Deep-Learning for Change Detection Using Multi-Modal Fusion of Remote Sensing Images: A Review
Abstract
:1. Introduction
2. Literature Review Methodology
2.1. Search Strategy
- (“data fusion” OR “multisource” OR “multimodal”) AND (“deep learning” OR “neural networks”) AND (“remote sensing” OR “satellite images”) AND (“change detection”).
- (“homogeneous” OR “heterogeneous”) AND (“deep learning” OR “neural networks”) AND (“remote sensing” OR “satellite images”).
- (“optical and SAR”) AND (“deep learning” OR “neural networks”) AND (“remote sensing” OR “satellite images”) AND (“change detection”).
2.2. Study Selection
3. Statistical Analysis and Results
4. Multi-Modal Datasets
4.1. Single-Source Data
4.2. Multi-Sensor Data
4.3. Multi-Source Data
4.4. Data Quality and Limitations
5. Multi-Modal Data Fusion for Change Detection
5.1. Feature Fusion Strategy
5.2. Homogeneous-RSCD
5.2.1. CNN-Based
Standard CNNs
CNNs with Attention Mechanisms
5.2.2. Deep Belief Network-Based
5.2.3. RNN-Based
5.2.4. Transforms
5.2.5. Multi-Model Combinations
5.3. Heterogeneous-RSCD
5.3.1. Multi-Scale Change Detection (Optical–Optical)
CNN-Based Methods
GAN-Based Methods
Transformers
Multi-Model Combinations
5.3.2. Multi-Modal Change Detection (Optical–SAR)
CNN-Based Methods
Transformers
GAN-Based Methods
6. Discussion
6.1. Quantitative Evaluation of Hom-RSCD Models
6.2. Quantitative Evaluation of Het-RSCD Models
6.3. Challenges and Future Directions
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Aplin, P. Remote sensing: Land cover. Prog. Phys. Geogr. 2004, 28, 283–293. [Google Scholar] [CrossRef]
- Rees, G. Physical Principles of Remote Sensing; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
- Pettorelli, N. Satellite Remote Sensing and the Management of Natural Resources; Oxford University Press: Oxford, UK, 2019. [Google Scholar]
- Yin, J.; Dong, J.; Hamm, N.A.; Li, Z.; Wang, J.; Xing, H.; Fu, P. Integrating remote sensing and geospatial big data for urban land use mapping: A review. Int. J. Appl. Earth Obs. Geoinf. 2021, 103, 102514. [Google Scholar] [CrossRef]
- Dash, J.P.; Pearse, G.D.; Watt, M.S. UAV multispectral imagery can complement satellite data for monitoring forest health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef]
- Cillero Castro, C.; Domínguez Gómez, J.A.; Delgado Martín, J.; Hinojo Sánchez, B.A.; Cereijo Arango, J.L.; Cheda Tuya, F.A.; Díaz-Varela, R. An UAV and satellite multispectral data approach to monitor water quality in small reservoirs. Remote Sens. 2020, 12, 1514. [Google Scholar] [CrossRef]
- Shirmard, H.; Farahbakhsh, E.; Müller, R.D.; Chandra, R. A review of machine learning in processing remote sensing data for mineral exploration. Remote Sens. Environ. 2022, 268, 112750. [Google Scholar] [CrossRef]
- Demchev, D.; Eriksson, L.; Smolanitsky, V. SAR image texture entropy analysis for applicability assessment of area-based and feature-based aea ice tracking approaches. In Proceedings of the EUSAR 2021; 13th European Conference on Synthetic Aperture Radar, VDE, Online, 29–31 April 2021; pp. 1–3. [Google Scholar]
- Wen, D.; Huang, X.; Bovolo, F.; Li, J.; Ke, X.; Zhang, A.; Benediktsson, J.A. Change detection from very-high-spatial-resolution optical remote sensing images: Methods, applications, and future directions. IEEE Geosci. Remote Sens. Mag. 2021, 9, 68–101. [Google Scholar] [CrossRef]
- Moreira, A.; Prats-Iraola, P.; Younis, M.; Krieger, G.; Hajnsek, I.; Papathanassiou, K.P. A tutorial on synthetic aperture radar. IEEE Geosci. Remote Sens. Mag. 2013, 1, 6–43. [Google Scholar] [CrossRef]
- Zhang, J. Multi-source remote sensing data fusion: Status and trends. Int. J. Image Data Fusion 2010, 1, 5–24. [Google Scholar] [CrossRef]
- Liu, Y.; Pang, C.; Zhan, Z.; Zhang, X.; Yang, X. Building change detection for remote sensing images using a dual-task constrained deep siamese convolutional network model. IEEE Geosci. Remote Sens. Lett. 2020, 18, 811–815. [Google Scholar] [CrossRef]
- Shi, S.; Zhong, Y.; Zhao, J.; Lv, P.; Liu, Y.; Zhang, L. Land-use/land-cover change detection based on class-prior object-oriented conditional random field framework for high spatial resolution remote sensing imagery. IEEE Trans. Geosci. Remote Sens. 2020, 60, 1–16. [Google Scholar] [CrossRef]
- Brunner, D.; Bruzzone, L.; Lemoine, G. Change detection for earthquake damage assessment in built-up areas using very high resolution optical and SAR imagery. In Proceedings of the 2010 IEEE International Geoscience and Remote Sensing Symposium, IEEE, Honolulu, HI, USA, 25–30 July 2010; pp. 3210–3213. [Google Scholar]
- You, Y.; Cao, J.; Zhou, W. A survey of change detection methods based on remote sensing images for multi-source and multi-objective scenarios. Remote Sens. 2020, 12, 2460. [Google Scholar] [CrossRef]
- Deng, J.; Wang, K.; Deng, Y.; Qi, G. PCA-based land-use change detection and analysis using multitemporal and multisensor satellite data. Int. J. Remote Sens. 2008, 29, 4823–4838. [Google Scholar] [CrossRef]
- Bovolo, F.; Bruzzone, L.; Marconcini, M. A novel approach to unsupervised change detection based on a semisupervised SVM and a similarity measure. IEEE Trans. Geosci. Remote Sens. 2008, 46, 2070–2082. [Google Scholar] [CrossRef]
- Hao, M.; Zhou, M.; Jin, J.; Shi, W. An advanced superpixel-based Markov random field model for unsupervised change detection. IEEE Geosci. Remote Sens. Lett. 2019, 17, 1401–1405. [Google Scholar] [CrossRef]
- Zhou, L.; Cao, G.; Li, Y.; Shang, Y. Change detection based on conditional random field with region connection constraints in high-resolution remote sensing images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 3478–3488. [Google Scholar] [CrossRef]
- Tan, K.; Jin, X.; Plaza, A.; Wang, X.; Xiao, L.; Du, P. Automatic change detection in high-resolution remote sensing images by using a multiple classifier system and spectral–spatial features. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 3439–3451. [Google Scholar] [CrossRef]
- Seo, D.K.; Kim, Y.H.; Eo, Y.D.; Lee, M.H.; Park, W.Y. Fusion of SAR and multispectral images using random forest regression for change detection. ISPRS Int. J. Geo-Inf. 2018, 7, 401. [Google Scholar] [CrossRef]
- Wang, C.; Wang, X. Building change detection from multi-source remote sensing images based on multi-feature fusion and extreme learning machine. Int. J. Remote Sens. 2021, 42, 2246–2257. [Google Scholar] [CrossRef]
- Touati, R.; Mignotte, M.; Dahmane, M. Multimodal change detection in remote sensing images using an unsupervised pixel pairwise-based Markov random field model. IEEE Trans. Image Process. 2019, 29, 757–767. [Google Scholar] [CrossRef]
- Cheng, G.; Huang, Y.; Li, X.; Lyu, S.; Xu, Z.; Zhao, H.; Zhao, Q.; Xiang, S. Change detection methods for remote sensing in the last decade: A comprehensive review. Remote Sens. 2024, 16, 2355. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Schmidt, R.M. Recurrent neural networks (rnns): A gentle introduction and overview. arXiv 2019, arXiv:1912.05911. [Google Scholar]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. Adv. Neural Inf. Process. Syst. 2014, 27, 2672–2680. [Google Scholar]
- Li, J.; Hong, D.; Gao, L.; Yao, J.; Zheng, K.; Zhang, B.; Chanussot, J. Deep learning in multimodal remote sensing data fusion: A comprehensive review. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102926. [Google Scholar] [CrossRef]
- Shafique, A.; Cao, G.; Khan, Z.; Asad, M.; Aslam, M. Deep learning-based change detection in remote sensing images: A review. Remote Sens. 2022, 14, 871. [Google Scholar] [CrossRef]
- Jiang, H.; Peng, M.; Zhong, Y.; Xie, H.; Hao, Z.; Lin, J.; Ma, X.; Hu, X. A survey on deep learning-based change detection from high-resolution remote sensing images. Remote Sens. 2022, 14, 1552. [Google Scholar] [CrossRef]
- Bai, T.; Wang, L.; Yin, D.; Sun, K.; Chen, Y.; Li, W.; Li, D. Deep learning for change detection in remote sensing: A review. Geo-Spat. Inf. Sci. 2023, 26, 262–288. [Google Scholar] [CrossRef]
- Parelius, E.J. A review of deep-learning methods for change detection in multispectral remote sensing images. Remote Sens. 2023, 15, 2092. [Google Scholar] [CrossRef]
- Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Ann. Intern. Med. 2009, 151, 264–269. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Daudt, R.C.; Le Saux, B.; Boulch, A.; Gousseau, Y. Urban change detection for multispectral earth observation using convolutional neural networks. In Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 2115–2118. [Google Scholar]
- Wang, X.; Cheng, W.; Feng, Y.; Song, R. TSCNet: Topological structure coupling network for change detection of heterogeneous remote sensing images. Remote Sens. 2023, 15, 621. [Google Scholar] [CrossRef]
- Chen, H.; Yokoya, N.; Wu, C.; Du, B. Unsupervised multimodal change detection based on structural relationship graph representation learning. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–18. [Google Scholar] [CrossRef]
- Chen, H.; Shi, Z. A spatial-temporal attention-based method and a new dataset for remote sensing image change detection. Remote Sens. 2020, 12, 1662. [Google Scholar] [CrossRef]
- Ji, S.; Wei, S.; Lu, M. Fully convolutional networks for multisource building extraction from an open aerial and satellite imagery data set. IEEE Trans. Geosci. Remote Sens. 2018, 57, 574–586. [Google Scholar] [CrossRef]
- Feng, S.; Fan, Y.; Tang, Y.; Cheng, H.; Zhao, C.; Zhu, Y.; Cheng, C. A change detection method based on multi-scale adaptive convolution kernel network and multimodal conditional random field for multi-temporal multispectral images. Remote Sens. 2022, 14, 5368. [Google Scholar] [CrossRef]
- Shen, L.; Lu, Y.; Chen, H.; Wei, H.; Xie, D.; Yue, J.; Chen, R.; Lv, S.; Jiang, B. S2Looking: A satellite side-looking dataset for building change detection. Remote Sens. 2021, 13, 5094. [Google Scholar] [CrossRef]
- Lebedev, M.; Vizilter, Y.V.; Vygolov, O.; Knyaz, V.A.; Rubis, A.Y. Change detection in remote sensing images using conditional adversarial networks. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 565–571. [Google Scholar] [CrossRef]
- Wang, M.; Tan, K.; Jia, X.; Wang, X.; Chen, Y. A deep siamese network with hybrid convolutional feature extraction module for change detection based on multi-sensor remote sensing images. Remote Sens. 2020, 12, 205. [Google Scholar] [CrossRef]
- Volpi, M.; Camps-Valls, G.; Tuia, D. Spectral alignment of multi-temporal cross-sensor images with automated kernel canonical correlation analysis. ISPRS J. Photogramm. Remote Sens. 2015, 107, 50–63. [Google Scholar] [CrossRef]
- Saha, S.; Bovolo, F.; Bruzzone, L. Unsupervised multiple-change detection in VHR multisensor images via deep-learning based adaptation. In Proceedings of the IGARSS 2019—2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; pp. 5033–5036. [Google Scholar]
- Jiang, H.; Hu, X.; Li, K.; Zhang, J.; Gong, J.; Zhang, M. PGA-SiamNet: Pyramid feature-based attention-guided siamese network for remote sensing orthoimagery building change detection. Remote Sens. 2020, 12, 484. [Google Scholar] [CrossRef]
- Shao, R.; Du, C.; Chen, H.; Li, J. SUNet: Change detection for heterogeneous remote sensing images from satellite and UAV using a dual-channel fully convolution network. Remote Sens. 2021, 13, 3750. [Google Scholar] [CrossRef]
- Li, Y.; Zhou, Y.; Zhang, Y.; Zhong, L.; Wang, J.; Chen, J. DKDFN: Domain knowledge-guided deep collaborative fusion network for multimodal unitemporal remote sensing land cover classification. ISPRS J. Photogramm. Remote Sens. 2022, 186, 170–189. [Google Scholar] [CrossRef]
- Robinson, C.; Malkin, K.; Jojic, N.; Chen, H.; Qin, R.; Xiao, C.; Schmitt, M.; Ghamisi, P.; Hänsch, R.; Yokoya, N. Global land-cover mapping with weak supervision: Outcome of the 2020 IEEE GRSS data fusion contest. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 3185–3199. [Google Scholar] [CrossRef]
- Rottensteiner, F.; Sohn, G.; Jung, J.; Gerke, M.; Baillard, C.; Benitez, S.; Breitkopf, U. The ISPRS benchmark on urban object classification and 3D building reconstruction. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. I-3 2012, 1, 293–298. [Google Scholar] [CrossRef]
- Lv, Z.; Huang, H.; Gao, L.; Benediktsson, J.A.; Zhao, M.; Shi, C. Simple multiscale UNet for change detection with heterogeneous remote sensing images. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
- Xu, Y.; Du, B.; Zhang, L.; Cerra, D.; Pato, M.; Carmona, E.; Prasad, S.; Yokoya, N.; Hänsch, R.; Le Saux, B. Advanced multi-sensor optical remote sensing for urban land use and land cover classification: Outcome of the 2018 IEEE GRSS data fusion contest. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 1709–1724. [Google Scholar] [CrossRef]
- Hong, D.; Hu, J.; Yao, J.; Chanussot, J.; Zhu, X.X. Multimodal remote sensing benchmark datasets for land cover classification with a shared and specific feature learning model. ISPRS J. Photogramm. Remote Sens. 2021, 178, 68–80. [Google Scholar] [CrossRef]
- Gader, P.; Zare, A.; Close, R.; Aitken, J.; Tuell, G. Muufl Gulfport Hyperspectral and Lidar Airborne Data Set; University of Florida: Gainesville, FL, USA, 2013. [Google Scholar]
- Li, X.; Du, Z.; Huang, Y.; Tan, Z. A deep translation (GAN) based change detection network for optical and SAR remote sensing images. ISPRS J. Photogramm. Remote Sens. 2021, 179, 14–34. [Google Scholar] [CrossRef]
- Huang, C.; Chen, Y.; Zhang, S.; Wu, J. Detecting, extracting, and monitoring surface water from space using optical sensors: A review. Rev. Geophys. 2018, 56, 333–360. [Google Scholar] [CrossRef]
- Zhu, X.X.; Tuia, D.; Mou, L.; Xia, G.S.; Zhang, L.; Xu, F.; Fraundorfer, F. Deep learning in remote sensing: A comprehensive review and list of resources. IEEE Geosci. Remote Sens. Mag. 2017, 5, 8–36. [Google Scholar] [CrossRef]
- Ghamisi, P.; Rasti, B.; Yokoya, N.; Wang, Q.; Hofle, B.; Bruzzone, L.; Bovolo, F.; Chi, M.; Anders, K.; Gloaguen, R.; et al. Multisource and multitemporal data fusion in remote sensing: A comprehensive review of the state of the art. IEEE Geosci. Remote Sens. Mag. 2019, 7, 6–39. [Google Scholar] [CrossRef]
- Gómez-Chova, L.; Tuia, D.; Moser, G.; Camps-Valls, G. Multimodal classification of remote sensing images: A review and future directions. Proc. IEEE 2015, 103, 1560–1584. [Google Scholar] [CrossRef]
- Daudt, R.C.; Le Saux, B.; Boulch, A.; Gousseau, Y. Multitask learning for large-scale semantic change detection. Comput. Vis. Image Underst. 2019, 187, 102783. [Google Scholar] [CrossRef]
- Peng, D.; Zhang, Y.; Guan, H. End-to-end change detection for high resolution satellite images using improved UNet++. Remote Sens. 2019, 11, 1382. [Google Scholar] [CrossRef]
- Zheng, Z.; Wan, Y.; Zhang, Y.; Xiang, S.; Peng, D.; Zhang, B. CLNet: Cross-layer convolutional neural network for change detection in optical remote sensing imagery. ISPRS J. Photogramm. Remote Sens. 2021, 175, 247–267. [Google Scholar] [CrossRef]
- Lei, Y.; Peng, D.; Zhang, P.; Ke, Q.; Li, H. Hierarchical paired channel fusion network for street scene change detection. IEEE Trans. Image Process. 2020, 30, 55–67. [Google Scholar] [CrossRef] [PubMed]
- Zhang, M.; Xu, G.; Chen, K.; Yan, M.; Sun, X. Triplet-based semantic relation learning for aerial remote sensing image change detection. IEEE Geosci. Remote Sens. Lett. 2018, 16, 266–270. [Google Scholar] [CrossRef]
- Zhang, C.; Yue, P.; Tapete, D.; Jiang, L.; Shangguan, B.; Huang, L.; Liu, G. A deeply supervised image fusion network for change detection in high resolution bi-temporal remote sensing images. ISPRS J. Photogramm. Remote Sens. 2020, 166, 183–200. [Google Scholar] [CrossRef]
- Bertinetto, L.; Valmadre, J.; Henriques, J.F.; Vedaldi, A.; Torr, P.H. Fully-convolutional siamese networks for object tracking. In Proceedings of the Computer Vision–ECCV 2016 Workshops, Amsterdam, The Netherlands, 8–10 and 15–16 October 2016; Proceedings, Part II 14. Springer: Berlin/Heidelberg, Germany, 2016; pp. 850–865. [Google Scholar]
- Adarme, M.O.; Feitosa, R.Q.; Happ, P.N.; De Almeida, C.A.; Gomes, A.R. Evaluation of Deep Learning Techniques for Deforestation Detection in the Brazilian Amazon and Cerrado Biomes From Remote Sensing Imagery. Remote Sens. 2020, 12, 910. [Google Scholar] [CrossRef]
- Zhang, J.; Wang, Z.; Bai, L.; Song, G.; Tao, J.; Chen, L. Deforestation Detection Based on U-Net and LSTM in Optical Satellite Remote Sensing Images. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, IEEE, Brussels, Belgium, 11–16 July 2021; pp. 3753–3756. [Google Scholar]
- John, D.; Zhang, C. An attention-based U-Net for detecting deforestation within satellite sensor imagery. Int. J. Appl. Earth Obs. Geoinf. 2022, 107, 102685. [Google Scholar] [CrossRef]
- Alshehri, M.; Ouadou, A.; Scott, G.J. Deep Transformer-based Network Deforestation Detection in the Brazilian Amazon Using Sentinel-2 Imagery. IEEE Geosci. Remote Sens. Lett. 2024, 21, 1–5. [Google Scholar] [CrossRef]
- Bidari, I.; Chickerur, S. Deep Recurrent Residual U-Net with Semi-Supervised Learning for Deforestation Change Detection. SN Comput. Sci. 2024, 5, 893. [Google Scholar] [CrossRef]
- Papadomanolaki, M.; Verma, S.; Vakalopoulou, M.; Gupta, S.; Karantzalos, K. Detecting urban changes with recurrent neural networks from multitemporal Sentinel-2 data. In Proceedings of the IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; pp. 214–217. [Google Scholar]
- Khusni, U.; Dewangkoro, H.I.; Arymurthy, A.M. Urban area change detection with combining CNN and RNN from Sentinel-2 multispectral remote sensing data. In Proceedings of the 2020 3rd International Conference on Computer and Informatics Engineering (IC2IE), Yogyakarta, Indonesia, 15–16 September 2020; pp. 171–175. [Google Scholar]
- Huang, F.; Shen, G.; Hong, H.; Wei, L. Change detection of buildings with the utilization of a deep belief network and high-resolution remote sensing images. Fractals 2022, 30, 2240255. [Google Scholar] [CrossRef]
- Pang, L.; Sun, J.; Chi, Y.; Yang, Y.; Zhang, F.; Zhang, L. CD-TransUNet: A hybrid transformer network for the change detection of urban buildings using l-band SAR images. Sustainability 2022, 14, 9847. [Google Scholar] [CrossRef]
- Shafique, A.; Seydi, S.T.; Cao, G. BCD-Net: Building change detection based on fully scale connected U-Net and subpixel convolution. Int. J. Remote Sens. 2023, 44, 7416–7438. [Google Scholar] [CrossRef]
- Xiong, J.; Liu, F.; Wang, X.; Yang, C. Siamese Transformer-Based Building Change Detection in Remote Sensing Images. Sensors 2024, 24, 1268. [Google Scholar] [CrossRef]
- Ahmed, N.; Hoque, M.A.A.; Arabameri, A.; Pal, S.C.; Chakrabortty, R.; Jui, J. Flood susceptibility mapping in Brahmaputra floodplain of Bangladesh using deep boost, deep learning neural network, and artificial neural network. Geocarto Int. 2022, 37, 8770–8791. [Google Scholar] [CrossRef]
- Lemenkova, P. Deep Learning Methods of Satellite Image Processing for Monitoring of Flood Dynamics in the Ganges Delta, Bangladesh. Water 2024, 16, 1141. [Google Scholar] [CrossRef]
- Daudt, R.C.; Le Saux, B.; Boulch, A. Fully convolutional siamese networks for change detection. In Proceedings of the 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 7–10 October 2018; pp. 4063–4067. [Google Scholar]
- Yang, Y.; Zhu, D.; Qu, T.; Wang, Q.; Ren, F.; Cheng, C. Single-stream CNN with learnable architecture for multisource remote sensing data. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–18. [Google Scholar] [CrossRef]
- Chen, H.; Wu, C.; Du, B.; Zhang, L. Deep siamese multi-scale convolutional network for change detection in multi-temporal VHR images. In Proceedings of the 2019 10th International Workshop on the Analysis of Multitemporal Remote Sensing Images (MultiTemp), Shanghai, China, 5–7 August 2019; pp. 1–4. [Google Scholar]
- Zhang, M.; Shi, W. A feature difference convolutional neural network-based change detection method. IEEE Trans. Geosci. Remote Sens. 2020, 58, 7232–7246. [Google Scholar] [CrossRef]
- Iftene, M.; Larabi, M.E.A.; Karoui, M.S. End-to-end change detection in satellite remote sensing imagery. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium, 11–16 July 2021; pp. 4356–4359. [Google Scholar]
- Zhang, H.; Lin, M.; Yang, G.; Zhang, L. ESCNet: An end-to-end superpixel-enhanced change detection network for very-high-resolution remote sensing images. IEEE Trans. Neural Netw. Learn. Syst. 2021, 34, 28–42. [Google Scholar] [CrossRef]
- Chen, P.; Li, C.; Zhang, B.; Chen, Z.; Yang, X.; Lu, K.; Zhuang, L. A region-based feature fusion network for VHR image change detection. Remote Sens. 2022, 14, 5577. [Google Scholar] [CrossRef]
- Zhang, X.; He, L.; Qin, K.; Dang, Q.; Si, H.; Tang, X.; Jiao, L. SMD-Net: Siamese multi-scale difference-enhancement network for change detection in remote sensing. Remote Sens. 2022, 14, 1580. [Google Scholar] [CrossRef]
- Wang, Q.; Li, M.; Li, G.; Zhang, J.; Yan, S.; Chen, Z.; Zhang, X.; Chen, G. High-resolution remote sensing image change detection method based on improved siamese U-Net. Remote Sens. 2023, 15, 3517. [Google Scholar] [CrossRef]
- Wang, J.; Liu, F.; Jiao, L.; Wang, H.; Yang, H.; Liu, X.; Li, L.; Chen, P. SSCFNet: A spatial-spectral cross fusion network for remote sensing change detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 4000–4012. [Google Scholar] [CrossRef]
- Zhang, W.; Zhang, Y.; Su, L.; Mei, C.; Lu, X. Difference-enhancement triplet network for change detection in multispectral images. IEEE Geosci. Remote Sens. Lett. 2023, 20, 1–5. [Google Scholar] [CrossRef]
- Yu, X.; Fan, J.; Chen, J.; Zhang, P.; Zhou, Y.; Han, L. NestNet: A multiscale convolutional neural network for remote sensing image change detection. Int. J. Remote Sens. 2021, 42, 4898–4921. [Google Scholar] [CrossRef]
- Zhang, X.; Yue, Y.; Gao, W.; Yun, S.; Su, Q.; Yin, H.; Zhang, Y. DifUnet++: A satellite images change detection network based on UNet++ and differential pyramid. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- Fang, S.; Li, K.; Shao, J.; Li, Z. SNUNet-CD: A densely connected siamese network for change detection of VHR images. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- Qian, J.; Xia, M.; Zhang, Y.; Liu, J.; Xu, Y. TCDNet: Trilateral change detection network for Google Earth image. Remote Sens. 2020, 12, 2669. [Google Scholar] [CrossRef]
- Zhang, W.; Lu, X. The spectral-spatial joint learning for change detection in multispectral imagery. Remote Sens. 2019, 11, 240. [Google Scholar] [CrossRef]
- Ye, Y.; Zhou, L.; Zhu, B.; Yang, C.; Sun, M.; Fan, J.; Fu, Z. Feature decomposition-optimization-reorganization network for building change detection in remote sensing images. Remote Sens. 2022, 14, 722. [Google Scholar] [CrossRef]
- Lei, J.; Gu, Y.; Xie, W.; Li, Y.; Du, Q. Boundary extraction constrained siamese network for remote sensing image change detection. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–13. [Google Scholar] [CrossRef]
- Ding, L.; Zhu, K.; Peng, D.; Tang, H.; Yang, K.; Bruzzone, L. Adapting segment anything model for change detection in VHR remote sensing images. IEEE Trans. Geosci. Remote Sens. 2024, 62, 1–11. [Google Scholar] [CrossRef]
- Zhao, X.; Ding, W.; An, Y.; Du, Y.; Yu, T.; Li, M.; Tang, M.; Wang, J. Fast segment anything. arXiv 2023, arXiv:2306.12156. [Google Scholar]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. In Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017; Volume 30. [Google Scholar]
- Jiang, M.; Zhang, X.; Sun, Y.; Feng, W.; Gan, Q.; Ruan, Y. AFSNet: Attention-guided full-scale feature aggregation network for high-resolution remote sensing image change detection. Giscience Remote Sens. 2022, 59, 1882–1900. [Google Scholar] [CrossRef]
- Adriano, B.; Yokoya, N.; Xia, J.; Miura, H.; Liu, W.; Matsuoka, M.; Koshimura, S. Learning from multimodal and multitemporal earth observation data for building damage mapping. ISPRS J. Photogramm. Remote Sens. 2021, 175, 132–143. [Google Scholar] [CrossRef]
- Li, H.; Wang, L.; Cheng, S. HARNU-Net: Hierarchical attention residual nested U-Net for change detection in remote sensing images. Sensors 2022, 22, 4626. [Google Scholar] [CrossRef]
- Chen, J.; Yuan, Z.; Peng, J.; Chen, L.; Huang, H.; Zhu, J.; Liu, Y.; Li, H. DASNet: Dual attentive fully convolutional siamese networks for change detection in high-resolution satellite images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 14, 1194–1206. [Google Scholar] [CrossRef]
- Lu, D.; Wang, L.; Cheng, S.; Li, Y.; Du, A. CANet: A combined attention network for remote sensing image change detection. Information 2021, 12, 364. [Google Scholar] [CrossRef]
- Li, X.; Lei, L.; Sun, Y.; Li, M.; Kuang, G. Multimodal bilinear fusion network with second-order attention-based channel selection for land cover classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 1011–1026. [Google Scholar] [CrossRef]
- Ma, J.; Shi, G.; Li, Y.; Zhao, Z. MAFF-Net: Multi-attention guided feature fusion network for change detection in remote sensing images. Sensors 2022, 22, 888. [Google Scholar] [CrossRef] [PubMed]
- Chen, J.; Fan, J.; Zhang, M.; Zhou, Y.; Shen, C. MSF-Net: A multiscale supervised fusion network for building change detection in high-resolution remote sensing images. IEEE Access 2022, 10, 30925–30938. [Google Scholar] [CrossRef]
- Xu, X.; Zhou, Y.; Lu, X.; Chen, Z. FERA-Net: A building change detection method for high-resolution remote sensing imagery based on residual attention and high-frequency features. Remote Sens. 2023, 15, 395. [Google Scholar] [CrossRef]
- Zhong, H.; Wu, C. T-UNet: Triplet UNet for change detection in high-resolution remote sensing images. arXiv 2023, arXiv:2308.02356. [Google Scholar] [CrossRef]
- Sivasankari, A.; Jayalakshmi, S. Land cover clustering for change detection using deep belief network. In Proceedings of the 2022 International Conference on Electronics and Renewable Systems (ICEARS), Tuticorin, India, 16–18 March 2022; pp. 815–822. [Google Scholar]
- Jia, M.; Zhao, Z. Change detection in synthetic aperture radar images based on a generalized gamma deep belief networks. Sensors 2021, 21, 8290. [Google Scholar] [CrossRef]
- Samadi, F.; Akbarizadeh, G.; Kaabi, H. Change detection in SAR images using deep belief network: A new training approach based on morphological images. IET Image Process. 2019, 13, 2255–2264. [Google Scholar] [CrossRef]
- Mou, L.; Zhu, X.X. A recurrent convolutional neural network for land cover change detection in multispectral images. In Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 4363–4366. [Google Scholar]
- Mou, L.; Bruzzone, L.; Zhu, X.X. Learning spectral-spatial-temporal features via a recurrent convolutional neural network for change detection in multispectral imagery. IEEE Trans. Geosci. Remote Sens. 2018, 57, 924–935. [Google Scholar] [CrossRef]
- Lyu, H.; Lu, H.; Mou, L.; Li, W.; Wright, J.; Li, X.; Li, X.; Zhu, X.X.; Wang, J.; Yu, L.; et al. Long-term annual mapping of four cities on different continents by applying a deep information learning method to landsat data. Remote Sens. 2018, 10, 471. [Google Scholar] [CrossRef]
- Sun, S.; Mu, L.; Wang, L.; Liu, P. L-UNet: An LSTM network for remote sensing image change detection. IEEE Geosci. Remote Sens. Lett. 2020, 19, 1–5. [Google Scholar] [CrossRef]
- Zhao, Y.; Chen, P.; Chen, Z.; Bai, Y.; Zhao, Z.; Yang, X. A triple-stream network with cross-stage feature fusion for high-resolution image change detection. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–17. [Google Scholar] [CrossRef]
- Zhu, Y.; Lv, K.; Yu, Y.; Xu, W. Edge-guided parallel network for VHR remote sensing image change detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 7791–7803. [Google Scholar] [CrossRef]
- Sefrin, O.; Riese, F.M.; Keller, S. Deep learning for land cover change detection. Remote Sens. 2020, 13, 78. [Google Scholar] [CrossRef]
- Jing, R.; Liu, S.; Gong, Z.; Wang, Z.; Guan, H.; Gautam, A.; Zhao, W. Object-Based change detection for VHR remote sensing images based on a trisiamese-LSTM. Int. J. Remote Sens. 2020, 41, 6209–6231. [Google Scholar] [CrossRef]
- Bandara, W.G.C.; Patel, V.M. A transformer-based siamese network for change detection. In Proceedings of the IGARSS 2022—2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; pp. 207–210. [Google Scholar]
- Yuan, P.; Zhao, Q.; Zhao, X.; Wang, X.; Long, X.; Zheng, Y. A transformer-based siamese network and an open optical dataset for semantic change detection of remote sensing images. Int. J. Digit. Earth 2022, 15, 1506–1525. [Google Scholar] [CrossRef]
- Yan, T.; Wan, Z.; Zhang, P. Fully transformer network for change detection of remote sensing images. In Proceedings of the Asian Conference on Computer Vision, Macao, China, 4–8 December 2022; pp. 1691–1708. [Google Scholar]
- Zhang, C.; Wang, L.; Cheng, S.; Li, Y. SwinSUNet: Pure transformer network for remote sensing image change detection. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–13. [Google Scholar] [CrossRef]
- Pan, J.; Bai, Y.; Shu, Q.; Zhang, Z.; Hu, J.; Wang, M. M-Swin: Transformer-based Multi-scale Feature Fusion Change Detection Network within Cropland for Remote Sensing Images. IEEE Trans. Geosci. Remote Sens. 2024, 62, 1–16. [Google Scholar] [CrossRef]
- Song, L.; Xia, M.; Xu, Y.; Weng, L.; Hu, K.; Lin, H.; Qian, M. Multi-granularity siamese transformer-based change detection in remote sensing imagery. Eng. Appl. Artif. Intell. 2024, 136, 108960. [Google Scholar] [CrossRef]
- Xu, X.; Li, J.; Chen, Z. TCIANet: Transformer-based context information aggregation network for remote sensing image change detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 1951–1971. [Google Scholar] [CrossRef]
- Ma, J.; Duan, J.; Tang, X.; Zhang, X.; Jiao, L. Eatder: Edge-assisted adaptive transformer detector for remote sensing change detection. IEEE Trans. Geosci. Remote Sens. 2023, 62, 1–15. [Google Scholar] [CrossRef]
- Chen, H.; Qi, Z.; Shi, Z. Remote sensing image change detection with transformers. IEEE Trans. Geosci. Remote Sens. 2021, 60, 1–14. [Google Scholar] [CrossRef]
- Song, X.; Hua, Z.; Li, J. PSTNet: Progressive sampling transformer network for remote sensing image change detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 8442–8455. [Google Scholar] [CrossRef]
- Zhang, K.; Zhao, X.; Zhang, F.; Ding, L.; Sun, J.; Bruzzone, L. Relation changes matter: Cross-temporal difference transformer for change detection in remote sensing images. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–5. [Google Scholar] [CrossRef]
- Ding, L.; Zhang, J.; Guo, H.; Zhang, K.; Liu, B.; Bruzzone, L. Joint spatio-temporal modeling for semantic change detection in remote sensing images. IEEE Trans. Geosci. Remote Sens. 2024, 62, 1–14. [Google Scholar] [CrossRef]
- Zhou, Y.; Huo, C.; Zhu, J.; Huo, L.; Pan, C. DCAT: Dual cross-attention-based transformer for change detection. Remote Sens. 2023, 15, 2395. [Google Scholar] [CrossRef]
- Noman, M.; Fiaz, M.; Cholakkal, H.; Narayan, S.; Anwer, R.M.; Khan, S.; Khan, F.S. Remote sensing change detection with transformers trained from scratch. IEEE Trans. Geosci. Remote Sens. 2024, 62, 4704214. [Google Scholar] [CrossRef]
- Yuan, J.; Wang, L.; Cheng, S. STransUNet: A siamese transUNet-based remote sensing image change detection network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 9241–9253. [Google Scholar] [CrossRef]
- Deng, Y.; Meng, Y.; Chen, J.; Yue, A.; Liu, D.; Chen, J. TChange: A hybrid transformer-CNN change detection network. Remote Sens. 2023, 15, 1219. [Google Scholar] [CrossRef]
- Wang, G.; Li, B.; Zhang, T.; Zhang, S. A network combining a transformer and a convolutional neural network for remote sensing image change detection. Remote Sens. 2022, 14, 2228. [Google Scholar] [CrossRef]
- Li, Q.; Zhong, R.; Du, X.; Du, Y. TransUNetCD: A hybrid transformer network for change detection in optical remote-sensing images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–19. [Google Scholar] [CrossRef]
- Liu, M.; Chai, Z.; Deng, H.; Liu, R. A CNN-transformer network with multiscale context aggregation for fine-grained cropland change detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 4297–4306. [Google Scholar] [CrossRef]
- Yin, M.; Chen, Z.; Zhang, C. A CNN-transformer network combining CBAM for change detection in high-resolution remote sensing images. Remote Sens. 2023, 15, 2406. [Google Scholar] [CrossRef]
- Wang, W.; Tan, X.; Zhang, P.; Wang, X. A CBAM based multiscale transformer fusion approach for remote sensing image change detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 6817–6825. [Google Scholar] [CrossRef]
- Song, X.; Hua, Z.; Li, J. LHDACT: Lightweight hybrid dual attention CNN and transformer network for remote sensing image change detection. IEEE Geosci. Remote Sens. Lett. 2023, 20, 1–5. [Google Scholar] [CrossRef]
- Jiang, M.; Chen, Y.; Dong, Z.; Liu, X.; Zhang, X.; Zhang, H. Multiscale fusion CNN-transformer network for high-resolution remote sensing image change detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 5280–5293. [Google Scholar] [CrossRef]
- Tang, W.; Wu, K.; Zhang, Y.; Zhan, Y. A siamese network based on multiple attention and multilayer transformers for change detection. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5219015. [Google Scholar] [CrossRef]
- Niu, Y.; Guo, H.; Lu, J.; Ding, L.; Yu, D. SMNet: Symmetric multi-task network for semantic change detection in remote sensing images based on CNN and transformer. Remote Sens. 2023, 15, 949. [Google Scholar] [CrossRef]
- Li, W.; Xue, L.; Wang, X.; Li, G. Mctnet: A multi-scale cnn-transformer network for change detection in optical remote sensing images. In Proceedings of the 2023 26th International Conference on Information Fusion (FUSION), Charleston, SC, USA, 27–30 July 2023; pp. 1–5. [Google Scholar]
- Tang, X.; Zhang, T.; Ma, J.; Zhang, X.; Liu, F.; Jiao, L. Wnet: W-shaped hierarchical network for remote sensing image change detection. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5615814. [Google Scholar] [CrossRef]
- Zhang, X.; Cheng, S.; Wang, L.; Li, H. Asymmetric cross-attention hierarchical network based on CNN and transformer for bitemporal remote sensing images change detection. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–15. [Google Scholar] [CrossRef]
- Feng, Y.; Xu, H.; Jiang, J.; Liu, H.; Zheng, J. ICIF-Net: Intra-scale cross-interaction and inter-scale feature fusion network for bitemporal remote sensing images change detection. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–13. [Google Scholar] [CrossRef]
- Fu, Z.; Li, J.; Ren, L.; Chen, Z. Slddnet: Stage-wise short and long distance dependency network for remote sensing change detection. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–19. [Google Scholar] [CrossRef]
- Zhang, C.; Wang, L.; Cheng, S. HCGNet: A Hybrid Change Detection Network Based on CNN and GNN. IEEE Trans. Geosci. Remote Sens. 2024, 62, 1–12. [Google Scholar] [CrossRef]
- Zhu, Y.; Li, Q.; Lv, Z.; Falco, N. Novel land cover change detection deep learning framework with very small initial samples using heterogeneous remote sensing images. Remote Sens. 2023, 15, 4609. [Google Scholar] [CrossRef]
- Liu, M.; Shi, Q.; Marinoni, A.; He, D.; Liu, X.; Zhang, L. Super-resolution-based change detection network with stacked attention module for images with different resolutions. IEEE Trans. Geosci. Remote Sens. 2021, 60, 1–18. [Google Scholar] [CrossRef]
- Tian, J.; Peng, D.; Guan, H.; Ding, H. RACDNet: Resolution-and alignment-aware change detection network for optical remote sensing imagery. Remote Sens. 2022, 14, 4527. [Google Scholar] [CrossRef]
- Liu, M.; Shi, Q.; Liu, P.; Wan, C. Siamese generative adversarial network for change detection under different scales. In Proceedings of the IGARSS 2020—2020 IEEE International Geoscience and Remote Sensing Symposium, Waikoloa, HI, USA, 26 September–2 October 2020; pp. 2543–2546. [Google Scholar]
- Prexl, J.; Saha, S.; Zhu, X.X. Mitigating spatial and spectral differences for change detection using super-resolution and unsupervised learning. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium, 11–16 July 2021; pp. 3113–3116. [Google Scholar]
- Li, S.; Wang, Y.; Cai, H.; Lin, Y.; Wang, M.; Teng, F. MF-SRCDNet: Multi-feature fusion super-resolution building change detection framework for multi-sensor high-resolution remote sensing imagery. Int. J. Appl. Earth Obs. Geoinf. 2023, 119, 103303. [Google Scholar] [CrossRef]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv 2020, arXiv:2010.11929. [Google Scholar]
- Wolf, T.; Debut, L.; Sanh, V.; Chaumond, J.; Delangue, C.; Moi, A.; Cistac, P.; Rault, T.; Louf, R.; Funtowicz, M.; et al. Transformers: State-of-the-art natural language processing. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, Online, 16–20 November 2020; pp. 38–45. [Google Scholar]
- Liu, M.; Shi, Q.; Li, J.; Chai, Z. Learning token-aligned representations with multimodel transformers for different-resolution change detection. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–13. [Google Scholar] [CrossRef]
- Sun, B.; Liu, Q.; Yuan, N.; Tan, J.; Gao, X.; Yu, T. Spectral token guidance transformer for multisource images change detection. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 2559–2572. [Google Scholar] [CrossRef]
- Chen, H.; Zhang, H.; Chen, K.; Zhou, C.; Chen, S.; Zou, Z.; Shi, Z. Continuous cross-resolution remote sensing image change detection. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5623320. [Google Scholar] [CrossRef]
- Chen, H.; Wu, C.; Du, B.; Zhang, L.; Wang, L. Change detection in multisource VHR images via deep siamese convolutional multiple-layers recurrent neural network. IEEE Trans. Geosci. Remote Sens. 2019, 58, 2848–2864. [Google Scholar] [CrossRef]
- Benedetti, P.; Ienco, D.; Gaetano, R.; Ose, K.; Pensa, R.G.; Dupuy, S. M3Fusion: A deep learning architecture for multiscale multimodal multitemporal satellite data fusion. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 4939–4949. [Google Scholar] [CrossRef]
- Ebel, P.; Saha, S.; Zhu, X.X. Fusing multi-modal data for supervised change detection. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, 43, 243–249. [Google Scholar] [CrossRef]
- Hafner, S.; Nascetti, A.; Azizpour, H.; Ban, Y. Sentinel-1 and Sentinel-2 data fusion for urban change detection using a dual stream u-net. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- He, X.; Zhang, S.; Xue, B.; Zhao, T.; Wu, T. Cross-modal change detection flood extraction based on convolutional neural network. Int. J. Appl. Earth Obs. Geoinf. 2023, 117, 103197. [Google Scholar] [CrossRef]
- Li, H.; Zhu, F.; Zheng, X.; Liu, M.; Chen, G. MSCDUNet: A deep learning framework for built-Up area change detection integrating multispectral, SAR, and VHR data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 5163–5176. [Google Scholar] [CrossRef]
- Chen, H.; Wu, C.; Du, B.; Zhang, L. DSDANet: Deep siamese domain adaptation convolutional neural network for cross-domain change detection. arXiv 2020, arXiv:2006.09225. [Google Scholar]
- Zhang, C.; Feng, Y.; Hu, L.; Tapete, D.; Pan, L.; Liang, Z.; Cigna, F.; Yue, P. A domain adaptation neural network for change detection with heterogeneous optical and SAR remote sensing images. Int. J. Appl. Earth Obs. Geoinf. 2022, 109, 102769. [Google Scholar] [CrossRef]
- Luppino, L.T.; Hansen, M.A.; Kampffmeyer, M.; Bianchi, F.M.; Moser, G.; Jenssen, R.; Anfinsen, S.N. Code-aligned autoencoders for unsupervised change detection in multimodal remote sensing images. IEEE Trans. Neural Netw. Learn. Syst. 2022, 5, 60–72. [Google Scholar] [CrossRef]
- Wu, Y.; Li, J.; Yuan, Y.; Qin, A.; Miao, Q.G.; Gong, M.G. Commonality autoencoder: Learning common features for change detection from heterogeneous images. IEEE Trans. Neural Netw. Learn. Syst. 2021, 33, 4257–4270. [Google Scholar] [CrossRef]
- Farahani, M.; Mohammadzadeh, A. Domain adaptation for unsupervised change detection of multisensor multitemporal remote-sensing images. Int. J. Remote Sens. 2020, 41, 3902–3923. [Google Scholar] [CrossRef]
- Jiang, X.; Li, G.; Liu, Y.; Zhang, X.P.; He, Y. Change detection in heterogeneous optical and SAR remote sensing images via deep homogeneous feature fusion. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 1551–1566. [Google Scholar] [CrossRef]
- Touati, R.; Mignotte, M.; Dahmane, M. Anomaly feature learning for unsupervised change detection in heterogeneous images: A deep sparse residual model. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 588–600. [Google Scholar] [CrossRef]
- Zheng, X.; Chen, X.; Lu, X.; Sun, B. Unsupervised change detection by cross-resolution difference learning. IEEE Trans. Geosci. Remote Sens. 2021, 60, 1–16. [Google Scholar] [CrossRef]
- Wei, L.; Chen, G.; Zhou, Q.; Liu, C.; Cai, C. Cross-mapping net: Unsupervised change detection from heterogeneous remote sensing images using a transformer network. In Proceedings of the 2023 8th International Conference on Computer and Communication Systems (ICCCS), Guangzhou, China, 21–24 April 2023; pp. 1021–1026. [Google Scholar]
- Lu, T.; Zhong, X.; Zhong, L. mSwinUNet: A multi-modal U-shaped swin transformer for supervised change detection. J. Intell. Fuzzy Syst. 2024; Preprint. [Google Scholar]
- Hu, X.; Zhang, P.; Ban, Y.; Rahnemoonfar, M. GAN-based SAR and optical image translation for wildfire impact assessment using multi-source remote sensing data. Remote Sens. Environ. 2023, 289, 113522. [Google Scholar] [CrossRef]
- Zhao, T.; Wang, L.; Zhao, C.; Liu, T.; Ohtsuki, T. Heterogeneous image change detection based on deep image translation and feature refinement-aggregation. In Proceedings of the 2023 IEEE International Conference on Image Processing (ICIP), Kuala Lumpur, Malaysia, 8–11 October 2023; pp. 1705–1709. [Google Scholar]
- Manocha, A.; Afaq, Y. Optical and SAR images-based image translation for change detection using generative adversarial network (GAN). Multimed. Tools Appl. 2023, 82, 26289–26315. [Google Scholar] [CrossRef]
- Du, Z.; Li, X.; Miao, J.; Huang, Y.; Shen, H.; Zhang, L. Concatenated deep learning framework for multi-task change detection of optical and SAR images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 17, 719–731. [Google Scholar] [CrossRef]
- Wang, M.; Huang, L.; Tang, B.H.; Le, W.; Tian, Q. TDSCCNet: Twin-depthwise separable convolution connect network for change detection with heterogeneous images. Geocarto Int. 2024, 39, 2329673. [Google Scholar] [CrossRef]
- Su, Z.; Wan, G.; Zhang, W.; Wei, Z.; Wu, Y.; Liu, J.; Jia, Y.; Cong, D.; Yuan, L. Edge-bound change detection in multisource remote sensing images. Electronics 2024, 13, 867. [Google Scholar] [CrossRef]
- Xu, J.; Luo, C.; Chen, X.; Wei, S.; Luo, Y. Remote sensing change detection based on multidirectional adaptive feature fusion and perceptual similarity. Remote Sens. 2021, 13, 3053. [Google Scholar] [CrossRef]
- Peng, X.; Zhong, R.; Li, Z.; Li, Q. Optical remote sensing image change detection based on attention mechanism and image difference. IEEE Trans. Geosci. Remote Sens. 2020, 59, 7296–7307. [Google Scholar] [CrossRef]
- Ienco, D.; Interdonato, R.; Gaetano, R.; Minh, D.H.T. Combining Sentinel-1 and Sentinel-2 satellite image time series for land cover mapping via a multi-source deep learning architecture. ISPRS J. Photogramm. Remote Sens. 2019, 158, 11–22. [Google Scholar] [CrossRef]
- Wang, L.; Wang, L.; Wang, H.; Wang, X.; Bruzzone, L. SPCNet: A subpixel convolution-based change detection network for hyperspectral images with different spatial resolutions. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–14. [Google Scholar] [CrossRef]
- Xu, X.; Li, W.; Ran, Q.; Du, Q.; Gao, L.; Zhang, B. Multisource remote sensing data classification based on convolutional neural network. IEEE Trans. Geosci. Remote Sens. 2017, 56, 937–949. [Google Scholar] [CrossRef]
- Chen, Y.; Li, C.; Ghamisi, P.; Jia, X.; Gu, Y. Deep fusion of remote sensing data for accurate classification. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1253–1257. [Google Scholar] [CrossRef]
- Feng, Q.; Zhu, D.; Yang, J.; Li, B. Multisource hyperspectral and LiDAR data fusion for urban land-use mapping based on a modified two-branch convolutional neural network. ISPRS Int. J. Geo-Inf. 2019, 8, 28. [Google Scholar] [CrossRef]
- Mohla, S.; Pande, S.; Banerjee, B.; Chaudhuri, S. Fusatnet: Dual attention based spectrospatial multimodal fusion network for hyperspectral and lidar classification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 14–19 June 2020; pp. 92–93. [Google Scholar]
- Ma, W.; Karakuş, O.; Rosin, P.L. AMM-FuseNet: Attention-based multi-modal image fusion network for land cover mapping. Remote Sens. 2022, 14, 4458. [Google Scholar] [CrossRef]
- Liu, J.; Gong, M.; Qin, K.; Zhang, P. A deep convolutional coupling network for change detection based on heterogeneous optical and radar images. IEEE Trans. Neural Netw. Learn. Syst. 2016, 29, 545–559. [Google Scholar] [CrossRef]
- Liu, Z.; Li, G.; Mercier, G.; He, Y.; Pan, Q. Change detection in heterogenous remote sensing images via homogeneous pixel transformation. IEEE Trans. Image Process. 2017, 27, 1822–1834. [Google Scholar] [CrossRef]
- Roy, S.K.; Deria, A.; Hong, D.; Rasti, B.; Plaza, A.; Chanussot, J. Multimodal fusion transformer for remote sensing image classification. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–20. [Google Scholar] [CrossRef]
- Luppino, L.T.; Bianchi, F.M.; Moser, G.; Anfinsen, S.N. Unsupervised image regression for heterogeneous change detection. arXiv 2019, arXiv:1909.05948. [Google Scholar] [CrossRef]
Journal Name | Total Publications | Impact Factor (2023) | Publisher | Cite Score (2023) |
---|---|---|---|---|
IEEE Transactions on Geoscience and Remote Sensing | 28 | 8.2 | IEEE | 10.9 |
Remote Sensing | 25 | 5 | MDPI | 7.9 |
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | 17 | 5.5 | IEEE | 7.8 |
IEEE Geoscience and Remote Sensing Letters | 9 | 4.8 | IEEE | 6.4 |
ISPRS Journal of Photogrammetry and Remote Sensing | 9 | 12.7 | Elsevier | 19.2 |
IEEE Transactions on Neural Networks and Learning Systems | 4 | 10.4 | IEEE | 21.9 |
International Journal of Remote Sensing | 3 | 3.5 | Taylor & Francis | 6.5 |
DataSet | Data Type | Resolution (m) | Satellite Types | |
---|---|---|---|---|
Single Source | OSCD [35] | Optical | 10-20-60 | Sentinel 2 |
Lake overflow [36] | Optical | 30 | Landsat 5(NIR/RGB) | |
Farmland [37] | SAR | 3 | Radarsat-2 (single/four look) | |
CLCD | Optical | 0.5 to 2 | Gaofen-2 | |
LEVIR-CD [38] | Optical | 0.5 | Google Earth | |
WHU-CD [39] | Optical | 0.3 | QuickBird/Worldview | |
Multi-sensor | Wang, M [40] | Optical | 5.8/4 | ZY-3/GF-2 |
S2Looking [41] | Optical | 0.5/0.8 | GF, SV, and BJ-2 | |
CCD [42] | Optical | 0.03/1 | Google Earth | |
MRCDD | Optical | 0.5/2 | Google Earth | |
Mengxi Liu [43] | Optical | 4/1 | Google Earth | |
Bastrop [44] | Optical | 30 | Landsat-5/EO-1 ALI | |
Saha, S [45] | Optical | 0.5/0.6 | Quickbird/Pleiades | |
Reunion | Optical | 10/2 | Sentinel-2/SPOT6/7 | |
EV-CD building [46] | Optical | 0.2/2 | Variety of sensors | |
Multi-source | HTCD [47] | UAV/Optical | 0.5971/0.07 | Google Earth/Open Aerial Map |
MSBC | Optical/SAR | 2/20 | GF-2/Sentinel1-2A | |
MSOSCD | Optical/SAR | - | Sentinel 2/Google Earth | |
Hunan [48] | Optical/SAR | 10/30 | Sentinel-1/2, SRTM | |
DFC2020 [49] | Optical/SAR | 10/20 | Sentinel-1/2 | |
Potsdam [50] | Optical/LiDar | 0.05 | - | |
California dataset [51] | Optical/SAR | 20/30 | Landsat 8/Sentinel-1A | |
Houston2018 [52] | HS/LiDAR/RGB | 0.5/1 | ITRES CASI 1500/Titan MW | |
Berlin data [53] | HS/SAR | 13.89 | HyMap HS/Sentinel-1 | |
MUUFL Gulfport [54] | HS/LiDAR | 0.54/1 | - | |
Gloucester I [55] | Optical/SAR | 0.65 | QuickBird 2/TerraSAR-X | |
Gloucester II [55] | Optical/SAR | ≈25 | SPOT/ERS-1 |
Method Name/Ref | Network Structure | DataSet | Precision (%) | F1 (%) | OA (%) |
---|---|---|---|---|---|
DSMS-FCN [82] | Siamese UNet | SZTAKI-Szada | 52.78 | 57.72 | 94.57 |
SZTAKI-Tiszadob | 89.18 | 88.86 | 96.20 | ||
ESCNet [85] | Siamese UNet | SZTAKI-Tiszadob | 76.33 | 74.56 | 93.95 |
SZTAKI-Szada | 48.89 | 53.73 | 94.07 | ||
RFNet [86] | Siamese CNN | WHU-CD | 95.72 | 92.49 | - |
SMD-Net [87] | Siamese UNet | CDD | 96.6 | 97 | 99.3 |
BCDD | 94.80 | 94.33 | 99.48 | ||
OSCD | 96.6 | 97.0 | 99.3 | ||
SSCFNet [89] | Siamese UNet | LEVIR-CD | 93.71 | 95.31 | - |
SZTAKI | 96.54 | 96.58 | - | ||
Siam-FAUNet [88] | Siamese UNet | CDD | 95.62 | 94.58 | 98.14 |
WHU-CD | 44.47 | 55.50 | 94.95 | ||
DASNet [104] | Siamese UNet + Attention | CDD | 92.2 | 92.7 | 98.2 |
DifUNet++ [92] | Siamese UNet++ | SVCD | 92.15 | 92.37 | - |
LEVIR-CD | 92.15 | 89.6 | - | ||
SNUNet-CD [93] | Siamese UNet++ | CDD | 96.3 | 96.2 | - |
TCDNet [94] | Siamese CNN | Google Earth | 71.18 | - | - |
SSJLN [95] | Siamese CNN | GF-1 Data | - | 94.94 | - |
EMT+ Data | - | 98.75 | - | ||
SAM-CD [98] | Siamese CNN | LEVIR-CD | 95.87 | 95.50 | 99.14 |
CLCD | 88.25 | 86.89 | 96.26 | ||
WHU-CD | 97.97 | 97.58 | 99.60 | ||
S2Looking | 72.80 | 65.13 | - | ||
NestNet [91] | Siamese UNet++, Attention | CDD | 88.26 | 88.62 | - |
OSCD | 49.01 | 49.32 | - | ||
HARNU-Net [103] | Siamese UNet, Attention | CDD | 97.10 | 97.20 | 99.34 |
AFSNet [101] | Siamese UNet, Attention | CDD | 98.44 | 95.56 | 98.94 |
CANet [105] | Siamese UNet, Attention | CDD | 93.2 | 93.2 | 98.4 |
PGA-SiamNet [46] | Siamese UNet, Attention | EV-CD building | 94.01 | 91.74 | 99.68 |
MFPNet [186] | Siamese UNet, Attention | SVCD | 97.54 | - | |
Zhang dataset | 68.45 | - | |||
MAFF-Net [107] | Siamese UNet, Attention | CDD | 96.5 | 99.2 | - |
LEVIR-CD | 89.7 | 98.9 | - | ||
WHU-CD | 92.4 | 99.4 | - | ||
MSF-Net [108] | Siamese UNet, Attention | LEVIR-CD | 90 | 88.66 | - |
FERA-Net [109] | Siamese UNet, Attention | LEVIR-CD | 91.57 | 89.58 | - |
WHU-CD | 93.51 | 92.48 | - | ||
T-UNet [110] | Triple UNet, Attention | LEVIR-CD | 92.60 | 91.63 | 99.16 |
WHU-CD | 95.44 | 91.77 | 99.42 | ||
DSIFN | 70.86 | 69.52 | 89.83 | ||
ChangeFormer [122] | Siamese Transformer | LEVIR-CD | 92.05 | 90.40 | 99.04 |
DSIFN | 88.48 | 86.67 | 95.56 | ||
SwinSUNet [125] | Siamese Transformer | CDD | 95.7 | 94.0 | 98.5 |
OSCD | 55.0 | 54.5 | 95.3 | ||
WHU | 95.0 | 93.8 | 99.4 | ||
BiT [130] | Siamese Transformer | LEVIR-CD | 89.24 | 89.31 | 98.92 |
DSIFN | 68.36 | 69.26 | 89.41 | ||
EATDer [129] | Siamese Transformer | LEVIR-CD | 91.74 | 91.20 | 98.75 |
CDD | 96.83 | 95.97 | 98.97 | ||
WHU-CD | 91.32 | 90 | 98.58 | ||
CTD-Former [132] | Siamese Transformer | LEVIR-CD | 91.85 | 92.71 | 98.62 |
WHU-CD | 96.74 | 96.86 | 99.5 | ||
CLCD | 87.29 | 85.08 | 96.11 | ||
SCanFormer [133] | Siamese Transformer | SECOND | - | 63.66 | 87.86 |
Landsat-SCD | 89.27 | 96.26 | |||
TransUNetCD [139] | Siamese UNet + Transformer | CDD | 93.2 | 93.2 | 98.4 |
S2Looking | 93.2 | 93.2 | 98.4 | ||
CTCANet [141] | Siamese CNN + Transformer | LEVIR-CD | 92.19 | 91.21 | 99.11 |
SYSU-CD | 80.50 | 81.23 | 91.40 | ||
DCAT [134] | Siamese (CNN + Transformer) | LEVIR-CD+ | 84.72 | 84.02 | - |
SYSU-CD | 87.00 | 79.63 | - | ||
WHU-CD | 91.53 | 88.19 | - | ||
SMART [145] | Siamese (CNN + Transformer) | LEVIR-CD | 94.29 | 93.04 | 98.69 |
SYSU-CD | 86.17 | 84.80 | 89.42 | ||
WHU-CD | 89.9 | 91.57 | 98.70 | ||
DSIFN | 76.89 | 78.7 | 87 | ||
WNet [148] | Siamese CNN+Siamese Transformer | LEVIR-CD | 91.16 | 90.67 | 99.06 |
WHU-CD | 92.37 | 91.25 | 99.31 | ||
SYSU-CD | 81.71 | 80.64 | 90.98 | ||
SVCD | 97.71 | 97.56 | 99.42 | ||
ACAHNet [149] | Siamese (CNN + Transformer) | CDD | 97.5 | 97.72 | 99.48 |
LEVIR-CD | 92.36 | 91.51 | 99.14 | ||
SYSU-CD | 83.96 | 82.73 | 91.97 | ||
ICIF-Net [150] | Siamese (CNN + Transformer) | LEVIR-CD+ | 87.79 | 83.65 | 98.73 |
WHU-CD | 92.98 | 88.32 | 98.96 | ||
SYSU-CD | 83.37 | 80.74 | 91.24 | ||
Slddnet [151] | Siamese (CNN + Transformer) | LEVIR-CD | - | 91.75 | - |
WHU-CD | - | 92.76 | - | ||
GZ-CD | - | 86.61 | - |
Method Name/Reference | Network Structure | DataSet | Precision (%) | F1 (%) | OA (%) |
---|---|---|---|---|---|
M-UNet [51] | Single UNet | Shuguang | - | 84.73 | 98.69 |
Sardinia | - | 67 | 98.01 | ||
California | - | 61.33 | 96.66 | ||
OB-DSCNH [43] | Siamese CNN | Mengxi Liu [43] | - | - | 97.92 |
SepDGConv [81] | Single CNN | Houston2018 | 56.55 | - | 63.74 |
Berlin | 54.23 | - | 68.21 | ||
MUUFL | 72.75 | - | 83.23 | ||
MM-Trans [161] | Siamese CNN + Transformer | 8×/11× CCD | 95.48/95.17 | 90.44/90.07 | - |
5×/8× S2looking | 65.37/64.57 | 58.62/56.99 | - | ||
8× HTCD | 82.13 | 74.99 | - | ||
MSCDUNet [169] | Siamse UNet++ | MSBC Dataset | - | 64.21 | - |
MSOSCD Dataset | - | 92.81 | - | ||
RACDNet [155] | GAN + Saimese UNet | MRCDD Dataset | 91.18 | 96.79 | |
SUNet [139] | Siamese UNet | HTCD dataset | 97.3 | 91 | 99.6 |
Patrick et al. [166] | Siamese UNet | ONERA CD data | 60.2 | 58.1 | - |
STCD-Former [162] | Siamese Transformer | Bastrop data | - | 99.25 | - |
M3Fusion [165] | Siamese CNN + RNN | Reunion Island | 90.09 | 89.96 | - |
AMM-FuseNet [194] | Siamese UNet + Attention | Hunan | 59.13 | 79.06 | |
DFC2020 | - | 90.33 | 94.56 | ||
Potsdam | 79.31 | 85.28 | |||
MFT [197] | Siamese CNN + Transformer | Houston2013 | 90.56 | - | 89.15 |
MUUFL | 81 | - | 94.18 | ||
Trento | 95.91 | - | 97.76 | ||
Chen et al. [191] | Siamese CNN | Houston2013 | 98.57 | - | 98.61 |
Bayview Park | 99.75 | - | 99.41 | ||
Recology | 98.90 | - | 98.15 | ||
MBFNet [106] | Siamese CNN + Attention | PoDelta | - | - | 82.61 |
CHONGMING | - | - | 93.61 | ||
TWINNS [188] | Siamse CNN, GRU | Reunion Island | 89.87 | 89.88 | - |
SiamCRNN [164] | Siamese CNN + LSTM | LiDAR-Opt | 87.38 | 82.15 | 82.15 |
MF-SRCDNet [158] | GAN + Siamese UNet | WXCD | 84.5 | 88.1 | 95.3 |
BCDD | 96.4 | 96.4 | 98.5 | ||
SiamGAN [156] | Siamese GAN | Guangzhou | 69.5 | 76.06 | - |
SRCDNet [154] | GAN + Siamese UNet, Attention | 4×/8× BCDD | 84.44/81.61 | 85.66/81.69 | - |
4×/8× CDD | 92.07/91.95 | - | - | ||
SILI [163] | Siamese CNN + Transformer | LEVIR-CD(4×) | 90 | 88 | 98 |
SV-CD(8×) | 95 | 94 | 98 | ||
DE-CD(3.3×) | 61 | 50 | - | ||
DAMSCDNet [171] | Siamese CNN | Data1 | 78.89 | 82.17 | - |
Data2 | 92.04 | 93.86 | - | ||
Data3 | 71.51 | - | 71.71 | ||
CA_AE [172] | Autoencoders | Lake overflow | - | - | 92.2 |
Constructions | - | - | 85.9 | ||
CAE [173] | Autoencoders | Yellow River | - | - | 97.74 |
Sardinia | - | - | 97.47 | ||
farmland | - | - | 97.91 | ||
Farahani et al. [174] | Autoencoders | San Francisco | - | 96.44 | 72/68 |
DHFF [175] | Siamese VGG (IST) | Tōhoku | 84.66 | - | 98.63 |
Haiti | 58.19 | - | 98.23 | ||
TSCNet [36] | Autoencoders + Attention | Flood California [198] | 49.4 | 5.74 | 93.9 |
Niu et al. [195] | Autoencoders | Yellow River | - | - | 97.7 |
farmland | - | - | 98.26 | ||
CM-Net [178] | Autoencoder + Transformer | SARDINA | 90.55 | 97.52 | |
Shuguang | 95.00 | - | 98.57 | ||
GLOUCESTERSHIRE | 93.51 | - | 96.92 | ||
DTCDN [55] | CycleGAN | Gloucester I | 89.96 | 89.95 | 97.98 |
Gloucester II | 90.78 | 88.67 | 96.33 | ||
California | 66.73 | 72.03 | 97.61 | ||
Shuguang | 92.92 | 91.56 | 99.75 | ||
DACDT [182] | CycleGAN | Gloucester I | - | - | 98.67 |
Gloucester II | - | - | 97.68 | ||
California | - | - | 98.87 | ||
MTCDN [183] | CycleGAN | Gloucester I | 88.86 | 88.22 | 97.65 |
Gloucester II | 89.49 | 88.87 | 96.34 | ||
California | 55.20 | 61.54 | 95.83 | ||
TDSCCNet [184] | CycleGAN | Italy | 85.64 | 81.07 | 97.62 |
WV-3 | 91.34 | 91.37 | 98.01 | ||
Gloucester | 93.29 | 93.75 | 97.36 | ||
Shuguang | 82.58 | 88.58 | 97.01 | ||
EO-GAN [185] | CGAN | Yellow River | 98.01 | ||
Shuguang | - | - | 98.16 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Saidi, S.; Idbraim, S.; Karmoude, Y.; Masse, A.; Arbelo, M. Deep-Learning for Change Detection Using Multi-Modal Fusion of Remote Sensing Images: A Review. Remote Sens. 2024, 16, 3852. https://doi.org/10.3390/rs16203852
Saidi S, Idbraim S, Karmoude Y, Masse A, Arbelo M. Deep-Learning for Change Detection Using Multi-Modal Fusion of Remote Sensing Images: A Review. Remote Sensing. 2024; 16(20):3852. https://doi.org/10.3390/rs16203852
Chicago/Turabian StyleSaidi, Souad, Soufiane Idbraim, Younes Karmoude, Antoine Masse, and Manuel Arbelo. 2024. "Deep-Learning for Change Detection Using Multi-Modal Fusion of Remote Sensing Images: A Review" Remote Sensing 16, no. 20: 3852. https://doi.org/10.3390/rs16203852
APA StyleSaidi, S., Idbraim, S., Karmoude, Y., Masse, A., & Arbelo, M. (2024). Deep-Learning for Change Detection Using Multi-Modal Fusion of Remote Sensing Images: A Review. Remote Sensing, 16(20), 3852. https://doi.org/10.3390/rs16203852