Use of GAN to Help Networks to Detect Urban Change Accurately
Abstract
:1. Introduction
2. Dataset
2.1. Xi’an Change Detection Dataset
2.2. Change Detection Dataset
3. Method
3.1. Generator
3.1.1. Generator Architecture
3.1.2. ASPP Module
3.1.3. Center-Surround Architecture
3.1.4. Attention Fusion Module
3.2. Discriminator
Discriminator Architecture
3.3. Loss Function
4. Experiments and Result Analysis
4.1. Evaluation Indicators
4.2. Experiment Design
4.3. Analysis and Comparison
4.3.1. Comparison of UNET-CD and Existing CD Algorithms
4.3.2. Enhancement of UNET-CD with GAN
4.3.3. Comparison of UNET-GAN-CD and Existing CD Algorithms on CDD
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Malmir, M.; Zarkesh, M.M.K.; Monavari, S.M.; Jozi, S.A.; Sharifi, E. Urban development change detection based on Multi-Temporal Satellite Images as a fast tracking approach—A case study of Ahwaz County, southwestern Iran. Environ. Monit. Assess 2015, 187, 108–117. [Google Scholar] [CrossRef] [PubMed]
- Huang, X.; Zhang, L.; Zhu, T. Building change detection from multitemporal high-resolution remotely sensed images based on a morphological building index. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 7, 105–115. [Google Scholar] [CrossRef]
- Xiao, P.; Zhang, X.; Wang, D.; Yuan, M.; Feng, X.; Kelly, M. Change detection of built-up land: A framework of combining pixel-based detection and object-based recognition. ISPRS J. Photogramm. Remote Sens. 2016, 119, 402–414. [Google Scholar] [CrossRef]
- Zhou, J.; Yu, B.; Qin, J. Multi-level spatial analysis for change detection of urban vegetation at individual tree scale. Remote Sens. 2014, 6, 9086–9103. [Google Scholar] [CrossRef] [Green Version]
- Wang, B.; Choi, S.; Han, Y.; Lee, S.; Choi, J. Application of IR-MAD using synthetically fused images for change detection in hyperspectral data. Remote Sens. Lett. 2015, 6, 578–586. [Google Scholar]
- Wen, D.; Huang, X.; Zhang, L.; Benediktsson, J. A Novel Automatic Change Detection Method for Urban High-Resolution Remotely Sensed Imagery Based on Multiindex Scene Representation. IEEE Trans. Geosci. Remote Sens. 2016, 54, 609–625. [Google Scholar] [CrossRef]
- Malila, W.A. Change vector analysis: An approach for detecting forest changes with landsat. LARS Symp. 1980, 385, 326–335. [Google Scholar]
- Feng, W.; Sui, H.; Tu, J.; Sun, K.; Huang, W. Change detection method for high resolution remote sensing images using random forest. Cehui Xuebao/Acta Geod. Et Cartogr. Sin. 2017, 46, 1880–1890. [Google Scholar]
- Optimization, S.M. A fast algorithm for training support vector machines. CiteSeerX 1998, 10, 4376. [Google Scholar]
- Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–9 October 2015; Springer: Berlin/Heidelberg, Germany, 2015; pp. 234–241. [Google Scholar]
- Daudt, R.C.; Le Saux, B.; Boulch, A. Fully convolutional siamese networks for change detection. In Proceedings of the 2018 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 7–10 October 2018; pp. 4063–4067. [Google Scholar]
- Lei, T.; Zhang, Y.; Lv, Z.; Li, S.; Liu, S.; Nandi, A.K. Landslide inventory mapping from bi-temporal images using deep convolutional neural networks. IEEE Geosci. Remote Sens. Lett. 2019, 16, 982–986. [Google Scholar] [CrossRef]
- Peng, D.; Zhang, Y.; Guan, H. End-to-end change detection for high resolution satellite images using improved unet++. Remote Sens. 2019, 11, 1382. [Google Scholar] [CrossRef] [Green Version]
- Cai, Y.; Liu, C.; Cheng, P.; Du, D.; Zhang, L.; Wang, W.; Ye, Q. Scale-residual learning network for scene text detection. IEEE Trans. Circ. Syst. Video Technol. 2020, 31, 2725–2738. [Google Scholar] [CrossRef]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative Adversarial Nets. Adv. Neural Inf. Process Syst. 2014, 27, 2672–2680. [Google Scholar]
- Zhu, J.-Y.; Park, T.; Isola, P.; Efros, A.A. Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2223–2232. [Google Scholar]
- Song, G.; Luo, L.; Liu, J.; Ma, W.C.; Lai, C.; Zheng, C.; Cham, T.J. AgileGAN: Stylizing portraits by inversion-consistent transfer learning. ACM Trans. Graph. 2021, 40, 1–13. [Google Scholar] [CrossRef]
- Luc, P.; Couprie, C.; Chintala, S.; Verbeek, J. Semantic Segmentation using Adversarial Networks. arXiv 2016, arXiv:1611.08408. [Google Scholar]
- Shi, Q.; Liu, X.; Li, X. Road detection from remote sensing images by generative adversarial networks. IEEE Access 2017, 6, 25486–25494. [Google Scholar] [CrossRef]
- Kim, S.; Park, S.; Yu, K. Proposal for a Method of Extracting Road Layers from Remote Sensing Images Using Conditional GANs. In Proceedings of the 2nd International Conference on Digital Signal Processing, Tokyo, Japan, 25–27 February 2018; pp. 84–87. [Google Scholar]
- Niu, X.; Gong, M.; Zhan, T.; Yang, Y. A conditional adversarial network for change detection in heterogeneous images. IEEE Geosci. Remote Sens. Lett. 2018, 16, 45–49. [Google Scholar] [CrossRef]
- Zhang, C.; Yue, P.; Tapete, D. A deeply supervised image fusion network for change detection in high resolution bi-temporal remote sensing images. ISPRS J. Photogramm. Remote Sens. 2020, 166, 183–200. [Google Scholar] [CrossRef]
- Lebedev, M.A.; Vizilter, Y.V.; Vygolov, O.V.; Knyaz, V.A.; Rubis, A.Y. Change Detection in Remote Sensing Images Using Conditional Adversarial Networks. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 2. [Google Scholar] [CrossRef]
Layer | Generator, UNET-CD |
---|---|
L1 | ConvBlock (I3, O32, K3), ReLU, P |
L2 | ConvBlock (I32, O64, K3), ReLU, P |
L3 | ConvBlock (I64, O96, K3), ReLU, P |
L4 | ConvBlock (I96, O128, K3), ReLU, P |
L5 | ConvBlock (I128, O128, K3) |
L6 | ASPPBlock (I128, O128) |
L7 | UpSampling2D (2), DeconvBlock (I512, O128, K3) |
L8 | UpSampling2D (2), DeconvBlock (I320, O96, K3) |
L9 | UpSampling2D (2), DeconvBlock (I224, O64, K3) |
L10 | UpSampling2D (2), DeconvBlock (I96, O32, K3) |
L11 | FinalConvBlock (I32, O3, K1) |
Layer | Discriminator, D |
---|---|
L1 | Conv (I8, O16, K3, S2), LeakyReLU, BN |
L2 | Conv (I16, O32, K3, S2), LeakyReLU, BN |
L3 | Conv (I32, O64, K3, S2), LeakyReLU, BN |
L4 | Conv (I64, O128, K3, S2), LeakyReLU, BN |
L5 | UpSampling2D (2),Conv (I192,O64,K3,S1), LeakyReLU |
L6 | UpSampling2D (4),Conv (I64,O64,K3,S1), LeakyReLU |
L7 | Conv (I64,O2,K3,S1) |
Ground Truth | ||||
---|---|---|---|---|
Prediction | 1 | 0 | ||
1 | TP | FP | T1 | |
0 | FN | TN | T2 | |
C1 | C2 | All_Num |
Evaluation Indicators | OA | Kappa | F1 |
---|---|---|---|
UNET-CD | 0.9515 | 0.5654 | 0.5767 |
SLN | 0.9495 | 0.5097 | 0.5129 |
FC-Siam-diff | 0.9272 | 0.4072 | 0.4437 |
Evaluation Indicators | OA | Kappa | F1 |
---|---|---|---|
UNET-CD | 0.9515 | 0.5654 | 0.5767 |
UNET-GAN-CD | 0.9587 | 0.5846 | 0.6022 |
Method/Channel | Params (M) | Precisions | Recalls | F1scores | Fps |
---|---|---|---|---|---|
FC-EF/16 | 1.35 | 0.609 | 0.583 | 0.592 | - |
FC-Siam-conc/16 | 1.54 | 0.709 | 0.603 | 0.637 | - |
FC-Siam-diff/16 * | 1.35 | 0.879 | 0.436 | 0.583 | 38 |
FC-Siam-diff/32 | 5.39 | 0.783 | 0.626 | 0.692 | - |
Unet++_MSOF/32 | 11.00 | 0.895 | 0.871 | 0.876 | - |
IFN/- | 35.72 | 0.950 | 0.861 | 0.903 | - |
SLN/16 * | 1.66 | 0.793 | 0.622 | 0.697 | 35 |
UNET-GAN-CD (Ours) | 5.35 | 0.913 | 0.915 | 0.914 | 21 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
He, C.; Zhao, Y.; Dong, J.; Xiang, Y. Use of GAN to Help Networks to Detect Urban Change Accurately. Remote Sens. 2022, 14, 5448. https://doi.org/10.3390/rs14215448
He C, Zhao Y, Dong J, Xiang Y. Use of GAN to Help Networks to Detect Urban Change Accurately. Remote Sensing. 2022; 14(21):5448. https://doi.org/10.3390/rs14215448
Chicago/Turabian StyleHe, Chenyang, Yindi Zhao, Jihong Dong, and Yang Xiang. 2022. "Use of GAN to Help Networks to Detect Urban Change Accurately" Remote Sensing 14, no. 21: 5448. https://doi.org/10.3390/rs14215448
APA StyleHe, C., Zhao, Y., Dong, J., & Xiang, Y. (2022). Use of GAN to Help Networks to Detect Urban Change Accurately. Remote Sensing, 14(21), 5448. https://doi.org/10.3390/rs14215448