Assessing the Potential of Multi-Temporal Conditional Generative Adversarial Networks in SAR-to-Optical Image Translation for Early-Stage Crop Monitoring
Abstract
:1. Introduction
2. Methodology
2.1. Pix2Pix
2.2. S-CycleGAN
2.3. MTcGAN
3. Experiments
3.1. Study Area
3.2. Data
3.3. Experimental Design
3.3.1. Optimization of Model Hyperparameters
3.3.2. Training and Test Setup
3.3.3. Experiment Setup
3.4. Evaluation
3.5. Implementation
4. Results
4.1. Analysis of Temporal Characteristics of Corn and Soybean
4.2. Impact of Different Prediction Dates in MTcGAN (Case A)
4.3. Impact of Temporal Distance between Reference and Prediction Dates in MTcGAN (Case B)
4.4. Comparison of Different SAR-to-Optical Image Translation Methods (Case C)
5. Discussion
5.1. Applicability of MTcGAN
5.2. Future Research Directions
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Segarra, J.; Buchaillot, M.L.; Araus, J.L.; Kefauver, S.C. Remote sensing for precision agriculture: Sentinel-2 improved features and applications. Agronomy 2020, 10, 641. [Google Scholar] [CrossRef]
- Lee, K.; Kim, S.; Ryu, J.-H.; Ahn, H. Comparison of MODIS and VIIRS NDVI characteristics on corn and soybean cultivation areas in Illinois. Korean J. Remote Sens. 2023, 39, 1483–1490, (In Korean with English Abstract). [Google Scholar]
- Kazemi Garajeh, M.; Salmani, B.; Zare Naghadehi, S.; Valipoori Goodarzi, H.; Khasraei, A. An integrated approach of remote sensing and geospatial analysis for modeling and predicting the impacts of climate change on food security. Sci. Rep. 2023, 13, 1057. [Google Scholar] [CrossRef] [PubMed]
- Franch, B.; Vermote, E.F.; Becker-Reshef, I.; Claverie, M.; Huang, J.; Zhang, J.; Justice, C.; Sobrino, J.A. Improving the timeliness of winter wheat production forecast in the United States of America, Ukraine and China using MODIS data and NCAR Growing Degree Day information. Remote Sens. Environ. 2015, 161, 131–148. [Google Scholar] [CrossRef]
- Becker-Reshef, I.; Barker, B.; Whitcraft, A.; Oliva, P.; Mobley, K.; Justice, C.; Sahajpal, R. Crop type maps for operational global agricultural monitoring. Sci. Data 2023, 10, 172. [Google Scholar] [CrossRef]
- Gao, F.; Anderson, M.; Daughtry, C.; Karnieli, A.; Hively, D.; Kustas, W. A within-season approach for detecting early growth stages in corn and soybean using high temporal and spatial resolution imagery. Remote Sens. Environ. 2020, 242, 111752. [Google Scholar] [CrossRef]
- Song, J.-S.; Kim, S.B.; Ryu, S.; Oh, J.; Kim, D.-S. Emerging plasma technology that alleviates crop stress during the early growth stages of plants: A Review. Front. Plant Sci. 2020, 11, 988. [Google Scholar] [CrossRef] [PubMed]
- Yang, C.; Suh, C.P.C. Applying machine learning classifiers to Sentinel-2 imagery for early identification of cotton fields to advance boll weevil eradication. Comput. Electron. Agric. 2023, 213, 108268. [Google Scholar] [CrossRef]
- Skakun, S.; Franch, B.; Vermote, E.; Roger, J.C.; Becker-Reshef, I.; Justice, C.; Kussul, N. Early season large-area winter crop mapping using MODIS NDVI data, growing degree days information and a Gaussian mixture model. Remote Sens. Environ. 2017, 195, 244–258. [Google Scholar] [CrossRef]
- Kwak, G.-H.; Park, C.-w.; Lee, K.-d.; Na, S.-i.; Ahn, H.-y.; Park, N.-W. Potential of hybrid CNN-RF model for early crop mapping with limited input data. Remote Sens. 2021, 13, 1629. [Google Scholar] [CrossRef]
- Chaves, M.E.D.; Sanches, I.D. Improving crop mapping in Brazil’s Cerrado from a data cubes-derived Sentinel-2 temporal analysis. Remote Sens. Appl. Soc. Environ. 2023, 32, 101014. [Google Scholar] [CrossRef]
- Luo, K.; Lu, L.; Xie, Y.; Chen, F.; Yin, F.; Li, Q. Crop type mapping in the central part of the North China Plain using Sentinel-2 time series and machine learning. Comput. Electron. Agric. 2023, 205, 107577. [Google Scholar] [CrossRef]
- Karmakar, P.; Teng, S.W.; Murshed, M.; Pang, S.; Li, Y.; Lin, H. Crop monitoring by multimodal remote sensing: A review. Remote Sens. Appl. Soc. Environ. 2024, 33, 101093. [Google Scholar] [CrossRef]
- Park, S.; Park, N.-W. Combining Gaussian process regression with Poisson blending for seamless cloud removal from optical remote sensing imagery for cropland monitoring. Agronomy 2023, 13, 2789. [Google Scholar] [CrossRef]
- Hagolle, O.; Huc, M.; Pascual, D.V.; Dedieu, G. A multi-temporal method for cloud detection, applied to FORMOSAT-2, VENuS, LANDSAT and SENTINEL-2 images. Remote Sens. Environ. 2010, 114, 1747–1755. [Google Scholar] [CrossRef]
- Cheng, Q.; Shen, H.; Zhang, L.; Yuan, Q.; Zeng, C. Cloud removal for remotely sensed images by similar pixel replacement guided with a spatio-temporal MRF model. ISPRS J. Photogramm. Remote Sens. 2014, 92, 54–68. [Google Scholar] [CrossRef]
- He, W.; Yokoya, N. Multi-temporal Sentinel-1 and -2 data fusion for optical image simulation. ISPRS Int. J. Geo-Inf. 2018, 7, 389. [Google Scholar] [CrossRef]
- Liu, P.; Li, J.; Wang, L.; He, G. Remote sensing data fusion with generative adversarial networks: State-of-the-art methods and future research directions. IEEE Geosci. Remote Sens. Mag. 2022, 10, 295–328. [Google Scholar] [CrossRef]
- Kulkarni, S.C.; Rege, P.P. Pixel level fusion techniques for SAR and optical images: A review. Inf. Fusion 2020, 59, 13–29. [Google Scholar] [CrossRef]
- Park, N.-W.; Park, M.-G.; Kwak, G.-H.; Hong, S. Deep learning-based virtual optical image generation and its application to early crop mapping. Appl. Sci. 2023, 13, 1766. [Google Scholar] [CrossRef]
- Bermudez, J.D.; Happ, P.N.; Feitosa, R.Q.; Oliveira, D.A.B. Synthesis of multispectral optical images from SAR/optical multitemporal data using conditional generative adversarial networks. IEEE Geosci. Remote Sens. Lett. 2019, 16, 1220–1224. [Google Scholar] [CrossRef]
- Guang, J.; Zhaohui, W.; Lifan, Z.; Yu, X.; Shan, Z.; Shengrong, G. SAR image colorization using multidomain cycle-consistency generative adversarial network. IEEE Geosci. Remote Sens. Lett. 2021, 18, 296–300. [Google Scholar]
- Wang, Z.; Ma, Y.; Zhang, Y. Hybrid cGAN: Coupling global and local features for SAR-to-optical image translation. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5236016. [Google Scholar] [CrossRef]
- Kwak, G.-H.; Park, S.; Park, N.-W. Combining conditional generative adversarial network and regression-based calibration for cloud removal of optical imagery. Korean J. Remote Sens. 2022, 38, 1357–1369, (In Korean with English Abstract). [Google Scholar]
- Xiong, Q.; Li, G.; Yao, X.; Zhang, X. SAR-to-optical image translation and cloud removal based on conditional generative adversarial networks: Literature survey, taxonomy, evaluation indicators, limits and future directions. Remote Sens. 2023, 15, 1137. [Google Scholar] [CrossRef]
- Wang, L.; Xu, X.; Yu, Y.; Yang, R.; Gui, R.; Xu, Z.; Pu, F. SAR-to-optical image translation using supervised cycle-consistent adversarial networks. IEEE Access 2019, 7, 129136–129149. [Google Scholar] [CrossRef]
- Fuentes Reyes, M.; Auer, S.; Merkle, N.; Henry, C.; Schmitt, M. SAR-to-optical image translation based on conditional generative adversarial networks—Optimization, opportunities and limits. Remote Sens. 2019, 11, 2067. [Google Scholar] [CrossRef]
- Zhang, J.; Zhou, J.; Lu, X. Feature-guided SAR-to-optical image translation. IEEE Access 2020, 8, 70925–70937. [Google Scholar] [CrossRef]
- Zhao, Y.; Celik, T.; Liu, N.; Li, H.C. A comparative analysis of GAN-based methods for SAR-to-optical image translation. IEEE Geosci. Remote Sens. Lett. 2022, 19, 3512605. [Google Scholar] [CrossRef]
- Won, T.; Eo, Y.-D. An experiment on image restoration applying the cycle generative adversarial network to partial occlusion Kompsat-3A image. Korean J. Remote Sens. 2022, 38, 33–43. [Google Scholar]
- Huang, Z.; Chen, Z.; Zhang, Q.; Quan, G.; Ji, M.; Zhang, C.; Yang, Y.; Liu, X.; Liang, D.; Zheng, H.; et al. CaGAN: A cycle-consistent generative adversarial network with attention for low-dose CT imaging. IEEE Trans. Comput. Imaging 2020, 6, 1203–1218. [Google Scholar] [CrossRef]
- Li, Y.; Fu, R.; Meng, X.; Jin, W.; Shao, F. A SAR-to-optical image translation method based on conditional generation adversarial network (cGAN). IEEE Access 2020, 8, 60338–60343. [Google Scholar] [CrossRef]
- Yang, X.; Zhao, J.; Wei, Z.; Wang, N.; Gao, X. SAR-to-optical image translation based on improved CGAN. Pattern Recognit. 2022, 121, 108208. [Google Scholar] [CrossRef]
- Turnes, J.N.; Castro, J.D.B.; Torres, D.L.; Vega, P.J.S.; Feitosa, R.Q.; Happ, P.N. Atrous cGAN for SAR to optical image translation. IEEE Geosci. Remote Sens. Lett. 2020, 19, 3031199. [Google Scholar]
- Zhao, J.; Ni, W.; Zhou, Y.; Chen, Y.; Yang, Z.; Bian, F. SAR-to-optical image translation by a variational generative adversarial network. Remote Sens. Lett. 2022, 13, 672–682. [Google Scholar] [CrossRef]
- Kong, Y.; Liu, S.; Peng, X. Multi-scale translation method from SAR to optical remote sensing images based on conditional generative adversarial network. Int. J. Remote Sens. 2022, 43, 2837–2860. [Google Scholar] [CrossRef]
- Jin, M.; Wang, P.; Li, Y. HyA-GAN: Remote sensing image cloud removal based on hybrid attention generation adversarial network. Int. J. Remote Sens. 2024, 45, 1755–1773. [Google Scholar] [CrossRef]
- Christovam, L.E.; Shimabukuro, M.H.; Galo, M.d.L.B.T.; Honkavaara, E. Pix2pix conditional generative adversarial network with MLP loss function for cloud removal in a cropland time series. Remote Sens. 2022, 14, 144. [Google Scholar] [CrossRef]
- Isola, P.; Zhu, J.Y.; Zhou, T.; Efros, A.A. Image-to-image translation with conditional adversarial networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 2017, Honolulu, HI, USA, 21–26 July 2017; pp. 1125–1134. [Google Scholar]
- Zhu, J.Y.; Park, T.; Isola, P.; Efros, A.A. Unpaired image-to-image translation using cycle-consistent adversarial networks. In Proceedings of the IEEE International Conference on Computer Vision 2017, Venice, Italy, 22–29 October 2017; pp. 2223–2232. [Google Scholar]
- USDA Foreign Agricultural Service. Available online: https://fas.usda.gov/commodities (accessed on 7 December 2023).
- ESA, Copernicus Data Space Ecosystem. Available online: https://dataspace.copernicus.eu/ (accessed on 28 November 2023).
- SNAP. Available online: https://step.esa.int/main/toolboxes/snap (accessed on 28 November 2023).
- Filipponi, F. Sentinel-1 GRD preprocessing workflow. Proceedings 2019, 18, 11. [Google Scholar] [CrossRef]
- Mandal, D.; Kumar, V.; Ratha, D.; Dey, S.; Bhattacharya, A.; Lopez-Sanchez, J.M.; Rao, Y.S. Dual polarimetric radar vegetation index for crop growth monitoring using sentinel-1 SAR data. Remote Sens. Environ. 2020, 247, 111954. [Google Scholar] [CrossRef]
- Enomoto, K.; Sakurada, K.; Wang, W.; Kawaguchi, N. Image translation between SAR and optical imagery with generative adversarial nets. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 1752–1755. [Google Scholar]
- CropScape—Cropland Data Layer. Available online: https://nassgeodata.gmu.edu/CropScape (accessed on 1 December 2023).
- TensorFlow. Available online: https://tensorflow.org (accessed on 13 November 2023).
- Keras Documentation. Available online: https://keras.io (accessed on 13 November 2023).
- Amherdt, S.; Leo, N.; Pereira, A.; Cornero, C.; Pacino, M. Assessment of interferometric coherence contribution to corn and soybean mapping with Sentinel-1 data time series. Geocarto Int. 2022, 38, 1–24. [Google Scholar] [CrossRef]
- Zheng, J.; Fu, H.; Li, W.; Wu, W.; Yu, L.; Yuan, S.; Tao, W.Y.W.; Pang, T.K.; Kanniah, K.D. Growing status observation for oil palm trees using Unmanned Aerial Vehicle (UAV) images. ISPRS J. Photogramm. Remote Sens. 2021, 173, 95–121. [Google Scholar] [CrossRef]
- Zheng, J.; Yuan, S.; Wu, W.; Li, W.; Yu, L.; Fu, H.; Coomes, D. Surveying coconut trees using high-resolution satellite imagery in remote atolls of the Pacific Ocean. Remote Sens. Environ. 2023, 287, 113485. [Google Scholar] [CrossRef]
- Roznik, M.; Boyd, M.; Porth, L. Improving crop yield estimation by applying higher resolution satellite NDVI imagery and high-resolution cropland masks. Remote Sens. Appl. Soc. Environ. 2022, 25, 100693. [Google Scholar] [CrossRef]
- Zhou, Y.; Lao, C.; Yang, Y.; Zhang, Z.; Chen, H.; Chen, Y.; Chen, J.; Ning, J.; Yang, N. Diagnosis of winter-wheat water stress based on UAV-borne multispectral image texture and vegetation indices. Agric. Water Manag. 2021, 256, 107076. [Google Scholar] [CrossRef]
- Chen, D.; Huang, J.; Jackson, T.J. Vegetation water content estimation for corn and soybeans using spectral indices derived from MODIS near- and short-wave infrared bands. Remote Sens. Environ. 2005, 98, 225–236. [Google Scholar] [CrossRef]
- USDA National Agricultural Statistics Service. Available online: https://www.nass.usda.gov/Statistics_by_State (accessed on 18 March 2024).
Specification | Sentinel-1 | Sentinel-2 | |
---|---|---|---|
Product type | Level-1 GRD | Level-2A BOA | |
Polarization or spectral bands (central wavelength) | VV and VH | Blue (490 nm), green (560 nm), red (665 nm), RE1–3 (705, 740, and 783 nm), NIR (842 nm), SWIR1–2 (1610 and 2190 nm) | |
Spatial resolution | 10 m | 10 m (blue, green, red, and NIR) 20 m (RE1–3 and SWIR1–2) | |
Acquisition dates | 19 June 2022 | 14 June 2022 | |
1 July 2022 | 29 June 2022 | ||
13 July 2022 | 12 July 2022 | ||
25 July 2022 | 22 July 2022 | ||
18 August 2022 | 13 August 2022 |
Generator | Discriminator | |
---|---|---|
Encoder | Decoder | |
CL (128, 64, 4, 2) | DcBDR (2, 1024, 4, 2) | CL (128, 64, 4, 2) |
CBL (64, 128, 4, 2) | DcBDR (4, 1024, 4, 2) | CBL (64, 128, 4, 2) |
CBL (32, 256, 4, 2) | DcBDR (8, 1024, 4, 2) | CBL (32, 256, 4, 2) |
CBL (16, 512, 4, 2) | DcBR (16, 1024, 4, 2) | ZCBL (31, 512, 4, 1) |
CBL (8, 512, 4, 2) | DcBR (32, 512, 4, 2) | ZCS (30, 1, 4, 1) |
CBL (4, 512, 4, 2) | DcBR (64, 256, 4, 2) | - |
CBL (2, 512, 4, 2) | DcBR (128, 128, 4, 2) | - |
CBL (1, 512, 4, 2) | DcT (256, N, 4, 2) | - |
Cases | Model | Training and Test | |
---|---|---|---|
Input Images | Output Image | ||
A-1 | MTcGAN | S1 (, ) and S2 () | S2 ( |
A-2 | S1 (, ) and S2 () | S2 ( | |
A-3 | S1 (, ) and S2 () | S2 ( | |
A-4 | S1 (, ) and S2 () | S2 ( | |
B-1 | MTcGAN | S1 (, ) and S2 () | S2 ( |
B-2 | S1 (, ) and S2 () | S2 ( | |
C-1 | Pix2Pix | S1 ( | S2 ( |
C-2 | S-CycleGAN | ||
C-3 | Pix2Pix | S1 ( | S2 ( |
C-4 | S-CycleGAN |
Crop | |||||
---|---|---|---|---|---|
Corn | 0.739 | - | 0.317 | - | |
- | 0.789 | 0.647 | - | ||
- | - | 0.907 | - | ||
- | - | - | 0.842 | ||
Soybean | 0.883 | - | 0.236 | - | |
- | 0.784 | 0.489 | - | ||
- | - | 0.866 | - | ||
- | - | - | 0.755 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kwak, G.-H.; Park, N.-W. Assessing the Potential of Multi-Temporal Conditional Generative Adversarial Networks in SAR-to-Optical Image Translation for Early-Stage Crop Monitoring. Remote Sens. 2024, 16, 1199. https://doi.org/10.3390/rs16071199
Kwak G-H, Park N-W. Assessing the Potential of Multi-Temporal Conditional Generative Adversarial Networks in SAR-to-Optical Image Translation for Early-Stage Crop Monitoring. Remote Sensing. 2024; 16(7):1199. https://doi.org/10.3390/rs16071199
Chicago/Turabian StyleKwak, Geun-Ho, and No-Wook Park. 2024. "Assessing the Potential of Multi-Temporal Conditional Generative Adversarial Networks in SAR-to-Optical Image Translation for Early-Stage Crop Monitoring" Remote Sensing 16, no. 7: 1199. https://doi.org/10.3390/rs16071199
APA StyleKwak, G. -H., & Park, N. -W. (2024). Assessing the Potential of Multi-Temporal Conditional Generative Adversarial Networks in SAR-to-Optical Image Translation for Early-Stage Crop Monitoring. Remote Sensing, 16(7), 1199. https://doi.org/10.3390/rs16071199