DCFENet: A Dual-Branch Collaborative Feature Enhancement Network for Farmland Boundary Detection
Abstract
1. Introduction
- (1)
- A multi-scale feature fusion attention module, TA-ASPP is proposed. By fusing features of different scales, it enhances the network’s adaptability to complex farmland image scenes, enabling more precise boundary localization.
- (2)
- A dual-branch decoding structure is proposed. By constructing a boundary decoder and a skeleton decoder, the network’s abilities for boundary localization and boundary skeleton topology modeling are, respectively, enhanced, effectively strengthening the continuity modeling of fractured boundaries.
- (3)
- A constraint mechanism tailored for the dual-branch structures is proposed. Boundary loss and skeleton loss are designed to collaboratively optimize the dual-branch networks, enabling simultaneous enhancement of boundary localization and structural integrity.
- (4)
- A high-resolution farmland boundary dataset is constructed, providing a data basis for research on farmland boundary detection and for validating the effectiveness of models.
2. Related Work
2.1. Traditional Farmland Boundary Extraction Methods
2.2. Deep Learning Methods for Boundary Segmentation
2.3. Multi-Task and Pixel-Wise Instance Segmentation Methods
3. Materials and Methods
3.1. Dataset Construction
3.1.1. Study Area
3.1.2. Dataset Preparation
3.2. The Proposed DCFENet
3.2.1. Overall Architecture
3.2.2. TA-ASPP
3.2.3. Dual-Branch Structure
The Skeleton Decoder
The Boundary Decoder
3.2.4. Bibranch Loss
Skeleton Loss
Boundary Branch Supervision Loss
3.3. Experimental Environment
3.4. Evaluation Metrics
4. Results
4.1. Ablation Study
4.2. Multi-Scale Comparative Experiment
4.3. Comparison of Different Models
4.3.1. Performance Comparison of Different Models
4.3.2. Model Computational Efficiency Analysis
4.3.3. Cross-Validation Analysis
4.4. Experimental Results Analysis
5. Discussion
5.1. Model Advantages
5.2. Limitations and Future Work
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Potapov, P.; Turubanova, S.; Hansen, M.C.; Tyukavina, A.; Zalles, V.; Khan, A.; Cortez, J. Global maps of cropland extent and change show accelerated cropland expansion in the twenty-first century. Nat. Food 2022, 3, 19–28. [Google Scholar] [CrossRef]
- Lambin, E.F.; Meyfroidt, P. Global land use change, economic globalization, and the looming land scarcity. Proc. Natl. Acad. Sci. USA 2011, 108, 3465–3472. [Google Scholar] [CrossRef] [PubMed]
- Li, Q.; Huang, Y.; Sun, J.; Chen, S.; Zou, J. Balancing Urban Expansion and Food Security: A Spatiotemporal Assessment of Cropland Loss and Productivity Compensation in the Yangtze River Delta, China. Land 2025, 14, 1476. [Google Scholar] [CrossRef]
- Li, L. The State of the World’s Land and Water Resources for Food and Agriculture (SOLAW): Systems at Breaking Point; FAO: Rome, Italy, 2021. [Google Scholar]
- Wang, M.; Wang, J.; Cui, Y.; Liu, J.; Chen, L. Agricultural field boundary delineation with satellite image segmentation for high-resolution crop mapping: A case study of rice paddy. Agronomy 2022, 12, 2342. [Google Scholar] [CrossRef]
- Wang, S.; Waldner, F.; Lobell, D.B. Unlocking large-scale crop field delineation in smallholder farming systems with transfer learning and weak supervision. Remote Sens. 2022, 14, 5738. [Google Scholar] [CrossRef]
- Zhang, J.; Yang, X.; Dai, J.; Wang, X.; Fang, Z.; Liu, X.; Wang, Z. Automated remote sensing monitoring of cropland non-agricultural and non-grain conversion at parcel scale in complex environments through multi-source data fusion. Geo-Spat. Inf. Sci. 2025, 29, 168–192. [Google Scholar] [CrossRef]
- De Bruin, S.; Heuvelink, G.B.M.; Brown, J.D. Propagation of positional measurement errors to agricultural field boundaries and associated costs. Comput. Electron. Agric. 2008, 63, 245–256. [Google Scholar] [CrossRef]
- Omia, E.; Bae, H.; Park, E.; Kim, M.S.; Baek, I.; Kabenge, I.; Cho, B.K. Remote sensing in field crop monitoring: A comprehensive review of sensor systems, data analyses and recent advances. Remote Sens. 2023, 15, 354. [Google Scholar] [CrossRef]
- Xu, Y.; Xue, X.; Sun, Z.; Gu, W.; Cui, L.; Jin, Y.; Lan, Y. Deriving agricultural field boundaries for crop management from satellite images using semantic feature pyramid network. Remote Sens. 2023, 15, 2937. [Google Scholar] [CrossRef]
- Wang, X.; Shu, L.; Han, R.; Yang, F.; Gordon, T.; Wang, X.; Xu, H. A survey of farmland boundary extraction technology based on remote sensing images. Electronics 2023, 12, 1156. [Google Scholar] [CrossRef]
- Waldner, F.; Diakogiannis, F.I. Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network. Remote Sens. Environ. 2020, 245, 111741. [Google Scholar] [CrossRef]
- Neupane, B.; Teerayut, H.; Jagannath, A. Deep learning-based semantic segmentation of urban features in satellite images: A review and meta-analysis. Remote Sens. 2021, 13, 808. [Google Scholar] [CrossRef]
- Lu, R.; Zhang, Y.; Huang, Q.; Zeng, P.; Shi, Z.; Ye, S. A refined edge-aware convolutional neural networks for agricultural parcel delineation. Int. J. Appl. Earth Obs. Geoinf. 2024, 133, 104084. [Google Scholar] [CrossRef]
- Chen, H.; Qi, X.; Yu, L.; Heng, P.A. DCAN: Deep contour-aware networks for accurate gland segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 2487–2496. [Google Scholar]
- Bai, M.; Urtasun, R. Deep watershed transform for instance segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 5221–5229. [Google Scholar]
- Watkins, B.; Van Niekerk, A. A comparison of object-based image analysis approaches for field boundary delineation using multi-temporal Sentinel-2 imagery. Comput. Electron. Agric. 2019, 158, 294–302. [Google Scholar] [CrossRef]
- Hong, R.; Park, J.; Jang, S.; Shin, H.; Kim, H.; Song, I. Development of a parcel-level land boundary extraction algorithm for aerial imagery of regularly arranged agricultural areas. Remote Sens. 2021, 13, 1167. [Google Scholar] [CrossRef]
- Wagner, M.P.; Oppelt, N. Extracting agricultural fields from remote sensing imagery using graph-based growing contours. Remote Sens. 2020, 12, 1205. [Google Scholar] [CrossRef]
- Xu, L.; Ming, D.; Zhou, W.; Bao, H.; Chen, Y.; Ling, X. Farmland extraction from high spatial resolution remote sensing images based on stratified scale pre-estimation. Remote Sens. 2019, 11, 108. [Google Scholar] [CrossRef]
- Felzenszwalb, P.F.; Huttenlocher, D.P. Efficient graph-based image segmentation. Int. J. Comput. Vis. 2004, 59, 167–181. [Google Scholar] [CrossRef]
- Xue, Y.; Zhao, J.; Zhang, M. A watershed-segmentation-based improved algorithm for extracting cultivated land boundaries. Remote Sens. 2021, 13, 939. [Google Scholar] [CrossRef]
- Tetteh, G.O.; Schwieder, M.; Erasmi, S.; Conrad, C.; Gocht, A. Comparison of an optimised multiresolution segmentation approach with deep neural networks for delineating agricultural fields from sentinel-2 images. PFG-J. Photogramm. Remote Sens. Geoinf. Sci. 2023, 91, 295–312. [Google Scholar] [CrossRef]
- Shunying, W.; Ya’nan, Z.; Xianzeng, Y.; Li, F.; Tianjun, W.; Jiancheng, L. BSNet: Boundary-semantic-fusion network for farmland parcel mapping in high-resolution satellite images. Comput. Electron. Agric. 2023, 206, 107683. [Google Scholar] [CrossRef]
- Zhang, H.; Liu, M.; Wang, Y.; Shang, J.; Liu, X.; Li, B.; Li, Q. Automated delineation of agricultural field boundaries from Sentinel-2 images using recurrent residual U-Net. Int. J. Appl. Earth Obs. Geoinf. 2021, 105, 102557. [Google Scholar] [CrossRef]
- Wu, S.; Su, Y.; Lu, X.; Xu, H.; Kang, S.; Zhang, B.; Liu, L. Extraction and mapping of cropland parcels in typical regions of southern China using unmanned aerial vehicle multispectral images and deep learning. Drones 2023, 7, 285. [Google Scholar] [CrossRef]
- Persello, C.; Tolpekin, V.A.; Bergado, J.R.; De By, R.A. Delineation of agricultural fields in smallholder farms from satellite images using fully convolutional networks and combinatorial grouping. Remote Sens. Environ. 2019, 231, 111253. [Google Scholar] [CrossRef] [PubMed]
- Lu, H.; Wang, H.; Ma, Z.; Ren, Y.; Fu, W.; Shan, Y.; Meng, Z. Farmland boundary extraction based on the AttMobile-DeeplabV3+ network and least squares fitting of straight lines. Front. Plant Sci. 2023, 14, 1228590. [Google Scholar] [CrossRef]
- Li, Z.; Wang, Y.; Tian, F.; Zhang, J.; Chen, Y.; Li, K. BAFormer: A Novel Boundary-Aware Compensation UNet-like Transformer for High-Resolution Cropland Extraction. Remote Sens. 2024, 16, 2526. [Google Scholar] [CrossRef]
- Huang, L.; Zhang, Z.; Yu, Y.; Tang, B.H. DEDANet: Mountainous Cropland Extraction From Remote Sensing Imagery with Detail Enhancement and Distance Attenuation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2025, 18, 17565–17579. [Google Scholar] [CrossRef]
- Shi, Z.; Fan, J.; Du, Y.; Zhou, Y.; Zhang, Y. LULC-SegNet: Enhancing land use and land cover semantic segmentation with denoising diffusion feature fusion. Remote Sens. 2024, 16, 4573. [Google Scholar] [CrossRef]
- Liu, X.; Zhang, J.; Duan, Y.; Zhou, J. CPVF: Vectorization of agricultural cultivation field parcels via a boundary–parcel multi-task learning network in ultra-high-resolution remote sensing images. ISPRS J. Photogramm. Remote Sens. 2025, 226, 267–299. [Google Scholar] [CrossRef]
- Ren, J.; Jing, Y.; Zheng, X.; Li, S.; Li, K.; Mu, G. Cropland Extraction Based on PlanetScope Images and a Newly Developed CAFM-Net Model. Remote Sens. 2026, 18, 646. [Google Scholar] [CrossRef]
- Wang, Y.; Yang, M.; Zhang, T.; Hu, S.; Zhuang, Q. DAENet: A Deep Attention-Enhanced Network for Cropland Extraction in Complex Terrain from High-Resolution Satellite Imagery. Agriculture 2025, 15, 1318. [Google Scholar] [CrossRef]
- Chen, H.; Wang, Q.; Xie, K.; Lei, L.; Wu, X. MPF-Net: Multi-projection filtering network for few-shot object detection. Appl. Intell. 2024, 54, 7777–7792. [Google Scholar] [CrossRef]
- Vayssade, J.A.; Paoli, J.N.; Gée, C.; Jones, G. DeepIndices: Remote sensing indices based on approximation of functions through deep-learning, application to uncalibrated vegetation images. Remote Sens. 2021, 13, 2261. [Google Scholar] [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–9 October 2015; pp. 234–241. [Google Scholar]
- Jie, J.; Guo, Y.; Wu, G.; Wu, J.; Hua, B. EdgeNAT: Transformer for efficient edge detection. arXiv 2024, arXiv:2408.10527. [Google Scholar] [CrossRef]
- Zhu, Z.; Yan, Y.; Xu, R.; Zi, Y.; Wang, J. Attention-Unet: A deep learning approach for fast and accurate segmentation in medical imaging. J. Comput. Sci. Softw. Appl. 2022, 2, 24–31. [Google Scholar]
- Pu, M.; Huang, Y.; Liu, Y.; Guan, Q.; Ling, H. Edter: Edge detection with transformer. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 1402–1412. [Google Scholar]
- Long, J.; Li, M.; Wang, X.; Stein, A. Delineation of agricultural fields using multi-task BsiNet from high-resolution satellite images. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102871. [Google Scholar] [CrossRef]
- Qin, X.; Zhang, Z.; Huang, C.; Dehghan, M.; Zaiane, O.R.; Jagersand, M. U2-Net: Going deeper with nested U-structure for salient object detection. Pattern Recognit. 2020, 106, 107404. [Google Scholar] [CrossRef]
- Pan, J.; Bulat, A.; Tan, F.; Zhu, X.; Dudziak, L.; Li, H.; Tzimiropoulos, G.; Martinez, B. Edgevits: Competing light-weight cnns on mobile devices with vision transformers. In European Conference on Computer Vision; Springer Nature: Cham, Switzerland, 2022; pp. 294–311. [Google Scholar]
- Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]









| Baseline | TA-ASPP | Dual-Branch | Dual-Branch Loss | Precision | Recall | F1 Score | BIoU | ODS | OIS |
|---|---|---|---|---|---|---|---|---|---|
| √ | × | × | × | 47.7 | 31.8 | 36.8 | 64.2 | 36.8 | 39.5 |
| √ | √ | × | × | 59.8 | 50 | 54.9 | 62.8 | 54.9 | 58.3 |
| √ | × | √ | × | 53.3 | 53.1 | 51.2 | 56.9 | 51.2 | 55.9 |
| √ | × | √ | √ | 64.6 | 49.3 | 59.8 | 61.3 | 59.8 | 64.9 |
| √ | √ | √ | √ | 74.5 | 68.1 | 68.1 | 77.4 | 68.1 | 70.4 |
| Baseline | Precision | Recall | F1 Score | BIoU | ODS | OIS |
|---|---|---|---|---|---|---|
| DUALRes-Net18_UNet | 64.6 | 49.3 | 59.8 | 61.3 | 59.8 | 64.9 |
| +ASPP | 51.9 | 64.3 | 60.7 | 62.9 | 60.7 | 63.4 |
| +EMA | 56.6 | 62.7 | 61.5 | 64.1 | 61.5 | 64.6 |
| +TA-ASPP | 74.5 | 68.1 | 68.1 | 77.4 | 68.1 | 70.4 |
| Model | Precision | Recall | F1 Score | BIoU | ODS | OIS |
|---|---|---|---|---|---|---|
| UNet | 29.5 | 40.8 | 31.7 | 37.4 | 31.7 | 34.4 |
| EdgeNAT | 40.6 | 38 | 35.3 | 26.6 | 37.4 | 38.2 |
| AttentionUnet | 25.1 | 16.3 | 18.7 | 45.2 | 18.7 | 19.6 |
| EDTER | 43 | 47.1 | 35.7 | 15 | 37.8 | 38.7 |
| BSiNet | 66.2 | 64.3 | 64.1 | 48.2 | 66.5 | 68.2 |
| ResNet18_UNet | 47.7 | 31.8 | 36.8 | 64.2 | 36.8 | 39.5 |
| ResNet18_UNet (mean) | 61.6 | 36.9 | 44.7 | 73.5 | 44.7 | 46.3 |
| ResNet18_UNet (sum) | 72.9 | 44.1 | 54 | 77.5 | 54 | 56.5 |
| DCFENet (ours) | 74.5 | 68.1 | 68.1 | 77.4 | 68.1 | 70.4 |
| Baseline | Parameter (M) | FLOPS (G) | Inference Speed (FPS) | Memory (GB) |
|---|---|---|---|---|
| UNet | 31.00 | 218.97 | 51.75 | 2.93 |
| EdgeNAT | 31.50 | 76.40 | 23.60 | 2.50 |
| AttentionUnet | 10.74 | 10.691 | 38.72 | 1.04 |
| EDTER | 48.30 | 118.60 | 12.30 | 3.40 |
| ResNet18_UNet | 15.78 | 28.37 | 166.94 | 0.68 |
| DCFENet (ours) | 26.43 | 37.43 | 97.97 | 1.03 |
| Model | Precision | Recall | F1 Score | BIoU | ODS | OIS |
|---|---|---|---|---|---|---|
| U-Net | 24.53 ± 3.47 | 42.04 ± 2.75 | 29.81 ± 2.75 | 34.6 ± 1.67 | 31.71 ± 2.86 | 29.92 ± 3.12 |
| ResNet18UNet | 45.08 ± 2.50 | 29.74 ± 2.22 | 34.89 ± 2.16 | 64.90 ± 0. 94 | 35.42 ± 3.07 | 34.24 ± 3.50 |
| DCFENet | 74.37 ± 0. 51 | 68.22 ± 1.73 | 70.64 ± 0. 92 | 75.38 ± 0. 81 | 71.15 ± 0. 89 | 68.36 ± 0. 92 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Lan, M.; Huang, B.; Wu, P. DCFENet: A Dual-Branch Collaborative Feature Enhancement Network for Farmland Boundary Detection. Agronomy 2026, 16, 964. https://doi.org/10.3390/agronomy16100964
Lan M, Huang B, Wu P. DCFENet: A Dual-Branch Collaborative Feature Enhancement Network for Farmland Boundary Detection. Agronomy. 2026; 16(10):964. https://doi.org/10.3390/agronomy16100964
Chicago/Turabian StyleLan, Mengyao, Bangjun Huang, and Peng Wu. 2026. "DCFENet: A Dual-Branch Collaborative Feature Enhancement Network for Farmland Boundary Detection" Agronomy 16, no. 10: 964. https://doi.org/10.3390/agronomy16100964
APA StyleLan, M., Huang, B., & Wu, P. (2026). DCFENet: A Dual-Branch Collaborative Feature Enhancement Network for Farmland Boundary Detection. Agronomy, 16(10), 964. https://doi.org/10.3390/agronomy16100964

