Parcel-Level Mapping of Horticultural Crop Orchards in Complex Mountain Areas Using VHR and Time-Series Images
Abstract
:1. Introduction
- (a)
- Many horticultural crops belong to the same family as natural forests and have similar phenological characteristics, which makes it more difficult to distinguish them from each other.
- (b)
- Horticultural crops are mainly planted in mountainous areas. Restricted by planting conditions, there are many agricultural parcels with irregular shapes and fuzzy edges [17]. Moreover, there is a high degree of heterogeneity between mountain parcels, which makes it difficult to extract them.
- (c)
- There is a mixed planting phenomenon in many parcels, so the conventional method of determining the parcel category is not suitable for mixed parcels in complex mountainous areas.
2. Study Area and Dataset
2.1. Study Area
2.2. Field Sampling Data
2.3. Remote-Sensing Data Acquisition and Preprocessing
3. Methods
- A parcel extraction framework with zoning and hierarchical strategies based on VHR images. In this part, texture-based and edge-based deep learning models are combined to extract the parcels in the study area.
- Crop classification based on time-series data. Based on Sentinel-2 images, the time-series characteristics of crops are constructed, and the land surface cover is classified into four categories using an LSTM algorithm.
- For the complex agricultural planting situation in mountainous areas, we choose to take the parcel as a spatial constraint and fill it with pixel-level classification results to determine its category, rather than input the parcel into the classifier as a classification unit. A category filling strategy is designed. With this strategy, the categories of candidate parcels are determined based on the pixel-level classification results obtained in the second part. Finally, the distribution of horticultural orchards is obtained.
3.1. Parcel Extraction Based on a Hierarchical Extraction Scheme
3.1.1. Farmland Classification System Based on Geographical Divisions
3.1.2. Parcel Extraction Based on the RCF Model
3.1.3. Parcel Extraction Based on DABNet
3.2. Horticultural Crop Classification with Time-Series Images
3.2.1. Time-Series Feature Construction
3.2.2. Classification of Parcels Based on an LSTM Model
3.3. Classification Result Filling Strategy
- Parcels surrounded by multiple pixels. This kind of parcel has a large area, so it contains many pixels (Sentinel-2). All of the pixels contained in the parcel and the pixels whose coverage area is greater than half of the pixel area at the parcel boundary are used as the filling pixels of the parcel.
- Multi-pixel-covered parcels. When the area where the pixel intersects the parcel is greater than half of the pixel area, the pixel is regarded as a filling pixel of the parcel.
- Single-pixel-covered parcels. For a single pixel overlay parcel, the pixel covering the parcel is the filled pixel of the parcel.
- Cherry trees have gradually invaded the apple orchard and are planted in an invasive manner in apple orchards. In some parcels, the distribution of cherry trees has no regular boundary, which leads to the complexity and diversity associated with mixed planting in many parcels.
- In VHR images, the texture of apple and cherry trees is similar. When DABNet is used to extract large slope parcels according to textural features in the mountain area, the two fruit trees are classified into one singular category. Therefore, the neural network can only obtain orchard parcels from the surface coverage, rather than detailed apple orchards or cherry orchards. If both apple and cherry orchards are forcibly used as DABNet input categories, then not only will the effectiveness of parcel extraction be greatly reduced, but the parcel will also be overly divided, becoming very different from the actual parcel and losing semantic information.
4. Result
4.1. Candidate Parcel Extraction Results
4.2. Time-Series Curve Construction Results
4.3. Accuracy Evaluation of LSTM Model Parameters and Classification Results
4.4. Parcel Fill Result
5. Discussion
- We used a hierarchical framework to extract parcels layer by layer. It is worth noting that in practical application, the geographical characteristics of the region should be fully considered using a zoning and hierarchical strategy. Inappropriate zoning policies increase the work intensity but reduce the classification accuracy. The study area of this paper is located in complex mountainous areas. Therefore, for practical consideration, we mainly use the terrain characteristics.
- Due to the complex planting situation in mountainous areas, we selected two deep learning models for parcel extraction, and we obtained a good extraction effect. The determined parcel distribution is very close to the actual situation. However, the combination of the two models is not always needed for parcel extraction. For an area with simple planting, the edge characteristics of the parcels are very clear, and the edge model alone is sufficient to extract the parcel distribution.
- In the case of mixed planting among crops, we choose to use the results based on pixel-level classification to determine the parcel category, rather than preferentially constructing the features of the parcel. If the parcel features are first constructed based on the mean value of pixels and then classified, the mixed parcels are likely to be misclassified because their features are not close to any crop. The parcel filling strategy in this paper can be used to avoid a situation in which the crop area is incorrectly estimated.
- The parcel extraction framework mentioned in this paper is mainly dependent on VHR optical images. There is also a limitation regarding the image acquisition time, preferably measured in autumn. However, the long revisit cycle of VHR images limits the acquisition from data sources.
- The study area is close to the sea, and there are too many clouds and rain in the summer, which affects the optical image quality. Hence, only using Sentinel-2 datasets to construct temporal features leads to large intervals between image sequences in July and August. To solve this problem, in future work, it is necessary to fuse multisource data to construct more accurate crop characteristic curves.
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Kozhoridze, G.; Orlovsky, N.; Orlovsky, L.; Blumberg, D.G.; Golan-Goldhirsh, A. Classification-based mapping of trees in commercial orchards and natural forests. Int. J. Remote Sens. 2018, 39, 8784–8797. [Google Scholar] [CrossRef]
- Zhu, Y.; Yang, G.; Yang, H.; Wu, J.; Lei, L.; Zhao, F.; Fan, L.; Zhao, C. Identification of Apple Orchard Planting Year Based on Spatiotemporally Fused Satellite Images and Clustering Analysis of Foliage Phenophase. Remote Sens. 2020, 12, 1199. [Google Scholar] [CrossRef] [Green Version]
- National Bureau of Statistics. Available online: http://www.stats.gov.cn/ (accessed on 16 May 2021).
- Yang, Y.P.; Huang, Q.T.; Wu, W.; Luo, J.C.; Gao, L.J.; Dong, W.; Wu, T.J.; Hu, X.D. Geo-Parcel Based Crop Identification by Integrating High Spatial-Temporal Resolution Imagery from Multi-Source Satellite Data. Remote Sens. 2017, 9, 1298. [Google Scholar] [CrossRef] [Green Version]
- Ashourloo, D.; Shahrabi, H.S.; Azadbakht, M.; Aghighi, H.; Nematollahi, H.; Alimohammadi, A.; Matkan, A.A. Automatic canola mapping using time series of sentinel 2 images. ISPRS J. Photogramm. Remote Sens. 2019, 156, 63–76. [Google Scholar] [CrossRef]
- Beeri, O.; Peled, A. Geographical model for precise agriculture monitoring with real-time remote sensing. ISPRS J. Photogramm. Remote Sens. 2009, 64, 47–54. [Google Scholar] [CrossRef]
- Meroni, M.; Marinho, E.; Sghaier, N.; Verstrate, M.M.; Leo, O. Remote Sensing Based Yield Estimation in a Stochastic Framework—Case Study of Durum Wheat in Tunisia. Remote Sens. 2013, 5, 539–557. [Google Scholar] [CrossRef] [Green Version]
- Xie, D.F.; Sun, P.J.; Zhang, J.S.; Zhu, X.F.; Wang, W.N.; Yuan, Z.M.Q. Autumn crop Identification using high-spatial-temporal resolution time series data generated by modis and landsat remote sensing images. In Proceedings of the IEEE Joint International Geoscience and Remote Sensing Symposium (IGARSS)/35th Canadian Symposium on Remote Sensing, Quebec City, QC, Canada, 13–18 July 2014; pp. 2118–2121. [Google Scholar]
- Liu, J.; Wang, L.M.; Yao, B.M.; Yang, F.G.; Yang, L.B.; Dong, Q.H. Comparative Study on Crop Recognition of Landsat-OLI and RapidEye Data. In Proceedings of the 6th International Conference on Agro-Geoinformatics, Fairfax, VA, USA, 7–10 August 2017; pp. 178–183. [Google Scholar]
- An, R.; Li, W.; Wang, H.L.; Ruan, R.Z. Crop classification using per-field method based on ETM plus image and MODIS EVI time series analysis. In Proceedings of the 5th International Symposium on Integrated Water Resources Management/3rd International Symposium on Methodology in Hydrology, Hohai University, Nanjing, China, 19–21 November 2010; p. 674. [Google Scholar]
- Zhang, M.; Li, Q.Z.; Wu, B.F. Investigating the capability of multi-temporal Landsat images for crop identification in high farmland fragmentation regions. In Proceedings of the 1st International Conference on Agro-Geoinformatics (Agro-Geoinformatics), Shanghai, China, 2–4 August 2012; pp. 26–29. [Google Scholar]
- Pluto-Kossakowska, J. Review on Multitemporal Classification Methods of Satellite Images for Crop and Arable Land Recognition. Agriculture 2021, 11, 999. [Google Scholar] [CrossRef]
- Ramadhani, F.; Koswara, M.R.S.; Apriyana, Y.; Harmanto. The comparison of numerous machine learning algorithms performance in classifying rice growth stages based on Sentinel-2 to enhance crop monitoring in national level. In Proceedings of the 1st International Conference on Sustainable Tropical Land Management (ICSTLM), Electr Network, Bogor, Indonesia, 16–18 September 2020. [Google Scholar]
- Baidar, T.; Fernandez-Beltran, R.; Pla, F. Sentinel-2 multi-temporal data for rice crop classification in nepal. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Electr Network, Waikoloa, HI, USA, 26 September–2 October 2020; pp. 4259–4262. [Google Scholar]
- She, B.; Yang, Y.Y.; Zhao, Z.G.; Huang, L.S.; Liang, D.; Zhang, D.Y. Identification and mapping of soybean and maize crops based on Sentinel-2 data. Int. J. Agric. Biol. Eng. 2020, 13, 171–182. [Google Scholar] [CrossRef]
- Phiri, D.; Simwanda, M.; Salekin, S.; Nyirenda, V.R.; Murayama, Y.; Ranagalage, M. Sentinel-2 Data for Land Cover/Use Mapping: A Review. Remote Sens. 2020, 12, 2291. [Google Scholar] [CrossRef]
- Liu, W.; Wang, J.; Luo, J.; Wu, Z.; Chen, J.; Zhou, Y.; Sun, Y.; Shen, Z.; Xu, N.; Yang, Y. Farmland Parcel Mapping in Mountain Areas Using Time-Series SAR Data and VHR Optical Images. Remote Sens. 2020, 12, 3733. [Google Scholar] [CrossRef]
- Ok, A.O.; Akar, O.; Gungor, O. Evaluation of random forest method for agricultural crop classification. Eur. J. Remote Sens. 2012, 45, 421–432. [Google Scholar] [CrossRef]
- Gasparovic, M.; Jogun, T. The effect of fusing Sentinel-2 bands on land-cover classification. Int. J. Remote Sens. 2018, 39, 822–841. [Google Scholar] [CrossRef]
- Huang, S.D.; Xu, W.H.; Xiong, Y.; Wu, C.; Dai, F.; Xu, H.F.; Wang, L.G.; Kou, W.L. Combining Textures and Spatial Features to Extract Tea Plantations Based on Object-Oriented Method by Using Multispectral Image. Spectrosc. Spectr. Anal. 2021, 41, 2565–2571. [Google Scholar] [CrossRef]
- Deng, J.S.; Shi, Y.Y.; Chen, L.S.; Wang, K.; Zhu, J.X. Cotton Identification and Extraction Using Near Infrared Sensor and Object-Oriented Spectral Segmentation Technique. Spectrosc. Spectr. Anal. 2009, 29, 1754–1758. [Google Scholar]
- Cao, X.; Li, Q.Z.; Du, X.; Zhang, M.; Zheng, X.Q. Exploring effect of segmentation scale on orient-based crop identification using HJ CCD data in Northeast China. In Proceedings of the 35th International Symposium on Remote Sensing of Environment (ISRSE35), Beijing, China, 22–26 April 2013; Institute Remote Sensing & Digital Earth: Beijing, China, 2013. [Google Scholar]
- Jiao, X.F.; Kovacs, J.M.; Shang, J.L.; McNairn, H.; Walters, D.; Ma, B.L.; Geng, X.Y. Object-oriented crop mapping and monitoring using multi-temporal polarimetric RADARSAT-2 data. ISPRS J. Photogramm. Remote Sens. 2014, 96, 38–46. [Google Scholar] [CrossRef]
- Hong, R.; Park, J.; Jang, S.; Shin, H.; Kim, H.; Song, I. Development of a Parcel-Level Land Boundary Extraction Algorithm for Aerial Imagery of Regularly Arranged Agricultural Areas. Remote Sens. 2021, 13, 1167. [Google Scholar] [CrossRef]
- Dai, W.; Na, J.; Huang, N.; Hu, G.; Yang, X.; Tang, G.; Xiong, L.; Li, F. Integrated edge detection and terrain analysis for agricultural terrace delineation from remote sensing images. Int. J. Geogr. Inf. Sci. 2020, 34, 484–503. [Google Scholar] [CrossRef]
- Jintian, C.; Xin, Z.; Weisheng, W.; Lei, W. Integration of optical and SAR remote sensing images for crop-type mapping based on a novel object-oriented feature selection method. Int. J. Agric. Biol. Eng. 2020, 13, 178–190. [Google Scholar] [CrossRef]
- Haoyu, W.; Zhanfeng, S.; Zihan, Z.; Zeyu, X.; Shuo, L.; Shuhui, J.; Yating, L. Improvement of Region-Merging Image Segmentation Accuracy Using Multiple Merging Criteria. Remote Sens. 2021, 13, 2782. [Google Scholar] [CrossRef]
- Hossain, M.; Chen, D. Segmentation for Object-based Image Analysis (Obia): A Review of Algorithms and Challenges From Remote Sensing Perspective. J. Math. 2019, 150, 115–134. [Google Scholar] [CrossRef]
- Wang, S.; Chen, Y.L. The information extraction of Gannan citrus orchard based on the GF-1 remote sensing image. IOP Conf. Ser. Earth Environ. Sci. 2017, 57, 012001. [Google Scholar] [CrossRef]
- Richter, G.M.; Agostini, F.; Barker, A.; Costomiris, D.; Qi, A. Assessing on-farm productivity of Miscanthus crops by combining soil mapping, yield modelling and remote sensing. Biomass Bioenergy 2016, 85, 252–261. [Google Scholar] [CrossRef] [Green Version]
- Xie, S.; Tu, Z. Holistically-Nested Edge Detection. Int. J. Comput. Vis. 2017, 125, 1395–1403. [Google Scholar] [CrossRef]
- Cheng, G.; Zhou, P.; Han, J. Learning Rotation-Invariant Convolutional Neural Networks for Object Detection in VHR Optical Remote Sensing Images. IEEE Trans. Geosci. Remote Sens. 2016, 54, 7405–7415. [Google Scholar] [CrossRef]
- Ding, P.; Zhang, Y.; Deng, W.-J.; Jia, P.; Kuijper, A. A light and faster regional convolutional neural network for object detection in optical remote sensing images. ISPRS J. Photogramm. Remote Sens. 2018, 141, 208–218. [Google Scholar] [CrossRef]
- Rabbi, J.; Ray, N.; Schubert, M.; Chowdhury, S.; Chao, D. Small-Object Detection in Remote Sensing Images with End-to-End Edge-Enhanced GAN and Object Detector Network. Remote Sens. 2020, 12, 1432. [Google Scholar] [CrossRef]
- Xia, G.S.; Bai, X.; Ding, J.; Zhu, Z.; Belongie, S.; Luo, J.B.; Datcu, M.; Pelillo, M.; Zhang, L.P. A Large-scale Dataset for Object Detection in Aerial Images. In Proceedings of the 31st IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 18–23 June 2018; pp. 3974–3983. [Google Scholar]
- Akarsh, A.; Manoj, K. Image surface texture analysis and classification using deep learning. Multimed. Tools Appl. 2020, 80, 1289–1309. [Google Scholar] [CrossRef]
- Castelluccio, M.; Poggi, G.; Sansone, C.; Verdoliva, L. Land Use Classification in Remote Sensing Images by Convolutional Neural Networks. arXiv 2015, arXiv:1508.00092. [Google Scholar]
- Emmanuel, M.; Yuliya, T.; Guillaume, C.; Pierre, A. Convolutional Neural Networks for Large-Scale Remote-Sensing Image Classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 645–657. [Google Scholar] [CrossRef] [Green Version]
- Paoletti, M.E.; Haut, J.M.; Plaza, J.; Plaza, A. A new deep convolutional neural network for fast hyperspectral image classification. ISPRS J. Photogramm. Remote Sens. 2017, 145, 120–147. [Google Scholar] [CrossRef]
- Tao, H.; Li, W.; Qin, X.; Wang, P.; Yu, W.; Li, J. Terrain Classification of Polarimetric Synthetic Aperture Radar Images Based on Deep Learning and Conditional Random Field Model. J. Radars 2019, 8, 471–478. [Google Scholar] [CrossRef]
- Fan, Y.; Ding, X.; Wu, J.; Ge, J.; Li, Y. High spatial-resolution classification of urban surfaces using a deep learning method. Build. Environ. 2021, 200, 107949. [Google Scholar] [CrossRef]
- Cheng, G.; Han, J.W.; Lu, X.Q. Remote Sensing Image Scene Classification: Benchmark and State of the Art. Proc. IEEE 2017, 105, 1865–1883. [Google Scholar] [CrossRef] [Green Version]
- Ma, L.; Liu, Y.; Zhang, X.L.; Ye, Y.X.; Yin, G.F.; Johnson, B.A. Deep learning in remote sensing applications: A meta-analysis and review. ISPRS J. Photogramm. Remote Sens. 2019, 152, 166–177. [Google Scholar] [CrossRef]
- Luo, L.H.; Li, F.Y.; Dai, Z.Y.; Yang, X.; Liu, W.; Fang, X. Terrace extraction based on remote sensing images and digital elevation model in the loess plateau, China. Earth Sci. Inform. 2020, 13, 433–446. [Google Scholar] [CrossRef]
- Zhao, F.; Xiong, L.Y.; Wang, C.; Wang, H.R.; Wei, H.; Tang, G.A. Terraces mapping by using deep learning approach from remote sensing images and digital elevation models. Trans. GIS 2021, 25, 2438–2454. [Google Scholar] [CrossRef]
- Blaes, X.; Vanhalle, L.; Defourny, P. Efficiency of crop identification based on optical and SAR image time series. Remote Sens. Environ. 2005, 96, 352–365. [Google Scholar] [CrossRef]
- Wu, B.; Li, Q. Crop planting and type proportion method for crop acreage estimation of complex agricultural landscapes. Int. J. Appl. Earth Obs. Geoinf. 2011, 16, 101–112. [Google Scholar] [CrossRef]
- Haest, B.; Borre, J.V.; Spanhove, T.; Thoonen, G.; Delalieux, S.; Kooistra, L.; Mücher, C.A.; Paelinckx, D.; Scheunders, P.; Kempeneers, P. Habitat Mapping and Quality Assessment of NATURA 2000 Heathland Using Airborne Imaging Spectroscopy. Remote Sens. 2017, 9, 266. [Google Scholar] [CrossRef] [Green Version]
- Kristin, F.; Hannes, F.; Michael, F.; Marion, S.; Björn, W. Hierarchical classification with subsequent aggregation of heathland habitats using an intra-annual RapidEye time-series. Int. J. Appl. Earth Obs. Geoinf. 2019, 87, 102036. [Google Scholar] [CrossRef]
- Sun, Y.; Luo, J.; Xia, L.; Wu, T.; Gao, L.; Dong, W.; Hu, X.; Hai, Y. Geo-parcel-based crop classification in very-high-resolution images via hierarchical perception. Int. J. Remote Sens. 2020, 41, 1603–1624. [Google Scholar] [CrossRef]
- Wang, X.; Ma, H.; Chen, X.; You, S. Edge Preserving and Multi-Scale Contextual Neural Network for Salient Object Detection. IEEE Trans. Image Processing 2018, 27, 121–134. [Google Scholar] [CrossRef] [Green Version]
- Liu, Y.; Cheng, M.-M.; Hu, X.; Wang, K.; Bai, X. Richer Convolutional Features for Edge Detection. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 41, 1939–1946. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tun, N.L.; Gavrilov, A.; Tun, N.M.; Trieu, D.; Aung, H. Remote Sensing Data Classification Using A Hybrid Pre-Trained VGG16 CNN-SVM Classifier. In Proceedings of the IEEE Conference of Russian Young Researchers in Electrical and Electronic Engineering (ElConRus), Saint Petersburg Electrotechn University, Saint Petersburg, Russia, 26–28 January 2021; pp. 2171–2175. [Google Scholar]
- Long, J.; Shelhamer, E.; Darrell, T. Fully Convolutional Networks for Semantic Segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- He, K.M.; Zhang, X.Y.; Ren, S.Q.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Romera, E.; Alvarez, J.M.; Bergasa, L.M.; Arroyo, R. ERFNet: Efficient Residual Factorized ConvNet for Real-Time Semantic Segmentation. IEEE Trans. Intell. Transp. Syst. 2018, 19, 263–272. [Google Scholar] [CrossRef]
- Li, G.; Yun, I.; Kim, J.; Kim, J. DABNet: Depth-wise Asymmetric Bottleneck for Real-time Semantic Segmentation. arXiv 2019, arXiv:1907.11357. [Google Scholar]
- Ren, T.; Liu, Z.; Zhang, L.; Liu, D.; Xi, X.; Kang, Y.; Zhao, Y.; Zhang, C.; Li, S.; Zhang, X. Early Identification of Seed Maize and Common Maize Production Fields Using Sentinel-2 Images. Remote Sens. 2020, 12, 2140. [Google Scholar] [CrossRef]
- Varlamova, E.V.; Solovyev, V.S. Investigation of Eastern Siberia vegetation index variations on long-term satellite data. Atmos. Ocean Opt. 2018, 10833, 108338C. [Google Scholar] [CrossRef]
- Richetti, J.; Judge, J.; Boote, K.J.; Johann, J.A.; Uribe-Opazo, M.A.; Becker, W.R.; Paludo, A.; Silva, L.C.D. Using phenology-based enhanced vegetation index and machine learning for soybean yield estimation in Parana State, Brazil. J. Appl. Remote Sens. 2018, 12, 026029. [Google Scholar] [CrossRef]
- Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
- Li, T.T.; Wang, Y.F.; Liu, C.Q.; Tu, S.S. Research on Identification of Multiple Cropping Index of Farmland and Regional Optimization Scheme in China Based on NDVI Data. Land 2021, 10, 861. [Google Scholar] [CrossRef]
- Ndikumana, E.; Minh, D.H.T.; Baghdadi, N.; Courault, D.; Hossard, L. Deep Recurrent Neural Network for Agricultural Classification using multitemporal SAR Sentinel-1 for Camargue, France. Remote Sens. 2018, 10, 1217. [Google Scholar] [CrossRef] [Green Version]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
- Arun, P.; Karnieli, A. Deep Learning-Based Phenological Event Modeling for Classification of Crops. Remote Sens. 2021, 13, 2477. [Google Scholar] [CrossRef]
- Russwurm, M.; Korner, M. Temporal Vegetation Modelling using Long Short-Term Memory Networks for Crop Identification from Medium-Resolution Multi-Spectral Satellite Images. In Proceedings of the 30th IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, USA, 21–26 July 2017; pp. 1496–1504. [Google Scholar]
- Pan, Y.Z.; Hu, T.G.; Zhu, X.F.; Zhang, J.S.; Wang, X.D. Mapping Cropland Distributions Using a Hard and Soft Classification Model. IEEE Trans. Geosci. Remote Sens. 2012, 50, 4301–4312. [Google Scholar] [CrossRef]
- Shuai, G.Y.; Zhang, J.S.; Basso, B.; Pan, Y.Z.; Zhu, X.F.; Zhu, S.; Liu, H.L. Multi-temporal RADARSAT-2 polarimetric SAR for maize mapping supported by segmentations from high-resolution optical image. Int. J. Appl. Earth Obs. Geoinf. 2019, 74, 1–15. [Google Scholar] [CrossRef]
Geographic Area | Farmland Type | Features |
---|---|---|
Plain Area | Greenhouse parcels | Regular shape, clear boundary, distributed in plain areas and different from the surrounding crop background. |
Regular parcels | Plain parcels, regular shape, clear boundary, uniform internal texture and uniform area. | |
Mountain Area | Terrace parcels | Long and narrow shape with uniform width, clear boundary, uniform internal texture and regular arrangement. |
Slope parcels | Fuzzy boundary, uniform internal texture, irregular shape, irregularly distributed on the hillside, great difference in area and mostly mixed parcels in Miaohou Town. |
Parcel Type | Number | Total Area (m2) | Average Area (m2) |
---|---|---|---|
Regular parcels | 3151 | 8,104,969 | 2572 |
Greenhouse parcels | 207 | 605,478 | 2925 |
Slope parcels | 903 | 22,332,936 | 24,732 |
Terrace parcels | 8962 | 10,033,835 | 1120 |
Total | 13,223 | 41,077,218 | 3106 |
Apple | Cherry | Greenhouse | Non-Orchard | |
---|---|---|---|---|
Apple | 163 | 24 | 0 | 10 |
Cherry | 12 | 337 | 0 | 1 |
Greenhouse | 0 | 1 | 191 | 3 |
Non-orchard | 16 | 18 | 2 | 467 |
Producer accuracy | 85.34% | 88.68% | 98.96% | 97.08% |
User accuracy | 82.74% | 96.29% | 97.95% | 92.84% |
Overall accuracy | 93.01% | |||
Kappa | 0.9015 |
Parcel Type | Number | Area (m2) |
---|---|---|
Pure apple orchard parcel | 1084 | 380,784 |
Pure cherry orchard parcel | 4163 | 6,200,552 |
Greenhouse | 207 | 605,478 |
Non-orchard parcel | 3262 | 5,435,964 |
Mixed parcel | 4507 | 28,454,440 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jiao, S.; Hu, D.; Shen, Z.; Wang, H.; Dong, W.; Guo, Y.; Li, S.; Lei, Y.; Kou, W.; Wang, J.; et al. Parcel-Level Mapping of Horticultural Crop Orchards in Complex Mountain Areas Using VHR and Time-Series Images. Remote Sens. 2022, 14, 2015. https://doi.org/10.3390/rs14092015
Jiao S, Hu D, Shen Z, Wang H, Dong W, Guo Y, Li S, Lei Y, Kou W, Wang J, et al. Parcel-Level Mapping of Horticultural Crop Orchards in Complex Mountain Areas Using VHR and Time-Series Images. Remote Sensing. 2022; 14(9):2015. https://doi.org/10.3390/rs14092015
Chicago/Turabian StyleJiao, Shuhui, Dingxiang Hu, Zhanfeng Shen, Haoyu Wang, Wen Dong, Yifei Guo, Shuo Li, Yating Lei, Wenqi Kou, Jian Wang, and et al. 2022. "Parcel-Level Mapping of Horticultural Crop Orchards in Complex Mountain Areas Using VHR and Time-Series Images" Remote Sensing 14, no. 9: 2015. https://doi.org/10.3390/rs14092015
APA StyleJiao, S., Hu, D., Shen, Z., Wang, H., Dong, W., Guo, Y., Li, S., Lei, Y., Kou, W., Wang, J., He, H., & Fang, Y. (2022). Parcel-Level Mapping of Horticultural Crop Orchards in Complex Mountain Areas Using VHR and Time-Series Images. Remote Sensing, 14(9), 2015. https://doi.org/10.3390/rs14092015