Seeing the Trees from Above: A Survey on Real and Synthetic Agroforestry Datasets for Remote Sensing Applications
Abstract
Highlights
- Fields such as remote sensing, statistical methods, and deep learning rely on the availability of high-quality tree datasets for effective analysis.
- Publicly available high-resolution aerial tree datasets, highlighting their characteristics and alignment with FAIR principles.
- Identifying and evaluating these datasets can support future research in tree monitoring and management at scale.
- Both the availability of datasets and their alignment with FAIR principles are crucial for model development and the advancement of analytical methods in agroforestry.
Abstract
1. Introduction
2. Methodology
- Ultra-High Resolution (UHR): 1 cm to 5 cm ground sampling distance (GSD).This level of detail allows individual leaves, small branches, and tree crowns to be visible, which is useful for forest monitoring tasks like precise species identification, disease detection, and structural analysis. Drones are typically capable of achieving this GSD due to their ability to fly at low altitudes with high-quality sensors.
- High Resolution: 5 cm to 30 cm GSD.This range is common for manned aircraft and some higher-altitude drone operations. Individual tree crowns are distinguishable, but smaller details within the canopy are often less clear.
FAIRness Analysis
3. Datasets
3.1. Published Datasets
- Application: This parameter refers to the specific use or purpose of a dataset according to the dataset author, offering insights for individuals seeking tailored or annotated data suited for particular applications. Notably, this criterion operates as the subsequent categorization factor following the distinction between published and unpublished datasets. Table 5 shows the applications of the identified datasets, described ahead in this section.Figure 3 shows the distribution of use case applications in the identified datasets. Apart from miscellaneous, the most frequent applications include ITCD, SI, and SS.
- Source: Describing the data’s structure and format. The term “sensor” is intentionally omitted in this context to encompass various data acquisition methods; for instance, point cloud data can be obtained through laser scanning or generated via structure from motion, namely, the photogrammetric point clouds (PPCs). Figure 4 demonstrates the distribution of data source of the identified datasets. The most-used source is the RGB camera, followed by the multispectral camera and LiDAR (Light Detection and Ranging). It must be mentioned that thermal cameras are regularly used in forestry-related studies [7]; however, none of the datasets we found present purely thermal-spectrum data, and any dataset with thermal band modality is included in the multispectral category.
- Platform: Refers to the remote sensing platform that carries the sensor payload. Understanding the platform is crucial as it determines the data perspective, resolution (not definitive), and the feasibility of data collection in diverse environmental conditions. It significantly influences the quality and scope of the acquired data, impacting subsequent analysis and applications. Figure 5 shows distribution of platforms carrying the sensors in selected datasets. The graph highlights the predominant platforms utilized for sensor deployment, with UAVs being the most extensively employed, followed by piloted aircraft and satellites. While our filtering criteria prioritize aerial perspective datasets, the graph also encompasses terrestrial point-of-view (POV) data. This inclusion is prompted by instances where datasets offer both aerial and terrestrial data as complementary sources, thus meeting the eligibility criteria. Moreover, the presence of three instances of synthetic data is worthy of note. Two of these particular datasets offer synthesized data from a UAV’s POV, and the third offers 3D models of trees, which can be used from any POV; therefore, they are presented under the UAV category. Further insights into synthetic datasets are explored in more detail in Section 3.1.11.
- Dataset Size: Indicates the volume or scale of the dataset, which plays a pivotal role in various aspects of analysis and application. Understanding the dataset size is fundamental as it influences computational requirements, model training, and the feasibility of certain analytical approaches.As an example, larger datasets may demand advanced computational resources, while smaller ones might be more manageable for specific analyses or applications. Assessing dataset size aids in determining the data’s comprehensiveness and potential limitations, guiding researchers in selecting appropriate methodologies and tools for analysis.
- Annotation: Represents the level of labeling, tagging, or additional information embedded within the dataset. Annotation serves as a crucial aspect, especially in supervised learning or applications requiring labeled data, or even just to assess the developed algorithm performance. Understanding the annotation level informs researchers about the data’s usability for specific tasks such as object detection, classification, or segmentation. Higher levels of annotation often indicate enhanced data richness but may also involve increased resource investment. Assessing annotation levels guides the selection of datasets aligned with the required granularity for precise analysis or model training. Figure 6 shows distribution of different annotation types in the selected datasets across the years they were published. Among the identified datasets, a predominant observation is the prevalence of unannotated data. When annotations are present, semantic-level annotations are the most frequently encountered type, followed by polygons and bounding box annotation. The distinction between polygon and semantic-level annotation arises from the nuanced application of polygons in certain cases, where they are utilized to achieve a higher degree of precision in object detection techniques compared to the more common use of bounding boxes.
- Open Access Status: Indicates whether the dataset is publicly available or restricted in its accessibility. This parameter holds immense significance in fostering collaboration, reproducibility, and the advancement of research. Open access datasets promote transparency and facilitate broader utilization by the scientific community, accelerating innovation and knowledge dissemination. Understanding the open access status aids researchers in identifying available resources for validation, comparison, or extension of existing studies. Moreover, it influences the reproducibility and credibility of research findings, emphasizing the importance of accessible data in driving scientific progress.
- Country: Denotes two key parameters, which are the country of corresponding author’s affiliated institution and the country associated with the dataset’s geographical extent. Understanding the author’s country provides insights into regional focuses, environmental contexts, or unique challenges that might impact the study’s scope or applicability. Simultaneously, recognizing the country representing the geographical extent of the dataset contributes to the broader understanding of global perspectives and diverse approaches within the field, highlighting regional contributions and fostering collaborative opportunities across different geographical areas. Figure 7 depicts the distribution of published datasets by the author’s country, showcasing the prevalence of dataset contributions. Germany leads with 10 published datasets, followed by the United States and China with 9 and 7, respectively. Figure 8 shows the distribution of published datasets by the geographical extent of the dataset. The most frequent countries are the United States with 9, Germany with 8, followed by China and New Zealand each with 5.Examining the parameters “Corresponding Author’s Country” (CA) and “Geographical Extent Country” (GE), it is evident that they exhibit congruent coverage, indicating a substantial overlap in the regions they represent.
3.1.1. Individual Tree Crown Delineation
3.1.2. Individual Tree Detection
3.1.3. Semantic Segmentation
3.1.4. Disease Detection
3.1.5. Species Identification
3.1.6. Geometrical Measurement
3.1.7. Carbon Stock Estimation
3.1.8. Health Status Analysis
3.1.9. Phenology Monitoring
3.1.10. Miscellaneous
3.1.11. Synthetic Data
3.2. Unpublished Data
3.3. Repositories
- Operating Country: Understanding the operational location of a repository sheds light on regional focus, legal considerations, and potential data limitations based on geographical boundaries or regulations.
- Size Limit: This parameter delineates the maximum capacity for data storage within a repository, crucial for users assessing data contribution or access possibilities.
- Number of Datasets: Reflecting the repository’s scale and diversity, a higher count often indicates a broader range of available datasets, appealing to a larger user base.
- Establishment Date: Signifying the repository’s maturity and experience, the creation date offers insights into its reliability and history of dataset curation.
- Geographical Extent: Describing the coverage area of datasets, this parameter assists researchers in evaluating the data’s relevance to specific regions or global applicability.
4. Conclusions
- UAVs are the dominant platform when it comes to HR and UHR data: The deployment of UAVs has witnessed a remarkable and ever-growing prominence in the data acquisition phase for forestry-related studies. These aerial platforms offer a versatile and efficient means to gather high-resolution and comprehensive datasets, pushing the envelope of conventional approaches to forest monitoring. UAVs equipped with various sensors, such as RGB, multispectral, hyperspectral cameras, LiDAR, and thermal sensors, provide researchers with a diverse array of data modalities. The ability to navigate through challenging terrains and rapidly capture detailed information about forest ecosystems makes drones indispensable tools in forestry research. Their cost-effectiveness, agility, and capacity to cover vast areas make them ideal for tasks ranging from tree species classification to monitoring forest health and assessing environmental changes over time. As technology continues to advance, the utilization of UAVs in forestry studies is poised to expand further, fostering more precise and comprehensive insights into the dynamics of forest ecosystems.
- There are more unpublished data than published: The visible imbalance between the amount of published and unpublished data uncovered through our study underscores a challenge in forestry-related remote sensing datasets. It becomes evident that a substantial amount of valuable data remains unshared, inaccessible to the wider research community. While the past years have witnessed a growth in the publication of forestry-related remote sensing data, there remains a pressing need for increased efforts to motivate researchers to share their data openly. The prevalence of unpublished data suggests untapped potential and unexplored possibilities for advancing forestry, robotics, and computer vision research. Initiatives promoting data sharing, fostering collaborative platforms, and highlighting the benefits of open data practices are essential to bridge this gap. Encouraging a culture of data transparency and openness is crucial to harness the full potential of available datasets, fostering innovation and contributing to a more comprehensive understanding of forest ecosystems.
- The scarcity of co-aligned datasets: The scarcity of co-aligned data with different modalities is a notable gap in forestry-related datasets. Pursuing a comprehensive understanding of forest ecosystems requires the fusion of data from various sources and modalities. Integrated datasets, encompassing information captured through diverse sensors like RGB, multispectral and hypespectral cameras, and LiDAR sensors, offer a holistic perspective on forest characteristics. Data fusion not only enhances the accuracy of forestry analyses but also enables the extraction of richer insights through the collaborative utilization of complementary information. In machine learning, particularly deep learning and neural networks, the integration of different data modalities becomes imperative. Techniques such as attention mechanisms benefit significantly from diverse data sources, allowing models to focus on relevant features and relationships. Addressing the scarcity of co-aligned data can help advancing research methodologies, optimizing the performance of machine learning algorithms, and unlocking the full potential of data-driven approaches in forestry studies and more.
- Synthetic data have immense potential: Synthetic data emerge as a transformative solution in addressing the complexities and limitations associated with current data acquisition methods in forestry research. By offering a controlled and versatile environment for generating datasets that mirror real-world scenarios, its role extends beyond mere supplementation. The controlled nature of synthetic data allows researchers to manipulate variables such as environmental conditions, tree species distribution, and terrain characteristics, providing a tailored and efficient means to curate datasets. One of its notable contributions is in the generation of annotation masks, which is essential for training machine learning models. Synthetic datasets facilitate the creation of diverse and precisely labeled datasets, streamlining the process of developing robust algorithms for tasks like object detection and classification. This comprehensive simulation extends to various sensors, enhancing the applicability of synthetic data in different modalities. As technology continues to advance, the integration of synthetic data into forestry research methodologies seems very promising in propelling the field toward more accurate and comprehensive insights into forestry. However, it is crucial to acknowledge that synthetic data are still in their early stages, requiring substantial further development and refinement for them to become a comprehensive and widely adopted solution in forestry research.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
AI | Artificial Intelligence |
CA | Corresponding Author |
CSE | Carbon Stock Estimation |
DB | Database |
DBR | Database Repository |
DD | Disease Detection |
DL | Deep Learning |
DS | Dataset |
GB | Gigabyte |
GE | Geographical Extent |
GM | Geometrical Measurement |
GSD | Ground Sampling Distance |
HSA | Health Status Analysis |
HSI | Hyperspectral Imaging |
ITCD | Individual Tree Crown Delineation |
ITD | Individual Tree Detection |
LiDAR | Light Detection and Ranging |
Misc. | Miscellaneous |
ML | Machine Learning |
MSI | Multispectral Imaging |
N/A | Not Available |
PM | Phenology Monitoring |
POV | Point Of View |
PPC | Photogrammetric Point Cloud |
RGB | Red Green Blue |
SI | Species Identification |
SS | Semantic Segmentation |
UAV | Unmanned Aerial Vehicle |
UHR | Ultra-High Resolution |
References
- Okorie, N.; Aba, S.; Amu, C.; Baiyeri, K. The role of trees and plantation agriculture in mitigating global climate change. Afr. J. Food Agric. Nutr. Dev. 2017, 17, 12691–12707. [Google Scholar] [CrossRef]
- Prevedello, J.A.; Almeida-Gomes, M.; Lindenmayer, D.B. The importance of scattered trees for biodiversity conservation: A global meta-analysis. J. Appl. Ecol. 2018, 55, 205–214. [Google Scholar] [CrossRef]
- Seth, M.K. Trees and their economic importance. Bot. Rev. 2003, 69, 321–376. [Google Scholar] [CrossRef]
- Turner-Skoff, J.B.; Cavender, N. The benefits of trees for livable and sustainable communities. Plants People Planet 2019, 1, 323–335. [Google Scholar] [CrossRef]
- Nyyssönen, A. An international review of forestry and forest products. Unasylva 1962, 16, 5–14. [Google Scholar]
- Toth, C.; Jóźków, G. Remote sensing platforms and sensors: A survey. ISPRS J. Photogramm. Remote Sens. 2016, 115, 22–36. [Google Scholar] [CrossRef]
- Chehreh, B.; Moutinho, A.; Viegas, C. Latest Trends on Tree Classification and Segmentation Using UAV Data—A Review of Agroforestry Applications. Remote Sens. 2023, 15, 2263. [Google Scholar] [CrossRef]
- Cotrozzi, L. Spectroscopic detection of forest diseases: A review (1970–2020). J. For. Res. 2022, 33, 21–38. [Google Scholar] [CrossRef]
- Fassnacht, F.E.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.T.; Straub, C.; Ghosh, A. Review of studies on tree species classification from remotely sensed data. Remote Sens. Environ. 2016, 186, 64–87. [Google Scholar] [CrossRef]
- Tian, L.; Wu, X.; Tao, Y.; Li, M.; Qian, C.; Liao, L.; Fu, W. Review of Remote Sensing-Based Methods for Forest Aboveground Biomass Estimation: Progress, Challenges, and Prospects. Forests 2023, 14, 1086. [Google Scholar] [CrossRef]
- Lu, Y.; Young, S. A survey of public datasets for computer vision tasks in precision agriculture. Comput. Electron. Agric. 2020, 178, 105760. [Google Scholar] [CrossRef]
- Wen, D.; Khan, S.M.; Xu, A.J.; Ibrahim, H.; Smith, L.; Caballero, J.; Zepeda, L.; de Blas Perez, C.; Denniston, A.K.; Liu, X.; et al. Characteristics of publicly available skin cancer image datasets: A systematic review. Lancet Digit. Health 2022, 4, e64–e74. [Google Scholar] [CrossRef] [PubMed]
- Yavanoglu, O.; Aydos, M. A Review on Cyber Security Datasets for Machine Learning Algorithms. In Proceedings of the 2017 IEEE International Conference on Big Data (Big Data), Boston, MA, USA, 11–14 December 2017; pp. 2186–2193. [Google Scholar]
- Ozdarici-ok, A.; Ok, A.O. Using remote sensing to identify individual tree species in orchards: A review. Sci. Hortic. 2023, 321, 112333. [Google Scholar] [CrossRef]
- Ecke, S.; Dempewolf, J.; Frey, J.; Schwaller, A.; Endres, E.; Klemmt, H.J.; Tiede, D.; Seifert, T. UAV-Based Forest Health Monitoring: A Systematic Review. Remote Sens. 2022, 14, 3205. [Google Scholar] [CrossRef]
- Wilkinson, M.D.; Dumontier, M.; Aalbersberg, I.J.; Appleton, G.; Axton, M.; Baak, A.; Blomberg, N.; Boiten, J.W.; da Silva Santos, L.B.; Bourne, P.E.; et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci. Data 2016, 3, 160018. [Google Scholar] [CrossRef]
- GO FAIR Initiative. 2023. Available online: https://www.go-fair.org/ (accessed on 15 November 2023).
- Asa Research Station. UAV—Multispectral Orthomosaic from Nybygget, 2018-05-04. 2021. Available online: https://hdl.handle.net/11676.1/mVvMKy2o4mTDYanOrJXCQVEV (accessed on 18 November 2023).
- Asa Research Station. UAV—Multispectral Point Cloud from Nybygget, 2018-04-25. 2021. Available online: https://hdl.handle.net/11676.1/1PBN9iyK3Y2n2pK3J3nDH2vI (accessed on 18 November 2023).
- Breunig, F.M. Unmanned Aerial Vehicle (UAV) data acquired over an experimental area of the UFSM campus Frederico Westphalen on December 13, 2017 in Rio Grande do Sul, Brazil. 2021. [Google Scholar] [CrossRef]
- Breunig, F.M.; Rieder, E. Unmanned Aerial Vehicle (UAV) data acquired over a subtropical forest area of the UFSM campus Frederico Westphalen, at February 18, 2021, Rio Grande do Sul, Brazil. 2021. [Google Scholar] [CrossRef]
- Chaschatzis, C.; Siniosoglou, I.; Triantafyllou, A.; Karaiskou, C.; Liatifis, A.; Radoglou-Grammatikis, P.; Pliatsios, D.; Kelli, V.; Lagkas, T.; Argyriou, V.; et al. Cherry Tree Disease Detection Dataset. 2022. [Google Scholar] [CrossRef]
- Chaschatzis, C.; Siniosoglou, I.; Triantafyllou, A.; Karaiskou, C.; Liatifis, A.; Radoglou-Grammatikis, P.; Pliatsios, D.; Kelli, V.; Lagkas, T.; Argyriou, V.; et al. Peach Tree Disease Detection Dataset. 2022. [Google Scholar] [CrossRef]
- Gonçalves, V.P.; Ribeiro, E.A.W.; Imai, N.N. Mapping Areas Invaded by Pinus sp. from Geographic Object-Based Image Analysis (GEOBIA) Applied on RPAS (Drone) Color Images. Remote Sens. 2022, 14, 2805. [Google Scholar] [CrossRef]
- Jansen, A.; Nicholson, J.; Esparon, A.; Whiteside, T.; Welch, M.; Tunstill, M.; Paramjyothi, H.; Gadhiraju, V.; van Bodegraven, S.; Bartolo, R. A Deep Learning Dataset for Savanna Tree Species in Northern Australia. 2022. [Google Scholar] [CrossRef]
- Kentsch, S.; Diez, Y. 2 Datasets of Forests for Computer Vision and Deep Learning Techniques. 2020. [Google Scholar] [CrossRef]
- Kleinsmann, J.; Verbesselt, J.; Kooistra, L. Monitoring Individual Tree Phenology in a Multi-Species Forest Using High Resolution UAV Images. Remote Sens. 2023, 15, 3599. [Google Scholar] [CrossRef]
- Morley, P.; Jump, A.; Donoghue, D. Co-Aligned Hyperspectral and LiDAR Data Collected in Drought-Stressed European Beech Forest, Rhön Biosphere Reserve, Germany, 2020. 2023. [Google Scholar] [CrossRef]
- Morley, P.; Jump, A.; Donoghue, D. Unmanned Aerial Vehicle Images of Drought-stressed European Beech Forests in the Rhön Biosphere Reserve, Germany, 2020. 2023. [Google Scholar] [CrossRef]
- Mõttus, M.; Markiet, V.; Hernández-Clemente, R.; Perheentupa, V.; Majasalmi, T. SPYSTUF Hyperspectral Data. 2021. [Google Scholar] [CrossRef]
- Puliti, S.; Pearse, G.; Surový, P.; Wallace, L.; Hollaus, M.; Wielgosz, M.; Astrup, R. FOR-instance: A UAV Laser Scanning Benchmark Dataset for Semantic and Instance Segmentation of Individual Trees. 2023. [Google Scholar] [CrossRef]
- Rajesh, C.B.; Changalagari, M.; Jha, S.; Ramachandran, K.I.; Nidamanuri, R.R. Agriculture_Forest_Hyperspectral_Data. Mendeley Data, V1. 2023. Available online: https://data.mendeley.com/datasets/p7n6ktjdx7/1 (accessed on 18 November 2023).
- Reiersen, G.; Dao, D.; Lütjens, B.; Klemmer, K.; Amara, K.; Zhang, C.; Zhu, X. ReforesTree: A Dataset for Estimating Tropical Forest Carbon Stock with Deep Learning and Aerial Imagery. 2022. [Google Scholar]
- Rock, G.; Mölter, T.; Deffert, P.; Dzunic, F.; Giese, L.; Rommel, E.; Kathoefer, F. mDRONES4rivers-project: UAV-Imagery of the Project Area Kuehkopf Knoblochsaue at the Rhine River, Germany. 2022. [Google Scholar] [CrossRef]
- Rock, G.; Mölter, T.; Deffert, P.; Dzunic, F.; Giese, L.; Rommel, E.; Kathoefer, F. mDRONES4rivers-project: UAV-Imagery of the Project area Nonnenwerth at the Rhine River, Germany. 2022. [Google Scholar] [CrossRef]
- Saarinen, N.; Kankare, V.; Yrttimaa, T.; Viljanen, N.; Honkavaara, E.; Holopainen, M.; Hyyppä, J.; Huuskonen, S.; Hynynen, J.; Vastaranta, M. Detailed Point Cloud Data on Stem Size and Shape of Scots Pine Trees. 2020. [Google Scholar] [CrossRef]
- Schürholz, D.; Castellanos-Galindo, G.A.; Casella, E.; Mejía-Rentería, J.C.; Chennu, A. Seeing the Forest for the Trees: Mapping Cover and Counting Trees from Aerial Images of a Mangrove Forest Using Artificial Intelligence. Remote Sens. 2023, 15, 3334. [Google Scholar] [CrossRef]
- van Geffen, F.; Brieger, F.; Pestryakova, L.A.; Zakharov, E.S.; Herzschuh, U.; Kruse, S. SiDroForest: Synthetic Siberian Larch Tree Crown Dataset of 10.000 Instances in the Microsoft’s Common Objects in Context Dataset (Coco) Format. 2021. [Google Scholar] [CrossRef]
- Varney, N.; Asari, V.K.; Graehling, Q. DALES: A Large-scale Aerial LiDAR Data Set for Semantic Segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 13–19 June 2020; pp. 186–187. [Google Scholar]
- Weintein, B. DeepForest—Street Trees Dataset. 2020. [Google Scholar] [CrossRef]
- Yang, R.; Fang, W.; Sun, X.; Jing, X.; Fu, L.; Wei, X.; Li, R. An Aerial Point Cloud Dataset of Apple Tree Detection and Segmentation with Integrating RGB Information and Coordinate Information. 2023. [Google Scholar] [CrossRef]
- Ye, H.; Chen, S.; Guo, A.; Nie, C.; Wang, J. A Dataset of UAV Multispectral Images for Banana Fusarium Wilt Survey. 2023. [Google Scholar] [CrossRef]
- Lefebvre, I.; Laliberté, E. UAV LiDAR, UAV Imagery, Tree Segmentations and Ground Measurements for Estimating Tree Biomass in Canadian (Quebec) Plantations. 2024. [Google Scholar] [CrossRef]
- Ramesh, V.; Ouaknine, A.; Rolnick, D. Forest Monitoring Dataset. 2024. [Google Scholar] [CrossRef]
- Schulz, C.; Ahlswede, S.; Gava, C.; Helber, P.; Bischke, B.; Arias, F.; Förster, M.; Hees, J.; Demir, B.; Kleinschmit, B. TreeSatAI Benchmark Archive for Deep Learning in Forest Applications. 2022. [Google Scholar] [CrossRef]
- Veitch-Michaelis, J.; Cottam, A.; Schweizer, D.; Broadbent, E.N.; Dao, D.; Zhang, C.; Zambrano, A.A.; Max, S. OAM-TCD: A globally diverse dataset of high-resolution tree cover maps. Adv. Neural Inf. Process. Syst. 2024, 37, 49749–49767. [Google Scholar]
- Beloiu Schwenke, M.; Xia, Z.; Novoselova, I.; Gessler, A.; Kattenborn, T.; Mosig, C.; Puliti, S.; Waser, L.; Rehush, N.; Cheng, Y.; et al. TreeAI Global Initiative—Advancing Tree Species Identification from aerial Images with Deep Learning. 2025. [Google Scholar] [CrossRef]
- Zhang, J.; Lei, F.; Fan, X. Parameter-Efficient Fine-Tuning for Individual Tree Crown Detection and Species Classification Using UAV-Acquired Imagery. Remote Sens. 2025, 17, 1272. [Google Scholar] [CrossRef]
- Niedz, R.; Bowman, K.D. UAV Image and Ground Data of Two Citrus ‘Valencia’ orange (Citrus sinensis [L.] Osbeck) Rootstock Trials. 2024. [Google Scholar] [CrossRef]
- Degenhardt, D. UAV-Based LiDAR and Multispectral Point Cloud Data for Reclaimed Wellsites in Alberta, Canada. 2025. Available online: https://github.com/NRCan/TreeAIBox (accessed on 17 July 2025).
- Pacheco-Prado, D.; Bravo-López, E.; Martínez, E.; Ruiz, L.Á. Urban Tree Species Identification Based on Crown RGB Point Clouds Using Random Forest and PointNet. Remote Sens. 2025, 17, 1863. [Google Scholar] [CrossRef]
- Popp, M.R.; Kalwij, J.M. Consumer-grade UAV imagery facilitates semantic segmentation of species-rich savanna tree layers. Sci. Rep. 2023, 13, 13892. [Google Scholar] [CrossRef]
- Allen, M.J.; Moreno-Fernández, D.; Ruiz-Benito, P.; Grieve, S.W.; Lines, E.R. Low-cost tree crown dieback estimation using deep learning-based segmentation. Environ. Data Sci. 2024, 3, e18. [Google Scholar] [CrossRef]
- Junttila, S.; Näsi, R.; Koivumäki, N.; Imangholiloo, M.; Saarinen, N.; Raisio, J.; Holopainen, M.; Hyyppä, H.; Hyyppä, J.; Lyytikäinen-Saarenmaa, P.; et al. Data for Estimating Spruce Tree Health Using Drone-Based RGB and Multispectral Imagery. 2024. [Google Scholar] [CrossRef]
- Gaydon, C.; Roche, F. PureForest: A Large-Scale Aerial Lidar and Aerial Imagery Dataset for Tree Species Classification in Monospecific Forests. arXiv 2024, arXiv:2404.12064. [Google Scholar]
- Mirela, B.; Lucca, H.; Nataliia, R.; Arthur, G.; Verena, G. Tree Species Annotations for Deep Learning. 2023. [Google Scholar] [CrossRef]
- Li, Y.; Qi, H.; Chen, H.; Liang, X.; Zhao, G. Deep Change Monitoring: A Hyperbolic Representative Learning Framework and a Dataset for Long-term Fine-grained Tree Change Detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 11–15 June 2025; pp. 27346–27356. [Google Scholar]
- Lee, J.J.; Li, B.; Beery, S.; Huang, J.; Fei, S.; Yeh, R.A.; Benes, B. Tree-D Fusion: Simulation-Ready Tree Dataset from Single Images with Diffusion Priors. In Computer Vision—ECCV 2024; Springer: Cham, Switzerland, 2025; pp. 439–460. [Google Scholar] [CrossRef]
- Tupinamba-Simoes, F.; Bravo, F. Virtual Forest Twins Based on Marteloscope Point Cloud Data from Different Forest Ecosystem Types and Its Associated Teaching Workload. 2023. [Google Scholar] [CrossRef]
- Lammoglia, S.K.; Danumah, J.H.; Akpa, Y.L.; Assoua Brou, Y.L.; Kassi, N.J. Dataset of RGB and Multispectral Aerial Images of Cocoa Agroforestry Plots (Divo, Côte d’Ivoire). 2024. [Google Scholar] [CrossRef]
- Jackisch, R. UAV-Based Lidar Point Clouds of Experimental Station Britz, Brandenburg. 2024. [Google Scholar] [CrossRef]
- Pirotti, F. LiDAR and Photogrammetry Point Clouds. 2023. [Google Scholar] [CrossRef]
- Agricultural Research Service. UAV Image and Ground Data of Citrus Bingo Mandarin Hybrid (Citrus reticulata Blanco) Rootstock Trial, Fort Pierce, Florida, USA. 2025. Available online: https://catalog.data.gov/dataset/uav-image-and-ground-data-of-citrus-bingo-mandarin-hybrid-icitrus-ireticulata-blanco-roots (accessed on 17 July 2024).
- Weinstein, B.; Marconi, S.; Zare, A.; Bohlman, S.; Graves, S.; Singh, A.; White, E. NEON Tree Crowns Dataset. 2020. [Google Scholar] [CrossRef]
- Boguszewski, A.; Batorski, D.; Ziemba-Jankowska, N.; Dziedzic, T.; Zambrzycka, A. LandCover.ai: Dataset for Automatic Mapping of Buildings, Woodlands, Water and Roads from Aerial Imagery. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 1102–1110. [Google Scholar]
- Gurgu, M.M.; Queralta, J.P.; Westerlund, T. Vision-Based GNSS-Free Localization for UAVs in the Wild. In Proceedings of the 2022 7th International Conference on Mechanical Engineering and Robotics Research (ICMERR), Krakow, Poland, 9–11 December 2022. [Google Scholar]
- Fonder, M.; Van Droogenbroeck, M. Mid-Air: A Multi-Modal Dataset for Extremely Low Altitude Drone Flights. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Long Beach, CA, USA, 15–20 June 2019; pp. 553–562. [Google Scholar] [CrossRef]
- TreeDataset. Tree-Top-View Dataset. 2023. Available online: https://universe.roboflow.com/treedataset-clsqo/tree-top-view (accessed on 19 October 2023).
- Kemker, R.; Salvaggio, C.; Kanan, C. Algorithms for semantic segmentation of multispectral remote sensing imagery using deep learning. ISPRS J. Photogramm. Remote Sens. 2018, 145, 60–77. [Google Scholar] [CrossRef]
- Parkan, M.; Junod, P.; Lugrin, R.; Ginzler, C. A Reference Airborne Lidar Dataset For Forest Research. 2018. [Google Scholar] [CrossRef]
- Project. Tree Counting Dataset. 2023. Available online: https://universe.roboflow.com/project-s402o/tree-counting-qiw3h (accessed on 19 October 2023).
- Ammar, A.; Koubaa, A. Aerial Images of Palm Trees. 2023. [Google Scholar] [CrossRef]
- Jones, A. Drone Imagery Mangrove Biomass Datasets. 2019. [Google Scholar] [CrossRef]
- (LINZ), L.I.N.Z. Southland 0.25 m Rural Aerial Photos (2023–2024). 2024. Available online: https://data.linz.govt.nz/layer/114532-southland-025m-rural-aerial-photos-2023-2024/ (accessed on 17 July 2024).
- (LINZ), L.I.N.Z. Auckland 0.25 m Rural Aerial Photos (2024). 2024. Available online: https://data.linz.govt.nz/layer/119795-auckland-025m-rural-aerial-photos-2024/ (accessed on 17 July 2024).
- (LINZ), L.I.N.Z. Gisborne 0.2 m Rural Aerial Photos (2023–2024). 2024. Available online: https://data.linz.govt.nz/layer/118823-gisborne-02m-rural-aerial-photos-2023-2024/ (accessed on 17 July 2024).
- (LINZ), L.I.N.Z. Gisborne 0.1 m Rural Aerial Photos (2023–2024). 2024. Available online: https://data.linz.govt.nz/layer/117730-gisborne-01m-rural-aerial-photos-2023-2024/ (accessed on 17 July 2024).
- Ahmadi, S.A.; Ghorbanian, A.; Golparvar, F.; Mohammadzadeh, A.; Jamali, S. Individual tree detection from unmanned aerial vehicle (UAV) derived point cloud data in a mixed broadleaf forest using hierarchical graph approach. Eur. J. Remote Sens. 2022, 55, 520–539. [Google Scholar] [CrossRef]
- Liu, M. LiDAR dataset of Forest Park. 2020. [Google Scholar] [CrossRef]
- Culman, M.; Delalieux, S.; Van Tricht, K. Individual Palm Tree Detection Using Deep Learning on RGB Imagery to Support Tree Inventory. Remote Sens. 2020, 12, 3476. [Google Scholar] [CrossRef]
- Pan, L. sUAV RGB Dataset. 2024. [Google Scholar] [CrossRef]
- Image AI Development. Forest Analysis Dataset. 2023. Available online: https://universe.roboflow.com/image-ai-development/forest-analysis (accessed on 17 July 2024).
- Arura UAV. UAV Tree Identification—NEW Dataset. 2023. Available online: https://universe.roboflow.com/arura-uav/uav-tree-identification-new (accessed on 19 October 2023).
- Kristensen, T. Mapping Above- and Below-Ground Carbon Pools in Boreal Forests: The Case for Airborne Lidar. Dataset. 2015. [Google Scholar] [CrossRef]
- Timilsina, S.; Aryal, J.; Kirkpatrick, J.B. Mapping Urban Tree Cover Changes Using Object-Based Convolution Neural Network (OB-CNN). Remote Sens. 2020, 12, 3017. [Google Scholar] [CrossRef]
- Guo, X.; Li, H.; Jing, L.; Wang, P. Individual Tree Species Classification Based on Convolutional Neural Networks and Multitemporal High-Resolution Remote Sensing Images. Sensors 2022, 22, 3157. [Google Scholar] [CrossRef]
- Weishaupt, M. Tree Species Classification from Very High-Resolution UAV Images Using Deep Learning: Integrating Approaches Using Individual Tree Crowns for Semantic Segmentation. Master’s Thesis, School of Life Science, Technical University of Munich, Munich, Germany, 20 December 2024. [Google Scholar]
- Yang, M.; Mou, Y.; Liu, S.; Meng, Y.; Liu, Z.; Li, P.; Xiang, W.; Zhou, X.; Peng, C. Detecting and mapping tree crowns based on convolutional neural network and Google Earth images. Int. J. Appl. Earth Obs. Geoinf. 2022, 108, 102764. [Google Scholar] [CrossRef]
- Man, Q.; Yang, X.; Liu, H.; Zhang, B.; Dong, P.; Wu, J.; Liu, C.; Han, C.; Zhou, C.; Tan, Z.; et al. Comparison of UAV-Based LiDAR and Photogrammetric Point Cloud for Individual Tree Species Classification of Urban Areas. Remote Sens. 2025, 17, 1212. [Google Scholar] [CrossRef]
- Gibril, M.B.A.; Shafri, H.Z.M.; Shanableh, A.; Al-Ruzouq, R.; bin Hashim, S.J.; Wayayok, A.; Sachit, M.S. Large-scale assessment of date palm plantations based on UAV remote sensing and multiscale vision transformer. Remote Sens. Appl. Soc. Environ. 2024, 34, 101195. [Google Scholar] [CrossRef]
- Al-Ruzouq, R.; Gibril, M.B.A.; Shanableh, A.; Bolcek, J.; Lamghari, F.; Hammour, N.A.; El-Keblawy, A.; Jena, R. Spectral–Spatial transformer-based semantic segmentation for large-scale mapping of individual date palm trees using very high-resolution satellite data. Ecol. Indic. 2024, 163, 112110. [Google Scholar] [CrossRef]
- Kwong, Q.B.; Kon, Y.T.; Rusik, W.R.W.; Shabudin, M.N.A.; Rahman, S.S.A.; Kulaveerasingam, H.; Appleton, D.R. Enhancing oil palm segmentation model with GAN-based augmentation. J. Big Data 2024, 11, 126. [Google Scholar] [CrossRef]
- Moysiadis, V.; Siniosoglou, I.; Kokkonis, G.; Argyriou, V.; Lagkas, T.; Goudos, S.K.; Sarigiannidis, P. Cherry Tree Crown Extraction Using Machine Learning Based on Images from UAVs. Agriculture 2024, 14, 322. [Google Scholar] [CrossRef]
- He, H.; Zhou, F.; Zhang, Y.; Chen, T.; Wei, Y. GACNet: A Geometric and Attribute Co-Evolutionary Network for Citrus Tree Height Extraction From UAV Photogrammetry-Derived Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2025, 18, 6363–6381. [Google Scholar] [CrossRef]
- Rodrıguez-Malpica, A.E.L.; Zouaghi, H.; Moshkenani, M.M.; Peng, W. Tree crown prediction of spruce tree using a machine-learning-based hybrid image processing method. Urban For. Urban Green. 2025, 107, 128815. [Google Scholar] [CrossRef]
- Yao, Z.; Chai, G.; Lei, L.; Jia, X.; Zhang, X. Individual Tree Species Identification and Crown Parameters Extraction Based on Mask R-CNN: Assessing the Applicability of Unmanned Aerial Vehicle Optical Images. Remote Sens. 2023, 15, 5164. [Google Scholar] [CrossRef]
- Htun, N.M.; Owari, T.; Tsuyuki, S.; Hiroshima, T. Mapping the Distribution of High-Value Broadleaf Tree Crowns through Unmanned Aerial Vehicle Image Analysis Using Deep Learning. Algorithms 2024, 17, 84. [Google Scholar] [CrossRef]
- Long, Y.; Ye, S.; Wang, L.; Wang, W.; Liao, X.; Jia, S. Scale Pyramid Graph Network for Hyperspectral Individual Tree Segmentation. IEEE Trans. Geosci. Remote Sens. 2024, 62, 1–14. [Google Scholar] [CrossRef]
- Amarasingam, N.; Hele, M.; Alvarez, F.V.; Warfield, A.; Trotter, P.; Granados, R.B.; Gonzalez, L.F. Broad-Leaved Pepper and Pandanus Classification and Mapping Using Drone Imagery—Proof of Concept Trial. In Proceedings of the 2nd Pest Animal and Weed Symposium 2023 (PAWS 2023), Dalby, QLD, Australia, 28–31 August 2023; pp. 196–208. [Google Scholar]
- Hayashi, Y.; Deng, S.; Katoh, M.; Nakamura, R. Individual tree canopy detection and species classification of conifers by deep learning. Jpn. J. For. Plan. 2021, 55, 3–22. [Google Scholar] [CrossRef]
- Ferreira, M.P.; de Almeida, D.R.A.; de Almeida Papa, D.; Minervino, J.B.S.; Veras, H.F.P.; Formighieri, A.; Santos, C.A.N.; Ferreira, M.A.D.; Figueiredo, E.O.; Ferreira, E.J.L. Individual tree detection and species classification of Amazonian palms using UAV images and deep learning. For. Ecol. Manag. 2020, 475, 118397. [Google Scholar] [CrossRef]
- Kouvaras, L.; Petropoulos, G.P. A Novel Technique Based on Machine Learning for Detecting and Segmenting Trees in Very High Resolution Digital Images from Unmanned Aerial Vehicles. Drones 2024, 8, 43. [Google Scholar] [CrossRef]
- Cheng, J.; Zhu, Y.; Zhao, Y.; Li, T.; Chen, M.; Sun, Q.; Gu, Q.; Zhang, X. Application of an improved U-Net with image-to-image translation and transfer learning in peach orchard segmentation. Int. J. Appl. Earth Obs. Geoinf. 2024, 130, 103871. [Google Scholar] [CrossRef]
- Ampatzidis, Y.; Partel, V. UAV-based High Throughput Phenotyping in Citrus Utilizing Multispectral Imaging and Artificial Intelligence. Remote Sens. 2019, 11, 410. [Google Scholar] [CrossRef]
- Hosingholizade, A.; Erfanifard, Y.; Alavipanah, S.K.; Millan, V.E.G.; Mielcarek, M.; Pirasteh, S.; Stereńczak, K. Assessment of Pine Tree Crown Delineation Algorithms on UAV Data: From K-Means Clustering to CNN Segmentation. Forests 2025, 16, 228. [Google Scholar] [CrossRef]
- Ma, J.; Yan, L.; Chen, B.; Zhang, L. A Tree Crown Segmentation Approach for Unmanned Aerial Vehicle Remote Sensing Images on Field Programmable Gate Array (FPGA) Neural Network Accelerator. Sensors 2025, 25, 2729. [Google Scholar] [CrossRef] [PubMed]
- Castro, W.; Avila-George, H.; Nauray, W.; Guerra, R.; Rodriguez, J.; Castro, J.; de-la Torre, M. Deep Classification of Algarrobo Trees in Seasonally Dry Forests of Peru Using Aerial Imagery. IEEE Access 2025, 13, 54960–54975. [Google Scholar] [CrossRef]
- Hao, Z.; Yao, S.; Post, C.J.; Mikhailova, E.A.; Lin, L. Comparative performance of convolutional neural networks for detecting and mapping a young Casuarina equisetifolia L. forest from unmanned aerial vehicle (UAV) imagery. New For. 2025, 56, 40. [Google Scholar] [CrossRef]
- Zhang, Y.; Wang, M.; Mango, J.; Xin, L.; Meng, C.; Li, X. Individual tree detection and counting based on high-resolution imagery and the canopy height model data. Geo-Spat. Inf. Sci. 2024, 27, 2162–2178. [Google Scholar] [CrossRef]
- Ozdarici-Ok, A.; Ok, A.O.; Zeybek, M.; Atesoglu, A. Automated extraction and validation of Stone Pine (Pinus pinea L.) trees from UAV-based digital surface models. Geo-Spat. Inf. Sci. 2024, 27, 142–162. [Google Scholar] [CrossRef]
- Wang, R.; Hu, C.; Han, J.; Hu, X.; Zhao, Y.; Wang, Q.; Sun, H.; Xie, Y. A Hierarchic Method of Individual Tree Canopy Segmentation Combing UAV Image and LiDAR. Arab. J. Sci. Eng. 2025, 50, 7567–7585. [Google Scholar] [CrossRef]
- Dersch, S.; Schöttl, A.; Krzystek, P.; Heurich, M. Towards complete tree crown delineation by instance segmentation with Mask R–CNN and DETR using UAV-based multispectral imagery and lidar data. ISPRS Open J. Photogramm. Remote Sens. 2023, 8, 100037. [Google Scholar] [CrossRef]
- Lv, L.; Li, X.; Mao, F.; Zhou, L.; Xuan, J.; Zhao, Y.; Yu, J.; Song, M.; Huang, L.; Du, H. A Deep Learning Network for Individual Tree Segmentation in UAV Images with a Coupled CSPNet and Attention Mechanism. Remote Sens. 2023, 15, 4420. [Google Scholar] [CrossRef]
- Yang, K.; Ye, Z.; Liu, H.; Su, X.; Yu, C.; Zhang, H.; Lai, R. A new framework for GEOBIA: Accurate individual plant extraction and detection using high-resolution RGB data from UAVs. Int. J. Digit. Earth 2023, 16, 2599–2622. [Google Scholar] [CrossRef]
- Carnegie, A.J.; Eslick, H.; Barber, P.; Nagel, M.; Stone, C. Airborne multispectral imagery and deep learning for biosecurity surveillance of invasive forest pests in urban landscapes. Urban For. Urban Green. 2023, 81, 127859. [Google Scholar] [CrossRef]
- Cheng, Z.; Qi, L.; Cheng, Y.; Wu, Y.; Zhang, H. Interlacing Orchard Canopy Separation and Assessment using UAV Images. Remote Sens. 2020, 12, 767. [Google Scholar] [CrossRef]
- Correa Martins, J.A.; Menezes, G.; Gonçalves, W.; Sant’Ana, D.A.; Osco, L.P.; Liesenberg, V.; Li, J.; Ma, L.; Oliveira, P.T.; Astolfi, G.; et al. Machine learning and SLIC for Tree Canopies Segmentation in Urban Areas. Ecol. Inform. 2021, 66, 101465. [Google Scholar] [CrossRef]
- Dong, X.; Zhang, Z.; Yu, R.; Tian, Q.; Zhu, X. Extraction of Information about Individual Trees from High-Spatial-Resolution UAV-Acquired Images of an Orchard. Remote Sens. 2020, 12, 133. [Google Scholar] [CrossRef]
- Di Gennaro, S.F.; Nati, C.; Dainelli, R.; Pastonchi, L.; Berton, A.; Toscano, P.; Matese, A. An Automatic UAV Based Segmentation Approach for Pruning Biomass Estimation in Irregularly Spaced Chestnut Orchards. Forests 2020, 11, 308. [Google Scholar] [CrossRef]
- Fawcett, D.; Azlan, B.; Hill, T.C.; Kho, L.K.; Bennie, J.; Anderson, K. Unmanned aerial vehicle (UAV) derived structure-from-motion photogrammetry point clouds for oil palm (Elaeis guineensis) canopy segmentation and height estimation. Int. J. Remote Sens. 2019, 40, 7538–7560. [Google Scholar] [CrossRef]
- Gan, Y.; Wang, Q.; Iio, A. Tree Crown Detection and Delineation in a Temperate Deciduous Forest from UAV RGB Imagery Using Deep Learning Approaches: Effects of Spatial Resolution and Species Characteristics. Remote Sens. 2023, 15, 778. [Google Scholar] [CrossRef]
- Ghasemi, M.; Latifi, H.; Pourhashemi, M. A Novel Method for Detecting and Delineating Coppice Trees in UAV Images to Monitor Tree Decline. Remote Sens. 2022, 14, 5910. [Google Scholar] [CrossRef]
- Illana Rico, S.; Martínez Gila, D.M.; Cano Marchal, P.; Gómez Ortega, J. Automatic Detection of Olive Tree Canopies for Groves with Thick Plant Cover on the Ground. Sensors 2022, 22, 6219. [Google Scholar] [CrossRef]
- Ji, Y.; Yan, E.; Yin, X.; Song, Y.; Wei, W.; Mo, D. Automated extraction of Camellia oleifera crown using unmanned aerial vehicle visible images and the ResU-Net deep learning model. Front. Plant Sci. 2022, 13, 958940. [Google Scholar] [CrossRef] [PubMed]
- Kolanuvada, S.R.; Ilango, K.K. Automatic Extraction of Tree Crown for the Estimation of Biomass from UAV Imagery Using Neural Networks. J. Indian Soc. Remote Sens. 2021, 49, 651–658. [Google Scholar] [CrossRef]
- Liang, Y.; Sun, Y.; Kou, W.; Xu, W.; Wang, J.; Wang, Q.; Wang, H.; Lu, N. Rubber Tree Recognition Based on UAV RGB Multi-Angle Imagery and Deep Learning. Drones 2023, 7, 547. [Google Scholar] [CrossRef]
- Ma, K.; Chen, Z.; Fu, L.; Tian, W.; Jiang, F.; Yi, J.; Du, Z.; Sun, H. Performance and Sensitivity of Individual Tree Segmentation Methods for UAV-LiDAR in Multiple Forest Types. Remote Sens. 2022, 14, 298. [Google Scholar] [CrossRef]
- Moradi, F.; Javan, F.D.; Samadzadegan, F. Potential evaluation of visible-thermal UAV image fusion for individual tree detection based on convolutional neural network. Int. J. Appl. Earth Obs. Geoinf. 2022, 113, 103011. [Google Scholar] [CrossRef]
- Parkan, M.J. Combined Use of Airborne Laser Scanning and Hyperspectral Imaging for Forest Inventories; Infoscience.epfl.ch. 2019. [Google Scholar] [CrossRef]
- Ponce, J.M.; Aquino, A.; Tejada, D.; Al-Hadithi, B.M.; Andújar, J.M. A Methodology for the Automated Delineation of Crop Tree Crowns from UAV-Based Aerial Imagery by Means of Morphological Image Analysis. Agronomy 2022, 12, 43. [Google Scholar] [CrossRef]
- Qi, Y.; Dong, X.; Chen, P.; Lee, K.H.; Lan, Y.; Lu, X.; Jia, R.; Deng, J.; Zhang, Y. Canopy Volume Extraction of Citrus reticulate Blanco cv. Shatangju Trees Using UAV Image-Based Point Cloud Deep Learning. Remote Sens. 2021, 13, 3437. [Google Scholar] [CrossRef]
- Roslan, Z.; Long, Z.A.; Ismail, R. Individual Tree Crown Detection using GAN and RetinaNet on Tropical Forest. In Proceedings of the 2021 15th International Conference on Ubiquitous Information Management and Communication (IMCOM), Seoul, Republic of Korea, 4–6 January 2021; pp. 1–7. [Google Scholar] [CrossRef]
- Șandric, I.; Irimia, R.; Petropoulos, G.P.; Stateras, D.; Kalivas, D.; Pleșoianu, A. Drone Imagery in Support of Orchards Trees Vegetation Assessment Based on Spectral Indices and Deep Learning. In Information and Communication Technologies for Agriculture—Theme I: Sensors; Springer International Publishing: Cham, Switzerland, 2022; pp. 233–248. [Google Scholar] [CrossRef]
- Torres, D.L.; Feitosa, R.Q.; La Rosa, L.E.C.; Happ, P.N.; Marcato, J.; Gonçalves, W.; Martins, J.; Liesenberg, V. Semantic Segmentation Of Endangered Tree Species In Brazilian Savanna Using Deeplabv3+ Variants. In Proceedings of the 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), Santiago, Chile, 22–26 March 2020; pp. 515–520. [Google Scholar] [CrossRef]
- Wu, J.; Yang, G.; Yang, H.; Zhu, Y.; Li, Z.; Lei, L.; Zhao, C. Extracting apple tree crown information from remote imagery using deep learning. Comput. Electron. Agric. 2020, 174, 105504. [Google Scholar] [CrossRef]
- Zaforemska, A.; Xiao, W.; Gaulton, R. Individual Tree Detection From UAV LIDAR Data In A Mixed Species Woodland. Int. Arch. Photogramm. Remote Sensing And Spat. Inf. Sci. 2019, XLII-2/W13, 657–663. [Google Scholar] [CrossRef]
- Zhang, W.; Chen, X.; Qi, J.; Yang, S. Automatic instance segmentation of orchard canopy in unmanned aerial vehicle imagery using deep learning. Front. Plant Sci. 2022, 13, 1041791. [Google Scholar] [CrossRef]
- Zheng, J.; Yuan, S.; Wu, W.; Li, W.; Yu, L.; Fu, H.; Coomes, D. Surveying coconut trees using high-resolution satellite imagery in remote atolls of the Pacific Ocean. Remote Sens. Environ. 2023, 287, 113485. [Google Scholar] [CrossRef]
- Thapa, N.; Khanal, R.; Bhattarai, B.; Lee, J. Pine Wilt Disease Segmentation with Deep Metric Learning Species Classification for Early-Stage Disease and Potential False Positive Identification. Electronics 2024, 13, 1951. [Google Scholar] [CrossRef]
- Wang, J.; Lin, Q.; Meng, S.; Huang, H.; Liu, Y. Individual Tree-Level Monitoring of Pest Infestation Combining Airborne Thermal Imagery and Light Detection and Ranging. Forests 2024, 15, 112. [Google Scholar] [CrossRef]
- Xiang, Z.; Li, T.; Lv, Y.; Wang, R.; Sun, T.; Gao, Y.; Wu, H. Identification of Damaged Canopies in Farmland Artificial Shelterbelts Based on Fusion of Unmanned Aerial Vehicle LiDAR and Multispectral Features. Forests 2024, 15, 891. [Google Scholar] [CrossRef]
- Ye, X.; Pan, J.; Liu, G.; Shao, F. Exploring the Close-Range Detection of UAV-Based Images on Pine Wilt Disease by an Improved Deep Learning Method. Plant Phenomics 2023, 5, 0129. [Google Scholar] [CrossRef]
- Kavithamani, V.; UmaMaheswari, S. Investigation of deep learning for whitefly identification in coconut tree leaves. Intell. Syst. Appl. 2023, 20, 200290. [Google Scholar] [CrossRef]
- Woolfson, L.S. Detecting Disease in Citrus Trees using Multispectral UAV Data and Deep Learning Algorithm. Master’s Thesis, University of the Witwatersrand, Johannesburg, South Africa, 2024. [Google Scholar]
- Guo, H.; Cheng, Y.; Liu, J.; Wang, Z. Low-cost and precise traditional Chinese medicinal tree pest and disease monitoring using UAV RGB image only. Sci. Rep. 2024, 14, 25562. [Google Scholar] [CrossRef]
- Feng, H.; Li, Q.; Wang, W.; Bashir, A.K.; Singh, A.K.; Xu, J.; Fang, K. Security of target recognition for UAV forestry remote sensing based on multi-source data fusion transformer framework. Inf. Fusion 2024, 112, 102555. [Google Scholar] [CrossRef]
- Liu, W.T.; Xie, Z.R.; Du, J.; Li, Y.H.; Long, Y.B.; Lan, Y.B.; Liu, T.Y.; Sun, S.; Zhao, J. Early detection of pine wilt disease based on UAV reconstructed hyperspectral image. Front. Plant 2024, 15, 1453761. [Google Scholar] [CrossRef]
- Hu, K.; Liu, J.; Xiao, H.; Zeng, Q.; Liu, J.; Zhang, L.; Li, M.; Wang, Z. A new BWO-based RGB vegetation index and ensemble learning strategy for the pests and diseases monitoring of CCB trees using unmanned aerial vehicle. Front. Plant Sci. 2024, 15, 1464723. [Google Scholar] [CrossRef] [PubMed]
- Li, H.; Tan, B.; Sun, L.; Liu, H.; Zhang, H.; Liu, B. Multi-Source Image Fusion Based Regional Classification Method for Apple Diseases and Pests. Appl. Sci. 2024, 14, 7695. [Google Scholar] [CrossRef]
- Yao, J.; Song, B.; Chen, X.; Zhang, M.; Dong, X.; Liu, H.; Liu, F.; Zhang, L.; Lu, Y.; Xu, C.; et al. Pine-YOLO: A Method for Detecting Pine Wilt Disease in Unmanned Aerial Vehicle Remote Sensing Images. Forests 2024, 15, 737. [Google Scholar] [CrossRef]
- Yu, R.; Liu, Y.; Gao, B.; Ren, L.; Luo, Y. Detection of pine wood nematode infections in Chinese pine (Pinus tabuliformis) using hyperspectral drone images. Pest Manag. Sci. 2025, 81, 5659–5674. [Google Scholar] [CrossRef]
- Bozzini, A.; Huo, L.; Brugnaro, S.; Morgante, G.; Persson, H.J.; Finozzi, V.; Battisti, A.; Faccoli, M. Multispectral drone images for the early detection of bark beetle infestations: Assessment over large forest areas in the Italian South-Eastern Alps. Front. For. Glob. Chang. 2025, 8, 1532954. [Google Scholar] [CrossRef]
- Blekos, K.; Tsakas, A.; Xouris, C.; Evdokidis, I.; Alexandropoulos, D.; Alexakos, C.; Katakis, S.; Makedonas, A.; Theoharatos, C.; Lalos, A. Analysis, Modeling and Multi-Spectral Sensing for the Predictive Management of Verticillium Wilt in Olive Groves. J. Sens. Actuator Netw. 2021, 10, 15. [Google Scholar] [CrossRef]
- Cai, P.; Chen, G.; Yang, H.; Li, X.; Zhu, K.; Wang, T.; Liao, P.; Han, M.; Gong, Y.; Wang, Q.; et al. Detecting Individual Plants Infected with Pine Wilt Disease Using Drones and Satellite Imagery: A Case Study in Xianning, China. Remote Sens. 2023, 15, 2671. [Google Scholar] [CrossRef]
- Gao, B.; Yu, L.; Ren, L.; Zhan, Z.; Luo, Y. Early Detection of Dendroctonus valens Infestation at Tree Level with a Hyperspectral UAV Image. Remote Sens. 2023, 15, 407. [Google Scholar] [CrossRef]
- Hu, G.; Wang, T.; Wan, M.; Bao, W.; Zeng, W. UAV remote sensing monitoring of pine forest diseases based on improved Mask R-CNN. Int. J. Remote Sens. 2022, 43, 1274–1305. [Google Scholar] [CrossRef]
- Izzudin, M.A.; Hamzah, A.; Nisfariza, M.N.; Idris, A.S. Analysis of Multispectral Imagery from Unmanned Aerial Vehicle (UAV) using Oject-Based Image Analysis for Detection of Ganoderma Disease in Oil Palm. J. Oil Palm Res. 2020, 32, 497–508. [Google Scholar] [CrossRef]
- Junttila, S.; Näsi, R.; Koivumäki, N.; Imangholiloo, M.; Saarinen, N.; Raisio, J.; Holopainen, M.; Hyyppä, H.; Hyyppä, J.; Lyytikäinen-Saarenmaa, P.; et al. Multispectral Imagery Provides Benefits for Mapping Spruce Tree Decline Due to Bark Beetle Infestation When Acquired Late in the Season. Remote Sens. 2022, 14, 909. [Google Scholar] [CrossRef]
- Lim, W.; Choi, K.; Cho, W.; Chang, B.; Ko, D.W. Efficient dead pine tree detecting method in the Forest damaged by pine wood nematode (Bursaphelenchus xylophilus) through utilizing unmanned aerial vehicles and deep learning-based object detection techniques. For. Sci. Technol. 2022, 18, 36–43. [Google Scholar] [CrossRef]
- Ma, L.; Huang, X.; Hai, Q.; Gang, B.; Tong, S.; Bao, Y.; Dashzebeg, G.; Nanzad, T.; Dorjsuren, A.; Enkhnasan, D.; et al. Model-Based Identification of Larix sibirica Ledeb. Damage Caused by Erannis jacobsoni Djak. Based on UAV Multispectral Features and Machine Learning. Forests 2022, 13, 2104. [Google Scholar] [CrossRef]
- Minařík, R.; Langhammer, J.; Lendzioch, T. Detection of Bark Beetle Disturbance at Tree Level Using UAS Multispectral Imagery and Deep Learning. Remote Sens. 2021, 13, 4768. [Google Scholar] [CrossRef]
- Pan, J.; Lin, J.; Xie, T. Exploring the Potential of UAV-Based Hyperspectral Imagery on Pine Wilt Disease Detection: Influence of Spatio-Temporal Scales. Remote Sens. 2023, 15, 2281. [Google Scholar] [CrossRef]
- Pulakkatu-thodi, I.; Dzurisin, J.; Follett, P. Evaluation of macadamia felted coccid (Hemiptera: Eriococcidae) damage and cultivar susceptibility using imagery from a small unmanned aerial vehicle (sUAV), combined with ground truthing. Pest Manag. Sci. 2022, 78, 4533–4543. [Google Scholar] [CrossRef]
- Sandino, J.; Pegg, G.; Gonzalez, F.; Smith, G. Aerial Mapping of Forests Affected by Pathogens Using UAVs, Hyperspectral Sensors, and Artificial Intelligence. Sensors 2018, 18, 944. [Google Scholar] [CrossRef] [PubMed]
- Trang, N.H. Evaluation And Identification Of Bark Beetle-Induced Forest Degradation Using RGB Acquired UAV Images And Machine Learning Techniques. Ph.D. Thesis, Iwate University, Morioka, Japan, 2022. [Google Scholar] [CrossRef]
- Turkulainen, E. Comparison of Deep Neural Networks in Classification of Spruce Trees Damaged by the Bark Beetle Using UAS RGB, Multi- and Hyperspectral Imagery. Master’s Thesis, School of Science, Aalto University, Helsinki, Finland, 2023. [Google Scholar]
- Wu, D.; Yu, L.; Yu, R.; Zhou, Q.; Li, J.; Zhang, X.; Ren, L.; Luo, Y. Detection of the Monitoring Window for Pine Wilt Disease Using Multi-Temporal UAV-Based Multispectral Imagery and Machine Learning Algorithms. Remote Sens. 2023, 15, 444. [Google Scholar] [CrossRef]
- Wu, Z.; Jiang, X. Extraction of Pine Wilt Disease Regions Using UAV RGB Imagery and Improved Mask R-CNN Models Fused with ConvNeXt. Forests 2023, 14, 1672. [Google Scholar] [CrossRef]
- Yu, R.; Huo, L.; Huang, H.; Yuan, Y.; Gao, B.; Liu, Y.; Yu, L.; Li, H.; Yang, L.; Ren, L.; et al. Early detection of pine wilt disease tree candidates using time-series of spectral signatures. Front. Plant Sci. 2022, 13, 1000093. [Google Scholar] [CrossRef]
- Yu, R.; Ren, L.; Luo, Y. Early detection of pine wilt disease in Pinus tabuliformis in North China using a field portable spectrometer and UAV-based hyperspectral imagery. For. Ecosyst. 2021, 8, 44. [Google Scholar] [CrossRef]
- Yu, R.; Luo, Y.; Li, H.; Yang, L.; Huang, H.; Yu, L.; Ren, L. Three-Dimensional Convolutional Neural Network Model for Early Detection of Pine Wilt Disease Using UAV-Based Hyperspectral Images. Remote Sens. 2021, 13, 4065. [Google Scholar] [CrossRef]
- Yu, R.; Luo, Y.; Zhou, Q.; Zhang, X.; Wu, D.; Ren, L. Early detection of pine wilt disease using deep learning algorithms and UAV-based multispectral imagery. For. Ecol. Manag. 2021, 497, 119493. [Google Scholar] [CrossRef]
- Zhang, S.; Huang, H.; Huang, Y.; Cheng, D.; Huang, J. A GA and SVM Classification Model for Pine Wilt Disease Detection Using UAV-Based Hyperspectral Imagery. Appl. Sci. 2022, 12, 6676. [Google Scholar] [CrossRef]
- Zhou, Y.; Liu, W.; Bi, H.; Chen, R.; Zong, S.; Luo, Y. A Detection Method for Individual Infected Pine Trees with Pine Wilt Disease Based on Deep Learning. Forests 2022, 13, 1880. [Google Scholar] [CrossRef]
- Zhu, X.; Wang, R.; Shi, W.; Yu, Q.; Li, X.; Chen, X. Automatic Detection and Classification of Dead Nematode-Infested Pine Wood in Stages Based on YOLO v4 and GoogLeNet. Forests 2023, 14, 601. [Google Scholar] [CrossRef]
- Bulut, S.; Günlü, A.; Aksoy, H.; Bolat, F.; Sönmez, M.Y. Integration of field measurements with unmanned aerial vehicle to predict forest inventory metrics at tree and stand scales in natural pure Crimean pine forests. Int. J. Remote Sens. 2024, 45, 3846–3870. [Google Scholar] [CrossRef]
- Conti, L.A.; Barcellos, R.L.; Oliveira, P.; Neto, F.C.N.; Cunha-Lignon, M. Geographic Object-Oriented Analysis of UAV Multispectral Images for Tree Distribution Mapping in Mangroves. Remote Sens. 2025, 17, 1500. [Google Scholar] [CrossRef]
- Zhong, H.; Zhang, Z.; Liu, H.; Wu, J.; Lin, W. Individual Tree Species Identification for Complex Coniferous and Broad-Leaved Mixed Forests Based on Deep Learning Combined with UAV LiDAR Data and RGB Images. Forests 2024, 15, 293. [Google Scholar] [CrossRef]
- Chen, C.; Jing, L.; Li, H.; Tang, Y.; Chen, F.; Tan, B. Using time-series imagery and 3DLSTM model to classify individual tree species. Int. J. Digit. Earth 2024, 17, 2308728. [Google Scholar] [CrossRef]
- Ou, J.; Tian, Y.; Zhang, Q.; Xie, X.; Zhang, Y.; Tao, J.; Lin, J. Coupling UAV Hyperspectral and LiDAR Data for Mangrove Classification Using XGBoost in China’s Pinglu Canal Estuary. Forests 2023, 14, 1838. [Google Scholar] [CrossRef]
- Miao, S.; Zhang, K.F.; Zeng, H.; Liu, J. AI-Based Tree Species Classification Using Pseudo Tree Crown Derived From UAV Imagery. 2024. [Google Scholar] [CrossRef]
- Kulhandjian, H.; Irineo, B.; Sales, J.; Kulhandjian, M. Low-Cost Tree Health Categorization and Localization Using Drones and Machine Learning. In Proceedings of the 2024 International Conference on Computing, Networking and Communications (ICNC), Big Island, HI, USA, 19–22 February 2024; pp. 296–300. [Google Scholar] [CrossRef]
- Dietenberger, S.; Mueller, M.M.; Stöcker, B.; Dubois, C.; Arlaud, H.; Adam, M.; Hese, S.; Meyer, H.; Thiel, C. Accurate Mapping of Downed Deadwood in a Dense Deciduous Forest Using UAV-SfM Data and Deep Learning. Remote Sens. 2025, 17, 1610. [Google Scholar] [CrossRef]
- Xiong, Y.; Zeng, X.; Lai, W.; Liao, J.; Chen, Y.; Zhu, M.; Huang, K. Detecting and Mapping Individual Fruit Trees in Complex Natural Environments via UAV Remote Sensing and Optimized YOLOv5. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 7554–7576. [Google Scholar] [CrossRef]
- Sánchez-Vega, J.A.; Silva-López, J.O.; Lopez, R.S.; Medina-Medina, A.J.; Tuesta-Trauco, K.M.; Rivera-Fernandez, A.S.; Silva-Melendez, T.B.; Oliva-Cruz, M.; Barboza, E.; da Silva Junior, C.A.; et al. Automatic Detection of Ceroxylon Palms by Deep Learning in a Protected Area in Amazonas (NW Peru). Forests 2025, 16, 1061. [Google Scholar] [CrossRef]
- Jarahizadeh, S.; Salehi, B. Deep Learning Analysis of UAV Lidar Point Cloud for Individual Tree Detecting. In Proceedings of the IGARSS 2024—2024 IEEE International Geoscience and Remote Sensing Symposium, Athens, Greece, 7–12 July 2024; pp. 9953–9956. [Google Scholar] [CrossRef]
- Ameslek, O.; Zahir, H.; Mitro, S.; Bachaoui, E.M. Identification and Mapping of Individual Trees from Unmanned Aerial Vehicle Imagery Using an Object-Based Convolutional Neural Network. Remote Sens. Earth Syst. Sci. 2024, 7, 172–182. [Google Scholar] [CrossRef]
- Zhang, Z.; Li, Y.; Cao, Y.; Wang, Y.; Guo, X.; Hao, X. MTSC-Net: A Semi-Supervised Counting Network for Estimating the Number of Slash pine New Shoots. Plant Phenomics 2024, 6, 0228. [Google Scholar] [CrossRef]
- Lv, L.; Zhao, Y.; Li, X.; Yu, J.; Song, M.; Huang, L.; Mao, F.; Du, H. UAV-Based Intelligent Detection of Individual Trees in Moso Bamboo Forests With Complex Canopy Structure. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 11915–11930. [Google Scholar] [CrossRef]
- Persson, D. Tree Crown Detection Using Machine Learning: A Study on Using the DeepForest Deep-Learning Model for Tree Crown Detection in Drone-Captured Aerial Footage; KTH Royal Institute of Technology: Stockholm, Sweden, 2024. [Google Scholar]
- Zhang, S.; Cui, Y.; Zhou, Y.; Dong, J.; Li, W.; Liu, B.; Dong, J. A Mapping Approach for Eucalyptus Plantations Canopy and Single Tree Using High-Resolution Satellite Images in Liuzhou, China. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–13. [Google Scholar] [CrossRef]
- Arce, L.S.D.; Osco, L.P.; dos Santos de Arruda, M.; Furuya, D.E.G.; Ramos, A.P.M.; Aoki, C.; Pott, A.; Fatholahi, S.; Li, J.; de Araújo, F.F.; et al. Mauritia flexuosa palm trees airborne mapping with deep convolutional neural network. Sci. Rep. 2021, 11, 19619. [Google Scholar] [CrossRef] [PubMed]
- Bennett, L.; Wilson, B.; Selland, S.; Qian, L.; Wood, M.; Zhao, H.; Boisvert, J. Image to attribute model for trees (ITAM-T): Individual tree detection and classification in Alberta boreal forest for wildland fire fuel characterization. Int. J. Remote Sens. 2022, 43, 1848–1880. [Google Scholar] [CrossRef]
- Cabrera-Ariza, A.M.; Lara-Gómez, M.A.; Santelices-Moya, R.E.; Meroño de Larriva, J.E.; Mesas-Carrascosa, F.J. Individualization of Pinus radiata Canopy from 3D UAV Dense Point Clouds Using Color Vegetation Indices. Sensors 2022, 22, 1331. [Google Scholar] [CrossRef] [PubMed]
- Csillik, O.; Cherbini, J.; Johnson, R.; Lyons, A.; Kelly, M. Identification of Citrus Trees from Unmanned Aerial Vehicle Imagery Using Convolutional Neural Networks. Drones 2018, 2, 39. [Google Scholar] [CrossRef]
- Han, P.; Ma, C.; Chen, J.; Chen, L.; Bu, S.; Xu, S.; Zhao, Y.; Zhang, C.; Hagino, T. Fast Tree Detection and Counting on UAVs for Sequential Aerial Images with Generating Orthophoto Mosaicing. Remote Sens. 2022, 14, 4113. [Google Scholar] [CrossRef]
- Kestur, R.; Kulkarni, A.; Bhaskar, R.; Sreenivasa, P.; Sri, D.D.; Choudhary, A.; Prabhu, B.V.B.; Anand, G.; Narasipura, O. MangoGAN: A general adversarial network-based deep learning architecture for mango tree crown detection. J. Appl. Remote Sens. 2022, 16, 014527. [Google Scholar] [CrossRef]
- La Rosa, L.E.C.; Zortea, M.; Gemignani, B.H.; Oliveira, D.A.B.; Feitosa, R.Q. FCRN-Based Multi-Task Learning for Automatic Citrus Tree Detection From UAV Images. In Proceedings of the 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), Santiago, Chile, 22–26 March 2020; pp. 403–408. [Google Scholar] [CrossRef]
- Luo, M.; Tian, Y.; Zhang, S.; Huang, L.; Wang, H.; Liu, Z.; Yang, L. Individual Tree Detection in Coal Mine Afforestation Area Based on Improved Faster RCNN in UAV RGB Images. Remote Sens. 2022, 14, 5545. [Google Scholar] [CrossRef]
- Lou, X.; Huang, Y.; Fang, L.; Huang, S.; Gao, H.; Yang, L.; Weng, Y.; Hung, I.K. Measuring loblolly pine crowns with drone imagery through deep learning. J. For. Res. 2022, 33, 227–238. [Google Scholar] [CrossRef]
- Miraki, M.; Sohrabi, H.; Fatehi, P. Citrus trees identification and trees stress detection based on spectral data derived from UAVs. Res. Hortic. Sci. 2022, 1, 27–40. [Google Scholar] [CrossRef]
- Miyoshi, G.T.; Arruda, M.d.S.; Osco, L.P.; Marcato Junior, J.; Gonçalves, D.N.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Gonçalves, W.N. A Novel Deep Learning Method to Identify Single Tree Species in UAV-Based Hyperspectral Images. Remote Sens. 2020, 12, 1294. [Google Scholar] [CrossRef]
- Mohan, M.; Leite, R.V.; Broadbent, E.N.; Jaafar, W.S.W.M.; Srinivasan, S.; Bajaj, S.; Corte, A.P.D.; do Amaral, C.H.; Gopan, G.; Saad, S.N.M.; et al. Individual tree detection using UAV-lidar and UAV-SfM data: A tutorial for beginners. Open Geosci. 2021, 13, 1028–1039. [Google Scholar] [CrossRef]
- Osco, L.P.; dos Santos de Arruda, M.; Marcato Junior, J.; da Silva, N.B.; Ramos, A.P.M.; Moryia, É.A.S.; Imai, N.N.; Pereira, D.R.; Creste, J.E.; Matsubara, E.T.; et al. A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery. ISPRS J. Photogramm. Remote Sens. 2020, 160, 97–106. [Google Scholar] [CrossRef]
- Salamí, E.; Gallardo, A.; Skorobogatov, G.; Barrado, C. On-the-Fly Olive Tree Counting Using a UAS and Cloud Services. Remote Sens. 2019, 11, 316. [Google Scholar] [CrossRef]
- Wu, Z.; Jiang, M.; Li, H.; Shen, Y.; Song, J.; Zhong, X.; Ye, Z. Urban carbon stock estimation based on deep learning and UAV remote sensing: A case study in Southern China. All Earth 2023, 35, 272–286. [Google Scholar] [CrossRef]
- Xia, K.; Wang, H.; Yang, Y.; Du, X.; Feng, H. Automatic Detection and Parameter Estimation of Ginkgo biloba in Urban Environment Based on RGB Images. J. Sens. 2021, 2021, 1–12. [Google Scholar] [CrossRef]
- Zhang, L.; Lin, H.; Wang, F. Individual Tree Detection Based on High-Resolution RGB Images for Urban Forestry Applications. IEEE Access 2022, 10, 46589–46598. [Google Scholar] [CrossRef]
- Akinbiola, S.; Salami, A.T.; Awotoye, O.O.; Popoola, S.O.; Olusola, J.A. Application of UAV photogrammetry for the assessment of forest structure and species network in the tropical forests of Southern Nigeria. Geocarto Int. 2023, 38, 2190621. [Google Scholar] [CrossRef]
- Duan, X.; Chang, M.; Wu, G.; Situ, S.; Zhu, S.; Zhang, Q.; Huangfu, Y.; Wang, W.; Chen, W.; Yuan, B.; et al. Estimation of biogenic volatile organic compound (BVOC) emissions in forest ecosystems using drone-based lidar, photogrammetry, and image recognition technologies. Atmos. Meas. Tech. 2024, 17, 4065–4079. [Google Scholar] [CrossRef]
- Gyawali, A.; Aalto, M.; Ranta, T. Tree Species Detection and Enhancing Semantic Segmentation Using Machine Learning Models with Integrated Multispectral Channels from PlanetScope and Digital Aerial Photogrammetry in Young Boreal Forest. Remote Sens. 2025, 17, 1811. [Google Scholar] [CrossRef]
- Jo, W.K.; Park, J.H. High-Accuracy Tree Type Classification in Urban Forests Using Drone-Based RGB Imagery and Optimized SVM. Korean J. Remote Sens. 2025, 41, 209–223. [Google Scholar] [CrossRef]
- Gahrouei, O.R.; Côté, J.F.; Bournival, P.; Giguère, P.; Béland, M. Comparison of Deep and Machine Learning Approaches for Quebec Tree Species Classification Using a Combination of Multispectral and LiDAR Data. Can. J. Remote Sens. 2024, 50, 2359433. [Google Scholar] [CrossRef]
- Hooshyar, M.; Li, Y.S.; Chun Tang, W.; Chen, L.W.; Huang, Y.M. Economic Fruit Trees Recognition in Hillsides: A CNN-Based Approach Using Enhanced UAV Imagery. IEEE Access 2024, 12, 61991–62005. [Google Scholar] [CrossRef]
- Miao, S.; Zhang, K.F.; Zeng, H.; Liu, J. Improving Artificial-Intelligence-Based Individual Tree Species Classification Using Pseudo Tree Crown Derived from Unmanned Aerial Vehicle Imagery. Remote Sens. 2024, 16, 1849. [Google Scholar] [CrossRef]
- Yang, Y.; Meng, Z.; Zu, J.; Cai, W.; Wang, J.; Su, H.; Yang, J. Fine-Scale Mangrove Species Classification Based on UAV Multispectral and Hyperspectral Remote Sensing Using Machine Learning. Remote Sens. 2024, 16, 3093. [Google Scholar] [CrossRef]
- Pierdicca, R.; Nepi, L.; Mancini, A.; Malinverni, E.S.; Balestra, M. UAV4TREE: DEEP LEARNING-BASED SYSTEM FOR AUTOMATIC CLASSIFICATION OF TREE SPECIES USING RGB OPTICAL IMAGES OBTAINED BY AN UNMANNED AERIAL VEHICLE. ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci. 2023, X-1/W1-2023, 1089–1096. [Google Scholar] [CrossRef]
- Xu, S.; Wang, R.; Shi, W.; Wang, X. Classification of Tree Species in Transmission Line Corridors Based on YOLO v7. Forests 2023, 15, 61. [Google Scholar] [CrossRef]
- Chen, X.; Shen, X.; Cao, L. Tree Species Classification in Subtropical Natural Forests Using High-Resolution UAV RGB and SuperView-1 Multispectral Imageries Based on Deep Learning Network Approaches: A Case Study within the Baima Snow Mountain National Nature Reserve, China. Remote Sens. 2023, 15, 2697. [Google Scholar] [CrossRef]
- Li, Y.F.; Xu, Z.H.; Hao, Z.B.; Yao, X.; Zhang, Q.; Huang, X.Y.; Li, B.; He, A.Q.; Li, Z.L.; Guo, X.Y. A comparative study of the performances of joint RFE with machine learning algorithms for extracting Moso bamboo (Phyllostachys pubescens) forest based on UAV hyperspectral images. Geocarto Int. 2023, 38, 2207550. [Google Scholar] [CrossRef]
- Imangholiloo, M.; Luoma, V.; Holopainen, M.; Vastaranta, M.; Mäkeläinen, A.; Koivumäki, N.; Honkavaara, E.; Khoramshahi, E. A New Approach for Feeding Multispectral Imagery into Convolutional Neural Networks Improved Classification of Seedlings. Remote Sens. 2023, 15, 5233. [Google Scholar] [CrossRef]
- Egli, S.; Höpke, M. CNN-Based Tree Species Classification Using High Resolution RGB Image Data from Automated UAV Observations. Remote Sens. 2020, 12, 3892. [Google Scholar] [CrossRef]
- Huang, Y.; Wen, X.; Gao, Y.; Zhang, Y.; Lin, G. Tree Species Classification in UAV Remote Sensing Images Based on Super-Resolution Reconstruction and Deep Learning. Remote Sens. 2023, 15, 2942. [Google Scholar] [CrossRef]
- Martins, G.B.; La Rosa, L.E.C.; Happ, P.N.; Filho, L.C.T.C.; Santos, C.J.F.; Feitosa, R.Q.; Ferreira, M.P. Deep learning-based tree species mapping in a highly diverse tropical urban setting. Urban For. Urban Green. 2021, 64, 127241. [Google Scholar] [CrossRef]
- Moura, M.M.; de Oliveira, L.E.S.; Sanquetta, C.R.; Bastos, A.; Mohan, M.; Corte, A.P.D. Towards Amazon Forest Restoration: Automatic Detection of Species from UAV Imagery. Remote Sens. 2021, 13, 2627. [Google Scholar] [CrossRef]
- Quan, Y.; Li, M.; Hao, Y.; Liu, J.; Wang, B. Tree species classification in a typical natural secondary forest using UAV-borne LiDAR and hyperspectral data. GIScience Remote Sens. 2023, 60, 2171706. [Google Scholar] [CrossRef]
- Shi, W.; Liao, X.; Sun, J.; Zhang, Z.; Wang, D.; Wang, S.; Qu, W.; He, H.; Ye, H.; Yue, H.; et al. Optimizing Observation Plans for Identifying Faxon Fir (Abies fargesii var. Faxoniana) Using Monthly Unmanned Aerial Vehicle Imagery. Remote Sens. 2023, 15, 2205. [Google Scholar] [CrossRef]
- Koreň, M.; Scheer, Ľ.; Sedmák, R.; Fabrika, M. Evaluation of tree stump measurement methods for estimating diameter at breast height and tree height. Int. J. Appl. Earth Obs. Geoinf. 2024, 129, 103828. [Google Scholar] [CrossRef]
- Albuquerque, R.W.; Matsumoto, M.H.; Calmon, M.; Ferreira, M.E.; Vieira, D.L.M.; Grohmann, C.H. A protocol for canopy cover monitoring on forest restoration projects using low-cost drones. Open Geosci. 2022, 14, 921–929. [Google Scholar] [CrossRef]
- Nasiri, V.; Darvishsefat, A.A.; Arefi, H.; Griess, V.C.; Sadeghi, S.M.M.; Borz, S.A. Modeling Forest Canopy Cover: A Synergistic Use of Sentinel-2, Aerial Photogrammetry Data, and Machine Learning. Remote Sens. 2022, 14, 1453. [Google Scholar] [CrossRef]
- Solvin, T.M.; Puliti, S.; Steffenrem, A. Use of UAV photogrammetric data in forest genetic trials: Measuring tree height, growth, and phenology in Norway spruce (Picea abies L. Karst.). Scand. J. For. Res. 2020, 35, 322–333. [Google Scholar] [CrossRef]
- Vivar-Vivar, E.D.; Pompa-García, M.; Martínez-Rivas, J.A.; Mora-Tembre, L.A. UAV-Based Characterization of Tree-Attributes and Multispectral Indices in an Uneven-Aged Mixed Conifer-Broadleaf Forest. Remote Sens. 2022, 14, 2775. [Google Scholar] [CrossRef]
- Vasilakos, C.; Verykios, V.S. Burned Olive Trees Identification with a Deep Learning Approach in Unmanned Aerial Vehicle Images. Remote Sens. 2024, 16, 4531. [Google Scholar] [CrossRef]
- Leidemer, T.; Caceres, M.L.L.; Diez, Y.; Ferracini, C.; Tsou, C.Y.; Katahira, M. Evaluation of Temporal Trends in Forest Health Status Using Precise Remote Sensing. Drones 2025, 9, 337. [Google Scholar] [CrossRef]
- Joshi, D.; Witharana, C. Vision Transformer-Based Unhealthy Tree Crown Detection in Mixed Northeastern US Forests and Evaluation of Annotation Uncertainty. Remote Sens. 2025, 17, 1066. [Google Scholar] [CrossRef]
- Tahrupath, K.; Guttila, J. Detecting Weligama Coconut Leaf Wilt Disease in Coconut Using UAV-Based Multispectral Imaging and Object-Based Classification. 2025. [Google Scholar] [CrossRef]
- Kwang, C.S.; Razak, S.F.A.; Yogarayan, S.; Zahisham, M.Z.A.; Tam, T.H.; Noor, M.K.A.M.; Abidin, H. Ganoderma Disease in Oil Palm Trees Using Hyperspectral Imaging and Machine Learning. J. Hum. Earth Future 2025, 6, 67–83. [Google Scholar] [CrossRef]
- Afsar, M.M.; Iqbal, M.S.; Bakhshi, A.D.; Hussain, E.; Iqbal, J. MangiSpectra: A Multivariate Phenological Analysis Framework Leveraging UAV Imagery and LSTM for Tree Health and Yield Estimation in Mango Orchards. Remote Sens. 2025, 17, 703. [Google Scholar] [CrossRef]
- Yin, D.; Cai, Y.; Li, Y.; Yuan, W.; Zhao, Z. Assessment of the Health Status of Old Trees of Platycladus orientalis L. Using UAV Multispectral Imagery. Drones 2024, 8, 91. [Google Scholar] [CrossRef]
- Yuan, B.K.K.; Ling, L.S.; Avtar, R. Oil palm tree health status detection using canopy height model and NDVI. AIP Conf. Proc. 2024, 3240, 020012. [Google Scholar] [CrossRef]
- Anwander, J.; Brandmeier, M.; Paczkowski, S.; Neubert, T.; Paczkowska, M. Evaluating Different Deep Learning Approaches for Tree Health Classification Using High-Resolution Multispectral UAV Data in the Black Forest, Harz Region, and Göttinger Forest. Remote Sens. 2024, 16, 561. [Google Scholar] [CrossRef]
- Johansen, K.; Duan, Q.; Tu, Y.H.; Searle, C.; Wu, D.; Phinn, S.; Robson, A.; McCabe, M.F. Mapping the condition of macadamia tree crops using multi-spectral UAV and WorldView-3 imagery. ISPRS J. Photogramm. Remote Sens. 2020, 165, 28–40. [Google Scholar] [CrossRef]
- Kahsay, A.G. Comparison of Thermal Infrared and Multispectral UAV Imagery for Detecting Pine Trees (Pinus brutia) Health Status in Lefka Ori National Park in West Crete, Greece. Master’s Thesis, University of Twente, Enschede, The Netherlands, 2022. [Google Scholar]
- Hao, X.; Cao, Y.; Zhang, Z.; Tomasetto, F.; Yan, W.; Xu, C.; Luan, Q.; Li, Y. CountShoots: Automatic Detection and Counting of Slash Pine New Shoots Using UAV Imagery. Plant Phenomics 2023, 5, 0065. [Google Scholar] [CrossRef]
- Qiu, S.; Zhu, X.; Zhang, Q.; Tao, X.; Zhou, K. Enhanced Estimation of Crown-Level Leaf Dry Biomass of Ginkgo Saplings Based on Multi-Height UAV Imagery and Digital Aerial Photogrammetry Point Cloud Data. Forests 2024, 15, 1720. [Google Scholar] [CrossRef]
- Juan-Ovejero, R.; Elghouat, A.; Navarro, C.J.; Reyes-Martín, M.P.; Jiménez, M.N.; Navarro, F.B.; Alcaraz-Segura, D.; Castro, J. Estimation of aboveground biomass and carbon stocks of Quercus ilex L. saplings using UAV-derived RGB imagery. Ann. For. Sci. 2023, 80, 44. [Google Scholar] [CrossRef]
- Goswami, A.; Khati, U.; Goyal, I.; Sabir, A.; Jain, S. Automated Stock Volume Estimation Using UAV-RGB Imagery. Sensors 2024, 24, 7559. [Google Scholar] [CrossRef]
- Safonova, A.; Guirado, E.; Maglinets, Y.; Alcaraz-Segura, D.; Tabik, S. Olive Tree Biovolume from UAV Multi-Resolution Image Segmentation with Mask R-CNN. Sensors 2021, 21, 1617. [Google Scholar] [CrossRef] [PubMed]
- Thanh, H.N.; Le, T.H.; Van, D.N.; Thuy, Q.H.; Kim, A.N.; Duc, H.L.; Van, D.P.; Trong, T.P.; Van, P.T. Forest cover change mapping based on Deep Neuron Network, GIS, and High-resolution Imagery. Vietnam. J. Earth Sci. 2025, 47, 151–175. [Google Scholar] [CrossRef]
- Carneiro, G.A.; Santos, J.; Sousa, J.J.; Cunha, A.; Pádua, L. Chestnut Burr Segmentation for Yield Estimation Using UAV-Based Imagery and Deep Learning. Drones 2024, 8, 541. [Google Scholar] [CrossRef]
- Wang, F.; Song, L.; Liu, X.; Zhong, S.; Wang, J.; Zhang, Y.; Wu, Y. Forest stand spectrum reconstruction using spectrum spatial feature gathering and multilayer perceptron. Front. Plant Sci. 2023, 14, 1223366. [Google Scholar] [CrossRef] [PubMed]
- Jayathunga, S.; Pearse, G.D.; Watt, M.S. Unsupervised Methodology for Large-Scale Tree Seedling Mapping in Diverse Forestry Settings Using UAV-Based RGB Imagery. Remote Sens. 2023, 15, 5276. [Google Scholar] [CrossRef]
- Sos, J.; Penglase, K.; Lewis, T.; Srivastava, P.K.; Singh, H.; Srivastava, S.K. Mapping and monitoring of vegetation regeneration and fuel under major transmission power lines through image and photogrammetric analysis of drone-derived data. Geocarto Int. 2023, 38, 2280597. [Google Scholar] [CrossRef]
- Hobart, M.; Pflanz, M.; Tsoulias, N.; Weltzien, C.; Kopetzky, M.; Schirrmann, M. Fruit Detection and Yield Mass Estimation from a UAV Based RGB Dense Cloud for an Apple Orchard. Drones 2025, 9, 60. [Google Scholar] [CrossRef]
- Oswald, D.; Pourreza, A.; Chakraborty, M.; Khalsa, S.D.S.; Brown, P.H. 3D radiative transfer modeling of almond canopy for nitrogen estimation by hyperspectral imaging. Precis. Agric. 2025, 26, 12. [Google Scholar] [CrossRef]
- Chen, H.; Cao, J.; An, J.; Xu, Y.; Bai, X.; Xu, D.; Li, W. Research on Walnut (Juglans regia L.) Yield Prediction Based on a Walnut Orchard Point Cloud Model. Agriculture 2025, 15, 775. [Google Scholar] [CrossRef]
- Lu, X.; Li, W.; Xiao, J.; Zhu, H.; Yang, D.; Yang, J.; Xu, X.; Lan, Y.; Zhang, Y. Inversion of Leaf Area Index in Citrus Trees Based on Multi-Modal Data Fusion from UAV Platform. Remote Sens. 2023, 15, 3523. [Google Scholar] [CrossRef]
- Paul, S.; Poliyapram, V.; İmamoğlu, N.; Uto, K.; Nakamura, R.; Kumar, D.N. Canopy Averaged Chlorophyll Content Prediction of Pear Trees Using Convolutional Autoencoder on Hyperspectral Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 1426–1437. [Google Scholar] [CrossRef]
Aspect | Keywords |
---|---|
Context | Tree, Forest, Forestry |
Source | UAV, Unmanned Aerial Vehicle, UAS, Unmanned Aerial System, Aerial |
Sensor | RGB, Multispectral, Hyperspectral, LiDAR |
Purpose | Artificial Intelligence, Machine Learning, Deep Learning, Segmentation, Classification |
Category | Definition | Scoring Logic | |
---|---|---|---|
F1 | (Meta) data are assigned globally unique and persistent identifiers | Score 0: Lacks a DOI or URL, a unique identification Score 0.5: Either a DOI or URL Score 1: Recognized and established source | |
F2 | Data are described with rich metadata (defined by R1 below) | Score 0: When R1 is nonexistent Score 1: When R1 exhibits a non-zero value | |
F3 | Metadata clearly and explicitly include the identifier of the data they describe | Score 0: Metadata lack a distinct reference to the identifier Score 1: Metadata explicitly include the identifier | |
F4 | (Meta) data are registered or indexed in a searchable resource | Score 1: Assigned universally, acknowledging that dataset existence implies indexing | |
A1 | A1.1 | The protocol is open, free, and universally implementable | Score 1: Assigned universally due to accessibility via standard internet protocols |
A1.2 | The protocol allows for an authentication and authorization procedure | Score 1: Recognizing protocol capability to support authentication and authorization procedures | |
A2 | Metadata are accessible, even when the data are no longer available | Score 0: Uses URL, which may be unreliable Score 1: Uses DOI, ensuring persistent access | |
I1 | (Meta) data use a formal, accessible, shared, and broadly applicable language | Score 0: No identifiable metadata Score 1: Metadata exist, any format (e.g., webpage) | |
I2 | (Meta) data use vocabularies that follow FAIR principles | Score 0: No metadata or vocabularies Score 0.5: Metadata exist but lack standard vocabularies Score 1: Full use of accepted vocabularies | |
I3 | (Meta) data include qualified references to other (meta) data | Score 0: No references to source datasets Score 1: Original work needing no references | |
R1 | R1.1 | (Meta) data are released with a clear and accessible data usage license | Score 0: No or unclear license Score 1: Clear and explicit license |
R1.2 | (Meta) data are associated with detailed provenance | Score 0: Provenance unclear or missing Score 0.5: Vague/incomplete provenance Score 1: Clear and complete documentation | |
R1.3 | (Meta) data meet domain-relevant community standards | Score 0: Unclear or non-standard metadata Score 0.5: Weak metadata Score 1: Structured and descriptive metadata |
Dataset | F | A | I | R | FAIR Score | |||||||||
F1 | F2 | F3 | F4 | A1.1 | A1.2 | A2 | I1 | I2 | I3 | R1.1 | R1.2 | R1.3 | ||
[18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 4.00 |
[63] | 0.5 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 3.88 |
[64] | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0.5 | 1 | 1 | 1 | 3.83 |
[65] [66] [67] | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 3.75 |
[68] | 0.5 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0.5 | 1 | 1 | 1 | 1 | 3.70 |
[69] [70] | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 3.66 |
[71] | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0.5 | 1 | 1 | 0 | 1 | 3.50 |
[72] | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 3.33 |
[73] | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0.5 | 1 | 3.25 |
[74] [75] [76] [77] | 0.5 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 3.13 |
[78] | 0.5 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0.5 | 1 | 0 | 1 | 1 | 3.13 |
[79] | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 3.08 |
[80] | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 3.08 |
[81] | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0.5 | 0 | 2.92 |
[82] | 0.5 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0.5 | 0 | 1 | 0 | 0.5 | 2.87 |
[83] | 0.5 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0.5 | 0 | 1 | 0 | 0.5 | 2.87 |
[84] | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 2.83 |
[85] | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0.5 | 0 | 2.58 |
[86] | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0.5 | 0 | 1 | 0 | 2.33 |
[87] | 0.5 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0.5 | 1 | 1 | 0.5 | 0.5 | 2.29 |
[88] | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 2.16 |
[89] | 0.5 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0.5 | 1 | 1 | 0.5 | 0.5 | 2.04 |
Dataset/ Study | Application | Source | Platform | Dataset Size (GB) | Annotation | Open Access | CA (*) Country | GE (*) Country |
---|---|---|---|---|---|---|---|---|
[43] | ITCD, SI, GM | RGB, LiDAR, PPC | UAV | 742.7 | Polygons | Yes | Canada | Canada |
[83] | ITCD | RGB | UAV | 0.32 | Polygons | Yes | N/A | N/A |
[24] | ITCD | RGB | UAV | 10 | Semantic labels | Yes | Brazil | Brazil |
[82] | ITCD | RGB | UAV | 0.15 | Semantic labels | Yes | N/A | N/A |
[89] | ITCD, SI | RGB, HSI, PPC, LiDAR | UAV | N/A | Polygons, Semantic labels | Yes | USA | China |
[31] | ITCD, SS | LiDAR | UAV | 1.6 | Semantic labels | Yes | Norway | Europe/Australia |
[37] | ITCD | RGB | UAV | 11 | Polygons | Yes | Germany | Colombia |
[85] | ITCD | MSI | Satellite | Undisclosed | Semantic labels | No | Australia | Australia |
[88] | ITCD | RGB | Satellite | Undisclosed | Semantic labels | Yes | China | USA |
[41] | ITCD | RGB, LiDAR | UAV | 0.18 | Semantic labels | No | China | N/A |
[44] | ITCD | RGB | UAV | 24.6 | Polygons, Bounding boxes | Yes | Canada | Canada |
[45] | ITCD | RGB | PA, Satellite | 16 | Polygons, Bounding boxes | Yes | Germany | Germany |
[46] | ITCD | RGB | UAV, PA | 4 | Polygons | Yes | Switzerland | Global |
[47] | ITD, SS, SI | RGB | PA | 8.1 | Polygons, Bounding boxes | Yes | Switzerland | Switzerland |
[48] | ITD, SI | RGB | UAV | 150 | Polygons, Bounding boxes | Yes | China | Canada |
[49] | ITD, GM | RGB | UAV | 2.48 | Keypoints | Yes | USA | USA |
[50] | ITD, Misc. | MSI, LiDAR | UAV | 4.5 | Semantic labels | Yes | USA | USA |
[63] | ITD | RGB | UAV | 0.4 | Keypoints | Yes | USA | USA |
[78] | ITD | RGB, PPC | UAV | Undisclosed | No | Yes | Sweden | Iran |
[72] | ITD | RGB | UAV | 0.82 | Bounding boxes | Yes | Saudi Arabia | Saudi Arabia |
[80] | ITD | RGB | PA | Undisclosed | Bounding boxes | No | Belgium | Spain |
[71] | ITD | RGB | N/A | 0.008 | Bounding boxes | Yes | France | N/A |
[68] | ITD | RGB | PA | 0.075 | Bounding boxes | Yes | N/A | N/A |
[64] | ITD | RGB, HSI, LiDAR | PA | 27.4 | Bounding boxes | Yes | USA | USA |
[40] | ITD | RGB | PA | 0.032 | Bounding boxes, Keypoints | Yes | USA | USA |
[87] | SS, SI | RGB | UAV | 69 | Polygons, Semantic labels | Yes | Germany | Germany |
[51] | SS, SI | RGB | UAV | 0.48 | Semantic labels | Yes | Ecuador | Ecuador |
[52] | SS, SI | RGB | UAV | 7.5 | Polygons | Yes | S. Africa | S. Africa |
[65] | SS | RGB | PA | 1.4 | Semantic labels | Yes | Poland | Poland |
[69] | SS | RGB, MSI | UAV | 2.8 | Semantic labels | Yes | USA | USA |
[26] | SS | RGB | UAV | 3.7 | Semantic labels | Yes | Japan | Japan |
[38] | SS | RGB | UAV (Synthetic) | 0.74 | Semantic labels | Yes | Germany | Russia |
[39] | SS | LiDAR | PA | 10 | Semantic labels | Yes | USA | Canada |
[53] | SS, Misc. | RGB | UAV | 9.3 | Polygons | Yes | UK | Spain |
[54] | DD, HSA | RGB, MSI | UAV | 18 | Polygons | Yes | Finland | Finland |
[22] | DD | RGB, MSI | UAV, Terrestrial | 39.25 | Keypoints | No | Greece | Macedonia |
[23] | DD | RGB, MSI | UAV, Terrestrial | 59.9 | Keypoints | No | Greece | Greece |
[42] | DD | MSI | UAV | 1.14 | No | Yes | China | China |
[55] | SI | MSI, LiDAR | PA | 147.5 | Polygons | Yes | France | France |
[56] | SI | RGB | UAV | Undisclosed | Bounding Boxes | No | Switzerland | Switzerland |
[86] | SI | RGB, MSI | Satellite | Undisclosed | Semantic labels | Yes | China | China |
[25] | SI | RGB | UAV | 19.3 | Semantic labels | Yes | Australia | Australia |
[73] | CSE | RGB, PPC | UAV | 0.2 | No | Yes | Australia | Australia |
[84] | CSE | LiDAR | PA | 0.52 | No | Yes | USA | N/A |
[33] | CSE | RGB | UAV | 7.5 | Bounding boxes | Yes | Germany | Ecuador |
[28] | HSA | LiDAR, HSI | UAV | 126 | No | Yes | UK | Germany |
[29] | HSA | RGB | UAV | 8.45 | No | Yes | UK | Germany |
[36] | GM | TLS, PPC | UAV, Terrestrial | 11.2 | No | Yes | Finland | Finland |
[57] | PM | RGB | UAV | 5.2 | Bounding boxes | Yes | China | Finland |
[27] | PM | MSI | UAV | 11.3 | No | Yes | Netherlands | Netherlands |
[58] | Misc. | RGB (Synthetic) | 3D model | 2000 | Semantic labels | Yes | USA | USA |
[59] | Misc. | LiDAR | UAV, Terrestrial | 10.7 | No | Yes | Spain | Germany |
[74] | Misc. | RGB | PA | Undisclosed | No | Yes | New Zealand | New Zealand |
[75] | Misc. | RGB | PA | Undisclosed | No | Yes | New Zealand | New Zealand |
[76] | Misc. | RGB | PA | Undisclosed | No | Yes | New Zealand | New Zealand |
[77] | Misc. | RGB | PA | Undisclosed | No | Yes | New Zealand | New Zealand |
[81] | Misc. | RGB | UAV | 0.25 | No | Yes | N/A | N/A |
[60] | Misc. | RGB, MSI | UAV | 16 | No | Yes | France | Côte d’Ivoire |
[61] | Misc. | LiDAR | UAV | 3.5 | No | Yes | Germany | Germany |
[62] | Misc. | RGB, LiDAR, PPC | UAV, Helicopter | 0.35 | No | Yes | Italy | Italy |
[18] | Misc. | MSI | UAV | 0.212 | No | Yes | Sweden | Sweden |
[19] | Misc. | PPC | UAV | 0.173 | No | Yes | Sweden | Sweden |
[20] | Misc. | RGB | UAV | 1.4 | No | Yes | Brazil | Brazil |
[21] | Misc. | RGB, MSI | UAV | 7.6 | No | Yes | Brazil | Brazil |
[67] | Misc. | RGB | UAV (Synthetic) | Undisclosed | Semantic labels | Yes | Belgium | N/A |
[66] | Misc. | RGB | UAV | 1.6 | No | Yes | Finland | Finland |
[79] | Misc. | LiDAR | PA | 0.7 | N/A | No | China | China |
[30] | Misc. | HSI | PA | 28.66 | No | No | Finland | Finland |
[70] | Misc. | LiDAR | PA | Undisclosed | N/A | No | Switzerland | Switzerland |
[32] | Misc. | HSI | PA | 4.61 | No | No | India | India |
[34] | Misc. | RGB, MSI | UAV | 3.5 | No | Yes | Germany | Germany |
[35] | Misc. | RGB, MSI | UAV | 1.5 | No | Yes | Germany | Germany |
Application | Description |
---|---|
Individual Tree Crown Delineation (ITCD) | Identifies individual tree crowns, providing detailed tree-level data, but is computationally intensive and depends on image quality. |
Individual Tree Detection (ITD) | Detects individual trees, estimating tree density and distribution, but faces challenges in dense forests and possible counting errors. |
Semantic Segmentation (SS) | Classifies image pixels into categories with high accuracy in tree recognition but requires large labeled datasets and is computationally demanding. |
Disease Detection (DD) | Identifies tree diseases for early detection and timely intervention but may need hyperspectral data, and subtle disease symptoms can be hard to detect. |
Species Identification (SI) | Determines tree species, supporting biodiversity studies, but similar species may be hard to distinguish. |
Carbon Stock Estimation (CSE) | Estimates forest carbon storage, important for climate studies, but requires accurate biomass models and is affected by terrain variability. |
Health Status Analysis (HSA) | Assesses tree health, monitoring forest vitality and aiding conservation, but involves complex health indicators and environmental influences. |
Geometrical Measurement (GM) | Measures tree dimensions, providing quantitative data for growth studies, but needs precise data and can have canopy overlap issues. |
Phenology Monitoring (PM) | Observes seasonal changes, studying climate impacts and aiding agricultural planning, but faces temporal resolution limits and weather interference. |
Miscellaneous (Misc.) | Corresponds to datasets with no clear indication of a specific use case. |
Dataset/ Study | Application | Source | Platform | Annotation | CA (*) Country | GE (*) Country |
---|---|---|---|---|---|---|
[90] | ITCD, ITD, HSA | RGB | UAV | Polygons, Semantic labels | Malaysia | UAE |
[91] | ITCD, ITD | MSI | SAT | Polygons | UAE | UAE |
[92] | ITCD, ITD | RGB | UAV | Polygons, Image labels | Malaysia | Malaysia |
[93] | ITCD, ITD | MSI | UAV | Polygons | Greece | Greece |
[94] | ITCD, ITD | RGB, PPC | UAV | Polygons | China | China |
[95] | ITCD, ITD | RGB | UAV | Polygons, Bounding boxes | Mexico | Canada |
[96] | ITCD, SI, GM | HSI | UAV | Polygons | China | China |
[97] | ITCD, SI | RGB | UAV | Polygons, Semantic labels | Japan | Japan |
[98] | ITCD, SI | HSI | UAV | Polygons, Semantic labels | China | China |
[99] | ITCD, SI | RGB, MSI | UAV | Polygons, Semantic labels | Australia | Australia |
[100] | ITCD, SI | RGB, LiDAR | UAV, PA | Semantic labels | Japan | Japan |
[101] | ITCD, SI | RGB | UAV | Semantic labels | Brazil | Brazil |
[102] | ITCD, HSA | MSI | UAV | Polygons, Semantic labels | Italy | Greece |
[103] | ITCD, SS | RGB | UAV, SAT | Polygons, Semantic labels | China | China |
[104] | ITCD, PM | MSI | UAV | Bounding boxes, Semantic labels | USA | USA |
[105] | ITCD | RGB, PPC | UAV | Polygons, Semantic labels | Poland | Iran |
[106] | ITCD | RGB | UAV | Polygons | China | China |
[107] | ITCD | MSI | UAV | Polygons | Mexico | Peru |
[108] | ITCD | RGB | UAV | Polygons | China | China |
[109] | ITCD | RGB, PPC | PA | Polygons | China | China |
[110] | ITCD | RGB | UAV | Polygons | Turkey | Turkey |
[111] | ITCD | RGB, LiDAR | PA | Polygons | China | China |
[112] | ITCD | MSI. LiDAR | UAV | Polygons | Germany | Germany |
[113] | ITCD | RGB | UAV | Polygons | China | China |
[114] | ITCD | RGB | UAV | Polygons | China | China |
[115] | ITCD | MSI | PA | Polygons | Australia | Australia |
[116] | ITCD | RGB | UAV | Semantic labels | China | China |
[117] | ITCD | RGB | UAV | Semantic labels | Brazil | Brazil |
[118] | ITCD | RGB, PPC | UAV | No | China | China |
[119] | ITCD | MSI | UAV | No | Italy | Italy |
[120] | ITCD | RGB, PPC | UAV | No | UK | Malaysia |
[121] | ITCD | RGB | UAV | Bounding boxes, Polygons | Japan | Japan |
[122] | ITCD | RGB | UAV | No | Iran | Iran |
[123] | ITCD | MSI | UAV | Semantic labels | Spain | Spain |
[124] | ITCD | RGB | UAV | Semantic labels | China | China |
[125] | ITCD | MSI | UAV | Semantic labels | India | India |
[126] | ITCD | RGB | UAV | Semantic labels | China | China |
[127] | ITCD | LiDAR | UAV | No | China | China |
[128] | ITCD | RGB, Thermal | UAV | Semantic labels | Netherlandsbn | Iran |
[129] | ITCD | HSI, LiDAR | PA | Semantic labels | Switzerland | Switzerland |
[130] | ITCD | MSI | UAV | Semantic labels | Spain | Spain |
[131] | ITCD, GM | RGB, PPC | UAV | Semantic labels | China | China |
[132] | ITCD | RGB | UAV | Bounding boxes | Malaysia | Malaysia |
[133] | ITCD, HSA | RGB | UAV | N/A | Romania | N/A |
[134] | ITCD, SI | RGB | UAV | Semantic labels | Brazil | Brazil |
[135] | ITCD | RGB | UAV | Bounding boxes, Semantic labels | China | China |
[136] | ITCD | LiDAR | UAV | No | UK | UK |
[137] | ITCD | RGB | UAV | Bounding boxes, Semantic labels | China | China |
[138] | ITCD | RGB | Satellite | Keypoints | China | French Polynesia |
[139] | DD, SS | RGB | UAV | Image labels | S. Korea | S. Korea |
[140] | DD, HSA | RGB, Thermal, LiDAR | UAV | Semantic labels | China | China |
[141] | DD, HSA | MSI, LiDAR | UAV | Polygons, Semantic labels | China | China |
[142] | DD | RGB | UAV | Bounding boxes | China | China |
[143] | DD | RGB | UAV | Bounding boxes | India | India |
[144] | DD | MSI | UAV | Polygons | S. Africa | S. Africa |
[145] | DD | RGB | UAV | Polygons | China | China |
[146] | DD | N/A | UAV | Bounding boxes | China | China |
[147] | DD | RGB | UAV | Image labels | China | China |
[148] | DD | RGB | UAV | Semantic labels | China | China |
[149] | DD | RGB, MSI | UAV | Image labels | China | China |
[150] | DD | RGB | UAV | Image labels | China | China |
[151] | DD | HSI | UAV | Polygons, Bounding boxes | China | China |
[152] | DD | MSI | UAV | Polygons, Semantic labels | Italy | Italy |
[153] | DD | MSI | UAV | Semantic labels | Greece | Greece |
[154] | DD | RGB | UAV | Bounding boxes | China | China |
[155] | DD | HSI | UAV | Bounding boxes | China | China |
[156] | DD | RGB | UAV | Bounding Boxes, Semantic labels | China | China |
[157] | DD | MSI | UAV | No | Malaysia | Malaysia |
[158] | DD | RGB, MSI | UAV | N/A | Finland | Finland |
[159] | DD | RGB | UAV | Image labels | S. Korea | S. Korea |
[160] | DD | MSI | UAV | N/A | China | Mongolia |
[161] | DD | MSI | UAV | Bounding boxes | Czech R. | Czech R. |
[162] | DD | HSI | UAV | N/A | China | China |
[163] | DD | RGB | UAV | No | USA | USA |
[164] | DD | HSI | UAV | Semantic labels | Australia | Australia |
[165] | DD | RGB | UAV | Keypoints | Japan | Japan |
[166] | DD | RGB, MSI, HSI | UAV | Image labels | Finland | Finland |
[167] | DD | RGB, MSI | UAV | No | China | China |
[168] | DD | RGB | UAV | Semantic labels | China | China |
[169] | DD | HIS, LiDAR | UAV | No | China | China |
[170] | DD | HSI | UAV | No | China | China |
[171] | DD | HSI, LiDAR | UAV | Bounding Boxes, Semantic labels | China | CHina |
[172] | DD | MSI | UAV | Bounding boxes | China | China |
[173] | DD | RGB, HSI | UAV | N/A | China | China |
[174] | DD | MSI | UAV | Bounding boxes | China | China |
[175] | DD | RGB | PA | Bounding boxes | China | China |
[176] | ITD, SS, GM | RGB, MSI, LiDAR | UAV | Polygons, Semantic labels | Russia | Russia |
[176] | ITD, GM, Misc. | RGB | UAV | Keypoints | Turkey | Turkey |
[177] | ITD, SI | MSI | UAV | Polygons | Brazil | Brazil |
[178] | ITD, SI | RGB, LiDAR | UAV | Bounding boxes, Semantic labels | China | China |
[179] | ITD, SI | RGB, MSI | UAV, Satellite | Polygons | China | China |
[180] | ITD, SI | HSI, LiDAR | UAV | Keypoints, Semantic labels | China | China |
[181] | ITD, SI | RGB | UAV | Polygons | China | China |
[182] | ITD, HSA | RGB | UAV | Image labels | USA | USA |
[183] | ITD, Misc. | RGB, PPC | UAV | Polygons | Germany | Germany |
[184] | ITD, Misc. | RGB, MSI | UAV | Polygons, Bounding boxes | China | China |
[185] | ITD | RGB | UAV | Bounding boxes | Peru | Peru |
[186] | ITD | LiDAR | UAV | Polygons, Bounding boxes | USA | USA |
[187] | ITD | RGB | UAV | Image labels | Morocco | Morocco |
[188] | ITD | RGB | UAV | Keypoints | China | China |
[189] | ITD | RGB | UAV | Bounding boxes | China | China |
[190] | ITD | RGB | UAV | Bounding boxes | Sweden | Sweden |
[191] | ITD | RGB | Satellite | Keypoints | China | China |
[192] | ITD | RGB | UAV | Keypoints | Brazil | Brazil |
[193] | ITD | RGB | UAV | Bounding boxes | Canada | Canada |
[194] | ITD | RGB, PPC | UAV | N/A | Chile | Chile |
[195] | ITD | MSI | UAV | Semantic labels | USA | USA |
[196] | ITD | RGB | UAV | Keypoints | China | China |
[197] | ITD | RGB | UAV | Image labels | India | India |
[198] | ITD | RGB | UAV | N/A | Brazil | Brazil |
[199] | ITD | RGB, MSI | UAV | Bounding boxes | China | China |
[200] | ITD, GM | RGB | UAV | Bounding boxes | USA | USA |
[201] | ITD | RGB | UAV | No | Iran | Iran |
[202] | ITD, SI | HSI | UAV | Keypoints | Brazil | Brazil |
[203] | ITD | PPC, LiDAR | UAV | No | Brazil | Brazil |
[204] | ITD | MSI | UAV | Keypoints | Brazil | Brazil |
[205] | ITD | RGB | UAV | No | Spain | Spain |
[206] | ITD, BE | RGB | UAV | Bounding boxes | China | China |
[207] | ITD | RGB | UAV | Bounding boxes | China | China |
[208] | ITD | RGB | UAV | Bounding boxes | China | Brazil |
[209] | SI, CSE | RGB | UAV | Semantic labels | Nigeria | Nigeria |
[210] | SI, Misc. | RGB, LiDAR | UAV | Semantic labels | China | China |
[211] | SI | RGB, MSI | UAV | Bounding boxes | Finland | Finland |
[212] | SI | RGB | UAV | Polygons | S.Korea | S.Korea |
[213] | SI | MSI, LiDAR | PA | Polygons, Semantic labels | Canada | Canada |
[214] | SI | RGB, MSI | UAV | Semantic labels | Taiwan | Taiwan |
[215] | SI | RGB | UAV | Polygons | China | China |
[216] | SI | RGB, MSI, HSI | UAV | Polygons | China | China |
[217] | SI | RGB | UAV | Image labels | Italy | Italy |
[218] | SI | MSI | UAV | Polygons | China | China |
[219] | SI | RGB, MSI | UAV, Satellite | Polygons | China | China |
[220] | SI | HSI | UAV | Semantic labels | China | China |
[221] | SI | RGB, MSI, LiDAR | UAV | Keypoints | Finland | Finland |
[222] | SI | RGB | UAV | Image labels | Germany | Germany |
[223] | SI | RGB | UAV | Image labels | China | China |
[224] | SI | RGB | PA | Semantic labels | Brazil | Brazil |
[225] | SI | RGB | UAV | Semantic labels | Brazil | Brazil |
[226] | SI | LiDAR, HSI | UAV | Semantic label, Polygons | China | China |
[227] | SI | RGB, MSI | UAV | Semantic labels | China | China |
[228] | GM | RGB | UAV | Polygons | Slovakia | Slovakia |
[229] | GM | RGB | UAV | No | Brazil | Brazil |
[230] | GM | RGB | UAV | Semantic labels | Romania | Iran |
[231] | GM, PM | RGB | UAV | No | Norway | Norway |
[232] | GM | RGB, MSI | UAV | No | Mexico | Mexico |
[233] | HSA, Misc. | RGB, MSI, LiDAR | UAV | Polygons, Image labels | Greece | Greece |
[234] | HSA | RGB | UAV | Image labels | Japan | Japan |
[235] | HSA | MSI | PA | Polygons | USA | USA |
[236] | HSA | MSI | UAV | Polygons | Sri Lanka | Sri Lanka |
[237] | HSA | HSI | UAV | Polygons, Semantic labels | Malaysia | Malaysia |
[238] | HSA | RGB, MSI | UAV | Polygons, Image labels | Pakistan | Pakistan |
[239] | HSA | MSI | UAV | Polygons | China | China |
[240] | HSA | RGB, MSI | UAV | Image labels | Malaysia | Malaysia |
[241] | HSA | MSI | UAV | Polygons, Semantic labels | Germany | Germany |
[242] | HSA | MSI | UAV | Semantic labels | Australia | Australia |
[243] | HSA | RGB, MSI, Thermal | UAV | No | Netherlands | Netherlands |
[244] | PM | RGB | UAV | Bounding boxes | China | China |
[245] | CSE, HSA | RGB, PPC | UAV | Polygons, Semantic labels | China | China |
[246] | CSE | RGB | UAV | Polygons | Spain | Spain |
[247] | CSE | RGB | UAV | Polygons, Semantic labels | India | India |
[248] | CSE | RGB, MSI | UAV | Semantic labels | Russia | Russia |
[249] | SS, Misc. | RGB | UAV | Polygons | Vietnam | Vietnam |
[250] | SS, Misc. | RGB | UAV | Polygons, Semantic labels | Portugal | Portugal |
[251] | SS | MSI, LiDAR | UAV | Polygons, Semantic labels | China | China |
[252] | Misc. | RGB | UAV | No | New Zealand | New Zealand |
[253] | Misc. | RGB | UAV | Polygons, Semantic labels | Australia | Australia |
[254] | Misc. | RGB | UAV | No | Germany | Germany |
[255] | Misc. | HSI | UAV | Polygons, Semantic labels | USA | USA |
[256] | Misc. | RGB | UAV | Semantic labels | China | China |
[257] | Misc. | RGB | UAV | Bounding boxes | China | China |
[258] | Misc. | HSI | UAV | No | India | Belgium |
Name | Operator/ Country | Size Limit | Number of Datasets | Establishment Date | Geographical Extent |
---|---|---|---|---|---|
data.gov.uk | UK | N/A | N/A | 2010 | UK |
Datarade ($) | Germany | N/A | N/A | 2018 | Worldwide |
Direção-Geral do Território ($) | Portugal | N/A | 378 | N/A | Portugal |
DRYAD ($) | Germany | Free individual files: 300 MB | +50,000 | 2007 | Worldwide |
EDI Data Portal | United States | N/A | +85,000 | N/A | Worldwide |
Github $) | United States | Free: 500MB, Paid: 50GB | N/A | 2008 | Worldwide |
Global Forest Watch | WRI | N/A | 142 | 1997 | Worldwide |
Hugging Face | USA | No limit | 70,735 | 2016 | Worldwide |
IEEE Dataport ($) | USA | 2/10 TB | +8000 | 2018 | Worldwide |
Kaggle | USA | 100 GB/dataset | 260,779 | 2010 | Worldwide |
Land use Opportunities-Data Supermarket | New Zealand | N/A | 74 | 2020 | New Zealand |
LILA BC | USA | N/A | 39 | N/A | Worldwide |
Mendeley Data ($) | Netherlands | Free: 10 GB, Paid: 100 GB | 26.9 million | 2015 | Worldwide |
NEON Data portal | United States | N/A | 20 regions | 2016 | USA |
ORNL DAAC | United States | N/A | N/A | 1994 | Worldwide |
Official portal for European data ($) | European Union | N/A | 1,544,800 | 2015 | Europe |
OAM | Community driven | N/A | 14,181 images | 2007 | Worldwide |
OpenTopography ($) | United States | N/A | 3532 | 2009 | Worldwide |
PACIFIC DATA HUB | SPC | N/A | 12,333 | N/A | Pacific Islands and countries |
Pacific Environment Data Portal | SPREP | N/A | 18,994 | N/A | Pacific Islands and countries |
PANGAEA | Germany | 15 GB/file | 422,680 | 1987, online 1995 | Worldwide |
Projeto áGIL—Dados LiDAR | ICNF/Portugal | N/A | 7 datasets | 2020 | Portugal |
RNDT | Italy | N/A | 20,222 | 2012 | Italy |
Roboflow ($) | United States | Free: 10,000 images | +200,000 | 2019 | Worldwide |
UC Irvine Machine Learning Repository | United States | N/A | 657 | 1987, 2023 | Worldwide |
Zenodo ($) | CERN/Europe | 50 GB/file | +3,000,000 | 2013 | Worldwide |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chehreh, B.; Moutinho, A.; Viegas, C. Seeing the Trees from Above: A Survey on Real and Synthetic Agroforestry Datasets for Remote Sensing Applications. Remote Sens. 2025, 17, 3346. https://doi.org/10.3390/rs17193346
Chehreh B, Moutinho A, Viegas C. Seeing the Trees from Above: A Survey on Real and Synthetic Agroforestry Datasets for Remote Sensing Applications. Remote Sensing. 2025; 17(19):3346. https://doi.org/10.3390/rs17193346
Chicago/Turabian StyleChehreh, Babak, Alexandra Moutinho, and Carlos Viegas. 2025. "Seeing the Trees from Above: A Survey on Real and Synthetic Agroforestry Datasets for Remote Sensing Applications" Remote Sensing 17, no. 19: 3346. https://doi.org/10.3390/rs17193346
APA StyleChehreh, B., Moutinho, A., & Viegas, C. (2025). Seeing the Trees from Above: A Survey on Real and Synthetic Agroforestry Datasets for Remote Sensing Applications. Remote Sensing, 17(19), 3346. https://doi.org/10.3390/rs17193346