A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges
Abstract
1. Introduction
2. Theoretical Background
2.1. Theoretical Evolution: From Precision Agriculture to Smart Agriculture
2.2. Drones as the Critical Enabling Layer in Agricultural Paradigm Shift
2.3. Smart Agriculture
2.4. IA/ML/DL on Drone Imagery
3. Review Method
3.1. Research Problems
- RQ1: What quartile levels are represented by the journals that have published research related to drones in smart agriculture?
- RQ2: What countries lead research on drones applied to smart agriculture, and how is the global scientific cooperation network structured?
- RQ3: What thematic categories can be identified in studies addressing the use of drones and their influence on smart agriculture?
- RQ4: How is the frequency of co-citation manifested among the authors referenced in studies examining the use of drones and their impact on smart agriculture?
- RQ5: In what ways do keywords show frequent co-occurrences in research on drones and their impact on smart agriculture?
3.2. Information Sources and Search Strategies
3.3. Identified Studies
3.4. Study Selection
3.5. Quality Assessment
- QA1: Is the thematic area addressed in the study clearly defined?
- QA2: Are the results of the experiments accurately identified and reported?
- QA3: Is the methodological pipeline clearly and reproducibly described (including data sources, sensors, flight parameters, and main processing steps)?
- QA4: Are the methods employed appropriate for the analysis of the results?
- QA5: Are the research objectives clearly formulated in the paper?
- QA6: Are the results and limitations reported with sufficient detail to enable replication or re-use (e.g., access to datasets or code, or detailed reporting of experimental settings)?
- QA7: Is the developed experiment valid and methodologically acceptable?
- Operationalization of the QA criteria
- QA1—Clarity of the thematic area addressed
- Yes: the study clearly defines the agricultural problem, context, or domain it addresses (e.g., crop, stress type, monitoring objective).
- Not: the thematic area is vague, overly general, or only implied.
- QA2—Accuracy and transparency in reporting experimental results
- Yes: the study reports quantitative results clearly (e.g., metrics, comparisons, tables, or figures) with sufficient detail.
- Not: results are incomplete, ambiguous, or missing essential information.
- QA3—Clarity and reproducibility of the methodological pipeline
- Yes: the paper clearly describes the methodological pipeline (data sources, sensors, flight parameters, and main processing steps) in a way that would allow reproduction.
- Not: the methodological description is incomplete, fragmented, or lacks sufficient detail for reproduction.
- QA4—Appropriateness of methods for analyzing results
- Yes: the analytical methods (ML/DL/statistics/indices) are clearly described and appropriate for the research question.
- Not: the methods are poorly explained, unjustified, or mismatched to the objectives.
- QA5—Clarity of research objectives
- Yes: the research goals are explicitly stated and connected with the methods and results.
- Not: objectives are absent, vague, or inconsistent.
- QA6—Transparency and reproducibility of results and limitations
- Yes: the results and limitations are reported with enough detail to enable replication or re-use (e.g., datasets or code are accessible, or experimental settings are thoroughly described).
- Not: results and limitations are reported superficially, lack key details, or do not provide enough information to support replication or re-use.
- QA7—Validity and methodological soundness of the experiment
- Yes: the study includes experimental validity elements (e.g., metrics, evaluation design, validation procedure).
- Not: experiments lack validity, justification, or methodological rigor.
3.6. Data Extraction Strategies
3.7. Data Synthesis
4. Results and Discussion
4.1. General Description of the Studies
- Method Category: The thematic distribution reveals a wide methodological diversity, with a predominance of applied lines focused on agricultural productivity and efficiency. The most representative categories are Yield & Biomass Estimation (22.8%), UAV Systems, Operations, Edge & Communications (15.2%), and Pest/Disease/Weed Detection (12.1%), which indicate a shift from experimental approaches toward integrated and field-based applications. In contrast, topics such as Cross-domain AI and Time Series Models (1.5% each) show emerging consolidation, suggesting methodological areas still under development.
- Methods Used: Methodological convergence combines machine learning, remote sensing, and 3D modeling techniques, consolidating an interdisciplinary framework that spans from classical algorithms (RF, SVR, KNN) to deep networks (YOLOv5, UNet, LSTM). This pattern confirms a technical maturity in agricultural automation, where the combination of fusion sensing, MCDM, and edge AI signals a trend toward more autonomous and adaptive systems. Notably, hybrid methods are concentrated in the categories with the highest proportion of studies (≈50% of the total), demonstrating the widespread adoption of artificial intelligence applied to UAVs.
- Datasets: The analyzed datasets reveal marked heterogeneity in scale, origin, and structure, with a predominance of data obtained from RGB, multispectral, and thermal UAVs, occasionally integrated with LiDAR and Sentinel-2 sensors. This diversity, represented in 70% of the investigations, reinforces the ecological validity of the results while limiting inter-context reproducibility. Consequently, studies with more rigorous field trials and ground truthing (≈20%) establish methodological benchmarks for the creation of standardized repositories.
- Performance: The reported performance indicators—such as R2 values above 0.9, IoU exceeding 70%, or reduced RMSE—demonstrate substantial advances in predictive accuracy and in the detection of pests, water stress, and biomass. However, the heterogeneity of metrics hinders cross-study comparisons and limits replicability. It is noteworthy that studies integrating deep learning and multitemporal analysis show the most significant improvements in accuracy and computational efficiency.
- Key Contributions: The main contributions focus on innovations in sensor integration, analytical precision, and operational efficiency, highlighting the development of reproducible models, open frameworks, and interactive DSS systems. It is important to note that the most represented categories (22.8% and 15.2%) not only provide technical advances but also yield direct economic implications in productivity and cost reduction. Overall, these contributions consolidate a transition from a purely technological approach toward scalable and sustainable agro-industrial solutions.
- Limitations: Beyond productivity and impact indicators, the synthesis of study-level limitations reveals several cross-cutting challenges for drone-based smart agriculture. Three constraints appear in more than 60% of the analyzed studies: weak model generalization (72% of works calibrated for specific crops, phenological stages, topographic conditions or management regimes and rarely validated under substantially different environments), data-standardization gaps (81% of studies lacking harmonized protocols for sensor calibration, flight parameters, ground-truth collection or evaluation metrics), and economic feasibility being neglected (89% of technical papers omitting cost–benefit analysis, ROI calculations, or consideration of implementation barriers for smallholder farmers). These patterns make it difficult to compare results or to reuse pipelines across contexts and systems, even when predictive performance is high in the original setting. Taken together, they indicate that reproducibility and scalability remain the most persistent methodological bottlenecks and constitute a central thread for the discussion developed in the next subsection.
4.2. Responses to the Research Questions
- Application trends and country-level implications
5. Conclusions and Future Research
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Afsar, M.M.; Iqbal, M.S.; Bakhshi, A.D.; Hussain, E.; Iqbal, J. MangiSpectra: A multivariate phenological analysis framework leveraging UAV imagery and LSTM for tree health and yield estimation in mango orchards. Remote Sens. 2025, 17, 703. [Google Scholar] [CrossRef]
- Ali, N.; Mohammed, A.; Bais, A.; Berraies, S.; Ruan, Y.; Cuthbert, R.D.; Sangha, J.S. Field-scale precision: Predicting grain yield of diverse wheat breeding lines using high-throughput UAV multispectral imaging. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 11419–11433. [Google Scholar] [CrossRef]
- Amarasingam, N.; Gonzalez, F.; Salgadoe, A.S.A.; Sandino, J.; Powell, K. Detection of white leaf disease in sugarcane crops using UAV-derived RGB imagery with existing deep learning models. Remote Sens. 2022, 14, 6137. [Google Scholar] [CrossRef]
- Bah, M.D.; Hafiane, A.; Canals, R. CRowNet: Deep network for crop row detection in UAV images. IEEE Access 2019, 8, 5189–5200. [Google Scholar] [CrossRef]
- Dorbu, F.; Hashemi-Beni, L. Detection of individual corn crop and canopy delineation from unmanned aerial vehicle imagery. Remote Sens. 2024, 16, 2679. [Google Scholar] [CrossRef]
- Guimarães, N.; Sousa, J.J.; Couto, P.; Bento, A.; Pádua, L. Combining UAV-based multispectral and thermal infrared data with regression modeling and SHAP analysis for predicting stomatal conductance in almond orchards. Remote Sens. 2024, 16, 2467. [Google Scholar] [CrossRef]
- Htun, N.-N.; Rojo, D.; Ooge, J.; De Croon, R.; Kasimati, A.; Verbert, K. Developing visual-assisted decision support systems across diverse agricultural use cases. Agriculture 2022, 12, 1027. [Google Scholar] [CrossRef]
- Gupta, R.; Bhatnagar, V.; Kumar, G.; Singh, G. Selection of suitable IoT-based end-devices, tools, and technologies for implementing smart farming: Issues and challenges. Int. J. Stud. Res. Technol. Manag. 2022, 10, 28–35. [Google Scholar] [CrossRef]
- Chiu, M.S.; Wang, J. Evaluation of machine learning regression techniques for estimating winter wheat biomass using biophysical, biochemical, and UAV multispectral data. Drones 2024, 8, 287. [Google Scholar] [CrossRef]
- Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Tortia, C.; Mania, E.; Guidoni, S.; Gay, P. Leaf area index evaluation in vineyards using 3D point clouds from UAV imagery. Precis. Agric. 2020, 21, 881–896. [Google Scholar] [CrossRef]
- Di Gennaro, S.; Dainelli, R.; Palliotti, A.; Toscano, P.; Matese, A. Sentinel-2 validation for spatial variability assessment in overhead trellis system viticulture versus UAV and agronomic data. Remote Sens. 2019, 11, 2573. [Google Scholar] [CrossRef]
- Guo, Y.; Guo, J.; Liu, C.; Xiong, H.; Chai, L.; He, D. Precision landing test and simulation of the agricultural UAV on apron. Sensors 2020, 20, 3369. [Google Scholar] [CrossRef]
- Brewer, K.; Clulow, A.; Sibanda, M.; Gokool, S.; Odindi, J.; Mutanga, O.; Naiken, V.; Chimonyo, V.G.P.; Mabhaudhi, T. Estimation of maize foliar temperature and stomatal conductance as indicators of water stress based on optical and thermal imagery acquired using an unmanned aerial vehicle (UAV) platform. Drones 2022, 6, 169. [Google Scholar] [CrossRef]
- Ndlovu, H.S.; Odindi, J.; Sibanda, M.; Mutanga, O. A systematic review on the application of UAV-based thermal remote sensing for assessing and monitoring crop water status in crop farming systems. Int. J. Remote Sens. 2024, 45, 4923–4960. [Google Scholar] [CrossRef]
- Bukowiecki, J.; Rose, T.; Holzhauser, K.; Rothardt, S.; Rose, M.; Komainda, M.; Herrmann, A.; Kage, H. UAV-based canopy monitoring: Calibration of a multispectral sensor for green area index and nitrogen uptake across several crops. Precis. Agric. 2024, 25, 1556–1580. [Google Scholar] [CrossRef]
- Flores Peña, P.; Ale Isaac, M.S.; Gîfu, D.; Pechlivani, E.M.; Ragab, A.R. Unmanned aerial vehicle-based hyperspectral imaging and soil texture mapping with robust AI algorithms. Drones 2025, 9, 129. [Google Scholar] [CrossRef]
- Gao, J.; Bambrah, C.K.; Parihar, N.; Kshirsagar, S.; Mallarapu, S.; Yu, H.; Wu, J.; Yang, Y. Analysis of various machine learning algorithms for using drone images in livestock farms. Agriculture 2024, 14, 522. [Google Scholar] [CrossRef]
- Chin, R.; Catal, C.; Kassahun, A. Plant disease detection using drones in precision agriculture. Precis. Agric. 2023, 24, 1663–1682. [Google Scholar] [CrossRef]
- Gao, M.; Yang, F.; Wei, H.; Liu, X. Automatic monitoring of maize seedling growth using unmanned aerial vehicle-based RGB imagery. Remote Sens. 2023, 15, 3671. [Google Scholar] [CrossRef]
- Silva, J.A.O.S.; de Siqueira, V.S.; Mesquita, M.; Vale, L.S.R.; da Silva, J.L.B.; da Silva, M.V.; Lemos, J.P.B.; Lacerda, L.N.; Ferrarezi, R.S.; de Oliveira, H.F.E. Artificial intelligence applied to support agronomic decisions for the automatic aerial analysis images captured by UAV: A systematic review. Agronomy 2024, 14, 2697. [Google Scholar] [CrossRef]
- Andrade-Mogollon, T.; Gamboa-Cruzado, J.; Amayo-Gamboa, F. Systematic literature review of generative AI and IoT as key technologies for precision agriculture. Comput. Sist. 2025, 29, 857–882. [Google Scholar] [CrossRef]
- Vasileiou, M.; Kyrgiakos, L.S.; Kleisiari, C.; Kleftodimos, G.; Vlontzos, G.; Belhouchette, H.; Pardalos, P.M. Transforming weed management in sustainable agriculture with artificial intelligence: A systematic literature review towards weed identification and deep learning. Crop Prot. 2024, 176, 106522. [Google Scholar] [CrossRef]
- Balyan, S.; Jangir, H.; Tripathi, S.N.; Tripathi, A.; Jhang, T.; Pandey, P. Seeding a sustainable future: Navigating the digital horizon of smart agriculture. Sustainability 2024, 16, 475. [Google Scholar] [CrossRef]
- Huang, L.; Tan, J.; Chen, Z. Mamba-UAV-SegNet: A multi-scale adaptive feature fusion network for real-time semantic segmentation of UAV aerial imagery. Drones 2024, 8, 671. [Google Scholar] [CrossRef]
- Sott, M.K.; Nascimento, L.d.S.; Foguesatto, C.R.; Furstenau, L.B.; Faccin, K.; Zawislak, P.A.; Mellado, B.; Kong, J.D.; Bragazzi, N.L. A bibliometric network analysis of recent publications on digital agriculture to depict strategic themes and evolution structure. Sensors 2021, 21, 7889. [Google Scholar] [CrossRef]
- Rejeb, A.; Abdollahi, A.; Rejeb, K.; Treiblmaier, H. Drones in agriculture: A review and bibliometric analysis. Comput. Electron. Agric. 2022, 198, 107017. [Google Scholar] [CrossRef]
- Kabir, M.S.; Pervez, A.K.M.K.; Jahan, M.N.; Khan, M.T.A.; Kabiraj, U.K. Unmanned aerial vehicles in agricultural sciences: A bibliometric analysis study based on Scopus database. Afr. J. Biol. Sci. 2024, 6, 14335–14357. [Google Scholar]
- Moraes, H.M.F.E.; Júnior, M.R.F.; Da Vitória, E.L.; Martins, R.N. A bibliometric and scientometric analysis on the use of UAVs in agriculture, livestock and forestry. Cienc. Rural 2023, 53, e20220130. [Google Scholar] [CrossRef]
- Mühl, D.D.; Oliveira, L. A bibliometric and thematic approach to agriculture 4.0. Heliyon 2022, 8, e09369. [Google Scholar] [CrossRef]
- Bertoglio, R.; Corbo, C.; Renga, F.M.; Matteucci, M. The digital agricultural revolution: A bibliometric analysis literature review. IEEE Access 2021, 9, 134762–134782. [Google Scholar] [CrossRef]
- Abrahams, M.; Sibanda, M.; Dube, T.; Chimonyo, V.G.P.; Mabhaudhi, T. A systematic review of UAV applications for mapping neglected and underutilised crop species’ spatial distribution and health. Remote Sens. 2023, 15, 4672. [Google Scholar] [CrossRef]
- Zambrano, P.; Calderon, F.; Villegas, H.; Paillacho, J.; Pazmiño, D.; Realpe, M. UAV remote sensing applications and current trends in crop monitoring and diagnostics: A systematic literature review. In Proceedings of the 2023 IEEE 13th International Conference on Pattern Recognition Systems (ICPRS), Guayaquil, Ecuador, 4–7 July 2023. [Google Scholar] [CrossRef]
- Bento, N.L.; Ferraz, G.A.E.S.; Santana, L.S.; Silva, M.D.L.O.E. Coffee growing with remotely piloted aircraft system: Bibliometric review. AgriEngineering 2023, 5, 2458–2477. [Google Scholar] [CrossRef]
- Marques, P.; Pádua, L.; Sousa, J.J.; Fernandes-Silva, A. Advancements in remote sensing imagery applications for precision management in olive growing: A systematic review. Remote Sens. 2024, 16, 1324. [Google Scholar] [CrossRef]
- de Jesus Diaz Lara, M.; Bernabe, J.G.; Benitez, R.A.G.; Toxqui, J.M.; Huerta, M.K. Bibliometric analysis of the use of the internet of things in precision agriculture. In Proceedings of the 2021 IEEE International Conference on Engineering Veracruz (ICEV), Boca del Río, Veracruz, Mexico, 25–28 October 2021. [Google Scholar] [CrossRef]
- Darra, N.; Anastasiou, E.; Kriezi, O.; Lazarou, E.; Kalivas, D.; Fountas, S. Can yield prediction be fully digitilized? A systematic review. Agronomy 2023, 13, 2441. [Google Scholar] [CrossRef]
- Mustafa, G.; Liu, Y.; Khan, I.H.; Hussain, S.; Jiang, Y.; Liu, J.; Arshad, S.; Osman, R. Establishing a knowledge structure for yield prediction in cereal crops using unmanned aerial vehicles. Front. Plant Sci. 2024, 15, 1401246. [Google Scholar] [CrossRef]
- Guimarães, N.; Sousa, J.J.; Pádua, L.; Bento, A.; Couto, P. Remote sensing applications in almond orchards: A comprehensive systematic review of current insights, research gaps, and future prospects. Appl. Sci. 2024, 14, 1749. [Google Scholar] [CrossRef]
- Escandón-Panchana, P.; Herrera-Franco, G.; Jaya-Montalvo, M.; Martínez-Cuevas, S. Geomatic tools used in the management of agricultural activities: A systematic review. Environ. Dev. Sustain. 2024, 27, 15275–15309. [Google Scholar] [CrossRef]
- Joice, A.; Tufaique, T.; Tazeen, H.; Igathinathane, C.; Zhang, Z.; Whippo, C.; Hendrickson, J.; Archer, D. Applications of Raspberry Pi for precision agriculture: A systematic review. Agriculture 2025, 15, 227. [Google Scholar] [CrossRef]
- Abdollahi, A.; Rejeb, K.; Rejeb, A.; Mostafa, M.M.; Zailani, S. Wireless sensor networks in agriculture: Insights from bibliometric analysis. Sustainability 2021, 13, 12011. [Google Scholar] [CrossRef]
- Singh, A.P.; Yerudkar, A.; Mariani, V.; Iannelli, L.; Glielmo, L. A bibliometric review of the use of unmanned aerial vehicles in precision agriculture and precision viticulture for sensing applications. Remote Sens. 2022, 14, 1604. [Google Scholar] [CrossRef]
- Wang, J.; Wang, S.; Zou, D.; Chen, H.; Zhong, R.; Li, H.; Zhou, W.; Yan, K. Social network and bibliometric analysis of unmanned aerial vehicle remote sensing applications from 2010 to 2021. Remote Sens. 2021, 13, 2912. [Google Scholar] [CrossRef]
- Brenya, R.; Zhu, J.; Sampene, A.K. Can agriculture technology improve food security in low- and middle-income nations? A systematic review. Sustain. Food Technol. 2023, 1, 484–499. [Google Scholar] [CrossRef]
- Wang, G.; Li, S.; Yi, Y.; Wang, Y.; Shin, C. Digital technology increases the sustainability of cross-border agro-food supply chains: A review. Agriculture 2024, 14, 900. [Google Scholar] [CrossRef]
- Yacoob, A.; Gokool, S.; Clulow, A.; Mahomed, M.; Mabhaudhi, T. Leveraging unmanned aerial vehicle technologies to facilitate precision water management in smallholder farms: A scoping review and bibliometric analysis. Drones 2024, 8, 476. [Google Scholar] [CrossRef]
- Kitchenham, B. Guidelines for Performing Systematic Literature Reviews in Software Engineering; Software Engineering Group, School of Computer Science and Mathematics, Keele University: Keele, UK; Department of Computer Science, University of Durham: Durham, UK, 2007. [Google Scholar]
- Petersen, K.; Vakkalanka, S.; Kuzniarz, L. Guidelines for conducting systematic mapping studies in software engineering: An update. Inf. Softw. Technol. 2015, 64, 1–18. [Google Scholar] [CrossRef]
- PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses. Available online: https://www.prismastatement.org (accessed on 28 July 2025).
- Ivezić, A.; Trudić, B.; Stamenković, Z.; Kuzmanović, B.; Perić, S.; Ivošević, B.; Buđen, M.; Petrović, K. Drone-related agrotechnologies for precise plant protection in Western Balkans: Applications, possibilities, and legal framework limitations. Agronomy 2023, 13, 2615. [Google Scholar] [CrossRef]
- Jasim, A.N.; Fourati, L.C.; Albahri, O.S. Evaluation of unmanned aerial vehicles for precision agriculture based on integrated fuzzy decision-making approach. IEEE Access 2023, 11, 75037–75062. [Google Scholar] [CrossRef]
- Jindo, K.; Teklu, M.G.; van Boheeman, K.; Njehia, N.S.; Narabu, T.; Kempenaar, C.; Molendijk, L.P.G.; Schepel, E.; Been, T.H. Unmanned aerial vehicle (UAV) for detection and prediction of damage caused by potato cyst nematode G. pallida on selected potato cultivars. Remote Sens. 2023, 15, 1429. [Google Scholar] [CrossRef]
- Jorge, J.; Vallbé, M.; Soler, J.A. Detection of irrigation inhomogeneities in an olive grove using the NDRE vegetation index obtained from UAV images. Eur. J. Remote Sens. 2019, 52, 169–177. [Google Scholar] [CrossRef]
- Jurado, J.M.; Ortega, L.; Cubillas, J.J.; Feito, F.R. Multispectralmapping on 3D models and multi-temporal monitoring for individual characterization of olive trees. Remote Sens. 2020, 12, 1106. [Google Scholar] [CrossRef]
- Kakamoukas, G.A.; Lagkas, T.D.; Argyriou, V.; Goudos, S.K.; Grammatikis, P.R.; Bibi, S.; Sarigiannidis, P.G. A novel air-to-ground communication scheme for advanced big data collection in smart farming using UAVs. IEEE Access 2025, 13, 16564–16583. [Google Scholar] [CrossRef]
- Kapari, M.; Sibanda, M.; Magidi, J.; Mabhaudhi, T.; Nhamo, L.; Mpandeli, S. Comparing machine learning algorithms for estimating the maize crop water stress index (CWSI) using UAV-acquired remotely sensed data in smallholder croplands. Drones 2024, 8, 61. [Google Scholar] [CrossRef]
- Khaliq, A.; Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Chiaberge, M.; Gay, P. Comparison of satellite and UAV-based multispectral imagery for vineyard variability assessment. Remote Sens. 2019, 11, 436. [Google Scholar] [CrossRef]
- Khun, K.; Tremblay, N.; Panneton, B.; Vigneault, P.; Lord, E.; Cavayas, F.; Codjia, C. Use of oblique RGB imagery and apparent surface area of plants for early estimation of above-ground corn biomass. Remote Sens. 2021, 13, 4032. [Google Scholar] [CrossRef]
- Killeen, P.; Kiringa, I.; Yeap, T.; Branco, P. Corn grain yield prediction using UAV-based high spatiotemporal resolution imagery, machine learning, and spatial cross-validation. Remote Sens. 2024, 16, 683. [Google Scholar] [CrossRef]
- Koubaa, A.; Ammar, A.; Abdelkader, M.; Alhabashi, Y.; Ghouti, L. AERO: AI-enabled remote sensing observation with onboard edge computing in UAVs. Remote Sens. 2023, 15, 1873. [Google Scholar] [CrossRef]
- Kovalev, I.V.; Kovalev, D.I.; Voroshilova, A.A.; Podoplelova, V.A.; Borovinsky, D.A. GERT analysis of UAV transport technological cycles when used in precision agriculture. IOP Conf. Ser. Earth Environ. Sci. 2022, 1076, 012055. [Google Scholar] [CrossRef]
- Li, M.; Shamshiri, R.R.; Schirrmann, M.; Weltzien, C.; Shafian, S.; Laursen, M.S. UAV oblique imagery with an adaptive micro-terrain model for estimation of leaf area index and height of maize canopy from 3D point clouds. Remote Sens. 2022, 14, 585. [Google Scholar] [CrossRef]
- Li, Z.; Zhou, X.; Cheng, Q.; Fei, S.; Chen, Z. A machine-learning model based on the fusion of spectral and textural features from UAV multi-sensors to analyse the total nitrogen content in winter wheat. Remote Sens. 2023, 15, 2152. [Google Scholar] [CrossRef]
- Liu, Y.; Feng, H.; Yue, J.; Fan, Y.; Jin, X.; Song, X.; Yang, H.; Yang, G. Estimation of potato above-ground biomass based on vegetation indices and green-edge parameters obtained from UAVs. Remote Sens. 2022, 14, 5323. [Google Scholar] [CrossRef]
- Liu, Z.; Li, H.; Ding, X.; Cao, X.; Chen, H.; Zhang, S. Estimating maize maturity by using UAV multi-spectral images combined with a CCC-based model. Drones 2023, 7, 586. [Google Scholar] [CrossRef]
- Dybå, T.; Dingsøyr, T. Empirical studies of agile software development: A systematic review. Inf. Softw. Technol. 2008, 50, 833–859. [Google Scholar] [CrossRef]
- Duarte, A.; Borralho, N.; Cabral, P.; Caetano, M. Recent advances in forest insect pests and diseases monitoring using UAV-based data: A systematic review. Forests 2022, 13, 911. [Google Scholar] [CrossRef]
- Alok, K.; Varshney, M. A study of the smart drone with an artificial neural network for precision farming. Neuro Quantology 2022, 20, 4454–4458. [Google Scholar]
- Vijayakumar, S.; Shanmugapriya, P.; Saravanane, P.; Ramesh, T.; Murugaiyan, V.; Ilakkiya, S. Precision weed control using unmanned aerial vehicles and robots: Assessing feasibility, bottlenecks, and recommendations for scaling. New Dev. Technol. 2025, 3, 10. [Google Scholar] [CrossRef]
- Farhad, M.M.; Kurum, M.; Gurbuz, A.C. A ubiquitous GNSS-R methodology to estimate surface reflectivity using spinning smartphone onboard a small UAS. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 6568–6578. [Google Scholar] [CrossRef]
- Hosseiny, B.; Rastiveis, H.; Homayouni, S. An automated framework for plant detection based on deep simulated learning from drone imagery. Remote Sens. 2020, 12, 3521. [Google Scholar] [CrossRef]
- Mia, M.S.; Tanabe, R.; Habibi, L.N.; Hashimoto, N.; Homma, K.; Maki, M.; Matsui, T.; Tanaka, T.S.T. Multimodal deep learning for rice yield prediction using UAV-based multispectral imagery and weather data. Remote Sens. 2023, 15, 2511. [Google Scholar] [CrossRef]
- Moresi, F.V.; Cirigliano, P.; Rengo, A.; Brunori, E.; Biasi, R.; Mugnozza, G.S.; Maesano, M. Monitoring abiotic stressors in rainfed vineyards involves combining UAV and field monitoring techniques to enhance precision management. Remote Sens. 2025, 17, 803. [Google Scholar] [CrossRef]
- Moysiadis, V.; Siniosoglou, I.; Kokkonis, G.; Argyriou, V.; Lagkas, T.; Goudos, S.K.; Sarigiannidis, P. Cherry tree crown extraction using machine learning based on images from UAVs. Agriculture 2024, 14, 322. [Google Scholar] [CrossRef]
- Nahrstedt, K.; Reuter, T.; Trautz, D.; Waske, B.; Jarmer, T. Classifying stand compositions in clover grass based on high-resolution multispectral UAV images. Remote Sens. 2024, 16, 2684. [Google Scholar] [CrossRef]
- Nanavati, R.V.; Meng, Y.; Coombes, M.; Liu, C. Generalized data-driven optimal path planning framework for uniform coverage missions using crop spraying UAVs. Precis. Agric. 2023, 24, 1497–1525. [Google Scholar] [CrossRef]
- Ndlovu, H.S.; Odindi, J.; Sibanda, M.; Mutanga, O.; Clulow, A.; Chimonyo, V.G.P.; Mabhaudhi, T. A comparative estimation of maize leaf water content using machine learning techniques and unmanned aerial vehicle (UAV)-based proximal and remotely sensed data. Remote Sens. 2021, 13, 4091. [Google Scholar] [CrossRef]
- Nnadozie, E.C.; Iloanusi, O.N.; Ani, O.A.; Yu, K. Detecting cassava plants under different field conditions using UAV-based RGB images and deep learning models. Remote Sens. 2023, 15, 2322. [Google Scholar] [CrossRef]
- Nuijten, R.J.G.; Kooistra, L.; De Deyn, G.B. Using unmanned aerial systems (UAS) and object-based image analysis (OBIA) for measuring plant-soil feedback effects on crop productivity. Drones 2019, 3, 54. [Google Scholar] [CrossRef]
- Ortenzi, L.; Violino, S.; Pallottino, F.; Figorilli, S.; Vasta, S.; Tocci, F.; Antonucci, F.; Imperi, G.; Costa, C. Early estimation of olive production from light drone orthophoto through canopy radius. Drones 2021, 5, 118. [Google Scholar] [CrossRef]
- De Padua, E.P.; Amongo, R.C.; Quilloy, E.P.; Suministrado, D.C.; Elauria, J.C. Development of a local unmanned aerial vehicle (UAV) pesticide sprayer for rice production system in the Philippines. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1109, 12022. [Google Scholar] [CrossRef]
- Panjaitan, S.D.; Dewi, Y.S.K.; Hendri, M.I.; Wicaksono, R.A.; Priyatman, H. A drone technology implementation approach to conventional paddy fields application. IEEE Access 2022, 10, 120650–120658. [Google Scholar] [CrossRef]
- Pantos, C.; Hildmann, H.; Valente, J. Experimental connectivity analysis for drones in greenhouses. Drones 2022, 7, 24. [Google Scholar] [CrossRef]
- Quille-Mamani, J.; Ramos-Fernández, L.; Huanuqueño-Murillo, J.; Quispe-Tito, D.; Cruz-Villacorta, L.; Pino-Vargas, E.; del Pino, L.F.; Heros-Aguilar, E.; Ángel Ruiz, L. Rice yield prediction using spectral and textural indices derived from UAV imagery and machine learning models in Lambayeque, Peru. Remote Sens. 2025, 17, 632. [Google Scholar] [CrossRef]
- Rahman, M.F.F.; Fan, S.; Zhang, Y.; Chen, L. A comparative study on application of unmanned aerial vehicle systems in agriculture. Agriculture 2021, 11, 22. [Google Scholar] [CrossRef]
- Ren, J.; Zhang, N.; Liu, X.; Wu, S.; Li, D. Dynamic harvest index estimation of winter wheat based on UAV hyperspectral remote sensing considering crop aboveground biomass change and the grain filling process. Remote Sens. 2022, 14, 1955. [Google Scholar] [CrossRef]
- Ren, P.; Li, H.; Han, S.; Chen, R.; Yang, G.; Yang, H.; Feng, H.; Zhao, C. Estimation of soybean yield by combining maturity group information and unmanned aerial vehicle multi-sensor data using machine learning. Remote Sens. 2023, 15, 4286. [Google Scholar] [CrossRef]
- Ronchetti, G.; Mayer, A.; Facchi, A.; Ortuani, B.; Sona, G. Crop row detection through UAV surveys to optimize on-farm irrigation management. Remote Sens. 2020, 12, 1967. [Google Scholar] [CrossRef]
- Senyurek, V.; Farhad, M.M.; Gurbuz, A.C.; Kurum, M.; Adeli, A. Fusion of reflected GPS signals with multispectral imagery to estimate soil moisture at subfield scale from small UAS platforms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 6843–6855. [Google Scholar] [CrossRef]
- Shahi, T.B.; Dahal, S.; Sitaula, C.; Neupane, A.; Guo, W. Deep learning-based weed detection using UAV images: A comparative study. Drones 2023, 7, 624. [Google Scholar] [CrossRef]
- Singh, K.; Huang, Y.; Young, W.; Harvey, L.; Hall, M.; Zhang, X.; Lobaton, E.; Jenkins, J.; Shankle, M. Sweet potato yield prediction using machine learning based on multispectral images acquired from a small unmanned aerial vehicle. Agriculture 2025, 15, 420. [Google Scholar] [CrossRef]
- Sofia, S.; Agosta, M.; Asciuto, A.; Crescimanno, M.; Galati, A. Unleashing profitability of vineyards through the adoption of unmanned aerial vehicles technology systems: The case of two Italian wineries. Precis. Agric. 2025, 26, 41. [Google Scholar] [CrossRef]
- Sunoj, S.; Cho, J.; Guinness, J.; van Aardt, J.; Czymmek, K.J.; Ketterings, Q.M. Corn grain yield prediction and mapping from unmanned aerial system (UAS) multispectral imagery. Remote Sens. 2021, 13, 3948. [Google Scholar] [CrossRef]
- Lara-Molina, F.A. Optimization of Coverage Path Planning for Agricultural Drones in Weed-Infested Fields Using Semantic Segmentation. Agriculture 2025, 15, 1262. [Google Scholar] [CrossRef]
- Vélez, S.; Vacas, R.; Martín, H.; Ruano-Rosa, D.; Álvarez, S. A novel technique using planar area and ground shadows calculated from UAV RGB imagery to estimate pistachio tree (Pistacia vera L.) canopy volume. Remote Sens. 2022, 14, 6006. [Google Scholar] [CrossRef]
- Wang, L.; Lan, Y.; Zhang, Y.; Zhang, H.; Tahir, M.N.; Ou, S.; Liu, X.; Chen, P. Applications and prospects of agricultural unmanned aerial vehicle obstacle avoidance technology in China. Sensors 2019, 19, 642. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.; Xiao, C.; Wang, Y.; Li, K.; Yu, K.; Geng, J.; Li, Q.; Yang, J.; Zhang, J.; Zhang, M.; et al. Monitoring of cotton boll opening rate based on UAV multispectral data. Remote Sens. 2023, 16, 132. [Google Scholar] [CrossRef]
- Wei, L.; Yu, M.; Zhong, Y.; Zhao, J.; Liang, Y.; Hu, X. Spatial–spectral fusion based on conditional random fields for the fine classification of crops in UAV-borne hyperspectral remote sensing imagery. Remote Sens. 2019, 11, 780. [Google Scholar] [CrossRef]
- Wu, P.; Lei, X.; Zeng, J.; Qi, Y.; Yuan, Q.; Huang, W.; Ma, Z.; Shen, Q.; Lyu, X. Research progress in mechanized and intelligentized pollination technologies for fruit and vegetable crops. Int. J. Agric. Biol. Eng. 2024, 17, 11–21. [Google Scholar] [CrossRef]
- Xu, R.; Li, C.; Bernardes, S. Development and testing of a UAV-based multi-sensor system for plant phenotyping and precision agriculture. Remote Sens. 2021, 13, 3517. [Google Scholar] [CrossRef]
- Yan, P.; Han, Q.; Feng, Y.; Kang, S. Estimating LAI for cotton using multisource UAV data and a modified universal model. Remote Sens. 2022, 14, 4272. [Google Scholar] [CrossRef]
- Ye, N.; Walker, P.; Gao, Y.; PopStefanija, I.; Hills, J. Comparison between thermal-optical and L-band passive microwave soil moisture remote sensing at farm scales: Towards UAV-based near-surface soil moisture mapping. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 633–642. [Google Scholar] [CrossRef]
- Zhang, F.; Hassanzadeh, A.; Kikkert, J.; Pethybridge, S.J.; Van Aardt, J. Evaluation of leaf area index (LAI) of broadacre crops using UAS-based LiDAR point clouds and multispectral imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 4027–4044. [Google Scholar] [CrossRef]
- Zheng, Z.; Yuan, J.; Yao, W.; Yao, H.; Liu, Q.; Guo, L. Crop classification from drone imagery based on lightweight semantic segmentation methods. Remote Sens. 2024, 16, 4099. [Google Scholar] [CrossRef]
- Slimani, H.; El Mhamdi, J.; Jilbab, A. Assessing the advancement of artificial intelligence and drones’ integration in agriculture through a bibliometric study. Int. J. Electr. Comput. Eng. 2024, 14, 878–890. [Google Scholar] [CrossRef]
- Gokool, S.; Mahomed, M.; Kunz, R.; Clulow, A.; Sibanda, M.; Naiken, V.; Chetty, K.; Mabhaudhi, T. Crop monitoring in smallholder farms using unmanned aerial vehicles to facilitate precision agriculture practices: A scoping review and bibliometric analysis. Sustainability 2023, 15, 3557. [Google Scholar] [CrossRef]
- Nduku, L.; Munghemezulu, C.; Mashaba-Munghemezulu, Z.; Kalumba, A.M.; Chirima, G.J.; Masiza, W.; De Villiers, C. Global research trends for unmanned aerial vehicle remote sensing application in wheat crop monitoring. Geomatics 2023, 3, 115–136. [Google Scholar] [CrossRef]
- Lalrochunga, D.; Parida, A.; Choudhury, S. Systematic review on capacity building through renewable energy enabled IoT–unmanned aerial vehicle for smart agroforestry. Clean. Circ. Bioeconomy 2024, 8, 100094. [Google Scholar] [CrossRef]
- Ndour, A.; Blasch, G.; Valente, J.; Gebrekidan, B.H.; Sida, T.S. Optimal machine learning algorithms and UAV multispectral imagery for crop phenotypic trait estimation: A comprehensive review and meta-analysis. Environ. Res. Commun. 2025, 7, 072002. [Google Scholar] [CrossRef]
- Armenta-Medina, D.; Ramirez-del Real, T.A.; Villanueva-Vásquez, D.; Mejia-Aguirre, C. Trends on advanced information and communication technologies for improving agricultural productivities: A bibliometric analysis. Agronomy 2020, 10, 1989. [Google Scholar] [CrossRef]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef] [PubMed]













| Conceptual Group | Descriptor |
|---|---|
| Drone Technologies | drone/unmanned aerial vehicle/uav/unmanned aircraft system/uas/remotely piloted aircraft system/rpas/aerial vehicle/unpiloted aircraft/agricultural drone/spraying drone/monitoring drone/surveillance drone/mapping drone |
| Smart Agriculture | agriculture 4.0/digital agriculture/smart agriculture/precision agriculture/agroindustry 4.0/agrotechnology/connected agriculture/data-driven agriculture/automated agriculture/iot agriculture/artificial intelligence agriculture/machine learning agriculture |
| Reference | Type | QA1 | QA2 | QA3 | QA4 | QA5 | QA6 | QA7 | Score |
|---|---|---|---|---|---|---|---|---|---|
| [1] | Journal | 2 | 1 | 1 | 1 | 3 | 2 | 3 | 13 |
| [2] | Journal | 2 | 3 | 1 | 2 | 1 | 2 | 1 | 12 |
| [68] | Journal | 2 | 2 | 1 | 2 | 3 | 1 | 1 | 12 |
| [3] | Journal | 3 | 1 | 1 | 2 | 1 | 3 | 2 | 13 |
| [69] | Journal | 2 | 3 | 1 | 2 | 2 | 2 | 1 | 13 |
| [4] | Journal | 1 | 1 | 1 | 3 | 3 | 3 | 1 | 13 |
| [23] | Journal | 3 | 1 | 3 | 1 | 1 | 3 | 1 | 13 |
| [13] | Journal | 2 | 1 | 2 | 2 | 1 | 2 | 2 | 12 |
| [15] | Journal | 2 | 2 | 2 | 2 | 1 | 1 | 2 | 12 |
| [18] | Journal | 1 | 1 | 3 | 2 | 1 | 2 | 2 | 12 |
| [9] | Journal | 2 | 2 | 2 | 2 | 1 | 2 | 2 | 13 |
| [10] | Journal | 1 | 1 | 2 | 2 | 3 | 2 | 1 | 12 |
| [11] | Journal | 3 | 2 | 2 | 3 | 1 | 2 | 1 | 14 |
| [5] | Journal | 1 | 2 | 2 | 2 | 2 | 2 | 1 | 12 |
| [70] | Journal | 3 | 3 | 1 | 1 | 3 | 1 | 1 | 13 |
| [16] | Journal | 3 | 2 | 2 | 1 | 2 | 3 | 2 | 15 |
| [17] | Journal | 1 | 3 | 1 | 3 | 3 | 1 | 1 | 13 |
| [19] | Journal | 2 | 2 | 2 | 3 | 1 | 1 | 2 | 13 |
| [6] | Journal | 2 | 1 | 3 | 3 | 2 | 1 | 1 | 13 |
| [12] | Journal | 1 | 1 | 2 | 3 | 3 | 1 | 3 | 14 |
| [8] | Journal | 2 | 2 | 1 | 2 | 2 | 2 | 2 | 13 |
| [71] | Journal | 2 | 2 | 3 | 1 | 2 | 2 | 1 | 13 |
| [7] | Journal | 1 | 2 | 2 | 2 | 1 | 2 | 2 | 12 |
| [24] | Journal | 3 | 1 | 1 | 2 | 2 | 2 | 1 | 12 |
| [50] | Journal | 2 | 2 | 1 | 3 | 1 | 2 | 2 | 13 |
| [51] | Journal | 1 | 3 | 2 | 2 | 2 | 1 | 2 | 13 |
| [52] | Journal | 2 | 2 | 2 | 2 | 2 | 1 | 2 | 13 |
| [53] | Journal | 3 | 2 | 1 | 2 | 2 | 2 | 1 | 13 |
| [54] | Journal | 1 | 1 | 3 | 2 | 2 | 2 | 2 | 13 |
| [55] | Journal | 2 | 1 | 2 | 2 | 2 | 2 | 1 | 12 |
| [56] | Journal | 3 | 3 | 1 | 1 | 1 | 1 | 2 | 12 |
| [57] | Journal | 2 | 2 | 3 | 1 | 2 | 2 | 3 | 15 |
| [58] | Journal | 1 | 3 | 2 | 2 | 3 | 2 | 2 | 15 |
| [59] | Journal | 3 | 2 | 2 | 3 | 1 | 2 | 2 | 15 |
| [60] | Journal | 2 | 1 | 2 | 2 | 2 | 1 | 2 | 12 |
| [61] | Journal | 3 | 1 | 2 | 1 | 1 | 2 | 2 | 12 |
| [62] | Journal | 2 | 1 | 2 | 2 | 1 | 2 | 2 | 12 |
| [63] | Journal | 3 | 2 | 1 | 2 | 2 | 1 | 2 | 13 |
| [64] | Journal | 1 | 3 | 2 | 2 | 3 | 1 | 2 | 14 |
| [65] | Journal | 2 | 2 | 3 | 1 | 1 | 1 | 3 | 13 |
| [72] | Journal | 3 | 2 | 2 | 2 | 1 | 1 | 3 | 14 |
| [73] | Journal | 2 | 3 | 2 | 1 | 1 | 2 | 2 | 13 |
| [74] | Journal | 1 | 2 | 3 | 2 | 2 | 3 | 2 | 15 |
| [75] | Journal | 2 | 3 | 1 | 2 | 2 | 3 | 2 | 15 |
| [76] | Journal | 2 | 2 | 2 | 1 | 2 | 1 | 2 | 12 |
| [77] | Journal | 2 | 2 | 1 | 2 | 1 | 3 | 3 | 14 |
| [78] | Journal | 3 | 3 | 2 | 1 | 2 | 1 | 2 | 14 |
| [79] | Journal | 1 | 2 | 3 | 3 | 2 | 2 | 2 | 15 |
| [80] | Journal | 1 | 1 | 2 | 2 | 2 | 2 | 2 | 12 |
| [81] | Journal | 2 | 2 | 2 | 2 | 1 | 2 | 2 | 13 |
| [82] | Journal | 1 | 1 | 2 | 2 | 2 | 1 | 3 | 12 |
| [83] | Journal | 3 | 3 | 2 | 2 | 1 | 2 | 2 | 15 |
| [84] | Journal | 1 | 3 | 3 | 1 | 2 | 2 | 2 | 14 |
| [85] | Journal | 2 | 2 | 2 | 2 | 1 | 1 | 2 | 12 |
| [86] | Journal | 3 | 1 | 3 | 2 | 2 | 2 | 2 | 15 |
| [87] | Journal | 1 | 1 | 2 | 2 | 2 | 2 | 2 | 12 |
| [88] | Journal | 2 | 2 | 2 | 2 | 2 | 1 | 2 | 13 |
| [89] | Journal | 3 | 2 | 2 | 1 | 2 | 1 | 2 | 13 |
| [90] | Journal | 2 | 3 | 2 | 2 | 2 | 1 | 1 | 13 |
| [91] | Journal | 1 | 2 | 1 | 2 | 3 | 2 | 2 | 13 |
| [92] | Journal | 3 | 2 | 1 | 2 | 1 | 3 | 1 | 13 |
| [93] | Journal | 2 | 2 | 3 | 1 | 1 | 1 | 2 | 12 |
| [94] | Journal | 1 | 3 | 2 | 1 | 2 | 2 | 2 | 13 |
| [95] | Journal | 3 | 2 | 1 | 2 | 1 | 2 | 3 | 14 |
| [96] | Journal | 2 | 2 | 2 | 2 | 2 | 2 | 1 | 13 |
| [97] | Journal | 1 | 3 | 1 | 2 | 2 | 1 | 2 | 12 |
| [98] | Journal | 2 | 1 | 2 | 1 | 2 | 2 | 2 | 12 |
| [99] | Journal | 3 | 2 | 1 | 1 | 3 | 2 | 1 | 13 |
| [100] | Journal | 2 | 2 | 2 | 2 | 1 | 3 | 1 | 13 |
| [101] | Journal | 2 | 2 | 2 | 1 | 2 | 1 | 2 | 12 |
| [102] | Journal | 1 | 3 | 1 | 2 | 2 | 1 | 2 | 12 |
| [103] | Journal | 3 | 1 | 2 | 1 | 2 | 2 | 2 | 13 |
| [104] | Journal | 2 | 3 | 2 | 1 | 1 | 2 | 3 | 14 |
| Publication Name | 2019 | 2020 | 2021 | 2022 | 2023 | 2024 | 2025 | Total |
| Remote Sensing | 3 | 3 | 4 | 6 | 7 | 5 | 3 | 31 |
| Drones | 1 | 0 | 2 | 2 | 2 | 3 | 0 | 10 |
| IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | 0 | 0 | 0 | 2 | 1 | 2 | 0 | 5 |
| Precision Agriculture | 0 | 1 | 0 | 0 | 2 | 1 | 1 | 5 |
| Agriculture | 0 | 0 | 0 | 1 | 0 | 2 | 1 | 4 |
| IEEE Access | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 4 |
| Sensors | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
| Acta Agriculturae Scandinavica, Section B-Soil and Plant Science | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
| Agronomy | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
| European Journal of Agronomy | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
| European Journal of Remote Sensing | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
| Indian Journal of Computer Science and Engineering | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
| International Journal of Agricultural and Biological Engineering | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
| International Journal of Students’ Research in Technology and Management | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
| IOP Conference Series: Earth and Environmental Science | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
| IOP Conference Series: Materials Science and Engineering | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
| … | … | … | … | … | … | … | … | … |
| Total | 6 | 6 | 7 | 16 | 15 | 15 | 8 | 73 |
| Year | N° Papers | Papers (%) | N° Citations | Citations (%) | H-Index | H-Index (%) | Citations/Paper |
|---|---|---|---|---|---|---|---|
| 2022 | 16 | 21.9 | 269 | 16.5 | 2181 | 19.2 | 16.8 |
| 2023 | 15 | 20.5 | 295 | 18.1 | 2593 | 22.8 | 19.7 |
| 2024 | 15 | 20.5 | 133 | 8.2 | 2071 | 18.2 | 8.9 |
| 2025 | 8 | 11.0 | 2 | 0.1 | 1182 | 10.4 | 0.3 |
| 2021 | 7 | 9.6 | 137 | 8.4 | 1011 | 8.9 | 19.6 |
| 2019 | 6 | 8.2 | 439 | 27.0 | 1039 | 9.1 | 73.2 |
| 2020 | 6 | 8.2 | 352 | 21.6 | 1312 | 11.5 | 58.7 |
| Total | 73 | 100.0 | 1627 | 100.0 | 11,389 | 100.0 | 22.3 |
| Bigram | Q1 | Q2 | NQ | Total |
|---|---|---|---|---|
| remote sensing | 43 | 0 | 0 | 43 |
| vegetation indices | 24 | 0 | 0 | 24 |
| machine learning | 21 | 0 | 2 | 23 |
| precision agriculture | 20 | 0 | 1 | 21 |
| vegetation index | 16 | 0 | 0 | 16 |
| deep learning | 12 | 1 | 1 | 14 |
| random forest | 12 | 0 | 1 | 13 |
| spatial resolution | 13 | 0 | 0 | 13 |
| growth stages | 11 | 0 | 0 | 12 |
| unmanned aerial | 9 | 0 | 3 | 12 |
| data collection | 10 | 0 | 0 | 10 |
| ground truth | 9 | 1 | 0 | 10 |
| study area | 10 | 0 | 0 | 10 |
| growth stage | 9 | 0 | 0 | 9 |
| linear regression | 9 | 0 | 0 | 9 |
| red edge | 9 | 0 | 0 | 9 |
| … | … | … | … | … |
| Total | 3214 | 99 | 192 | 3505 |
| Method Category | Methods Used | Datasets | Performance | Key Contributions | Limitations | Refs. | Qty. (%) |
|---|---|---|---|---|---|---|---|
| Calibration & Multi-sensor Integration | UAV multispectral calibration; multi-sensor fusion (RGB; multispectral; thermal; hyperspectral); AI integration for soil/texture | Multi-year field trials; cotton field ground data; UAV images (RGB/MS/TIR) | MAE(GAI) = 0.19–0.48 m2/m2; MAE(N) = 0.80–1.21 g/m2; NDVI error = 6.6% (canopy); thermal Δ = 1.02 °C | Crop/season-specific calibration frameworks; open-sourced multi-sensor UAV design; improved fusion pipelines | Limited generalizability across growth stages/environments; calibration complexity; compute/resource demands | [15,16,100] | 3 (4.5) |
| Crop Row & Field-Structure Mapping | CNN + Hough (CRowNet); thresholding/segmentation; OBIA/MIRS; computational geometry | UAV RGB/MS orthomosaics; public beet field images; UAS imagery (NL) | Detection rate = 93.6%; IoU > 70%; OA > 90%; R (crop volume) = 0.71 | Robust pipelines for row/canopy extraction; fast canopy volume estimation; cross-crop heuristics | Shadows and weeds degrade accuracy; crop/phenology specificity; DEM/terrain error propagation | [4,5,79,88] | 4 (6.1) |
| Cross-domain AI (Non-agri, outlier) | LSTM; Seq2Seq; Attention; BLEU evaluation | Dialogue dataset | BLEU-4 = 0.8537 | Benchmarks sequential models and attention | Domain mismatch vs. agri; usability not assessed | [99] | 1 (1.5) |
| Economic Evaluation & DSS/MCDM | Web-based DSS; usability tests; FWZIC/FDOSM (fuzzy MCDM) | Various agri datasets; user studies; vineyard interviews | High usability; mental demand “slightly high”; profit ↑ 12.4%; cost −€6.1/ha | Six interactive DSS modules (open code); UAV profitability evidence; criteria weights for UAV selection | Cognitive load; sample/context specificity; unmodeled features/costs | [7,51,92] | 3 (4.5) |
| Remote Sensing Methodology (UAV vs. Satellite/Index Studies) | NDVI/NDRE/GNDVI/EVI2 comparison; photogrammetry; S2 vs. UAV correlation | UAV RGB/MS; Sentinel-2; field sampling | r = 0.60–0.80 (UAV vs. S2); NDRE improves irrigation issue detection | Validates UAV superiority for vigor; NDRE efficacy; multi-scale comparability | Inter-row bias in satellites; crop/terrain specificity | [11,53,57] | 3 (4.5) |
| Reviews & Surveys (Systematic/Scoping) | Systematic literature review; comparative analysis; digitalization reviews | Literature corpora (155+ studies; last decade) | NR | Comprehensive maps of diseases/tech; barriers and interventions; legal/regulatory landscape | Time-window bias; heterogeneous data quality; limited economic evidence | [8,18,23,50,69,85] | 6 (9.1) |
| Soil/Water Status & Moisture Retrieval | GNSS-R; L-band passive microwave; thermal-optical; RF/ML for EWT/FMC/SLA | Ground SM; airborne L-band; UAV TIR/MS; smallholder plots | RMSE(SM) = 0.05–0.09 m3/m3; rRMSE(EWT) = 3.13%; rRMSE(FMC) = 1% | Sub-field SM at low cost; CWSI tracking across phenology; water-status monitoring framework | Site-specific calibration; sensing-depth mismatch; vegetation cover effects | [70,73,77,89,102] | 5 (7.6) |
| Time Series & Sequence Models (Agri) | LSTM (phenology-aware); decision tree; UAV spectral time series | UAV MS + weather + tree attributes | Accuracy = 93%; AUC = 0.85/0.96/0.92; R2(yield) = 0.21; RMSE = 50.18 | Phenology-conditioned health model; new mango indices; cumulative health index | Yield sub-model underfits; dependence on UAV pipelines | [1] | 1 (1.5) |
| UAV Systems, Operations, Edge & Communications | Precision landing; low-cost sprayers; path-planning (NN/optimization); FANET (AERO-FL); edge AI (YOLOv4/7, DeepSort); risk (GERT); comms (RSS/RTT) | Field tests (paddy/rice/PH); simulations; greenhouse comms | Landing error ≈ 6.8–13.3 cm; PDR = 98.5%; delay = 45 ms; FPS ≈ 15.5; capacity = 0.45 ha/h | Operational readiness (autopilot sprayers); real-time on-board AI; robust FANET routing; downtime risk modeling | Battery limits; comms loss; scenario-specific tuning; legal/operational barriers | [12,55,60,61,68,76,81,82,83,96] | 10 (15.2) |
| Pest/Disease/Weed & Individual-Plant Detection | XGBoost/ RF/ KNN (WLD); Faster-R-CNN/YOLOv5/8; OTSU; Detectron2; crown/mask extraction | UAV RGB/MS; cherry orchards; cassava sets; field RT images | Acc = 94% (WLD); F1 = 94.85%; IoU = 85.30%; mAP@0.5:0.95 = 0.960 | First UAV-ML WLD pipeline; real-time weed/cassava detection; precise crown delineation | Crop-specific tuning; small-object/occlusion errors; environment variability | [3,17,52,71,74,75,78,94] | 8 (12.1) |
| Yield & Biomass Estimation (Supervised/Ensemble) | RF/SVR/PLSR/GPR/Ridge/ElasticNet; XGBoost; stacking; linear models | UAV RGB/MS/HS; ground truth yield/biomass; multi-temporal plots | R2 up to 0.92; RMSE = 12.5; CC(RF) = 0.81; RPD = 1.867; mAPE/NRMSE reported | Ensemble/fusion boosts accuracy; optimal timing (flowering/dough); RGB vs. MS trade-offs | Spatial autocorrelation/overfitting; crop/variety specificity; stage dependence | [2,9,58,59,63,64,65,72,80,84,86,87,91,93,97] | 15 (22.8) |
| Phenotyping & 3D/LAI Canopy Modeling | 3D point clouds; LiDAR/MSI fusion; oblique UAV geometry; LAI/height mapping; seedling growth (MCDI) | UAV RGB/MS; LiDAR; ground LAI/height; high-res seedling images | R2(LAI) = 0.78–0.89; R2 > 0.8 (OBIA canopy volume); F1 > 98.5% (seedlings) | Cost-effective LAI/height maps; 3D morphology tracking; automated phenotyping | Cultivar/altitude/occlusion sensitivity; dense canopies | [10,19,54,62,95,101,103] | 7 (10.6) |
| Quartile | Total Citations | No. of Papers | Citations per Paper |
|---|---|---|---|
| Q1 | 1598 | 66 | 24 |
| Q2 | 7 | 2 | 4 |
| NQ | 22 | 5 | 4 |
| Total | 1627 | 73 | 22 |
| Trigram | NQ | Q1 | Q2 | Total |
|---|---|---|---|---|
| difference vegetation index | 0 | 13 | 0 | 13 |
| unmanned aerial vehicles | 2 | 9 | 0 | 11 |
| normalized difference vegetation | 0 | 10 | 0 | 10 |
| machine learning algorithms | 0 | 9 | 0 | 9 |
| remote sensing data | 0 | 8 | 0 | 8 |
| uav remote sensing | 0 | 8 | 0 | 8 |
| machine learning models | 0 | 7 | 0 | 7 |
| root mean square | 0 | 7 | 0 | 7 |
| vegetation index ndvi | 0 | 7 | 0 | 7 |
| convolutional neural networks | 1 | 4 | 1 | 6 |
| leaf area index | 0 | 6 | 0 | 6 |
| mean square error | 0 | 6 | 0 | 6 |
| aerial vehicles uavs | 0 | 5 | 0 | 5 |
| crop water stress | 0 | 5 | 0 | 5 |
| different growth stages | 0 | 5 | 0 | 5 |
| multiple linear regression | 0 | 5 | 0 | 5 |
| satellite remote sensing | 0 | 5 | 0 | 5 |
| deep learning models | 0 | 4 | 0 | 4 |
| high spatial resolution | 0 | 4 | 0 | 4 |
| machine learning model | 0 | 4 | 0 | 4 |
| neural networks cnns | 1 | 3 | 0 | 4 |
| partial least squares | 0 | 4 | 0 | 4 |
| remote sensing technologies | 0 | 4 | 0 | 4 |
| support vector regression | 0 | 4 | 0 | 4 |
| vegetation indices vis | 0 | 4 | 0 | 4 |
| yield prediction model | 0 | 4 | 0 | 4 |
| … | … | … | … | … |
| Total | 93 | 2508 | 41 | 2642 |
| Country | Google Scholar | IEEE Xplore | ProQuest | Scopus | Taylor & Francis Online | Total |
|---|---|---|---|---|---|---|
| China | 1 | 0 | 3 | 15 | 0 | 19 |
| US | 0 | 3 | 2 | 6 | 0 | 11 |
| Italy | 1 | 0 | 1 | 6 | 0 | 8 |
| Canada | 0 | 0 | 0 | 5 | 0 | 5 |
| India | 3 | 0 | 1 | 0 | 1 | 5 |
| Spain | 0 | 0 | 0 | 4 | 1 | 5 |
| … | .. | … | … | … | … | … |
| Total | 9 | 15 | 12 | 68 | 2 | 106 |
| Country | No. of Papers | Papers (%) | No. of Citations | Citations (%) | H-Index | Citations/Paper |
|---|---|---|---|---|---|---|
| China | 19 | 17.9 | 292 | 12.5 | 3216 | 15.4 |
| US | 11 | 10.4 | 262 | 11.2 | 1999 | 23.8 |
| Italy | 8 | 7.5 | 376 | 16.0 | 1257 | 47.0 |
| Canada | 5 | 4.7 | 82 | 3.5 | 769 | 16.4 |
| India | 5 | 4.7 | 51 | 2.2 | 258 | 10.2 |
| Spain | 5 | 4.7 | 174 | 7.4 | 766 | 34.8 |
| Australia | 4 | 3.8 | 91 | 3.9 | 705 | 22.8 |
| Germany | 4 | 3.8 | 51 | 2.2 | 749 | 12.8 |
| Greece | 4 | 3.8 | 13 | 0.6 | 517 | 3.3 |
| Netherlands | 3 | 2.8 | 38 | 1.6 | 535 | 12.7 |
| Pakistan | 3 | 2.8 | 79 | 3.4 | 629 | 26.3 |
| South Africa | 3 | 2.8 | 127 | 5.4 | 335 | 42.3 |
| UK | 3 | 2.8 | 25 | 1.1 | 566 | 8.3 |
| Bangladesh | 2 | 1.9 | 37 | 1.6 | 356 | 18.5 |
| Iran | 2 | 1.9 | 32 | 1.4 | 356 | 16.0 |
| Japan | 2 | 1.9 | 41 | 1.7 | 434 | 20.5 |
| … | … | … | … | … | … | … |
| Total | 106 | 100.0 | 2345 | 100.0 | 16733 | 22.1 |
| Topic | Density | Centrality | Total Citations | Total Documents | Category |
|---|---|---|---|---|---|
| UAS Phenotyping | 0.98 | 0.72 | 603 | 12 | Motor |
| UAV CropSensing | 0.49 | 0.84 | 1091 | 24 | Basic |
| Agricultural AI | 0.12 | 0.14 | 2097 | 61 | Marginal |
| Precision UAVs | 0.10 | 0.23 | 1924 | 53 | Marginal |
| Drone Agriculture | 0.07 | 0.18 | 2347 | 71 | Marginal |
| Precision Agriculture | 0.06 | 0.13 | 1729 | 46 | Marginal |
| Smart Farming | 0.06 | 0.66 | 1928 | 45 | Basic |
| Data-Driven Agriculture | 0.02 | 0.09 | 1874 | 55 | Marginal |
| Precision AgriDrones | 0.02 | 0.18 | 2106 | 61 | Marginal |
| Citation1 | Citation2 | Weight | Citation1 | Citation2 | Weight |
|---|---|---|---|---|---|
| li x. | wang j. | 17 | wang j. | zhang z. | 15 |
| li x. | zhang z. | 17 | wang x. | wang y. | 15 |
| yang g. | yang x. | 17 | wang x. | zhang l. | 15 |
| zhang l. | zhang z. | 17 | wang x. | zhang y. | 15 |
| cao w. | tian y. | 16 | zhang j. | zhang l. | 15 |
| cao w. | zhu y. | 16 | zhang l. | zhang y. | 15 |
| feng h. | yang g. | 16 | zhang y. | zhu y. | 15 |
| li x. | zhang l. | 16 | feng h. | li z. | 14 |
| tian y. | zhu y. | 16 | feng h. | yang x. | 14 |
| wang j. | zhang l. | 16 | li j. | wang j. | 14 |
| li z. | yang g. | 15 | li x. | li y. | 14 |
| liu j. | zhang l. | 15 | li x. | wang x. | 14 |
| liu j. | zhang y. | 15 | li x. | wang y. | 14 |
| wang j. | zhang y. | 15 | liu j. | wang j. | 14 |
| wang j. | zhang z. | 15 | liu j. | zhang z. | 14 |
| Keyword1 | Keyword2 | Weight |
|---|---|---|
| ml | precision agriculture | 8 |
| precision agriculture | uav | 7 |
| ml | remote sensing | 5 |
| precision agriculture | remote sensing | 5 |
| precision agriculture | vegetation indices | 5 |
| deep learning | precision agriculture | 4 |
| drones | precision agriculture | 4 |
| remote sensing | unmanned aerial vehicle | 4 |
| plant detection | precision agriculture | 3 |
| precision agriculture | uav remote sensing | 3 |
| 3d point cloud from photogrammetry | crop phenotyping | 2 |
| 3d point cloud from photogrammetry | leaf area index (lai) | 2 |
| 3d point cloud from photogrammetry | precision agriculture | 2 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Gamboa-Cruzado, J.; Estrada-Gutierrez, J.; Bustos-Romero, C.; Alzamora Rivero, C.; Valenzuela, J.N.; Tavera Romero, C.A.; Gamarra-Moreno, J.; Amayo-Gamboa, F. A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges. Sustainability 2026, 18, 507. https://doi.org/10.3390/su18010507
Gamboa-Cruzado J, Estrada-Gutierrez J, Bustos-Romero C, Alzamora Rivero C, Valenzuela JN, Tavera Romero CA, Gamarra-Moreno J, Amayo-Gamboa F. A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges. Sustainability. 2026; 18(1):507. https://doi.org/10.3390/su18010507
Chicago/Turabian StyleGamboa-Cruzado, Javier, Jhon Estrada-Gutierrez, Cesar Bustos-Romero, Cristina Alzamora Rivero, Jorge Nolasco Valenzuela, Carlos Andrés Tavera Romero, Juan Gamarra-Moreno, and Flavio Amayo-Gamboa. 2026. "A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges" Sustainability 18, no. 1: 507. https://doi.org/10.3390/su18010507
APA StyleGamboa-Cruzado, J., Estrada-Gutierrez, J., Bustos-Romero, C., Alzamora Rivero, C., Valenzuela, J. N., Tavera Romero, C. A., Gamarra-Moreno, J., & Amayo-Gamboa, F. (2026). A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges. Sustainability, 18(1), 507. https://doi.org/10.3390/su18010507

