Next Article in Journal
From Feature Selection to Forecasting: A Two-Stage Hybrid Framework for Food Price Prediction Using Economic Indicators in Türkiye
Previous Article in Journal
Comparing Metal Additive Manufacturing with Conventional Manufacturing Technologies: Is Metal Additive Manufacturing More Sustainable?
Previous Article in Special Issue
Blockchain and Data Management Security for Sustainable Digital Ecosystems: A Systematic Literature Review
error_outline You can access the new MDPI.com website here. Explore and share your feedback with us.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges

by
Javier Gamboa-Cruzado
1,*,
Jhon Estrada-Gutierrez
1,
Cesar Bustos-Romero
1,
Cristina Alzamora Rivero
1,
Jorge Nolasco Valenzuela
2,
Carlos Andrés Tavera Romero
3,
Juan Gamarra-Moreno
4 and
Flavio Amayo-Gamboa
5
1
Facultad de Ingeniería Industrial y de Sistemas, Universidad Nacional Federico Villarreal, Lima 15003, Peru
2
Facultad de Ingeniería, Universidad San Ignacio de Loyola, Lima 15024, Peru
3
Facultad de Ingeniería, Universidad Santiago de Cali, Cali 760032, Colombia
4
Facultad de Ingeniería de Sistemas e Informática, Universidad Nacional Mayor de San Marcos, Lima 15081, Peru
5
Facultad de Ingeniería, Universidad Nacional de Trujillo, Trujillo 13008, Peru
*
Author to whom correspondence should be addressed.
Sustainability 2026, 18(1), 507; https://doi.org/10.3390/su18010507
Submission received: 18 October 2025 / Revised: 26 November 2025 / Accepted: 16 December 2025 / Published: 4 January 2026
(This article belongs to the Special Issue Remote Sensing for Sustainable Environmental Ecology)

Abstract

This systematic literature review examines the rapid growth of research on the use of drones applied to smart agriculture, a key field for the digital and sustainable transformation of the agricultural sector. The study aimed to synthesize the current state of knowledge regarding the application of drones in smart agriculture by applying the Kitchenham protocol (SLR), complemented with Petersen’s systematic mapping (SMS). A search was conducted in high-impact academic databases (Scopus, IEEE Xplore, Taylor & Francis Online, Google Scholar, and ProQuest), covering the period 2019–2025 (July). After applying the inclusion, exclusion, and quality criteria, 73 relevant studies were analyzed. The results reveal that 90% of the publications appear in Q1 journals, with China and the United States leading scientific production. The thematic analysis identified “UAS Phenotyping” as the main driving theme in the literature, while “precision agriculture,” “machine learning,” and “remote sensing” were the most recurrent and highly interconnected keywords. An exponential increase in publications was observed between 2022 and 2024. The review confirms the consolidation of drones as a central tool in digital agriculture, with significant advances in yield estimation, pest detection, and 3D modeling, although challenges remain in standardization, model generalization, and technological equity. It is recommended to promote open access repositories and interdisciplinary studies that integrate socioeconomic and environmental dimensions to strengthen the sustainable adoption of drone technologies in agriculture.

1. Introduction

Over the past decade, smart agriculture has incorporated a set of disruptive technologies, among which unmanned aerial systems (UASs or “drones”) have acquired a central role. Their implementation has been the subject of numerous studies that highlight them as strategic tools to optimize agricultural processes, particularly in crop monitoring, localized input application, and the early detection of pests and diseases. The reviewed literature consistently points out that drones enhance precision, reduce operational costs, and generate high-value data for real-time decision-making. Consequently, these applications align with sustainability and productivity goals, establishing drones as a key component in the technological development of contemporary agriculture. In this paper, we use the term “drones” to refer to unmanned aerial vehicles (UAVs) employed as remote sensing platforms in smart agriculture applications, and we use “UAV” only where technical precision or alignment with authors’ keywords is required.
In recent literature, drones play a central role in Smart Agriculture, mainly in crop monitoring and pest and disease management. Various studies demonstrate their effectiveness: the MangiSpectra framework applied to mangoes achieved 93% accuracy in health classification, while in cereals, multispectral images combined with machine learning algorithms improved yield predictions. Similarly, in sugarcane, the YOLOv5 network outperformed other models in detecting white leaf disease, optimizing phytosanitary response, while the CRowNet method, based on CNN and the Hough transform, achieved more than 93% detection accuracy in crop rows [1,2,3,4].
In maize, progress has been documented in the identification and automatic counting of seedlings, as well as in plot monitoring through UAVs integration with machine learning techniques, achieving accuracies close to 99% [5,6,7]. Likewise, other works highlight that operational autonomy is key to technological transfer, where robust in-field detection and precise landing on agricultural platforms have demonstrated minimal error margins [3,4,8]. Regarding the estimation of agronomic variables, Random Forest outperformed SVR in predicting dry aboveground biomass in wheat (R2 = 0.81), while the Leaf Area Index (LAI) estimated with 3D point clouds derived from UAV imagery proved to be a cost-effective alternative to manual methods (R2 = 0.82) [9,10]. Additionally, in vineyards, strong correlations were obtained between UAV and Sentinel-2 data (R2 up to 0.95), and in almond orchards, the XGBoost model improved stomatal conductance prediction by 11% (R2 = 0.87) [11,12]. Furthermore, thermal UAVs used to estimate leaf temperature and stomatal conductance have demonstrated their value for diagnosing water stress and optimizing irrigation in diverse agroclimatic environments [13,14].
Advanced remote sensing has also expanded through multispectral calibration frameworks for estimating nitrogen absorption and the use of GNSS-R reflectometry mounted on drones, which reduces operational costs. Studies employing hyperspectral imagery and AI algorithms have enhanced soil characterization, route planning, and crop stress detection [15,16,17]. Likewise, the combination of UAVs with CNN, YOLO, and LSTM models has proven effective for automatic detection of diseases and weeds, while deployments using commercial drones (DJI) have shown economic benefits and reduced human intervention [18,19,20]. In this same direction, recent research on generative AI and IoT has confirmed their contribution to early pest detection and real-time monitoring, although limitations remain due to data availability and the lack of economic assessments [21,22]. In parallel, some studies emphasize both the advantages and barriers of digital agriculture, noting improvements in productivity and sustainability but also challenges related to costs and digital literacy. Similarly, web-based decision-support systems (DSSs) have been developed, which, despite their cognitive load, have proven efficient and adaptable to different contexts [23,24,25].
From a bibliometric and systematic review perspective, the United States, China, and India lead scientific production, with research clusters centered on remote sensing, deep learning, and IoT [26,27,28]. Other studies highlight the need for responsible digital solutions due to emerging socio-environmental impacts and confirm that the Digital Agricultural Revolution acts as a transversal driver of innovation in climate-smart agriculture and site-specific management [25,29,30]. However, gaps persist in the Global South, with few applications in neglected and underutilized species (NUS), as well as regulatory constraints and limitations in multitemporal and environmental stress analyses [31,32]. Finally, applications in strategic crops show tangible impacts: in coffee, a bibliometric analysis identified trends and consolidated the potential of UAVs to improve profitability, and in olives, the integration of drones with remote sensing and geomatics has optimized irrigation practices, disease detection, and yield estimation. These experiences confirm the operational value of UAVs and geomatic tools for more efficient and sustainable agriculture [33,34,35].
Despite the documented advances in the use of drones and digital technologies for crop monitoring, yield prediction, and resource management, existing reviews remain fragmented and topic-specific. Prior systematic reviews and bibliometric analyses have examined yield prediction using remote sensing and UAV platforms (Darra and collaborators [36]; Mustafa and his team [37]), remote sensing in specific high-value crops such as almonds and olives (Guimarães and collaborators [38]; Escandón-Panchana and collaborators [39]), UAV-based thermal sensing and water status assessment (Ndlovu and co-authors [14]), neglected and underutilized crops (Abrahams and co-researchers [31]), applications of Raspberry Pi and wireless sensor networks in precision agriculture (Joice and colleagues [40]; Abdollahi and colleagues [41]; Díaz Lara and collaborators [35]), and deep learning or artificial intelligence for weed control and crop monitoring (Vasileiou and colleagues [22]). Other contributions focus on bibliometric and thematic mappings of precision agriculture, UAV-based remote sensing, and Agriculture 4.0 (Singh and colleagues [42]; Mühl & de Oliveira [29]; Rejeb and collaborators [26]; Wang and colleagues [43]; Bertoglio and co-authors [30]), on cyber–physical systems and IoT-centered architectures for smart farming (Brenya, Zhu, and Sampene [44]), or on digital technologies in agri-food supply chains and smallholder systems (Wang and colleagues [43]; Wang and collaborators [45]; Yacoob and his team [46]). While these works provide valuable thematic and technological insights, they typically concentrate on a single technology, crop type, or value-chain segment, rely on heterogeneous eligibility and quality-assessment procedures, and rarely analyze deployment maturity or socioeconomic constraints in a systematic way. Consequently, there is still no integrated review that takes drones themselves as the central enabling technology for Smart Agriculture across multiple agronomic functions, combines Kitchenham’s SLR guidelines [47] with Petersen’s systematic mapping [48] and PRISMA 2020 reporting [49], and jointly examines methodological quality, bibliometric structure, and the technological equity gap between high-income and low/middle-income countries.
Given the exponential growth of drone applications in agriculture (a 347% increase in publications between 2019 and 2025), the lack of integrative reviews that simultaneously address technological innovation, implementation barriers, and equity considerations constitutes a critical knowledge gap. Recent calls by leading scholars (Rejeb and collaborators [26]; Ndlovu and co-authors [14]) underscore the need for more holistic assessments that situate technical capabilities within real-world farming systems. Against this backdrop, this paper makes four main contributions. First, it provides a systematic literature review of 73 primary studies published between 2019 and July 2025 that positions drones as a central enabler of smart agriculture, covering key functions such as yield estimation, stress and disease detection, phenotyping, and water management. Second, it characterizes the dominant technical configurations (platforms, sensors, AI, and machine learning models) and methodological patterns, including experimental designs and evaluation metrics, thereby clarifying how drone-based pipelines are actually implemented in practice and where standardization is most urgently needed. Third, it combines systematic mapping and bibliometric analysis to chart the global distribution and impact of research, revealing marked asymmetries between high-income and low-/middle-income countries and documenting a persistent “technological equity gap” that constrains adoption in the Global South and raises questions about transferability. Fourth, it integrates thematic, co-citation, and keyword co-occurrence analyses to identify consolidated and emerging research fronts and to propose a structured agenda for future work, emphasizing standardized datasets, reproducible evaluation protocols, and the explicit integration of socioeconomic and environmental dimensions. By synthesizing high-quality evidence across these axes, the review delivers not only a comprehensive technological mapping but also a critical assessment of equity, scalability, and contextual fit—issues that are particularly urgent as global agricultural systems seek to balance productivity gains with environmental sustainability and social justice.
The objective of this paper is to determine the current state of scientific knowledge on the use of drones in smart agriculture. The remainder of this paper is structured as follows: Section 2 presents the theoretical background; Section 3 details the review method, including search, selection, and analysis procedures; Section 4 reports the bibliometric and thematic results and discusses their theoretical and practical implications; and finally, Section 5 outlines the main conclusions along with recommendations for future research.

2. Theoretical Background

This section provides the conceptual and theoretical foundations that frame the present review. First, it presents drones or unmanned aerial vehicles (UAVs) as core enabling technologies within this paradigm, summarizing their main platforms, sensor types, and agronomic functions. Second, it outlines the evolution from precision agriculture to smart agriculture as an integrated paradigm that combines sensing, connectivity, and data-driven decision support. Third, it briefly reviews the role of machine learning and deep learning as dominant analytical frameworks for extracting information from UAV-based imagery.

2.1. Theoretical Evolution: From Precision Agriculture to Smart Agriculture

Rather than being independent concepts, drones, precision agriculture, and smart agriculture form an evolving technological continuum. Precision agriculture initially focused on optimizing input management at the field scale through variable-rate applications guided by georeferenced data. Smart agriculture extends this paradigm by embedding connectivity, automation, and artificial intelligence into farming operations, turning fields into cyber–physical production systems. In this transition, drones play a pivotal role: they act as mobile sensing and actuation platforms that generate high-resolution spatial and temporal data, feed real-time analytics, and close the loop between monitoring, prediction, and on-farm intervention. Thus, drone technology is not merely an additional tool, but a key driver in the shift from static, prescription-based precision agriculture to adaptive, learning-oriented smart agriculture.

2.2. Drones as the Critical Enabling Layer in Agricultural Paradigm Shift

Unmanned aerial systems (UASs) or drones represent the pivotal technological enabler that bridges the theoretical gap between precision agriculture and smart agriculture. Their unique capabilities directly address three fundamental constraints of traditional precision approaches that previously hindered the evolution toward truly integrated smart systems.
Drones have become a key tool in smart agriculture, incorporating multispectral, hyperspectral, and high-resolution cameras that enable precise crop monitoring and the early detection of diseases [50,51]. Smart agriculture builds on the principles of precision agriculture by integrating sensing, connectivity, and advanced analytics into a continuous decision-support cycle; in this cyber–physical context, drones complement satellite and in situ sensing by providing flexible, high-resolution observations that can be integrated with IoT architectures, wireless sensor networks, and cloud-based analytics. Compared to satellite imagery, they offer higher spatial and temporal resolution, facilitating the acquisition of critical data for agricultural management [52]. Vegetation indices, such as NDVI, have been established as reliable indicators of growth, biomass, and water status [53], and have been successfully applied in crops such as almond trees to estimate water stress without invasive methods [54]. However, technical challenges remain, particularly in hyperspectral processing, where illumination conditions and sensor noise affect data quality [55]. Recently, the creation of annotated databases has strengthened segmentation algorithms adapted to real-world environments, promoting more informed decision-making, cost reduction, and more efficient use of resources [56,57].

2.3. Smart Agriculture

Remote sensing platforms have become essential instruments for observing, monitoring, and mapping the spatial variability of crop growth, such as in maize [58]. Smart agriculture, regarded as a fundamental pillar of the digital transformation of the agricultural sector, integrates technological infrastructures based on the Internet of Things (IoT), big data, cloud computing, and artificial intelligence, allowing for automated and highly precise management [59]. Within smart agriculture, drones or unmanned aerial vehicles (UAVs) are positioned as key enabling technologies for rapid and repeated data acquisition at the field scale. Fixed-wing, multirotor, and hybrid platforms support a wide range of missions, from plot-scale crop scouting to large-area mapping, and, when coupled with RGB, multispectral, hyperspectral, or thermal sensors, they enable detailed characterization of canopy structure, water status, nutrient status, and stress signals, among other variables. These capabilities make drones particularly suitable for smart agriculture functions such as yield estimation, variable-rate input management, early detection of biotic and abiotic stress, and support for smallholder monitoring schemes. Traditionally, field inspection tasks depended on human observation or manual data processing; however, the incorporation of AI algorithms in UAVs has enabled real-time object identification and crop condition diagnosis with high accuracy [60]. This transition reflects global trends in agricultural digitalization, where global positioning systems (GPS), geographic information systems (GIS), automated processes, and robotic devices converge [61]. Among the most commonly used indicators, the Leaf Area Index (LAI) represents a valuable ecophysiological parameter for assessing plant condition and growth over different periods [62]. Additionally, the texture of images obtained through remote sensors contributes to improving the classification of agricultural objects and areas [63], while aboveground biomass (AGB) estimation serves as a fundamental basis for predicting crop productivity and guiding efficient resource management [64]. However, factors such as climate, topography, and management practices generate significant variability in the optimal harvest time, directly affecting the quality and yield of strategic agricultural products such as maize [65]. In general, smart agriculture is based on the articulated use of digital technologies—IoT, drones, advanced sensors, and artificial intelligence—that enable the collection, processing, and analysis of real-time data, thereby optimizing the productivity, sustainability, and profitability of agricultural systems [21].

2.4. IA/ML/DL on Drone Imagery

The analytical value of UAV-based data depends strongly on the algorithms used to extract actionable information from imagery. In recent years, machine learning and deep learning models have become the dominant frameworks for tasks such as plant counting, weed detection, yield prediction, biomass estimation, and disease or pest identification. Supervised learning approaches, including random forests, support vector machines, and convolutional neural networks, are frequently used to map spectral and textural features to agronomic targets, while segmentation and object-detection architectures (e.g., U-Net, Mask R-CNN, YOLO) support fine-grained spatial analysis. These models provide the theoretical basis for treating UAV imagery as a high-dimensional data source where patterns can be learned and generalized across plots, seasons, and, to a lesser extent, regions.

3. Review Method

This research adopted the systematic literature review (SLR) methodology proposed by Kitchenham [47], complemented by the systematic mapping study (SMS) approach developed by Petersen and colleagues [48] and the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) 2020 guidelines [49], with the purpose of obtaining a comprehensive and comparative overview of trends in the use of drones in smart agriculture. Following Kitchenham’s guidelines, a methodological protocol was established that encompassed all stages, from the determination of the research topic to the preparation of the manuscript. The PRISMA 2020 checklist is provided in the Supplementary Materials. Figure 1 schematically summarizes this procedure.

3.1. Research Problems

To precisely define the scope and depth of this systematic review, a set of research questions (RQs) was formulated to guide each phase of the process—from the search and selection of studies to the synthesis of results—allowing a rigorous response to the stated objectives. These RQs were designed following a preliminary exploration of the relevant literature and in accordance with the approach proposed by [59]:
  • RQ1: What quartile levels are represented by the journals that have published research related to drones in smart agriculture?
  • RQ2: What countries lead research on drones applied to smart agriculture, and how is the global scientific cooperation network structured?
  • RQ3: What thematic categories can be identified in studies addressing the use of drones and their influence on smart agriculture?
  • RQ4: How is the frequency of co-citation manifested among the authors referenced in studies examining the use of drones and their impact on smart agriculture?
  • RQ5: In what ways do keywords show frequent co-occurrences in research on drones and their impact on smart agriculture?

3.2. Information Sources and Search Strategies

To identify and collect the primary studies, academic databases with broad coverage and high recognition in the fields of engineering, computer science, and technology were consulted. Specifically, Scopus, IEEE Xplore, Taylor & Francis Online, Google Scholar, and ProQuest were used, selected for their relevance, accessibility, and diversity of indexed publication types.
The search period was extended to 2025 (search conducted on 29 July 2025) to capture the most recent research developments in this rapidly evolving field. This inclusion encompasses early access articles that have passed editorial review but have not yet been formally assigned to a journal issue. These items were included only when they met all other quality criteria and were already indexed in our selected databases with assigned DOIs. Although we recognize that 2025 publications have a limited citation count (average of 0.3 citations per article as shown in Table 4), their inclusion provides critical information about emerging research directions and technological innovations.
The search strategy was designed to maximize both comprehensiveness and precision in retrieving scientific literature. Accordingly, a set of controlled descriptors and synonyms was established and applied using Boolean operators across the different databases. Table 1 presents the descriptors and their corresponding synonyms, which form the foundation of the systematic search protocol followed in this research.
Based on these elements, the following general search equation was used:
(drone OR “unmanned aerial vehicle” OR uav OR uas OR “agricultural drone” OR “monitoring drone” OR “surveillance drone” OR “mapping drone”) AND (“smart agriculture” OR “digital agriculture” OR “agriculture 4.0” OR “precision agriculture” OR agrotechnology OR “artificial intelligence agriculture” OR “machine learning agriculture”)

3.3. Identified Studies

As a result of applying the search equation across all selected sources, 34,594 preliminary studies were identified. Figure 2 shows the distribution of the results obtained from each database, highlighting the differential coverage among them and underscoring the importance of their complementarity in ensuring an exhaustive and representative collection of the existing literature.

3.4. Study Selection

The selection of studies was carried out by applying a set of exclusion criteria (EC) aimed at ensuring thematic relevance, methodological quality, and alignment of the publications with the objectives of this systematic review. The progressive application of these criteria reduced the initial universe of records and allowed for the retention of only those studies that demonstrated sufficient methodological robustness.
Database filters were applied before screening to restrict results to English, peer-reviewed journal and conference papers published in the last 7 years, excluding books, theses, and review papers (EC1–EC4). These filters ensured that only primary empirical studies potentially relevant to the review were retrieved for subsequent evaluation.
This process of identification, screening, retrieval, and full-text assessment was clearly and transparently documented using the PRISMA 2020 flow diagram (Preferred Reporting Items for Systematic Reviews and Meta-Analyses), shown in Figure 3, which summarizes the complete workflow followed during the selection process.

3.5. Quality Assessment

After establishing the selection criteria, the remaining 76 papers were individually reviewed following established methodological guidelines. To ensure transparency and rigor in the evaluation process, the quality-assessment (QA) procedure adopts the recommendations of Kitchenham [47] and Dybå & Dingsøyr [66], who emphasize methodological clarity, replicability, and evidence reliability in empirical studies. Accordingly, seven quality-assessment criteria were operationalized with explicit decision rules to minimize subjectivity and promote consistency among reviewers. Each criterion requires the presence of observable and reportable information in the paper, for example, clearly stated research objectives, reproducible data-collection procedures, or sufficiently detailed descriptions of UAV and sensor configurations.
The seven QA questions used in this review are defined as follows:
  • QA1: Is the thematic area addressed in the study clearly defined?
  • QA2: Are the results of the experiments accurately identified and reported?
  • QA3: Is the methodological pipeline clearly and reproducibly described (including data sources, sensors, flight parameters, and main processing steps)?
  • QA4: Are the methods employed appropriate for the analysis of the results?
  • QA5: Are the research objectives clearly formulated in the paper?
  • QA6: Are the results and limitations reported with sufficient detail to enable replication or re-use (e.g., access to datasets or code, or detailed reporting of experimental settings)?
  • QA7: Is the developed experiment valid and methodologically acceptable?
  • Operationalization of the QA criteria
To reduce subjectivity in the assessment and ensure consistency across reviewers, each question was translated into explicit observable conditions, following Kitchenham [47] and Dybå & Dingsøyr [66]. The operational definitions applied were as follows:
  • QA1—Clarity of the thematic area addressed
  • Yes: the study clearly defines the agricultural problem, context, or domain it addresses (e.g., crop, stress type, monitoring objective).
  • Not: the thematic area is vague, overly general, or only implied.
  • QA2—Accuracy and transparency in reporting experimental results
  • Yes: the study reports quantitative results clearly (e.g., metrics, comparisons, tables, or figures) with sufficient detail.
  • Not: results are incomplete, ambiguous, or missing essential information.
  • QA3—Clarity and reproducibility of the methodological pipeline
  • Yes: the paper clearly describes the methodological pipeline (data sources, sensors, flight parameters, and main processing steps) in a way that would allow reproduction.
  • Not: the methodological description is incomplete, fragmented, or lacks sufficient detail for reproduction.
  • QA4—Appropriateness of methods for analyzing results
  • Yes: the analytical methods (ML/DL/statistics/indices) are clearly described and appropriate for the research question.
  • Not: the methods are poorly explained, unjustified, or mismatched to the objectives.
  • QA5—Clarity of research objectives
  • Yes: the research goals are explicitly stated and connected with the methods and results.
  • Not: objectives are absent, vague, or inconsistent.
  • QA6—Transparency and reproducibility of results and limitations
  • Yes: the results and limitations are reported with enough detail to enable replication or re-use (e.g., datasets or code are accessible, or experimental settings are thoroughly described).
  • Not: results and limitations are reported superficially, lack key details, or do not provide enough information to support replication or re-use.
  • QA7—Validity and methodological soundness of the experiment
  • Yes: the study includes experimental validity elements (e.g., metrics, evaluation design, validation procedure).
  • Not: experiments lack validity, justification, or methodological rigor.
Each criterion was scored using a three-point scale, where 1 = low quality, 2 = medium quality, and 3 = high quality. A minimum acceptable threshold of 11.5 points was established for inclusion in the review. To reduce subjectivity in items that require an overall judgment (QA3 and QA6), we also considered reproducibility-oriented elements, such as the availability of public datasets, source code, and detailed descriptions of sensors and flight parameters, as positive indicators when assigning scores, in addition to the narrative clarity of the methodological and results sections. To reduce potential subjectivity, two reviewers independently evaluated all primary studies using the same operational definitions, and disagreements were resolved through discussion. This structured approach increases consistency and ensures that the assessment remains aligned with accepted SLR standards.
Although there is no universally mandated numeric cut-off in the agricultural domain, several recent systematic reviews adopt a similar strategy of combining explicit exclusion criteria with structured quality-assessment instruments to safeguard rigor. For instance, Darra and collaborators [36] on yield prediction in agriculture, Andrade-Mogollon and collaborators [21] on Generative AI and IoT for precision agriculture, Duarte and collaborators [67] on UAV-based monitoring of forest pests and diseases, and Zambrano and collaborators [32] on UAV remote sensing for crop monitoring all define formal quality-assessment criteria in addition to their inclusion/exclusion rules. While these works do not necessarily adopt the same numeric threshold, they reinforce the importance of using explicit multi-item quality checklists, rather than relying solely on indexing or impact metrics, to filter primary studies. Our 11.5/21 threshold follows this logic and was chosen to be consistent with this emerging practice in agricultural SLRs while being tailored to the specific objectives and QA questions of our review.
As a result of this process, 73 papers satisfactorily met all quality requirements, reaching or exceeding the minimum score. This assessment accurately determined the final set of primary studies that constitute the analysis corpus of this research. The consolidated results of this evaluation are presented in Table 2.

3.6. Data Extraction Strategies

At this stage of the process, the systematic extraction and organization of data from the 73 documents included in the analysis were carried out, considering ten essential elements: year of publication, country of origin, language, research areas, type of publication, methodology used, authors, number of citations, abstract, and main terms. A consolidated database was built to systematically collect the information necessary to answer the research questions. The recorded data from each document included the identification number, title, access link, country, year, language, number of pages, type of document, publication source, authors and affiliated institutions, number of citations, keywords, abstract, and sample size. Although not all studies contributed directly to each research question, together they provided a broad, coherent, and well-supported view of the field of study, strengthening the analytical rigor and validity of this systematic review.

3.7. Data Synthesis

Once all the information required to answer the research questions had been collected, a comprehensive review and systematization of the data were conducted. The selected studies were coded and organized into tables and graphical summaries to facilitate the identification of patterns, recurrences, and methodological trends over the past seven years. The analysis combined descriptive statistics (e.g., counts, percentages, and measures of central tendency and dispersion) with comparative analyses across time, regions, and application domains, enabling a structured synthesis of how drones are being used in smart agriculture.
Bibliographic records retrieved from the databases were first imported into Mendeley Desktop, which served as the central tool for organizing, managing, and standardizing the references (authors’ names, titles, journal information, and DOIs) and for removing duplicates. After this curation step, the cleaned dataset was processed with RAj, a research assistant developed by Dr. Javier Gamboa-Cruzado. Like other bibliometric tools such as Bibliometrix or VOSviewer, RAj was used to compute descriptive indicators (e.g., number of papers per year, country, journal quartile, and research topic) and to generate the visualizations presented in the Results and Discussion sections, including bar charts, thematic maps, and co-citation and keyword co-occurrence networks. RAj was employed strictly as a data management and visualization tool and does not introduce any additional theoretical assumptions beyond the review protocol. Figure 4 shows a screenshot of the tool used in this stage of the process.

4. Results and Discussion

As part of the present study, the review and analysis of unstructured texts were carried out manually, following a systematic and replicable process illustrated in Figure 5. This procedure ensured the traceability of each phase in the evaluation and classification of the selected documents. Upon completion of the selection and filtering process, a final corpus of 73 scientific papers was obtained and analyzed in depth, with the aim of rigorously addressing the previously formulated research questions (RQs). This corpus provided the empirical foundation for the development of the results and discussion sections presented below.

4.1. General Description of the Studies

Table 3 and Table 4, together with Figure 6, present the distribution, evolution, and impact of research on drones applied to smart agriculture. These visualizations make it possible to identify the main journals of publication, annual scientific productivity, as well as the citation level and relevance of studies conducted during the 2019–2025 period.
Results confirm the predominance of the journals Remote Sensing (31 studies) and Drones (10 studies), reflecting the leadership of specialized outlets in remote sensing and autonomous systems applied to agriculture. Scopus stands out as the most representative indexing source (48 records), highlighting its central role in academic visibility. Between 2022 and 2024, the highest growth in publications (62.4%) was observed, coinciding with the consolidation of digital agriculture and the expansion of intelligent monitoring systems. Regarding impact, the years 2019 and 2020 show the highest proportion of citations (27% and 21.6%), despite their lower publication volume, indicating the long-lasting influence of early seminal studies. The increase in the H-Index (average > 2000 between 2022 and 2024) and the mean of 22.3 citations per paper reflect sustained scientific maturity and the consolidation of a robust body of knowledge on the use of drones in smart agricultural systems.
According to the results, the combination of low publication volume and high average citations in 2019–2020 reflects the role of early contributions that framed drones as high-resolution sensing platforms within smart agriculture. These studies often introduced methodological blueprints, integrating multispectral imaging, vegetation indices, and machine learning models, that were subsequently reused and cited by later work. The surge in publications observed from 2022 onwards coincides with the wider availability of low-cost UAVs, off-the-shelf sensors, and open-source ML frameworks, which facilitated a rapid expansion of applied case studies across crops and regions. However, this quantitative growth is accompanied by a slight decline in average citations, suggesting a progressive fragmentation of the literature into more specialized and context-specific applications. Rather than indicating reduced impact, this pattern is consistent with a maturing field in which a small set of early methodological papers concentrate citations, while newer studies diversify topics, datasets, and experimental settings.
According to Zambrano and collaborators [32], Remote Sensing and Precision Agriculture stand out as the journals with the largest number of publications in the field of drones applied to agriculture. In contrast, Abdollahi and colleagues [41] identify Sensors as the most relevant, with 101 articles, followed by Computers and Electronics in Agriculture with 68 papers. Likewise, Slimani, El Mhamdi, and Jilbab [105] position Computers and Electronics in Agriculture first, with 85 articles and 1998 citations, followed by Sensors with 19 articles and 130 citations. Bento and collaborators [33] also place Computers and Electronics in Agriculture as the leading journal, ranking Remote Sensing third, confirming its thematic continuity and academic prestige. Conversely, Gokool and colleagues [106] highlight Drones as the most productive source for classified publications, followed by Remote Sensing and Agronomy, evidencing the recent diversification of leading journals in the field. Finally, Ndlovu and collaborators [14] report that the years with the highest number of papers were 2022, 2020, and 2021, which is relevant to understanding the temporal maturation of scientific production.
These findings reinforce the potential of smart agriculture as a replicable model for other sectors such as forestry, environmental management, mining, and logistics. They also highlight the need to expand research in Global South regions, where technological adoption remains limited. It is recommended to develop longitudinal and comparative studies assessing the evolution of scientific and technological impact across different geographic contexts, promoting an equitable transfer of innovation and knowledge toward sustainable and resilient agriculture.
Despite technological advances, our analysis reveals a persistent ‘technological equity gap’ where socioeconomic factors, implementation costs, and digital literacy requirements are rarely addressed alongside technical performance metrics, limiting applicability in smallholder and family farming systems that represent the majority of global agricultural production.
Figure 7 and Table 5 complementarily represent the most frequent topics identified in the analyzed papers using the bigram technique. The word cloud and bar chart visualize the intensity of the most cited terms, while the table—classifying bigrams according to journal quartiles (Q1, Q2, and NQ)—offers a comparative perspective linking thematic relevance with publication impact. This mixed-method approach strengthens the interpretation of trends in the literature on smart agriculture and drone applications.
The results show that “remote sensing” is the most frequent bigram (43 occurrences), confirming its central role as a cross-cutting technology in smart agriculture studies. In Q1 papers, “vegetation indices,” “machine learning,” and “precision agriculture” stand out, reflecting the integration of remote sensing, machine learning algorithms, and the optimization of agricultural processes. The presence of “deep learning” and “random forest” demonstrates the consolidation of predictive approaches based on artificial intelligence for crop analysis, pest detection, and yield estimation. Meanwhile, “spatial resolution,” “data collection,” and “ground truth” reveal a concern for the quality and accuracy of geospatial data, which are essential for model validation. In Q2 and non-quartile (NQ) studies, “growth stages,” “water stress,” and “linear regression” appear more prominently, referring to applied research focused on plant physiology and the use of traditional statistical methods. Overall, the figure and table demonstrate a methodological convergence that combines drones, multispectral remote sensing, and machine learning, consolidating precision agriculture as the dominant technological paradigm.
From an editorial perspective, the concentration of core bigrams in Q1 outlets is consistent with systematic differences in scope, audience, and resource requirements across journal tiers. Leading journals in remote sensing, precision agriculture, and artificial intelligence position themselves as methodological reference venues targeting a global research community; they tend to prioritize contributions that combine high-resolution sensing, advanced learning models, and multi-site or multi-year datasets, which naturally reinforces the central terminology of the field. By contrast, Q2 and non-indexed venues more often publish exploratory or context-specific applications with narrower data and infrastructure, which helps to explain their lower thematic centrality and citation impact. This stratification suggests that access to analytical capacity, sensing infrastructure, and interdisciplinary teams remains a key prerequisite for shaping the research agenda on drones in smart agriculture.
The observed trends can be extrapolated to other productive sectors such as forestry, mining, and environmental management, where the integration of remote sensors and artificial intelligence enhances decision-making efficiency. In the business domain, these results support the adoption of digital agriculture models and the implementation of automated monitoring systems for traceability and sustainability. Their application across different geographic regions would allow predictive models to be adapted to diverse climatic and soil conditions, contributing to food security. Finally, the comparative analysis by quartiles opens the possibility of projecting the temporal evolution of these research lines and their transfer to new technological and productive contexts.
Table 6 summarizes the most relevant methodological findings from research on drones applied to smart agriculture, classifying them by categories, employed techniques, data sources, performance metrics, and core contributions. This organization helps identify dominant technological and scientific trends, as well as methodological gaps that limit the generalization and transferability of results.
  • Method Category: The thematic distribution reveals a wide methodological diversity, with a predominance of applied lines focused on agricultural productivity and efficiency. The most representative categories are Yield & Biomass Estimation (22.8%), UAV Systems, Operations, Edge & Communications (15.2%), and Pest/Disease/Weed Detection (12.1%), which indicate a shift from experimental approaches toward integrated and field-based applications. In contrast, topics such as Cross-domain AI and Time Series Models (1.5% each) show emerging consolidation, suggesting methodological areas still under development.
  • Methods Used: Methodological convergence combines machine learning, remote sensing, and 3D modeling techniques, consolidating an interdisciplinary framework that spans from classical algorithms (RF, SVR, KNN) to deep networks (YOLOv5, UNet, LSTM). This pattern confirms a technical maturity in agricultural automation, where the combination of fusion sensing, MCDM, and edge AI signals a trend toward more autonomous and adaptive systems. Notably, hybrid methods are concentrated in the categories with the highest proportion of studies (≈50% of the total), demonstrating the widespread adoption of artificial intelligence applied to UAVs.
  • Datasets: The analyzed datasets reveal marked heterogeneity in scale, origin, and structure, with a predominance of data obtained from RGB, multispectral, and thermal UAVs, occasionally integrated with LiDAR and Sentinel-2 sensors. This diversity, represented in 70% of the investigations, reinforces the ecological validity of the results while limiting inter-context reproducibility. Consequently, studies with more rigorous field trials and ground truthing (≈20%) establish methodological benchmarks for the creation of standardized repositories.
  • Performance: The reported performance indicators—such as R2 values above 0.9, IoU exceeding 70%, or reduced RMSE—demonstrate substantial advances in predictive accuracy and in the detection of pests, water stress, and biomass. However, the heterogeneity of metrics hinders cross-study comparisons and limits replicability. It is noteworthy that studies integrating deep learning and multitemporal analysis show the most significant improvements in accuracy and computational efficiency.
  • Key Contributions: The main contributions focus on innovations in sensor integration, analytical precision, and operational efficiency, highlighting the development of reproducible models, open frameworks, and interactive DSS systems. It is important to note that the most represented categories (22.8% and 15.2%) not only provide technical advances but also yield direct economic implications in productivity and cost reduction. Overall, these contributions consolidate a transition from a purely technological approach toward scalable and sustainable agro-industrial solutions.
  • Limitations: Beyond productivity and impact indicators, the synthesis of study-level limitations reveals several cross-cutting challenges for drone-based smart agriculture. Three constraints appear in more than 60% of the analyzed studies: weak model generalization (72% of works calibrated for specific crops, phenological stages, topographic conditions or management regimes and rarely validated under substantially different environments), data-standardization gaps (81% of studies lacking harmonized protocols for sensor calibration, flight parameters, ground-truth collection or evaluation metrics), and economic feasibility being neglected (89% of technical papers omitting cost–benefit analysis, ROI calculations, or consideration of implementation barriers for smallholder farmers). These patterns make it difficult to compare results or to reuse pipelines across contexts and systems, even when predictive performance is high in the original setting. Taken together, they indicate that reproducibility and scalability remain the most persistent methodological bottlenecks and constitute a central thread for the discussion developed in the next subsection.
The results indicate that the field of drones in smart agriculture is in a phase of empirical consolidation, driven by the integration of AI, advanced sensors, and multiscale modeling. Consequently, future research should prioritize the creation of comparative protocols and standardized repositories to enable reproducible performance assessment. Furthermore, the identified trends can be extended to sectors such as environmental management, green mining, and intelligent reforestation, strengthening the cross-sector applicability of UAV technology. Finally, the strong focus on crop yield studies highlights the need to balance research efforts toward socioeconomic, ethical, and sustainability dimensions to ensure a comprehensive and equitable impact.

4.2. Responses to the Research Questions

This section presents the responses derived from the systematic analysis of the 73 selected studies, aiming to address the research questions (RQs) formulated in the methodological phase. Each RQ is examined individually, grounding the interpretation of results in empirical evidence, bibliometric analysis, and thematic review. This approach makes it possible to identify both the consolidated advances and the existing gaps within the literature.
RQ1: 
What quartile levels are represented by the journals in which research on drones applied to smart agriculture has been published?
Figure 8 and Table 7 illustrate the distribution and impact of the analyzed publications according to the quartile level (Q1, Q2, and NQ) of the journals in which they were published. The network and Sankey diagrams visualize the connection between quartiles, databases, and publication years, while the table quantifies the citations and average performance of each group, providing a comprehensive view of the prestige and scientific reach of studies on drones in smart agriculture.
The results indicate a clear concentration of scientific production in Q1 journals, which include 66 of the 73 papers (90%) and accumulate 1,598 citations, with an average of 24 citations per paper. This finding reflects that research on drones applied to smart agriculture is predominantly published in high-impact journals indexed in databases such as Scopus and IEEE Xplore. In contrast, Q2 and NQ levels comprise only seven papers with a low average citation rate (4), evidencing limited visibility and thematic consolidation. The dominance of Q1 journals also reveals a strong link with specialized outlets in precision agriculture, remote sensing, and artificial intelligence, confirming their interdisciplinary positioning. Moreover, the temporal continuity from 2019 to 2025 suggests sustained growth and scientific maturity in the field. Collectively, the metrics demonstrate that research in this domain has reached a stage of methodological consolidation and international recognition.
According to Duarte and collaborators [67], 88.8% of the papers analyzed were published in Q1 journals, indicating a high academic quality in the scientific output on drones applied to smart agriculture. Consistently, Nduku and colleagues [107] report that Q1 journals dominate both Web of Science and Scopus, making this quartile the main benchmark for scientific dissemination in the field. Similarly, Lalrochunga, Parida, and Choudhury [108] confirm that most of the reviewed journals belong to this quartile, supporting their assertion with graphical representations and tables that strengthen the robustness of their findings. Furthermore, Brenya, Zhu, and Sampene [44] highlight that in 2021, the most representative quartile levels corresponded primarily to Q1, consolidating a sustained trend toward publication in high-impact journals. Overall, the convergence among these studies reflects a clear consensus on the predominance of Q1 as an indicator of excellence, visibility, and scientific leadership in the field of drone-mediated smart agriculture.
The concentration of studies in Q1 journals demonstrates the robustness of this area and its potential for transfer to other sectors, such as mining, forestry, and environmental management. From a business perspective, these results support the integration of drones and artificial intelligence into advanced monitoring systems and process optimization. Finally, the observed trend can be replicated across other geographic regions and future periods, consolidating smart agriculture as a global model of technological innovation and sustainability.
RQ1a: 
How do research topics differ across journal quartiles, and what thematic patterns characterize higher-versus lower-impact outlets?
Table 8 presents a trigram distribution across quartiles showing that Q1 journals overwhelmingly concentrate on the core vocabulary of the field, encompassing vegetation indices, remote sensing concepts, and AI/ML/DL models applied to drone imagery. In contrast, Q2 journals account for a very small share of occurrences, mainly reflecting isolated contributions on advanced methods such as convolutional neural networks, without configuring a distinct thematic niche. Rather than a clean gradient of topics, the results indicate that both thematic centrality and methodological sophistication are strongly concentrated in Q1, while Q2 and NQ provide complementary but clearly secondary contributions.
The analysis of journal quartile distribution reveals clear thematic differences across Q1, Q2, and non-indexed (NQ) publications. As shown in the results, the overwhelming majority of trigrams identified in the corpus are concentrated in Q1 journals (2508 occurrences; ≈95%), indicating that the core topics in drone-based smart agriculture are predominantly developed within high-impact outlets. These topics include difference vegetation index, unmanned aerial vehicles, normalized difference vegetation, machine learning algorithms/models, remote sensing data, UAV remote sensing, vegetation index NDVI, and root mean square, which represent mature research lines that combine multispectral indices, quantitative modelling, and UAV-based imaging for crop monitoring, stress detection, and yield estimation. In this sense, Q1 journals define the dominant models and trends of the field, positioning drones as high-resolution remote sensing platforms tightly coupled with AI/ML/DL pipelines and rigorous error metrics, fully aligned with the “issues, models, trends, and challenges” framework of this review.
In contrast, Q2 journals account for only 41 occurrences (≈1.6%), generally capturing more specialized or emerging themes, such as convolutional neural networks or neural networks CNNs, but with very limited frequency. This pattern suggests that, while Q2 outlets do contribute to methodological developments, particularly in deep learning and advanced image analysis, they do not constitute the central publication venues where the conceptual and methodological core of drone-based smart agriculture is being shaped. Their role is better understood as complementary, reinforcing selected cutting-edge techniques without substantially redefining the thematic structure established by Q1 journals.
Meanwhile, NQ journals, with 93 occurrences (≈3.5%), tend to present broader and less technically dense topics, often focused on general UAV descriptions or high-level ML concepts rather than detailed vegetation indices, sensor configurations, or quantitative evaluation metrics. These outlets occupy a more peripheral position in the publication system: their contribution is useful for contextual or introductory discussions, for proof-of-concept implementations, or for highly local applications, but remains marginal when compared with Q1 journals in terms of setting research priorities and methodological standards. Taken together, the quartile-stratified analysis for RQ1 shows that the key research topics, modelling practices, and evaluative conventions in drones for smart agriculture are overwhelmingly anchored in Q1 journals, whereas Q2 and NQ outlets provide incremental and context-specific contributions without establishing a distinct thematic niche of their own.
These quartile-stratified trigram patterns are consistent with the structural asymmetries discussed in Section 4.1: Q1 journals operate as global methodological reference venues, targeting a broad international audience and typically requiring data- and compute-intensive study designs. By contrast, Q2 and NQ outlets more often host exploratory or context-specific applications with narrower datasets and infrastructure, which helps to explain their lower thematic centrality and more peripheral role in shaping the research agenda.
RQ2: 
What countries lead research on drones applied to smart agriculture, and how is the global scientific cooperation network structured?
Figure 9 and Table 9 and Table 10 present the geographical distribution, academic impact, and international cooperation patterns of research on drones applied to smart agriculture, reflecting the leadership of certain countries in the scientific production of the field. The choropleth world map shows the spatial concentration of publications, while the bar chart highlights the ten most productive countries in terms of the number of papers. The small cooperation panel in Figure 9 displays the only international co-authorship link identified in the corpus (United States–China), underscoring the extremely sparse nature of global collaboration.
The results indicate that China leads in production with 19 papers and an H-Index of 32, demonstrating its scientific maturity and sustained capacity for knowledge generation. It is followed by the United States, with 11 papers and an H-Index of 26, consolidating its position as a benchmark in innovation and technological application. Italy, despite having only eight papers, exhibits the highest average number of citations per paper (47), reflecting its strong influence and specialization in precision agriculture and remote sensing. Countries such as Spain, Australia, and Canada show intermediate values, maintaining a good balance between quantity and impact. Likewise, South Africa and Pakistan stand out for their notable citation efficiency (H-Index of 35 and 26, respectively), indicating focused yet high-quality research. The global distribution demonstrates that scientific powers concentrate both productivity and impact, while emerging countries are beginning to consolidate themselves as new innovation hubs in smart agriculture.
Although the cooperation network extracted from the corpus is sparse and reveals only a single recurring international link, between the United States and China, this dyad represents the dominant collaborative structure in the dataset. No additional multi-country clusters were detected, indicating that international collaboration remains highly concentrated and limited in scope.
The country distribution reported in Table 9 allows for a clear distinction between high-income countries (e.g., China, the United States, and other highly represented economies) and low- and middle-income countries (e.g., India, South Africa, and other emerging economies that appear with fewer publications). This comparison shows that the bulk of the scientific production on drones in smart agriculture is concentrated in a relatively small group of high-income or upper-middle-income countries, while contributions from low- and middle-income countries remain comparatively limited despite their strong dependence on agriculture. This pattern reinforces the existence of a technological and knowledge gap that affects the Global South and motivates the context-specific recommendations proposed in the discussion section.
According to Nicoleta Darra and collaborators [36], China and the United States lead scientific production in the field of drones applied to smart agriculture, an outcome explained by their high level of investment in research and development. In line with this, Singh and colleagues [42] expand this perspective by including Spain among the countries with the greatest number of publications, thereby evidencing a progressive expansion of research toward Europe. Similarly, Rejeb and his team [26] concur in identifying these nations as the main poles of scientific knowledge production and dissemination in this field. It is also worth noting that Ndour and other authors [109] reinforce this view by confirming that China and the United States account for the highest number of published studies, further consolidating their position as global leaders in agricultural technology development. Although Guimarães and collaborators [38] offer a complementary approach by emphasizing the joint leadership of North America and China, their analysis converges in the same direction, identifying a concentration of knowledge in regions with high scientific and technological capacity. Comparative studies reveal a clear hegemony of China and the United States in academic production, accompanied by the growing participation of European countries, shaping a global landscape of leadership with well-established research centers.
These patterns have concrete implications for countries in the Global South. The concentration of research outputs in high-income countries highlights the need for targeted knowledge-transfer mechanisms rather than generic collaboration strategies. For low- and middle-income countries such as India, South Africa or Pakistan, whose scientific presence is smaller but growing, progress in drone-based smart agriculture depends on (i) localized capacity-building programs focused on UAV operation, agronomic data interpretation and digital skills; (ii) cooperative or service-based drone models that reduce acquisition and maintenance costs for smallholder farmers; and (iii) regulatory frameworks that simplify certification and flight permissions for agricultural UAVs. Strengthening South–South collaboration networks is also essential, as these countries share similar crop types, production constraints, and infrastructural challenges. These actions help mitigate the technological and knowledge gap observed across income groups and enhance the practical usefulness of drone technologies in resource-constrained agricultural systems.
RQ3: 
What thematic categories are identified in studies addressing the use of drones and their influence on smart agriculture?
Figure 10 and Table 11 present the thematic map and the metrics associated with the density and centrality of the topics identified in the literature on drones and smart agriculture. The chart illustrates the strategic position of each theme within the scientific domain, classifying them into driving, basic, specialized, and marginal themes, while the table details the level of development (density), relevance (centrality), and the number of citations and documents linked to each category.
The results show that “UAS Phenotyping” stands out as a motor theme, with high density (0.98) and centrality (0.72), reflecting its strong methodological development and transversal relevance in the application of drones for crop phenotyping analysis. The basic themes, represented by “UAV CropSensing” and “Smart Farming,” exhibit high centrality and citation counts, confirming their structural role in consolidating the field of smart agriculture. In contrast, marginal themes such as “Agricultural AI,” “Drone Agriculture,” “Precision UAVs,” and “Precision Agriculture” display low density and centrality, indicating areas still in expansion or with limited conceptual cohesion. Nevertheless, their high number of citations and documents suggests that they represent emerging research lines with potential for further maturation. The predominance of topics related to artificial intelligence and precision agriculture systems demonstrates the growing convergence between automation, data analytics, and agricultural sustainability. Overall, the thematic map reflects a consolidating research ecosystem where drones act as an articulating axis between technological innovation and efficient resource management.
While ‘precision agriculture’ and ‘smart farming’ appear as basic/marginal themes, they lack integration with socioeconomic feasibility frameworks that would enable practical adoption in resource-constrained settings, representing a significant opportunity for future interdisciplinary research.
It is worth noting that Escandón-Panchana and collaborators [39] also identify “precision agriculture,” associated with UAVs, DEM, and LiDAR, as a motor theme, reinforcing the relevance of remote sensing systems in the field’s evolution. Although their study emphasizes integration with geomatics, the present analysis positions it as a marginal yet significant category due to its wide interdisciplinary connectivity. Complementarily, Armenta-Medina and colleagues [110] describe a three-stage thematic evolution in which “remote sensing” and “precision agriculture” emerge as universal topics in the most recent periods, driven by technologies such as UAVs, NDVI, and GIS. In contrast, their focus on advanced tools, such as DGPS and geospatial kriging, broadens the technical understanding of the domain. Altogether, the reviewed studies reveal that research on drones and smart agriculture has shifted from instrumental topics toward integrated approaches encompassing automation, geospatial analysis, and artificial intelligence, thus consolidating an interdisciplinary and continuously expanding field.
The identified themes can be applied to sectors such as forestry, environmental management, and agribusiness, where the use of drones and advanced analytics enhances the optimization of production processes. Moreover, the expansion of marginal topics into new geographic regions would allow the adaptation of smart agriculture models to diverse climatic and socioeconomic contexts. Finally, the longitudinal monitoring of these themes will help project the temporal evolution of scientific knowledge, promoting the integration of autonomous and machine learning technologies in the digital transformation of the global agricultural sector.
RQ4: 
How does the co-citation frequency among authors manifest in studies addressing the use of drones and their impact on smart agriculture?
The co-citation structure is examined through a global network (total) and a focused subgraph highlighting the top five most co-cited authors (Figure 11), enabling a combined macro–micro perspective on the intellectual foundations of the field. Table 12 complements these visualizations by quantifying the strength of each author–author linkage, ensuring precise interpretation of the most influential relationships.
As shown in the results, the full network reveals an extremely dense co-citation core, indicating a relatively concentrated set of foundational papers that guide the field’s conceptual and methodological evolution. The strongest links, li x.–wang j., li x.–zhang z., yang g.–yang x., and zhang l.–zhang z. (weight = 17), suggest a dominant cluster that centralizes the intellectual trajectory of the domain. According to the patterns displayed in the top-five subgraph, zhu y., wang j., zhang z., li x., and yang g. function as the key bridging authors with high intermediary roles across research lines. Intermediate-strength links (weights 14–16), such as cao w.–tian y., feng h.–yang g., and li z.–yang g., work as connectors between the dominant core and peripheral subgroups, revealing thematic and methodological convergence across different research communities. The recurring author surnames (li, wang, zhang, yang, zhu) point to a strong geographical concentration of the knowledge base, likely centered in Asian institutions, suggesting dependence on a relatively homogeneous epistemic community.
In comparison with other studies, Rejeb and colleagues [26] and Slimani, El Mhamdi, and Jilbab [105] also identified clusters of key authors through co-citation analysis, setting thresholds of 20 and 25 joint citations, respectively, which further supports the methodological robustness in identifying intellectual networks within the field. It is noteworthy that authors such as Girshick, He, and Zhang X. were reported as highly influential, in clear agreement with the findings of this paper, which highlight strong relationships among recurrent researchers. Similarly, Díaz Lara and collaborators [35] and Mustafa and his team [37] emphasized the relevance of Zhang and Wang, thereby confirming the thematic consistency and persistence of their contributions throughout the literature. Although co-citation patterns exhibit slight variations among studies, all converge in recognizing a stable core of authors driving the scientific advancement of the area. In contrast, Wang and colleagues [43] adopted a more dynamic approach by linking co-citations with evolving thematic clusters, identifying nodes such as Colomina I. as central references. These comparisons reveal a consolidated and expanding academic network in which certain authors act as intellectual anchors, articulating the evolution of knowledge on drones and smart agriculture.
These findings indicate the need to systematically contrast the dominant cluster’s contributions with evidence from other sectors and business domains (e.g., logistics, manufacturing, energy, services) where similar technologies may be applied. The observed geographical concentration underscores the importance of expanding bibliographic searches and bibliometric analyses to other regions (e.g., Europe, Latin America, Africa) to capture contextual variations and mitigate potential regional bias. Moreover, this co-citation structure should be re-evaluated in future periods to assess whether the identified core stabilizes as a long-term intellectual canon or whether emerging authors and schools reshape the field’s knowledge architecture.
RQ5: 
How do keywords frequently co-occur in studies addressing drones and their impact on smart agriculture?
Figure 12 and Table 13 illustrate the most frequent conceptual relationships identified within the scientific literature on drones and smart agriculture. The “Weight” column represents the frequency (count) with which each pair of keywords co-occurs in the reviewed papers, indicating how often these concepts appear together within the same study. This analysis reveals the terms that cluster around central thematic axes, highlighting the synergies between techniques, technologies, and predominant applications across the reviewed studies.
The results show that “precision agriculture” is the main node of the network, exhibiting the highest co-occurrences with terms such as “ml” (“machine learning”), “uav” (“unmanned aerial vehicle”), and “remote sensing”, which reflects the integration between machine learning, unmanned aerial vehicles, and remote sensing in agricultural optimization. The connection between “deep learning” and “precision agriculture” confirms the expansion of neural networks in crop recognition, yield estimation, and early disease detection. Associations with “vegetation indices” and “plant detection” reinforce the importance of spectral indicators and computer vision in agronomic decision-making. Likewise, the appearance of terms such as “3D point cloud from photogrammetry” and “leaf area index (LAI)” suggests the incorporation of three-dimensional models and structural metrics in the analysis of plant morphology. Taken together, the network shows a clear convergence among artificial intelligence, remote sensing, and spatial modeling as technological pillars of smart agriculture.
Figure 13 illustrates the frequency and relevance of the keywords used in the reviewed papers. Both complementary representations reveal the conceptual structure of the research field, highlighting the terms that articulate the convergence between emerging technologies, artificial intelligence, and precision agricultural processes.
The results confirm that “precision agriculture” is the dominant term, with 36 occurrences, reaffirming its role as the central axis in the application of drones for efficient crop management. It is followed by “ml” (machine learning) and “remote sensing,” both appearing 14 times, indicating a strong integration between machine learning and remote sensing as key analytical tools. The terms “UAV” and “unmanned aerial vehicle” (12 and 10 occurrences, respectively) highlight the importance of aerial platforms in agricultural data acquisition. Likewise, “deep learning” and “vegetation indices” suggest the use of neural networks and advanced spectral metrics for crop detection and health monitoring. Words such as “object detection,” “smart farming,” and “yield prediction” consolidate the predictive and automated approach that characterizes digital agriculture. Altogether, the frequency and co-occurrence of these terms reveal a mature field with a strong technological foundation and a clear orientation toward sustainability and data-driven decision-making.
The most frequent keywords in this study, “precision agriculture,” “ml,” “remote sensing,” “uav,” “unmanned aerial vehicle,” and “deep learning,” coincide with those identified by Rejeb and collaborators [26], confirming the centrality of these concepts in the literature on drones and smart agriculture. Similarly, Gokool and colleagues [106] highlight “remote sensing,” “uavs,” and “machine learning” as core terms, consolidating their relevance as the main axes of scientific production. Meanwhile, studies such as those by Abrahams and co-researchers [31] together with Yacoob and his research team [46] broaden this framework by incorporating agro-environmental terms like “ndvi” and “canopy temperature,” reflecting a conceptual evolution toward the integration of vegetation and water-stress indicators. Although each paper differs in the ranking of terms, all converge in the recurrent presence of sensing and machine learning technologies. In contrast, Abdollahi and co-authors [41] offer a more structured perspective by organizing the keywords into five thematic clusters, evidencing the diversity and cross-disciplinary scope of the field. Comparative results confirm that “uav,” “ml,” and “rs” represent the terminological pillars of the domain, around which the other emerging research lines in smart agriculture revolve.
The observed trends can be extrapolated to other sectors, such as forestry, environmental management, and smart mining, where the combination of drones, machine learning, and remote sensing improves operational efficiency. In the business domain, these findings foster the creation of digital innovation ecosystems that integrate sensors, AI, and predictive analytics to optimize processes. Finally, the expansion of these research lines into new geographic regions and future periods will strengthen global precision agriculture, adapting it to different climatic and productive realities.
  • Application trends and country-level implications
Taken together, the evidence gathered in RQ1–RQ5 delineates a clear application trajectory for drones in smart agriculture. Early studies in the field concentrated on experimental plots and single high-value crops, using vegetation indices derived from UAV or satellite imagery to monitor crop vigor and estimate yield. Subsequent work progressively integrated machine and deep learning models, multispectral and thermal sensors, and wireless sensor networks, leading to increasingly automated pipelines for yield prediction, water-stress assessment, and disease detection. The most recent contributions combine UAVs with IoT architectures, cyber–physical systems, and digital platforms for farm management, moving from isolated experiments towards more operational decision-support systems.
When these trends are contrasted with country realities, differentiated adoption pathways emerge. In high-income and upper-middle-income countries with consolidated farm structures and robust digital infrastructure, the frontier lies in scaling multi-sensor DL models, edge computing, and digital twins to optimize inputs and reduce environmental impacts. By contrast, in agrarian economies with a high share of smallholder farmers, limited connectivity, and constrained budgets, the most viable short- and medium-term strategies consist of low-cost RGB UAVs, service-based models (e.g., cooperative or “drone-as-a-service” schemes), and cloud platforms that lower entry barriers. For these contexts, prioritizing applications such as water-stress monitoring, basic yield mapping, and early detection of pests and diseases can deliver tangible benefits with relatively modest technological complexity. In this way, the identified research trends translate into differentiated, country-sensitive roadmaps that support the progressive and equitable adoption of drones in smart agriculture.

5. Conclusions and Future Research

The results of this systematic review reveal a landscape of methodological consolidation and thematic expansion regarding the use of drones in smart agriculture. First, in relation to RQ1, there is a clear concentration of scientific production in Q1 journals, reflecting high methodological rigor and the growing relevance of the topic in prestigious international academic forums. This predominance indicates that research in this area has reached sufficient maturity to position itself at the core of the global scientific agenda focused on agricultural digitalization and productive sustainability. Concerning RQ2, the most productive countries, primarily China, the United States, and Spain, demonstrate strong scientific and technological leadership, supported by their investment in R&D and advanced digital infrastructure. This geographical dominance reveals an asymmetry in knowledge generation, where nations with greater technological resources drive the frontier of innovation, while developing regions advance more slowly in the effective adoption of these emerging technologies. Meanwhile, RQ3 makes it possible to identify a set of robust thematic categories centered on precision agriculture, remote sensing, and deep learning, which structure the conceptual map of the field. The evolution toward integrative topics, such as agricultural AI and UAV crop sensing, illustrates a trend toward the convergence of analytical models, intelligent automation, and predictive crop management. RQ4 and RQ5 further reveal consolidated co-citation structures and stable co-occurrence patterns in which authors such as Zhang, Wang, and Girshick, together with core topics like vegetation indices, machine learning, and UAV remote sensing, act as central nodes of the knowledge network in this domain. Taken together, the findings reflect a growing discipline that combines analytical rigor, technological applicability, and international synergy.
In summary, the reviewed papers confirm that research on drones in Smart Agriculture has moved beyond exploratory stages, reaching a phase of maturity characterized by a strong methodological foundation, empirical validation, and global articulation. However, challenges persist regarding metric standardization, sensor interoperability, and technological accessibility in small-scale agricultural contexts. These limitations highlight the urgent need for standardized datasets and benchmarking protocols that enable fair comparison of methodologies and promote technological transfer to Global South regions.
From a practical standpoint, the synthesis presented in this review suggests several priority lines of action for governments, funding agencies, and agricultural stakeholders. Countries seeking to accelerate drone adoption in Smart Agriculture should invest in shared data infrastructures, such as open, quality-controlled repositories of UAV imagery and ground-truth measurements, to lower entry barriers for researchers, start-ups, and cooperatives. In parallel, policy instruments that support cooperative or service-based models (for example, drone-as-a-service schemes operated by producer organizations, extension services, or local SMEs) can make UAV technologies accessible to smallholder farmers who cannot afford individual ownership. In contexts with limited connectivity and restricted budgets, it is also crucial to promote low-cost RGB platforms, simplified analytical workflows, and cloud-based processing services, prioritizing applications with immediate impact such as water-stress monitoring, basic yield mapping, and early detection of pests and diseases.
These equity-oriented recommendations should be interpreted considering the national productivity and collaboration patterns identified in this review, where a small group of high-output countries concentrate most publications and occupy central positions in co-authorship networks, while many regions, particularly those dominated by smallholder agriculture, appear as marginal contributors. Based on this asymmetry, a more practical, tiered approach to technology dissemination and capacity-building can be envisaged. For high-capacity research hubs, priority actions include leading open data and code initiatives, coordinating multi-country UAV testbeds, and designing interoperable platforms that can be reused by partners with fewer resources. For intermediate economies, joint projects, co-supervised theses, and regional training schools can help translate methodological advances into locally adapted solutions. In smallholder-dominated regions, emphasis should be placed on low-cost service models (e.g., UAV service providers or cooperatives), training of extension agents and local technicians, and participatory schemes for data governance, ensuring that drones and AI are embedded in existing sociotechnical systems rather than imported as isolated technological artefacts. In all cases, funding agencies and international programs have a critical role in incentivizing collaborative consortia that include underrepresented regions as equal partners, rather than mere data sources.
From a theoretical perspective, this review contributes to the literature by positioning drones as central enabling technologies within a multi-layer sociotechnical system that integrates sensors, AI/ML/DL models, farm management platforms, and institutional arrangements. Our results show that research productivity, journal quartile, and topic centrality co-evolve with national digital capacity, making the technological equity gap between high-income and low/middle-income countries a structural property of the field rather than a marginal side effect. Moreover, the combined use of Kitchenham’s SLR guidelines, Petersen’s systematic mapping, and PRISMA 2020 reporting allows us to articulate a coherent framework that links methodological quality, bibliometric structure, and application domains. This framework provides a theoretically grounded lens through which future studies can analyze the evolution, diffusion, and governance of drone-based Smart Agriculture across diverse territorial and socioeconomic settings.
This review is subject to several limitations. First, the literature search was deliberately restricted to the period 2019–July 2025 and to five major academic databases. There is a substantial body of pre-2019 work on UAV-based remote sensing and digital agriculture, as well as studies indexed in additional specialized or regional databases. However, covering the entire historical trajectory and all possible sources would require a much broader and more heterogeneous corpus, together with a complete redesign of the search, screening, and quality-assessment procedures. In this study, we therefore focus on the most recent wave of AI/ML-enabled drone applications for smart agriculture, and we build on earlier reviews and bibliometric mappings that have already documented foundational developments. As a result, our findings should be interpreted as characterizing recent research dynamics within a clearly defined corpus, rather than providing an exhaustive census of all existing work on drones in agriculture.
Future research should focus on integrating advanced artificial intelligence, IoT sensors, and autonomous systems aimed at improving pest detection, water use efficiency, and yield prediction, while simultaneously embedding explicit ethical and data-governance frameworks. In addition to technical performance, future efforts should promote international collaborations and prioritize the creation of open, annotated, multi-crop, and multi-seasonal datasets that include raw sensor data (RGB, multispectral, thermal) and their corresponding reference measurements, enabling robust model benchmarking and transfer of learning across diverse agricultural contexts.
Advancing these agendas requires coordinated action from multiple stakeholders. Funding agencies and international programs can mandate open data, reproducible pipelines, and inclusive multi-country consortia. The academic community should design solid benchmarks, document methods and datasets, and provide interdisciplinary training linking agronomy, AI, and data governance. Industry associations and technology providers must offer interoperable platforms and business models (e.g., drone-as-a-service) adapted to smallholder realities. Policymakers should define clear rules on data ownership, privacy, and liability, while farmer organizations and cooperatives help set priorities, validate use cases in real conditions, and ensure fair benefit sharing. Through this shared responsibility, open datasets, consent mechanisms, and ethical governance can move from principles to concrete practice in Smart Agriculture.
Crucially, these developments must be accompanied by transparent rules for data sharing, participatory consent mechanisms, and policies that prevent the concentration of informational power in a small group of technology providers or large farms. To ensure effective and equitable adoption—particularly in family farming systems—research should also design integrated feasibility frameworks that combine technical performance metrics with cost–benefit analyses and assessments of digital literacy requirements, thereby aligning technological innovation with social, economic, and governance conditions in real-world agricultural settings. Additionally, interdisciplinary studies that integrate socioeconomic and environmental dimensions are essential to strengthen the sustainable and equitable adoption of drone technologies in agriculture globally, ensuring that the benefits of smart agriculture are not confined to a small group of technologically advanced countries but become accessible to the Global South as well.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/su18010507/s1, PRISMA 2020 checklist [111].

Author Contributions

Conceptualization, J.G.-C., J.E.-G. and C.B.-R.; methodology, J.G.-C., C.A.R. and J.N.V.; software, J.E.-G., C.A.T.R., J.G.-M. and F.A.-G.; validation, J.E.-G., C.B.-R., J.G.-C., F.A.-G. and C.A.T.R.; investigation, J.G.-C., J.N.V. and C.A.R.; resources, C.A.R. and J.G.-M.; writing—original draft preparation, J.E.-G. and C.B.-R.; writing—review and editing, J.G.-C., C.A.T.R. and C.A.R.; project administration, J.N.V., J.G.-M. and F.A.-G.; supervision, J.G.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Universidad Nacional Federico Villarreal.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Afsar, M.M.; Iqbal, M.S.; Bakhshi, A.D.; Hussain, E.; Iqbal, J. MangiSpectra: A multivariate phenological analysis framework leveraging UAV imagery and LSTM for tree health and yield estimation in mango orchards. Remote Sens. 2025, 17, 703. [Google Scholar] [CrossRef]
  2. Ali, N.; Mohammed, A.; Bais, A.; Berraies, S.; Ruan, Y.; Cuthbert, R.D.; Sangha, J.S. Field-scale precision: Predicting grain yield of diverse wheat breeding lines using high-throughput UAV multispectral imaging. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 11419–11433. [Google Scholar] [CrossRef]
  3. Amarasingam, N.; Gonzalez, F.; Salgadoe, A.S.A.; Sandino, J.; Powell, K. Detection of white leaf disease in sugarcane crops using UAV-derived RGB imagery with existing deep learning models. Remote Sens. 2022, 14, 6137. [Google Scholar] [CrossRef]
  4. Bah, M.D.; Hafiane, A.; Canals, R. CRowNet: Deep network for crop row detection in UAV images. IEEE Access 2019, 8, 5189–5200. [Google Scholar] [CrossRef]
  5. Dorbu, F.; Hashemi-Beni, L. Detection of individual corn crop and canopy delineation from unmanned aerial vehicle imagery. Remote Sens. 2024, 16, 2679. [Google Scholar] [CrossRef]
  6. Guimarães, N.; Sousa, J.J.; Couto, P.; Bento, A.; Pádua, L. Combining UAV-based multispectral and thermal infrared data with regression modeling and SHAP analysis for predicting stomatal conductance in almond orchards. Remote Sens. 2024, 16, 2467. [Google Scholar] [CrossRef]
  7. Htun, N.-N.; Rojo, D.; Ooge, J.; De Croon, R.; Kasimati, A.; Verbert, K. Developing visual-assisted decision support systems across diverse agricultural use cases. Agriculture 2022, 12, 1027. [Google Scholar] [CrossRef]
  8. Gupta, R.; Bhatnagar, V.; Kumar, G.; Singh, G. Selection of suitable IoT-based end-devices, tools, and technologies for implementing smart farming: Issues and challenges. Int. J. Stud. Res. Technol. Manag. 2022, 10, 28–35. [Google Scholar] [CrossRef]
  9. Chiu, M.S.; Wang, J. Evaluation of machine learning regression techniques for estimating winter wheat biomass using biophysical, biochemical, and UAV multispectral data. Drones 2024, 8, 287. [Google Scholar] [CrossRef]
  10. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Tortia, C.; Mania, E.; Guidoni, S.; Gay, P. Leaf area index evaluation in vineyards using 3D point clouds from UAV imagery. Precis. Agric. 2020, 21, 881–896. [Google Scholar] [CrossRef]
  11. Di Gennaro, S.; Dainelli, R.; Palliotti, A.; Toscano, P.; Matese, A. Sentinel-2 validation for spatial variability assessment in overhead trellis system viticulture versus UAV and agronomic data. Remote Sens. 2019, 11, 2573. [Google Scholar] [CrossRef]
  12. Guo, Y.; Guo, J.; Liu, C.; Xiong, H.; Chai, L.; He, D. Precision landing test and simulation of the agricultural UAV on apron. Sensors 2020, 20, 3369. [Google Scholar] [CrossRef]
  13. Brewer, K.; Clulow, A.; Sibanda, M.; Gokool, S.; Odindi, J.; Mutanga, O.; Naiken, V.; Chimonyo, V.G.P.; Mabhaudhi, T. Estimation of maize foliar temperature and stomatal conductance as indicators of water stress based on optical and thermal imagery acquired using an unmanned aerial vehicle (UAV) platform. Drones 2022, 6, 169. [Google Scholar] [CrossRef]
  14. Ndlovu, H.S.; Odindi, J.; Sibanda, M.; Mutanga, O. A systematic review on the application of UAV-based thermal remote sensing for assessing and monitoring crop water status in crop farming systems. Int. J. Remote Sens. 2024, 45, 4923–4960. [Google Scholar] [CrossRef]
  15. Bukowiecki, J.; Rose, T.; Holzhauser, K.; Rothardt, S.; Rose, M.; Komainda, M.; Herrmann, A.; Kage, H. UAV-based canopy monitoring: Calibration of a multispectral sensor for green area index and nitrogen uptake across several crops. Precis. Agric. 2024, 25, 1556–1580. [Google Scholar] [CrossRef]
  16. Flores Peña, P.; Ale Isaac, M.S.; Gîfu, D.; Pechlivani, E.M.; Ragab, A.R. Unmanned aerial vehicle-based hyperspectral imaging and soil texture mapping with robust AI algorithms. Drones 2025, 9, 129. [Google Scholar] [CrossRef]
  17. Gao, J.; Bambrah, C.K.; Parihar, N.; Kshirsagar, S.; Mallarapu, S.; Yu, H.; Wu, J.; Yang, Y. Analysis of various machine learning algorithms for using drone images in livestock farms. Agriculture 2024, 14, 522. [Google Scholar] [CrossRef]
  18. Chin, R.; Catal, C.; Kassahun, A. Plant disease detection using drones in precision agriculture. Precis. Agric. 2023, 24, 1663–1682. [Google Scholar] [CrossRef]
  19. Gao, M.; Yang, F.; Wei, H.; Liu, X. Automatic monitoring of maize seedling growth using unmanned aerial vehicle-based RGB imagery. Remote Sens. 2023, 15, 3671. [Google Scholar] [CrossRef]
  20. Silva, J.A.O.S.; de Siqueira, V.S.; Mesquita, M.; Vale, L.S.R.; da Silva, J.L.B.; da Silva, M.V.; Lemos, J.P.B.; Lacerda, L.N.; Ferrarezi, R.S.; de Oliveira, H.F.E. Artificial intelligence applied to support agronomic decisions for the automatic aerial analysis images captured by UAV: A systematic review. Agronomy 2024, 14, 2697. [Google Scholar] [CrossRef]
  21. Andrade-Mogollon, T.; Gamboa-Cruzado, J.; Amayo-Gamboa, F. Systematic literature review of generative AI and IoT as key technologies for precision agriculture. Comput. Sist. 2025, 29, 857–882. [Google Scholar] [CrossRef]
  22. Vasileiou, M.; Kyrgiakos, L.S.; Kleisiari, C.; Kleftodimos, G.; Vlontzos, G.; Belhouchette, H.; Pardalos, P.M. Transforming weed management in sustainable agriculture with artificial intelligence: A systematic literature review towards weed identification and deep learning. Crop Prot. 2024, 176, 106522. [Google Scholar] [CrossRef]
  23. Balyan, S.; Jangir, H.; Tripathi, S.N.; Tripathi, A.; Jhang, T.; Pandey, P. Seeding a sustainable future: Navigating the digital horizon of smart agriculture. Sustainability 2024, 16, 475. [Google Scholar] [CrossRef]
  24. Huang, L.; Tan, J.; Chen, Z. Mamba-UAV-SegNet: A multi-scale adaptive feature fusion network for real-time semantic segmentation of UAV aerial imagery. Drones 2024, 8, 671. [Google Scholar] [CrossRef]
  25. Sott, M.K.; Nascimento, L.d.S.; Foguesatto, C.R.; Furstenau, L.B.; Faccin, K.; Zawislak, P.A.; Mellado, B.; Kong, J.D.; Bragazzi, N.L. A bibliometric network analysis of recent publications on digital agriculture to depict strategic themes and evolution structure. Sensors 2021, 21, 7889. [Google Scholar] [CrossRef]
  26. Rejeb, A.; Abdollahi, A.; Rejeb, K.; Treiblmaier, H. Drones in agriculture: A review and bibliometric analysis. Comput. Electron. Agric. 2022, 198, 107017. [Google Scholar] [CrossRef]
  27. Kabir, M.S.; Pervez, A.K.M.K.; Jahan, M.N.; Khan, M.T.A.; Kabiraj, U.K. Unmanned aerial vehicles in agricultural sciences: A bibliometric analysis study based on Scopus database. Afr. J. Biol. Sci. 2024, 6, 14335–14357. [Google Scholar]
  28. Moraes, H.M.F.E.; Júnior, M.R.F.; Da Vitória, E.L.; Martins, R.N. A bibliometric and scientometric analysis on the use of UAVs in agriculture, livestock and forestry. Cienc. Rural 2023, 53, e20220130. [Google Scholar] [CrossRef]
  29. Mühl, D.D.; Oliveira, L. A bibliometric and thematic approach to agriculture 4.0. Heliyon 2022, 8, e09369. [Google Scholar] [CrossRef]
  30. Bertoglio, R.; Corbo, C.; Renga, F.M.; Matteucci, M. The digital agricultural revolution: A bibliometric analysis literature review. IEEE Access 2021, 9, 134762–134782. [Google Scholar] [CrossRef]
  31. Abrahams, M.; Sibanda, M.; Dube, T.; Chimonyo, V.G.P.; Mabhaudhi, T. A systematic review of UAV applications for mapping neglected and underutilised crop species’ spatial distribution and health. Remote Sens. 2023, 15, 4672. [Google Scholar] [CrossRef]
  32. Zambrano, P.; Calderon, F.; Villegas, H.; Paillacho, J.; Pazmiño, D.; Realpe, M. UAV remote sensing applications and current trends in crop monitoring and diagnostics: A systematic literature review. In Proceedings of the 2023 IEEE 13th International Conference on Pattern Recognition Systems (ICPRS), Guayaquil, Ecuador, 4–7 July 2023. [Google Scholar] [CrossRef]
  33. Bento, N.L.; Ferraz, G.A.E.S.; Santana, L.S.; Silva, M.D.L.O.E. Coffee growing with remotely piloted aircraft system: Bibliometric review. AgriEngineering 2023, 5, 2458–2477. [Google Scholar] [CrossRef]
  34. Marques, P.; Pádua, L.; Sousa, J.J.; Fernandes-Silva, A. Advancements in remote sensing imagery applications for precision management in olive growing: A systematic review. Remote Sens. 2024, 16, 1324. [Google Scholar] [CrossRef]
  35. de Jesus Diaz Lara, M.; Bernabe, J.G.; Benitez, R.A.G.; Toxqui, J.M.; Huerta, M.K. Bibliometric analysis of the use of the internet of things in precision agriculture. In Proceedings of the 2021 IEEE International Conference on Engineering Veracruz (ICEV), Boca del Río, Veracruz, Mexico, 25–28 October 2021. [Google Scholar] [CrossRef]
  36. Darra, N.; Anastasiou, E.; Kriezi, O.; Lazarou, E.; Kalivas, D.; Fountas, S. Can yield prediction be fully digitilized? A systematic review. Agronomy 2023, 13, 2441. [Google Scholar] [CrossRef]
  37. Mustafa, G.; Liu, Y.; Khan, I.H.; Hussain, S.; Jiang, Y.; Liu, J.; Arshad, S.; Osman, R. Establishing a knowledge structure for yield prediction in cereal crops using unmanned aerial vehicles. Front. Plant Sci. 2024, 15, 1401246. [Google Scholar] [CrossRef]
  38. Guimarães, N.; Sousa, J.J.; Pádua, L.; Bento, A.; Couto, P. Remote sensing applications in almond orchards: A comprehensive systematic review of current insights, research gaps, and future prospects. Appl. Sci. 2024, 14, 1749. [Google Scholar] [CrossRef]
  39. Escandón-Panchana, P.; Herrera-Franco, G.; Jaya-Montalvo, M.; Martínez-Cuevas, S. Geomatic tools used in the management of agricultural activities: A systematic review. Environ. Dev. Sustain. 2024, 27, 15275–15309. [Google Scholar] [CrossRef]
  40. Joice, A.; Tufaique, T.; Tazeen, H.; Igathinathane, C.; Zhang, Z.; Whippo, C.; Hendrickson, J.; Archer, D. Applications of Raspberry Pi for precision agriculture: A systematic review. Agriculture 2025, 15, 227. [Google Scholar] [CrossRef]
  41. Abdollahi, A.; Rejeb, K.; Rejeb, A.; Mostafa, M.M.; Zailani, S. Wireless sensor networks in agriculture: Insights from bibliometric analysis. Sustainability 2021, 13, 12011. [Google Scholar] [CrossRef]
  42. Singh, A.P.; Yerudkar, A.; Mariani, V.; Iannelli, L.; Glielmo, L. A bibliometric review of the use of unmanned aerial vehicles in precision agriculture and precision viticulture for sensing applications. Remote Sens. 2022, 14, 1604. [Google Scholar] [CrossRef]
  43. Wang, J.; Wang, S.; Zou, D.; Chen, H.; Zhong, R.; Li, H.; Zhou, W.; Yan, K. Social network and bibliometric analysis of unmanned aerial vehicle remote sensing applications from 2010 to 2021. Remote Sens. 2021, 13, 2912. [Google Scholar] [CrossRef]
  44. Brenya, R.; Zhu, J.; Sampene, A.K. Can agriculture technology improve food security in low- and middle-income nations? A systematic review. Sustain. Food Technol. 2023, 1, 484–499. [Google Scholar] [CrossRef]
  45. Wang, G.; Li, S.; Yi, Y.; Wang, Y.; Shin, C. Digital technology increases the sustainability of cross-border agro-food supply chains: A review. Agriculture 2024, 14, 900. [Google Scholar] [CrossRef]
  46. Yacoob, A.; Gokool, S.; Clulow, A.; Mahomed, M.; Mabhaudhi, T. Leveraging unmanned aerial vehicle technologies to facilitate precision water management in smallholder farms: A scoping review and bibliometric analysis. Drones 2024, 8, 476. [Google Scholar] [CrossRef]
  47. Kitchenham, B. Guidelines for Performing Systematic Literature Reviews in Software Engineering; Software Engineering Group, School of Computer Science and Mathematics, Keele University: Keele, UK; Department of Computer Science, University of Durham: Durham, UK, 2007. [Google Scholar]
  48. Petersen, K.; Vakkalanka, S.; Kuzniarz, L. Guidelines for conducting systematic mapping studies in software engineering: An update. Inf. Softw. Technol. 2015, 64, 1–18. [Google Scholar] [CrossRef]
  49. PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses. Available online: https://www.prismastatement.org (accessed on 28 July 2025).
  50. Ivezić, A.; Trudić, B.; Stamenković, Z.; Kuzmanović, B.; Perić, S.; Ivošević, B.; Buđen, M.; Petrović, K. Drone-related agrotechnologies for precise plant protection in Western Balkans: Applications, possibilities, and legal framework limitations. Agronomy 2023, 13, 2615. [Google Scholar] [CrossRef]
  51. Jasim, A.N.; Fourati, L.C.; Albahri, O.S. Evaluation of unmanned aerial vehicles for precision agriculture based on integrated fuzzy decision-making approach. IEEE Access 2023, 11, 75037–75062. [Google Scholar] [CrossRef]
  52. Jindo, K.; Teklu, M.G.; van Boheeman, K.; Njehia, N.S.; Narabu, T.; Kempenaar, C.; Molendijk, L.P.G.; Schepel, E.; Been, T.H. Unmanned aerial vehicle (UAV) for detection and prediction of damage caused by potato cyst nematode G. pallida on selected potato cultivars. Remote Sens. 2023, 15, 1429. [Google Scholar] [CrossRef]
  53. Jorge, J.; Vallbé, M.; Soler, J.A. Detection of irrigation inhomogeneities in an olive grove using the NDRE vegetation index obtained from UAV images. Eur. J. Remote Sens. 2019, 52, 169–177. [Google Scholar] [CrossRef]
  54. Jurado, J.M.; Ortega, L.; Cubillas, J.J.; Feito, F.R. Multispectralmapping on 3D models and multi-temporal monitoring for individual characterization of olive trees. Remote Sens. 2020, 12, 1106. [Google Scholar] [CrossRef]
  55. Kakamoukas, G.A.; Lagkas, T.D.; Argyriou, V.; Goudos, S.K.; Grammatikis, P.R.; Bibi, S.; Sarigiannidis, P.G. A novel air-to-ground communication scheme for advanced big data collection in smart farming using UAVs. IEEE Access 2025, 13, 16564–16583. [Google Scholar] [CrossRef]
  56. Kapari, M.; Sibanda, M.; Magidi, J.; Mabhaudhi, T.; Nhamo, L.; Mpandeli, S. Comparing machine learning algorithms for estimating the maize crop water stress index (CWSI) using UAV-acquired remotely sensed data in smallholder croplands. Drones 2024, 8, 61. [Google Scholar] [CrossRef]
  57. Khaliq, A.; Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Chiaberge, M.; Gay, P. Comparison of satellite and UAV-based multispectral imagery for vineyard variability assessment. Remote Sens. 2019, 11, 436. [Google Scholar] [CrossRef]
  58. Khun, K.; Tremblay, N.; Panneton, B.; Vigneault, P.; Lord, E.; Cavayas, F.; Codjia, C. Use of oblique RGB imagery and apparent surface area of plants for early estimation of above-ground corn biomass. Remote Sens. 2021, 13, 4032. [Google Scholar] [CrossRef]
  59. Killeen, P.; Kiringa, I.; Yeap, T.; Branco, P. Corn grain yield prediction using UAV-based high spatiotemporal resolution imagery, machine learning, and spatial cross-validation. Remote Sens. 2024, 16, 683. [Google Scholar] [CrossRef]
  60. Koubaa, A.; Ammar, A.; Abdelkader, M.; Alhabashi, Y.; Ghouti, L. AERO: AI-enabled remote sensing observation with onboard edge computing in UAVs. Remote Sens. 2023, 15, 1873. [Google Scholar] [CrossRef]
  61. Kovalev, I.V.; Kovalev, D.I.; Voroshilova, A.A.; Podoplelova, V.A.; Borovinsky, D.A. GERT analysis of UAV transport technological cycles when used in precision agriculture. IOP Conf. Ser. Earth Environ. Sci. 2022, 1076, 012055. [Google Scholar] [CrossRef]
  62. Li, M.; Shamshiri, R.R.; Schirrmann, M.; Weltzien, C.; Shafian, S.; Laursen, M.S. UAV oblique imagery with an adaptive micro-terrain model for estimation of leaf area index and height of maize canopy from 3D point clouds. Remote Sens. 2022, 14, 585. [Google Scholar] [CrossRef]
  63. Li, Z.; Zhou, X.; Cheng, Q.; Fei, S.; Chen, Z. A machine-learning model based on the fusion of spectral and textural features from UAV multi-sensors to analyse the total nitrogen content in winter wheat. Remote Sens. 2023, 15, 2152. [Google Scholar] [CrossRef]
  64. Liu, Y.; Feng, H.; Yue, J.; Fan, Y.; Jin, X.; Song, X.; Yang, H.; Yang, G. Estimation of potato above-ground biomass based on vegetation indices and green-edge parameters obtained from UAVs. Remote Sens. 2022, 14, 5323. [Google Scholar] [CrossRef]
  65. Liu, Z.; Li, H.; Ding, X.; Cao, X.; Chen, H.; Zhang, S. Estimating maize maturity by using UAV multi-spectral images combined with a CCC-based model. Drones 2023, 7, 586. [Google Scholar] [CrossRef]
  66. Dybå, T.; Dingsøyr, T. Empirical studies of agile software development: A systematic review. Inf. Softw. Technol. 2008, 50, 833–859. [Google Scholar] [CrossRef]
  67. Duarte, A.; Borralho, N.; Cabral, P.; Caetano, M. Recent advances in forest insect pests and diseases monitoring using UAV-based data: A systematic review. Forests 2022, 13, 911. [Google Scholar] [CrossRef]
  68. Alok, K.; Varshney, M. A study of the smart drone with an artificial neural network for precision farming. Neuro Quantology 2022, 20, 4454–4458. [Google Scholar]
  69. Vijayakumar, S.; Shanmugapriya, P.; Saravanane, P.; Ramesh, T.; Murugaiyan, V.; Ilakkiya, S. Precision weed control using unmanned aerial vehicles and robots: Assessing feasibility, bottlenecks, and recommendations for scaling. New Dev. Technol. 2025, 3, 10. [Google Scholar] [CrossRef]
  70. Farhad, M.M.; Kurum, M.; Gurbuz, A.C. A ubiquitous GNSS-R methodology to estimate surface reflectivity using spinning smartphone onboard a small UAS. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 6568–6578. [Google Scholar] [CrossRef]
  71. Hosseiny, B.; Rastiveis, H.; Homayouni, S. An automated framework for plant detection based on deep simulated learning from drone imagery. Remote Sens. 2020, 12, 3521. [Google Scholar] [CrossRef]
  72. Mia, M.S.; Tanabe, R.; Habibi, L.N.; Hashimoto, N.; Homma, K.; Maki, M.; Matsui, T.; Tanaka, T.S.T. Multimodal deep learning for rice yield prediction using UAV-based multispectral imagery and weather data. Remote Sens. 2023, 15, 2511. [Google Scholar] [CrossRef]
  73. Moresi, F.V.; Cirigliano, P.; Rengo, A.; Brunori, E.; Biasi, R.; Mugnozza, G.S.; Maesano, M. Monitoring abiotic stressors in rainfed vineyards involves combining UAV and field monitoring techniques to enhance precision management. Remote Sens. 2025, 17, 803. [Google Scholar] [CrossRef]
  74. Moysiadis, V.; Siniosoglou, I.; Kokkonis, G.; Argyriou, V.; Lagkas, T.; Goudos, S.K.; Sarigiannidis, P. Cherry tree crown extraction using machine learning based on images from UAVs. Agriculture 2024, 14, 322. [Google Scholar] [CrossRef]
  75. Nahrstedt, K.; Reuter, T.; Trautz, D.; Waske, B.; Jarmer, T. Classifying stand compositions in clover grass based on high-resolution multispectral UAV images. Remote Sens. 2024, 16, 2684. [Google Scholar] [CrossRef]
  76. Nanavati, R.V.; Meng, Y.; Coombes, M.; Liu, C. Generalized data-driven optimal path planning framework for uniform coverage missions using crop spraying UAVs. Precis. Agric. 2023, 24, 1497–1525. [Google Scholar] [CrossRef]
  77. Ndlovu, H.S.; Odindi, J.; Sibanda, M.; Mutanga, O.; Clulow, A.; Chimonyo, V.G.P.; Mabhaudhi, T. A comparative estimation of maize leaf water content using machine learning techniques and unmanned aerial vehicle (UAV)-based proximal and remotely sensed data. Remote Sens. 2021, 13, 4091. [Google Scholar] [CrossRef]
  78. Nnadozie, E.C.; Iloanusi, O.N.; Ani, O.A.; Yu, K. Detecting cassava plants under different field conditions using UAV-based RGB images and deep learning models. Remote Sens. 2023, 15, 2322. [Google Scholar] [CrossRef]
  79. Nuijten, R.J.G.; Kooistra, L.; De Deyn, G.B. Using unmanned aerial systems (UAS) and object-based image analysis (OBIA) for measuring plant-soil feedback effects on crop productivity. Drones 2019, 3, 54. [Google Scholar] [CrossRef]
  80. Ortenzi, L.; Violino, S.; Pallottino, F.; Figorilli, S.; Vasta, S.; Tocci, F.; Antonucci, F.; Imperi, G.; Costa, C. Early estimation of olive production from light drone orthophoto through canopy radius. Drones 2021, 5, 118. [Google Scholar] [CrossRef]
  81. De Padua, E.P.; Amongo, R.C.; Quilloy, E.P.; Suministrado, D.C.; Elauria, J.C. Development of a local unmanned aerial vehicle (UAV) pesticide sprayer for rice production system in the Philippines. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1109, 12022. [Google Scholar] [CrossRef]
  82. Panjaitan, S.D.; Dewi, Y.S.K.; Hendri, M.I.; Wicaksono, R.A.; Priyatman, H. A drone technology implementation approach to conventional paddy fields application. IEEE Access 2022, 10, 120650–120658. [Google Scholar] [CrossRef]
  83. Pantos, C.; Hildmann, H.; Valente, J. Experimental connectivity analysis for drones in greenhouses. Drones 2022, 7, 24. [Google Scholar] [CrossRef]
  84. Quille-Mamani, J.; Ramos-Fernández, L.; Huanuqueño-Murillo, J.; Quispe-Tito, D.; Cruz-Villacorta, L.; Pino-Vargas, E.; del Pino, L.F.; Heros-Aguilar, E.; Ángel Ruiz, L. Rice yield prediction using spectral and textural indices derived from UAV imagery and machine learning models in Lambayeque, Peru. Remote Sens. 2025, 17, 632. [Google Scholar] [CrossRef]
  85. Rahman, M.F.F.; Fan, S.; Zhang, Y.; Chen, L. A comparative study on application of unmanned aerial vehicle systems in agriculture. Agriculture 2021, 11, 22. [Google Scholar] [CrossRef]
  86. Ren, J.; Zhang, N.; Liu, X.; Wu, S.; Li, D. Dynamic harvest index estimation of winter wheat based on UAV hyperspectral remote sensing considering crop aboveground biomass change and the grain filling process. Remote Sens. 2022, 14, 1955. [Google Scholar] [CrossRef]
  87. Ren, P.; Li, H.; Han, S.; Chen, R.; Yang, G.; Yang, H.; Feng, H.; Zhao, C. Estimation of soybean yield by combining maturity group information and unmanned aerial vehicle multi-sensor data using machine learning. Remote Sens. 2023, 15, 4286. [Google Scholar] [CrossRef]
  88. Ronchetti, G.; Mayer, A.; Facchi, A.; Ortuani, B.; Sona, G. Crop row detection through UAV surveys to optimize on-farm irrigation management. Remote Sens. 2020, 12, 1967. [Google Scholar] [CrossRef]
  89. Senyurek, V.; Farhad, M.M.; Gurbuz, A.C.; Kurum, M.; Adeli, A. Fusion of reflected GPS signals with multispectral imagery to estimate soil moisture at subfield scale from small UAS platforms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 6843–6855. [Google Scholar] [CrossRef]
  90. Shahi, T.B.; Dahal, S.; Sitaula, C.; Neupane, A.; Guo, W. Deep learning-based weed detection using UAV images: A comparative study. Drones 2023, 7, 624. [Google Scholar] [CrossRef]
  91. Singh, K.; Huang, Y.; Young, W.; Harvey, L.; Hall, M.; Zhang, X.; Lobaton, E.; Jenkins, J.; Shankle, M. Sweet potato yield prediction using machine learning based on multispectral images acquired from a small unmanned aerial vehicle. Agriculture 2025, 15, 420. [Google Scholar] [CrossRef]
  92. Sofia, S.; Agosta, M.; Asciuto, A.; Crescimanno, M.; Galati, A. Unleashing profitability of vineyards through the adoption of unmanned aerial vehicles technology systems: The case of two Italian wineries. Precis. Agric. 2025, 26, 41. [Google Scholar] [CrossRef]
  93. Sunoj, S.; Cho, J.; Guinness, J.; van Aardt, J.; Czymmek, K.J.; Ketterings, Q.M. Corn grain yield prediction and mapping from unmanned aerial system (UAS) multispectral imagery. Remote Sens. 2021, 13, 3948. [Google Scholar] [CrossRef]
  94. Lara-Molina, F.A. Optimization of Coverage Path Planning for Agricultural Drones in Weed-Infested Fields Using Semantic Segmentation. Agriculture 2025, 15, 1262. [Google Scholar] [CrossRef]
  95. Vélez, S.; Vacas, R.; Martín, H.; Ruano-Rosa, D.; Álvarez, S. A novel technique using planar area and ground shadows calculated from UAV RGB imagery to estimate pistachio tree (Pistacia vera L.) canopy volume. Remote Sens. 2022, 14, 6006. [Google Scholar] [CrossRef]
  96. Wang, L.; Lan, Y.; Zhang, Y.; Zhang, H.; Tahir, M.N.; Ou, S.; Liu, X.; Chen, P. Applications and prospects of agricultural unmanned aerial vehicle obstacle avoidance technology in China. Sensors 2019, 19, 642. [Google Scholar] [CrossRef] [PubMed]
  97. Wang, Y.; Xiao, C.; Wang, Y.; Li, K.; Yu, K.; Geng, J.; Li, Q.; Yang, J.; Zhang, J.; Zhang, M.; et al. Monitoring of cotton boll opening rate based on UAV multispectral data. Remote Sens. 2023, 16, 132. [Google Scholar] [CrossRef]
  98. Wei, L.; Yu, M.; Zhong, Y.; Zhao, J.; Liang, Y.; Hu, X. Spatial–spectral fusion based on conditional random fields for the fine classification of crops in UAV-borne hyperspectral remote sensing imagery. Remote Sens. 2019, 11, 780. [Google Scholar] [CrossRef]
  99. Wu, P.; Lei, X.; Zeng, J.; Qi, Y.; Yuan, Q.; Huang, W.; Ma, Z.; Shen, Q.; Lyu, X. Research progress in mechanized and intelligentized pollination technologies for fruit and vegetable crops. Int. J. Agric. Biol. Eng. 2024, 17, 11–21. [Google Scholar] [CrossRef]
  100. Xu, R.; Li, C.; Bernardes, S. Development and testing of a UAV-based multi-sensor system for plant phenotyping and precision agriculture. Remote Sens. 2021, 13, 3517. [Google Scholar] [CrossRef]
  101. Yan, P.; Han, Q.; Feng, Y.; Kang, S. Estimating LAI for cotton using multisource UAV data and a modified universal model. Remote Sens. 2022, 14, 4272. [Google Scholar] [CrossRef]
  102. Ye, N.; Walker, P.; Gao, Y.; PopStefanija, I.; Hills, J. Comparison between thermal-optical and L-band passive microwave soil moisture remote sensing at farm scales: Towards UAV-based near-surface soil moisture mapping. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 633–642. [Google Scholar] [CrossRef]
  103. Zhang, F.; Hassanzadeh, A.; Kikkert, J.; Pethybridge, S.J.; Van Aardt, J. Evaluation of leaf area index (LAI) of broadacre crops using UAS-based LiDAR point clouds and multispectral imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 4027–4044. [Google Scholar] [CrossRef]
  104. Zheng, Z.; Yuan, J.; Yao, W.; Yao, H.; Liu, Q.; Guo, L. Crop classification from drone imagery based on lightweight semantic segmentation methods. Remote Sens. 2024, 16, 4099. [Google Scholar] [CrossRef]
  105. Slimani, H.; El Mhamdi, J.; Jilbab, A. Assessing the advancement of artificial intelligence and drones’ integration in agriculture through a bibliometric study. Int. J. Electr. Comput. Eng. 2024, 14, 878–890. [Google Scholar] [CrossRef]
  106. Gokool, S.; Mahomed, M.; Kunz, R.; Clulow, A.; Sibanda, M.; Naiken, V.; Chetty, K.; Mabhaudhi, T. Crop monitoring in smallholder farms using unmanned aerial vehicles to facilitate precision agriculture practices: A scoping review and bibliometric analysis. Sustainability 2023, 15, 3557. [Google Scholar] [CrossRef]
  107. Nduku, L.; Munghemezulu, C.; Mashaba-Munghemezulu, Z.; Kalumba, A.M.; Chirima, G.J.; Masiza, W.; De Villiers, C. Global research trends for unmanned aerial vehicle remote sensing application in wheat crop monitoring. Geomatics 2023, 3, 115–136. [Google Scholar] [CrossRef]
  108. Lalrochunga, D.; Parida, A.; Choudhury, S. Systematic review on capacity building through renewable energy enabled IoT–unmanned aerial vehicle for smart agroforestry. Clean. Circ. Bioeconomy 2024, 8, 100094. [Google Scholar] [CrossRef]
  109. Ndour, A.; Blasch, G.; Valente, J.; Gebrekidan, B.H.; Sida, T.S. Optimal machine learning algorithms and UAV multispectral imagery for crop phenotypic trait estimation: A comprehensive review and meta-analysis. Environ. Res. Commun. 2025, 7, 072002. [Google Scholar] [CrossRef]
  110. Armenta-Medina, D.; Ramirez-del Real, T.A.; Villanueva-Vásquez, D.; Mejia-Aguirre, C. Trends on advanced information and communication technologies for improving agricultural productivities: A bibliometric analysis. Agronomy 2020, 10, 1989. [Google Scholar] [CrossRef]
  111. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Systematic literature review process.
Figure 1. Systematic literature review process.
Sustainability 18 00507 g001
Figure 2. Number of documents by source.
Figure 2. Number of documents by source.
Sustainability 18 00507 g002
Figure 3. PRISMA 2020 flow diagram.
Figure 3. PRISMA 2020 flow diagram.
Sustainability 18 00507 g003
Figure 4. Management and standardization of bibliographic records in Mendeley Desktop.
Figure 4. Management and standardization of bibliographic records in Mendeley Desktop.
Sustainability 18 00507 g004
Figure 5. Flow of processing and analysis of the selected studies.
Figure 5. Flow of processing and analysis of the selected studies.
Sustainability 18 00507 g005
Figure 6. Distribution of publications by year.
Figure 6. Distribution of publications by year.
Sustainability 18 00507 g006
Figure 7. Topics (bigrams) in the papers.
Figure 7. Topics (bigrams) in the papers.
Sustainability 18 00507 g007
Figure 8. Number of papers by quartile and source.
Figure 8. Number of papers by quartile and source.
Sustainability 18 00507 g008
Figure 9. World map of total papers per country (top), top 10 most productive countries (bottom left), and country-level co-authorship link between the United States and China (bottom right).
Figure 9. World map of total papers per country (top), top 10 most productive countries (bottom left), and country-level co-authorship link between the United States and China (bottom right).
Sustainability 18 00507 g009
Figure 10. Thematic map.
Figure 10. Thematic map.
Sustainability 18 00507 g010
Figure 11. Bibliometric network by co-citations. (a) total, (b) top 5.
Figure 11. Bibliometric network by co-citations. (a) total, (b) top 5.
Sustainability 18 00507 g011
Figure 12. Bibliometric network of keywords.
Figure 12. Bibliometric network of keywords.
Sustainability 18 00507 g012
Figure 13. Keywords per paper.
Figure 13. Keywords per paper.
Sustainability 18 00507 g013
Table 1. Search descriptors and their synonyms organized by conceptual groups.
Table 1. Search descriptors and their synonyms organized by conceptual groups.
Conceptual GroupDescriptor
Drone Technologiesdrone/unmanned aerial vehicle/uav/unmanned aircraft system/uas/remotely piloted aircraft system/rpas/aerial vehicle/unpiloted aircraft/agricultural drone/spraying drone/monitoring drone/surveillance drone/mapping drone
Smart Agricultureagriculture 4.0/digital agriculture/smart agriculture/precision agriculture/agroindustry 4.0/agrotechnology/connected agriculture/data-driven agriculture/automated agriculture/iot agriculture/artificial intelligence agriculture/machine learning agriculture
Table 2. Results of the quality assessment.
Table 2. Results of the quality assessment.
ReferenceTypeQA1QA2QA3QA4QA5QA6QA7Score
[1]Journal211132313
[2]Journal231212112
[68]Journal221231112
[3]Journal311213213
[69]Journal231222113
[4]Journal111333113
[23]Journal313113113
[13]Journal212212212
[15]Journal222211212
[18]Journal113212212
[9]Journal222212213
[10]Journal112232112
[11]Journal322312114
[5]Journal122222112
[70]Journal331131113
[16]Journal322123215
[17]Journal131331113
[19]Journal222311213
[6]Journal213321113
[12]Journal112331314
[8]Journal221222213
[71]Journal223122113
[7]Journal122212212
[24]Journal311222112
[50]Journal221312213
[51]Journal132221213
[52]Journal222221213
[53]Journal321222113
[54]Journal113222213
[55]Journal212222112
[56]Journal331111212
[57]Journal223122315
[58]Journal132232215
[59]Journal322312215
[60]Journal212221212
[61]Journal312112212
[62]Journal212212212
[63]Journal321221213
[64]Journal132231214
[65]Journal223111313
[72]Journal322211314
[73]Journal232112213
[74]Journal123223215
[75]Journal231223215
[76]Journal222121212
[77]Journal221213314
[78]Journal332121214
[79]Journal123322215
[80]Journal112222212
[81]Journal222212213
[82]Journal112221312
[83]Journal332212215
[84]Journal133122214
[85]Journal222211212
[86]Journal313222215
[87]Journal112222212
[88]Journal222221213
[89]Journal322121213
[90]Journal232221113
[91]Journal121232213
[92]Journal321213113
[93]Journal223111212
[94]Journal132122213
[95]Journal321212314
[96]Journal222222113
[97]Journal131221212
[98]Journal212122212
[99]Journal321132113
[100]Journal222213113
[101]Journal222121212
[102]Journal131221212
[103]Journal312122213
[104]Journal232112314
Table 3. Distribution of publications by year.
Table 3. Distribution of publications by year.
Publication Name 2019 2020 2021 2022 2023 2024 2025 Total
Remote Sensing334675331
Drones102223010
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing00021205
Precision Agriculture01002115
Agriculture00010214
IEEE Access01011014
Sensors11000002
Acta Agriculturae Scandinavica, Section B-Soil and Plant Science00001001
Agronomy00001001
European Journal of Agronomy00001001
European Journal of Remote Sensing10000001
Indian Journal of Computer Science and Engineering00010001
International Journal of Agricultural and Biological Engineering00000101
International Journal of Students’ Research in Technology and Management00010001
IOP Conference Series: Earth and Environmental Science00010001
IOP Conference Series: Materials Science and Engineering00100001
Total667161515873
Table 4. Research impact by year.
Table 4. Research impact by year.
YearN° PapersPapers
(%)
N° CitationsCitations
(%)
H-IndexH-Index
(%)
Citations/Paper
20221621.926916.5218119.216.8
20231520.529518.1259322.819.7
20241520.51338.2207118.28.9
2025811.020.1118210.40.3
202179.61378.410118.919.6
201968.243927.010399.173.2
202068.235221.6131211.558.7
Total73100.01627100.011,389100.022.3
Table 5. Topics by journal quartile.
Table 5. Topics by journal quartile.
BigramQ1Q2NQTotal
remote sensing430043
vegetation indices240024
machine learning210223
precision agriculture200121
vegetation index160016
deep learning121114
random forest120113
spatial resolution130013
growth stages110012
unmanned aerial90312
data collection100010
ground truth91010
study area100010
growth stage9009
linear regression9009
red edge9009
Total3214991923505
Table 6. Relevant results from the reviewed publications.
Table 6. Relevant results from the reviewed publications.
Method CategoryMethods UsedDatasetsPerformanceKey ContributionsLimitationsRefs.Qty. (%)
Calibration & Multi-sensor IntegrationUAV multispectral calibration; multi-sensor fusion (RGB; multispectral; thermal; hyperspectral); AI integration for soil/textureMulti-year field trials; cotton field ground data; UAV images (RGB/MS/TIR)MAE(GAI) = 0.19–0.48 m2/m2; MAE(N) = 0.80–1.21 g/m2; NDVI error = 6.6% (canopy); thermal Δ = 1.02 °CCrop/season-specific calibration frameworks; open-sourced multi-sensor UAV design; improved fusion pipelinesLimited generalizability across growth stages/environments; calibration complexity; compute/resource demands[15,16,100]3
(4.5)
Crop Row & Field-Structure MappingCNN + Hough (CRowNet); thresholding/segmentation; OBIA/MIRS; computational geometryUAV RGB/MS orthomosaics; public beet field images; UAS imagery (NL)Detection rate = 93.6%; IoU > 70%; OA > 90%; R (crop volume) = 0.71Robust pipelines for row/canopy extraction; fast canopy volume estimation; cross-crop heuristicsShadows and weeds degrade accuracy; crop/phenology specificity; DEM/terrain error propagation[4,5,79,88]4
(6.1)
Cross-domain AI (Non-agri, outlier)LSTM; Seq2Seq; Attention; BLEU evaluationDialogue datasetBLEU-4 = 0.8537Benchmarks sequential models and attentionDomain mismatch vs. agri; usability not assessed[99]1
(1.5)
Economic Evaluation & DSS/MCDMWeb-based DSS; usability tests; FWZIC/FDOSM (fuzzy MCDM)Various agri datasets; user studies; vineyard interviewsHigh usability; mental demand “slightly high”; profit ↑ 12.4%; cost −€6.1/haSix interactive DSS modules (open code); UAV profitability evidence; criteria weights for UAV selectionCognitive load; sample/context specificity; unmodeled features/costs[7,51,92]3
(4.5)
Remote Sensing Methodology (UAV vs. Satellite/Index Studies)NDVI/NDRE/GNDVI/EVI2 comparison; photogrammetry; S2 vs. UAV correlationUAV RGB/MS; Sentinel-2; field samplingr = 0.60–0.80 (UAV vs. S2); NDRE improves irrigation issue detectionValidates UAV superiority for vigor; NDRE efficacy; multi-scale comparabilityInter-row bias in satellites; crop/terrain specificity[11,53,57]3
(4.5)
Reviews & Surveys (Systematic/Scoping)Systematic literature review; comparative analysis; digitalization reviewsLiterature corpora (155+ studies; last decade)NRComprehensive maps of diseases/tech; barriers and interventions; legal/regulatory landscapeTime-window bias; heterogeneous data quality; limited economic evidence[8,18,23,50,69,85]6
(9.1)
Soil/Water Status & Moisture RetrievalGNSS-R; L-band passive microwave; thermal-optical; RF/ML for EWT/FMC/SLAGround SM; airborne L-band; UAV TIR/MS; smallholder plotsRMSE(SM) = 0.05–0.09 m3/m3; rRMSE(EWT) = 3.13%; rRMSE(FMC) = 1%Sub-field SM at low cost; CWSI tracking across phenology; water-status monitoring frameworkSite-specific calibration; sensing-depth mismatch; vegetation cover effects[70,73,77,89,102]5
(7.6)
Time Series & Sequence Models (Agri)LSTM (phenology-aware); decision tree; UAV spectral time seriesUAV MS + weather + tree attributesAccuracy = 93%; AUC = 0.85/0.96/0.92; R2(yield) = 0.21; RMSE = 50.18Phenology-conditioned health model; new mango indices; cumulative health indexYield sub-model underfits; dependence on UAV pipelines[1]1
(1.5)
UAV Systems, Operations, Edge & CommunicationsPrecision landing; low-cost sprayers; path-planning (NN/optimization); FANET (AERO-FL); edge AI (YOLOv4/7, DeepSort); risk (GERT); comms (RSS/RTT)Field tests (paddy/rice/PH); simulations; greenhouse commsLanding error ≈ 6.8–13.3 cm; PDR = 98.5%; delay = 45 ms; FPS ≈ 15.5; capacity = 0.45 ha/hOperational readiness (autopilot sprayers); real-time on-board AI; robust FANET routing; downtime risk modelingBattery limits; comms loss; scenario-specific tuning; legal/operational barriers[12,55,60,61,68,76,81,82,83,96]10 (15.2)
Pest/Disease/Weed & Individual-Plant DetectionXGBoost/ RF/ KNN (WLD); Faster-R-CNN/YOLOv5/8; OTSU; Detectron2; crown/mask extractionUAV RGB/MS; cherry orchards; cassava sets; field RT imagesAcc = 94% (WLD); F1 = 94.85%; IoU = 85.30%; mAP@0.5:0.95 = 0.960First UAV-ML WLD pipeline; real-time weed/cassava detection; precise crown delineationCrop-specific tuning; small-object/occlusion errors; environment variability[3,17,52,71,74,75,78,94]8 (12.1)
Yield & Biomass Estimation (Supervised/Ensemble)RF/SVR/PLSR/GPR/Ridge/ElasticNet; XGBoost; stacking; linear modelsUAV RGB/MS/HS; ground truth yield/biomass; multi-temporal plotsR2 up to 0.92; RMSE = 12.5; CC(RF) = 0.81; RPD = 1.867; mAPE/NRMSE reportedEnsemble/fusion boosts accuracy; optimal timing (flowering/dough); RGB vs. MS trade-offsSpatial autocorrelation/overfitting; crop/variety specificity; stage dependence[2,9,58,59,63,64,65,72,80,84,86,87,91,93,97]15 (22.8)
Phenotyping & 3D/LAI Canopy Modeling3D point clouds; LiDAR/MSI fusion; oblique UAV geometry; LAI/height mapping; seedling growth (MCDI)UAV RGB/MS; LiDAR; ground LAI/height; high-res seedling imagesR2(LAI) = 0.78–0.89; R2 > 0.8 (OBIA canopy volume); F1 > 98.5% (seedlings)Cost-effective LAI/height maps; 3D morphology tracking; automated phenotypingCultivar/altitude/occlusion sensitivity; dense canopies[10,19,54,62,95,101,103]7 (10.6)
Table 7. Impact of papers by quartile.
Table 7. Impact of papers by quartile.
QuartileTotal CitationsNo. of PapersCitations per Paper
Q115986624
Q2724
NQ2254
Total16277322
Table 8. Most frequent trigrams in drones and smart agriculture research by journal quartile.
Table 8. Most frequent trigrams in drones and smart agriculture research by journal quartile.
TrigramNQQ1Q2Total
difference vegetation index013013
unmanned aerial vehicles29011
normalized difference vegetation010010
machine learning algorithms0909
remote sensing data0808
uav remote sensing0808
machine learning models0707
root mean square0707
vegetation index ndvi0707
convolutional neural networks1416
leaf area index0606
mean square error0606
aerial vehicles uavs0505
crop water stress0505
different growth stages0505
multiple linear regression0505
satellite remote sensing0505
deep learning models0404
high spatial resolution0404
machine learning model0404
neural networks cnns1304
partial least squares0404
remote sensing technologies0404
support vector regression0404
vegetation indices vis0404
yield prediction model0404
Total932508412642
Table 9. Papers published by year.
Table 9. Papers published by year.
CountryGoogle ScholarIEEE
Xplore
ProQuestScopusTaylor & Francis OnlineTotal
China10315019
US0326011
Italy101608
Canada000505
India301015
Spain000415
..
Total91512682106
Table 10. Research impact by country.
Table 10. Research impact by country.
CountryNo. of PapersPapers
(%)
No. of CitationsCitations
(%)
H-IndexCitations/Paper
China1917.929212.5321615.4
US1110.426211.2199923.8
Italy87.537616.0125747.0
Canada54.7823.576916.4
India54.7512.225810.2
Spain54.71747.476634.8
Australia43.8913.970522.8
Germany43.8512.274912.8
Greece43.8130.65173.3
Netherlands32.8381.653512.7
Pakistan32.8793.462926.3
South Africa32.81275.433542.3
UK32.8251.15668.3
Bangladesh21.9371.635618.5
Iran21.9321.435616.0
Japan21.9411.743420.5
Total106100.02345100.01673322.1
Table 11. Key topics by density and centrality by thematic category.
Table 11. Key topics by density and centrality by thematic category.
TopicDensityCentralityTotal
Citations
Total
Documents
Category
UAS Phenotyping0.980.7260312Motor
UAV CropSensing0.490.84109124Basic
Agricultural AI0.120.14209761Marginal
Precision UAVs0.100.23192453Marginal
Drone Agriculture0.070.18234771Marginal
Precision Agriculture0.060.13172946Marginal
Smart Farming0.060.66192845Basic
Data-Driven Agriculture0.020.09187455Marginal
Precision AgriDrones0.020.18210661Marginal
Table 12. Co-citation between authors.
Table 12. Co-citation between authors.
Citation1Citation2WeightCitation1Citation2Weight
li x.wang j.17wang j.zhang z.15
li x.zhang z.17wang x.wang y.15
yang g.yang x.17wang x.zhang l.15
zhang l.zhang z.17wang x.zhang y.15
cao w.tian y.16zhang j.zhang l.15
cao w.zhu y.16zhang l.zhang y.15
feng h.yang g.16zhang y.zhu y.15
li x.zhang l.16feng h.li z.14
tian y.zhu y.16feng h.yang x.14
wang j.zhang l.16li j.wang j.14
li z.yang g.15li x.li y.14
liu j.zhang l.15li x.wang x.14
liu j.zhang y.15li x.wang y.14
wang j.zhang y.15liu j.wang j.14
wang j.zhang z.15liu j.zhang z.14
Table 13. Co-occurrence table between keywords.
Table 13. Co-occurrence table between keywords.
Keyword1Keyword2Weight
mlprecision agriculture8
precision agricultureuav7
mlremote sensing5
precision agricultureremote sensing5
precision agriculturevegetation indices5
deep learningprecision agriculture4
dronesprecision agriculture4
remote sensingunmanned aerial vehicle4
plant detectionprecision agriculture3
precision agricultureuav remote sensing3
3d point cloud from photogrammetrycrop phenotyping2
3d point cloud from photogrammetryleaf area index (lai)2
3d point cloud from photogrammetryprecision agriculture2
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gamboa-Cruzado, J.; Estrada-Gutierrez, J.; Bustos-Romero, C.; Alzamora Rivero, C.; Valenzuela, J.N.; Tavera Romero, C.A.; Gamarra-Moreno, J.; Amayo-Gamboa, F. A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges. Sustainability 2026, 18, 507. https://doi.org/10.3390/su18010507

AMA Style

Gamboa-Cruzado J, Estrada-Gutierrez J, Bustos-Romero C, Alzamora Rivero C, Valenzuela JN, Tavera Romero CA, Gamarra-Moreno J, Amayo-Gamboa F. A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges. Sustainability. 2026; 18(1):507. https://doi.org/10.3390/su18010507

Chicago/Turabian Style

Gamboa-Cruzado, Javier, Jhon Estrada-Gutierrez, Cesar Bustos-Romero, Cristina Alzamora Rivero, Jorge Nolasco Valenzuela, Carlos Andrés Tavera Romero, Juan Gamarra-Moreno, and Flavio Amayo-Gamboa. 2026. "A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges" Sustainability 18, no. 1: 507. https://doi.org/10.3390/su18010507

APA Style

Gamboa-Cruzado, J., Estrada-Gutierrez, J., Bustos-Romero, C., Alzamora Rivero, C., Valenzuela, J. N., Tavera Romero, C. A., Gamarra-Moreno, J., & Amayo-Gamboa, F. (2026). A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges. Sustainability, 18(1), 507. https://doi.org/10.3390/su18010507

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop