Next Article in Journal
Federated Transfer Learning for Tomato Leaf Disease Detection Using Neuro-Graph Hybrid Model
Previous Article in Journal
A Decision Support System (DSS) for Irrigation Oversizing Diagnosis Using Geospatial Canopy Data and Irrigation Ecolabels
Previous Article in Special Issue
Monitoring Eichhornia crassipes and Myriophyllum aquaticum in Irrigation Systems Using High-Resolution Satellite Imagery: Impacts on Water Quality and Management Strategies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Vegetation Indices from UAV Imagery: Emerging Tools for Precision Agriculture and Forest Management

1
Department of Bioengineering of Horti-Viticultural Systems, Faculty of Horticulture, University of Agronomic Sciences and Veterinary Medicine of Bucharest, 011464 Bucharest, Romania
2
National Institute for Research and Development in Forestry “Marin Dracea”, Eroilor 128, 077190 Voluntari, Romania
3
Department of Chemistry, Physics and Environment, Faculty of Sciences and Environmental, Dunarea de Jos University Galati, 47, Domneasca Street, 800008 Galati, Romania
4
Rexdan Research Infrastructure, “Dunarea de Jos” University of Galati, 800008 Galati, Romania
*
Authors to whom correspondence should be addressed.
AgriEngineering 2025, 7(12), 431; https://doi.org/10.3390/agriengineering7120431
Submission received: 25 October 2025 / Revised: 5 December 2025 / Accepted: 12 December 2025 / Published: 14 December 2025

Abstract

Unmanned Aerial Vehicles (UAVs) have become essential instruments for precision agriculture and forest monitoring, offering rapid, high-resolution data collection over wide areas. This review synthesizes global advances (2015–2024) in UAV-derived vegetation indices (VIs), combining bibliometric and content analyses of 472 peer-reviewed publications. The study identifies key research trends, dominant indices, and technical progress achieved through RGB, multispectral, hyperspectral, and thermal sensors. Results show an exponential growth of scientific output, led by China, the USA, and Europe, with NDVI, NDRE, and GNDVI remaining the most widely applied indices. New indices such as GSI, RBI, and MVI demonstrate enhanced sensitivity for stress and disease detection in both crops and forests. UAV-based monitoring has proven effective for yield prediction, water-stress evaluation, pest identification, and biomass estimation. Despite significant advances, challenges persist regarding illumination correction, soil background influence, and limited forestry applications. The paper concludes that UAV-derived vegetation indices—when integrated with machine learning and multi-sensor data—represent a transformative approach for the sustainable management of agricultural and forest ecosystems.

1. Introduction

Unmanned Aerial Vehicles (UAVs), commonly referred to as drones, have emerged as cost-effective and time-efficient alternatives to traditional methods for agricultural monitoring and management. By enabling the acquisition, processing, and analysis of high-resolution spatial and temporal crop data at the field scale, UAVs support improved crop productivity and water management, especially in the climate change context [1,2,3,4,5]. Equipped with multispectral and thermal sensors, drones allow continuous crop monitoring throughout the growth cycle, making it possible to detect anomalies at early stages and implement timely interventions. Their application in smallholder agriculture is particularly relevant, as UAVs can contribute to household food security and strengthen water management practices in developing regions [6].
In forestry, UAVs are increasingly being adopted for periodic monitoring to prevent forest degradation [7,8]. While conventional methods rely on manual field surveys, drones provide a powerful alternative through aerial image-based remote sensing, offering efficient, large-scale data collection [9].
The introduction of hyperspectral cameras has greatly expanded vegetation analysis, as their numerous spectral bands enable the development of various Spectral Vegetation Indices (SVIs). These indices help reduce the influence of complex background elements—such as soil color, reflections, shadows, or plastic cover brightness—thereby isolating vegetation characteristics more effectively. Over the past decade, several SVIs have been designed to identify plant diseases at different stages, as many symptoms manifest through distinct variations in spectral reflectance. For instance, changes in pigments or chlorophyll content can alter reflectance in the green band (500–600 nm range). SVIs can be calculated through diverse approaches, including band ratios, differences, and linear combinations, ultimately providing values that indicate plant health status [10].
Another important application of drones in agriculture is the yield prediction, often performed through the computation of single or multiple vegetation indices [11]. Much of the research in this field focuses on correlations between imaging data and crop properties. Two key areas of emphasis are: (1) biomass or crop yield prediction using vegetation indices, and (2) plant or weed detection. In both domains, vegetation indices frequently serve as the primary indicators for crop assessment [12].
Among these indices, the Normalized Difference Vegetation Index (NDVI), derived from radiometric data, has proven particularly useful for assessing plant physiological status. NDVI serves as a decision-support tool for determining optimal pesticide application timing and evaluating thresholds of economic damage. With the expansion of drone technology and the integration of digital cameras, NDVI has been increasingly applied in agriculture by analyzing digital images with specialized software to estimate canopy cover [13].
However, retrieving biochemical parameters such as chlorophyll content remains challenging, as canopy reflectance is influenced not only by biochemical composition but also by structural properties of the crops. Crops with similar chlorophyll content may exhibit different reflectance patterns depending on canopy architecture. To minimize errors in chlorophyll mapping, structural factors must be incorporated into vegetation index models. Empirical relationships between indices and chlorophyll content are already well established in precision agriculture, and UAV-based remote sensing has further enhanced the capacity to retrieve both structural and biochemical crop parameters [14].
In forestry, increasing tree dieback and mortality—particularly in forest parks—has highlighted the need for reliable monitoring techniques [15,16,17,18]. Remote sensing, using high-quality aerial and satellite data, is increasingly adopted as an alternative or complement to traditional field-based surveys for forest inventories [19].
Although numerous reviews have examined drones in agriculture and forestry [20,21,22,23] or vegetation dynamics more generally [24,25,26,27,28,29], there is still a lack of comprehensive studies focused specifically on vegetation indices derived from drone-based imaging. Existing reviews typically treat UAVs, vegetation analysis, or remote sensing as broad topics, but they do not systematically analyze how different types of UAV-mounted sensors (RGB, multispectral, hyperspectral, thermal) contribute to the development, performance, and limitations of vegetation indices across both agricultural and forestry environments. Moreover, no prior review integrates bibliometric trends with technical, methodological, and environmental factors affecting UAV-derived vegetation index reliability. This gap is significant given the rapid expansion of vegetation index applications, sensor technologies, and analytical methods.
Given the growing scientific and practical significance of this topic, the present article aims to address this gap by analyzing the use of vegetation indices in both agricultural and forestry applications based on UAV-collected imagery.
This article provides a comprehensive review of the application of drones (Unmanned Aerial Vehicles, UAVs) in agriculture and forestry with a specific focus on vegetation indices derived from UAV-based imaging. Unlike broader reviews on UAV use in precision agriculture and forestry, this work emphasizes the development, application, and performance of vegetation indices for monitoring crop health, stress detection, yield prediction, and forest condition assessment. The scope includes: A bibliometric analysis of global research trends (2015–2024) in UAV-based vegetation index studies; A systematic review of vegetation indices developed using RGB, multispectral, hyperspectral, and thermal sensors; Evaluation of technical achievements in UAV-derived vegetation indices for both agricultural and forestry applications; Case studies demonstrating the utility of vegetation indices across different crop systems, forest types, and environmental conditions; Critical examination of environmental and methodological factors that influence vegetation index performance, such as soil background, canopy structure, and solar illumination. By explicitly addressing these aspects, this review provides a consolidated and structured understanding of UAV-based vegetation indices and highlights current challenges, knowledge gaps, and opportunities for future research.

2. Materials and Methods

This systematic literature review was carried out in two complementary phases. The first phase involved a bibliometric analysis aimed at examining global research trends related to drones in agricultural and forest vegetation index studies. Relevant publications were retrieved from two leading bibliographic databases: Scopus and the Science Citation Index Expanded (SCI-Expanded) within the Web of Science (WoS), such as other studies [30,31,32]. The search query used was “drones and agricultural vegetation index OR drones and forest vegetation index”, ensuring broad coverage of the literature aligned with the research focus. Bibliometric methods were chosen because of their reliability in assessing scientific output, identifying research frontiers, and detecting emerging thematic areas.
Following the initial search, the records were screened and refined according to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines [33].
Search strings and exact queries. To ensure reproducibility, the exact search strings used in each database are reported below. Boolean operators, quotation marks, and field tags were used.
These strings are designed to capture both general references to drones/UAVs and common vegetation-index terminology. Where database syntax differed (e.g., field tags), the query was adapted accordingly while preserving logical structure. Wildcards (*) were used to capture plural and morphological variants.
Time span, document types, and language. The search time span was unrestricted (all years indexed by each database up to the search date). Document types were limited to peer-reviewed articles and reviews. Only documents in English were included, because (1) English is the dominant language of the indexed literature in the selected databases and (2) machine translations were not performed for non-English articles. Conference proceedings, editorials, correspondence, book chapters, and theses were excluded at the screening stage.
De-duplication and data quality control. De-duplication was carried out in two steps. First, automated duplicate detection was performed by matching DOI and exact title strings in Microsoft Excel. Second, remaining potential duplicates were screened manually (title/first author/publication year/journal) to identify records with missing or inconsistent DOI fields. Data quality control steps included verification of exported metadata fields (title, abstract, authorship, year), correction of obvious OCR errors in titles/abstracts, and standardization of author and institution names (see normalization below).
Inclusion and exclusion criteria (applied in screening). The inclusion and exclusion criteria were predefined and applied in two screening stages (title/abstract screening followed by full-text screening). The criteria were:
Inclusion: peer-reviewed articles or reviews, English language, primary focus on UAV/drone/UAS applications for estimating, mapping, or analyzing vegetation indices in agricultural or forest ecosystems, or methodological development directly related to UAV-derived vegetation indices, with sufficient metadata (abstract and bibliographic fields) available.
Exclusion: non-peer-reviewed items (editorials, correspondence, posters, theses, patents), studies focusing solely on helicopter/manned aircraft remote sensing without UAVs, studies where vegetation indices were only mentioned in passing without UAV-derived data, studies using UAVs for wildlife detection or other non-vegetation objectives, inaccessible full text, and items lacking an abstract.
Items excluded at full-text stage were tagged with one of the following structured exclusion reasons: (A) out of scope (topic mismatch), (B) non-peer reviewed/editorial, (C) no UAV data (manned aircraft only), (D) full text inaccessible, (E) language other than English, or (F) insufficient methodological detail (e.g., only conceptual commentary). These structured reasons were recorded for each excluded item, and a representative sample of excluded items with reasons is provided in the Supporting Information.
Screening procedure, reviewer roles, and disagreement handling. Screening was performed by two independent reviewers (Reviewer A and Reviewer B). In the first stage, titles and abstracts were read independently by both reviewers. Records judged as potentially relevant by either reviewer advanced to the second stage. In the second stage, full texts were retrieved and read independently by both reviewers. If the full text was unavailable, the item was marked as inaccessible (reason D). When reviewers disagreed on inclusion/exclusion at either stage, a third senior reviewer (Reviewer C) adjudicated after joint discussion.
Final selection and dataset. After applying the two-stage screening and resolving disagreements, 472 publications met the inclusion criteria and were selected for bibliometric and qualitative content analysis (Table 1 and Figure 1).
The bibliometric analysis was conducted across nine dimensions: (1) publication type, (2) research discipline, (3) year of publication, (4) geographic distribution of contributions, (5) authorship, (6) institutional affiliations, (7) journals, (8) publishers, and (9) keywords. Data processing and visualization were performed using Web of Science Core Collection (version 5.35, Clarivate) [34], Scopus [35], Microsoft Excel (version 2024) [36], and Geochart [37]. To further explore bibliometric relationships, VOSviewer (version 1.6.20) [38] was employed to map co-authorship networks, co-citation patterns, and keyword co-occurrence clusters.
Classification of research areas: To characterize the disciplinary distribution of the publications included in this review, we assigned each article to one or more research areas based on the subject categories provided in the Web of Science (WoS) Core Collection. Each journal indexed in WoS is associated with one or more subject categories (e.g., Remote Sensing, Forestry, Agriculture, Ecology, Environmental Sciences). When an article was retrieved, all WoS categories attributed to its source journal were recorded. Therefore, a single article may appear in more than one research area if the journal itself spans multiple WoS categories. This approach ensures that the classification reflects the standard bibliometric structure used by WoS rather than subjective manual assignment. For example, an article published in a journal indexed simultaneously under Remote Sensing and Environmental Sciences—Ecology is counted once in each of these two categories. Similarly, papers published in journals indexed in Forestry are grouped under the forestry category, while those appearing in journals indexed under Agriculture are grouped under agriculture, regardless of overlapping classifications with Remote Sensing or Environmental Sciences. Because many journals in the fields of drone-based monitoring and vegetation analysis are inherently interdisciplinary, the sum of articles across categories exceeds the total number of publications reviewed. The figure concerning the distribution of the primary research areas in publications; therefore, represents the frequency of research-area assignments, not mutually exclusive article counts. The second phase involved a qualitative content analysis of the 472 screened publications. This step provided a deeper understanding of knowledge production on the topic and enabled the categorization of results into four thematic domains: (1) UAV-derived vegetation indices in Agriculture and Forestry; (2) Technical achievements in UAV-based vegetation index development; (3) Application of drone-derived vegetation indices in Agriculture and Forestry: case study outcomes; (4) Effects of environmental and field conditions on vegetation indices (Figure 2).
The expanded methodological description is intended to enhance transparency and reproducibility, ensuring that readers can accurately replicate the bibliometric and qualitative procedures applied in this review.

3. Results

3.1. A Bibliometric Review

The inventory of published documents on this topic from 2015 to 2024 resulted in a total of 472 publications. Of these, most are research articles (375, representing 79% of the total), followed by 61 proceedings papers (13%), 26 reviews (6%), and 10 book chapters (2%) (Figure 3).
Although this topic appeared relatively recently (the first article I recorded was published in 2016), the number of publications has increased year by year, reaching in 2024 a level twice as high as the previous year, 2023 (Figure 4).
The classification of published articles by research areas highlights the predominance of the following out of 37 categories: Environmental sciences—Ecology (126 articles), remote sensing (121 articles), and Agriculture (100 articles) (Figure 5).
The authors of these publications come from 86 countries across all inhabited continents (Figure 6). China has published the highest number of articles (61), followed by the USA (57), Germany (32), and India and Spain (25 articles each).
The authors’ countries of origin, who have published articles on this topic, can be organized into four clusters: Cluster 1 includes: Canada, Finland, Germany, Italy, South Africa, Sweden and Switzerland; Cluster 2 consists of: Brazil, Denmark, England, France and Mexico; Cluster 3 consists of: Australia, Indonesia, Japan and Sri Lanka; and Cluster 4 includes: Belgium, Marocco, Portugal and Spain (Figure 7). Since each cluster includes countries from different continents, no clear geographical pattern of grouping could be identified; it seems that connections between authors are the main factor driving these associations. The exception is cluster 3, which includes only countries from Asia.
The publications on this topic have appeared in a wide range of scientific journals (184 in total). The most prominent among them were remote sensing (with 47 articles), Drones (with 17 articles), Agriculture (with 17 articles), and Agronomy (with 14 articles) (Table 2 and Figure 8).
In addition, total link strength refers to the cumulative strength of the bibliographic coupling links of each journal as calculated by VOSviewer; higher values indicate that the journal’s articles share a larger number of cited references with articles from other journals in the network, reflecting stronger connectivity within the research field. In all the network visualizations produced with VOSviewer (Figure 7, Figure 8 and Figure 9), the size of each node represents the number of documents associated with that country, journal, or keyword, while the thickness of the connecting lines (links) indicates the strength of the co-authorship, co-occurrence, or bibliographic coupling relationships among items. Thicker lines represent stronger relationships based on VOSviewer’s link strength calculations.
The co-authorship network (Figure 7) and the bibliographic coupling of journals (Figure 8) were also generated using VOSviewer with the following configuration: Type of analysis: Co-authorship (for Figure 7) and Bibliographic coupling (for Figure 8); Unit of analysis: Countries (Figure 7) and Sources/Journals (Figure 8); Counting method: Full counting; Minimum number of documents per item: two for countries and two for journals; Maximum network size: 500 items; Clustering method: VOSviewer default (LinLog/modularity optimization); Normalization: Association strength.
In terms of institutional affiliation, the most representative institutions for authors publishing on this topic were: the Chinese Academy of Agricultural Sciences, the National Land Survey of Finland, and Wageningen University (each with eight articles published), followed by the Chinese Academy of Sciences, the Finnish Geospatial Research Institute and Swedish University of Agricultural Sciences (each with seven articles published). The leading publishers in this research domain included MDPI (135 articles), Elsevier (70 articles), Springer Nature (35 articles), and IEEE (25 articles).
From the analysis carried out on the keywords that appear in the articles published on this topic, it resulted that the most frequently used keywords were: drones, vegetation indices, UAV, and remote sensing (Table 3).
When grouped into clusters, the keywords formed three main categories with more than 10 words: Cluster 1: mainly includes terms related to algorithms and models: algoritms, leaf area index, leaf chlorophyll content, machine learning, model, prediction, spectral reflectance; Cluster 2, generally includes terms related to agriculture and vegetation: agriculture, crop, precision agriculture, vegetation indexes, vegetation indices; Cluster 3, includes keywords related to indices and also the satellite imagery: imagery, indexes, NDVI, multispectral, random forest, Sentinel-2. (Figure 9).
For the keyword co-occurrence network (Figure 9), the analysis was performed in VOSviewer version using the following configuration: Type of analysis: Co-occurrence; Unit of analysis: Author keywords only; Counting method: Full counting; Minimum number of occurrences for inclusion: 10; Maximum network size: 500 items; Clustering method: VOSviewer default (LinLog/modularity optimization); Normalization: Association strength. These settings were applied to ensure full reproducibility of the visualizations. The same general configuration principles (network type, full counting, and default clustering) are also applied in the co-authorship and bibliographic coupling networks presented in Figure 7 and Figure 8.

3.2. The Literature Review

3.2.1. UAV-Derived Vegetation Indices in Agriculture and Forestry

A broad range of vegetation indices has been investigated in recent studies, using UAV-mounted RGB, multispectral, hyperspectral, and thermal sensors. These indices have been applied in diverse agricultural and forestry contexts, ranging from stress detection in crops to forest health monitoring. The indices and their applications are summarized in Table 4.

3.2.2. Technical Achievements in UAV-Based Vegetation Index Development

Performance of UAV-Derived Vegetation Indices in Agricultural Monitoring
The application of drone-based RGB imagery for NDVI prediction demonstrated significant improvements compared with conventional machine learning approaches. Diykh reported that the integration of empirical curvelet transform with DenseNet achieved superior performance, producing an SSIM value of 0.98 and an MSE as low as 120 [104]. These results indicate that the proposed model not only improves prediction robustness under atmospheric variability but also enhances crop monitoring and early problem identification in agricultural settings [105].
Similarly, UAV-derived fractional vegetation cover (FVC) showed a strong correlation with wheat plant density during the reviving period. Du et al. [77] achieved an R2 value of 0.97, RMSE of 1.86 plants/m2, and RRMSE of 0.677%, indicating high inversion accuracy through mixed pixel decomposition. This outcome demonstrates the suitability of FVC as a proxy for variable-rate nitrogen fertilization management.
In wheat rust disease assessment, the Random Forest Classifier (RFC) using spectral vegetation indices achieved the highest classification accuracy of 76%. Among the 14 indices evaluated, the Green NDVI (GNDVI), Photochemical Reflectance Index (PRI), Red-Edge Vegetation Stress Index (RVS1), and Chlorophyll Green (Chl green) emerged as the most reliable indicators [8].
For maize crop damage detection, UAV-acquired RGB images processed through vegetation indices (ExG, GLI, MGRVI) combined with unsupervised k-means clustering within QGIS proved effective. The semi-automated method demonstrated strong applicability in integrating vegetation indices with open-source software for crop damage monitoring [106].
Vegetation Indices and Forest Health Monitoring
Allen [104] reported that crown dieback assessment from UAV RGB data combined with deep learning (Mask R-CNN) achieved a segmentation accuracy of mAP 0.519. Furthermore, color-coordinate-based vegetation index estimations correlated closely with expert field assessments, suggesting that UAV and deep learning-based vegetation index approaches are robust and reliable in forest monitoring.
Ansari [107] developed a framework integrating LANDSAT 8, UAV imagery, and GIS data, employing a change vector analysis method with composite spectral indices. This approach successfully generated forest disturbance maps using NDVI, Tasseled Cap indices, and SAVI, demonstrating the effectiveness of combined datasets in forest monitoring.
Baloloy [108] introduced the Mangrove Vegetation Index (MVI), based on Sentinel-2 spectral bands. The index achieved a classification accuracy of 92%, enabling robust separation of mangrove forests from terrestrial vegetation and other land cover types.
New Vegetation Indices for Disease and Stress Detection
Several novel indices were proposed for enhanced disease and stress detection. Heim developed the Lemon Myrtle–Myrtle Rust Index (LMMR), which outperformed classical indices (PRI, MCARI, NBNDVI) with improved classification accuracies of 58–67% [109].
For early detection of spruce bark beetle infestations, Huo et al. [86] introduced the Green Shoulder Indices (GSI). Detection rates ranged between 0.24 and 0.31 at early infestation (T3) and 0.76–0.83 at later stages (T4), surpassing traditional indices such as PRI and REIP. This highlights the potential of GSI for early detection of forest stress prior to pest emergence.
Health status of chestnut forests and orchards (intense dieback due to chestnut blight and ink disease, and recovery after biological control) may be monitored with UAV-high-resolution multispectral imagery [54,110,111,112]. Vegetation indices GnDVI and RdNDVI combined with machine learning algorithms SVM and GNB have achieved >95% classification accuracy [55].
Elm status threatened by the Dutch elm disease is an important subject of monitoring, especially in protected habitats and cultural gardens [113,114,115]. Changes in NDVI derived from Landsat data are capable of detecting DED tree canopies with 71% precision [116]. Deep learning models using high-resolution spectral leaf images (multispectral ResNet18 models) have improved the efficiency of DED detection, being able to distinguish susceptible genotypes from resistant ones [115].
For the identification of ash dieback (produced by invasive pathogen Hymenoscyphus fraxineus), partial least squares discriminant analysis (PLS-DA) and linear discriminant analysis (LDA) were the most accurate for (individual tree crowns) ITC species identification (accuracy >90%), and random forest (RF) was the most accurate for ash dieback severity (77%). The reflectance of narrow blue (415 nm), red-edge (680 nm), and NIR (760 nm) bands increases in efficiency of ash disease monitoring [117].
Early prediction of pine disease caused by the quarantine pathogens (Fusarium circinatum) and nematodes (Bursaphelenchus xylophilus) is successfully performed by remote sensing methods—Fast Principal Component Analysis (Fast-PCA) and Partial Least Squares Discriminant Analysis (PLS-DA). VIS-NIR hyperspectral imaging is useful for disease detection and genetic susceptibility assessment [118,119]. Discolored trees are identified with RGB and multispectral data sources [120].
The biophysical parameters based on Sentinel satellites showed satisfactory accuracy in the prediction of the forest leaf cover. The fraction of vegetation cover (FCover) index showed the strongest correlation with leaf phenology in European beech forests. Fcover is a potential candidate vegetation index (instead of the well-known NDVI) in predicting forest tree species phenology or other physiological processes. FCover indices quantify the spatial extension of the green vegetation and may be a candidate to replace classical VIs for forest ecosystem survey [121].
Remote sensing data combined with artificial intelligence (Adaptive Neural Fuzzy Inference System—ANFIS) to estimate the NDVI dynamics. Combining fuzzy logic and Artificial Neural Network (ANN) capabilities to predict with a highly acceptable error the vegetation growth of palm trees [122].
Li, G [123] proposed the Rice Blast Index (RBI) based on UAV hyperspectral imagery, which achieved superior detection accuracy compared with 30 established vegetation indices. RBI yielded classification accuracies of 95.0% (KNN, κ = 0.938) and 95.1% (RF, κ = 0.925), confirming its effectiveness in detecting rice blast disease.

3.2.3. Application of Drone-Derived Vegetation Indices in Agriculture and Forestry: Case Study Outcomes

Agricultural Crops
Wheat
Hyperspectral imaging proved effective for detecting wheat stem rust at multiple severity levels, enabling faster identification of resistant varieties and supporting timely field management. The study also highlighted the potential for developing cost-effective multispectral sensors tailored for rust detection [8]. In rainfed wheat systems, UAV-based photogrammetry combined with VIs such as NDVI, NDWI, NDRI, and ExG facilitated early diagnosis of water stress [39].
Rice
Drone-based multispectral imagery integrated with machine learning enabled improved prediction of rice yield and its components, with time-series indices—particularly NDRE, LCI, and NDVI—providing the highest accuracy [124]. Miniature multispectral sensors were also effective for estimating above-ground biomass (AGB) during mid-late growth stages using both VIs and texture metrics [125]. In upland rice intercropped with Brachiaria in Brazil, NDVI, SR, SAVI, and MPRI were correlated with carbon stock and biomass [96]. Another study found that WDRVI, MCARI, and MSAVI showed the strongest correlation with LAI, chlorophyll, and biomass, while PVI, MCARI, and GNDVI were the best predictors during the Navarai season [126].
Maize
UAS-derived VIs combined with canopy height measurements enabled the prediction of grain yield and flowering traits such as anthesis and silking [47]. High-throughput phenotyping using UAV RGB sensors demonstrated that VARI correlated strongly with yield (r = 0.99) and was associated with flowering, plant height, grain weight, and anthocyanin concentration [127]. On Madeira Island, maize productivity and AGB were significantly associated with NDVI (p < 0.05), as well as NDRE and GNDVI (p < 0.01) [128].
Soybean
In soybean breeding, UAV-derived RGB imagery supported high-throughput phenotyping, with VIs showing strong correlations with agronomic traits [129].
Potato
In potato crops, aerial and ground-based VIs jointly explained up to 88.4% of yield variation, demonstrating the value of UAV-based phenotyping for optimizing planting and management conditions [130].
Vegetables and Legumes
In Chinese cabbage, the ExG index was identified as the most reliable for separating soil, mulch, and plants [41]. In chickpea grown in arid environments, NDVI and SAVI derived from UAV and satellite data were suitable for spatio-temporal crop monitoring [131].
Cotton
For Peruvian cotton, NDVI, RVI, and NDRE were used to evaluate crop performance [59]. Early seedling emergence was best detected through multispectral indices, including NDVI, RVI, SAVI, EVI2, OSAVI, and MCARI [33]. Additional research showed that ARVI, MCARI, WDRVI, NGRDI, ExG, RGBVI, and VARI effectively quantified biophysical traits and yield [70]. Leaf chlorophyll content was best estimated using NDVI, NDRE, GNDVI, REDVI, OSAVI, and MCARI [51].
Sugarcane
RGB-based VIs (GLI, VARI, GRVI, and MGRVI) were evaluated for growth monitoring in sugarcane. MGRVI achieved classification accuracies exceeding 94%, whereas VARI was the least sensitive [49]. Another study showed that GRVI outperformed LAI in predicting crop yield across different planting arrangements [132].
Fruit Orchards and Other Crops
In citrus orchards, NDVI, GNDVI, and SIPI2 were the most effective indices for assessing water stress [100]. Thermal and multispectral UAV surveys in Sicily further revealed spatial differences in photosynthetic activity and pre-harvest stress [133]. Peach orchards were reliably monitored using NDVI, GNDVI, NDRE, and REGNDVI [95]. In peanut crops, chlorophyll content was best predicted using regression models based on NDVI and GNDVI [134].
Weeds and Non-traditional Crops
UAV-derived indices, including NGRDI, NDVI, GLI, NDRE, and GNDVI, were used for weed detection in vineyards [126] and maize fields [135]. For sugar beet, NDVI, ReNDVI, and SAVI were tested for assessing weed impact, although these indices tended to overestimate healthy vegetation cover [136]. In seaweed aquaculture, indices such as RBNDVI, GLI, Hue, Green–Red ratio, and NGRDI improved disease-classification accuracy [43].
Forest and Tree Systems
Machine learning algorithms (GBTA, RF, CART) are useful in predicting the forest attributes from Sentinel-2 satellite imagery [137]. The Canopy Cover Estimation Method (CCEM) was employed, achieving accuracy percentages ranging from 77% to 86%, demonstrating its efficacy in estimating FCC without the need for fieldwork. The integration of annual updates from the Esri Sentinel-2 Land Cover dataset as auxiliary data greatly enhanced the reliability and accuracy of the analyses. Integration of ICESat-2 segment data and Landsat imagery is efficient to monitor the dynamics of forest canopy cover (FCC) in fire-prone areas [138,139].
Gray Level Co-occurrence Matrix (GCLM) texture derived by multispectral ortomosaic provides adequate explanatory variables to predict canopy characteristics and stand structure of (poplar) forests [140].
Forest general health status (including forest decline) was a continuous subject of national and European monitoring [141,142,143]. A combination of field observations and spectral imaging in red and infrared bands was used to assess the forest health [144]. The tandem use of multispectral and thermal sensors (UAV), together with a radiative transfer modeling and machine learning approach, helps us to predict the oak decline [145].
Drones capable of capturing high-resolution images in selected spectral bands (especially near-infrared, NIR, to generate CIR composites and NDVI indices) are particularly useful in the survey of tree health status (especially dieback and decline) in urban forestry, reducing fieldwork time and labor costs [146]. Urban sprawl was assessed in Europe by Random Forest (RF) and Support Vector Machines (SVM), and spatial metrics (Shannon’s Entropy, Patch Density, Urban Compactness Ratio, and Buffer analysis) [147].
Long-term vegetation dynamics were assessed using time series remote sensing data (vegetation indices and biophysical parameters—NDVI, EVI, LAI, FPAR). Seasonal ecological stress was detected by FPAR and FPAR + LAI in some seasons. NDVI revealed long-term vegetation degradation [148].
Deep learning models (CNN-BiLSTM-AM, CNN-BiLSTM, LSTM, BiLSTM-AM) and vegetation indexes (NDVI, EVI, and kNDVI) were used to evaluate the spatiotemporal trends of forest. The CNN-BiLSTM-AM model combined with the EVI achieved the best performance (R2 = 0.981) [149].
In forest monitoring, NDVI was effective in identifying pest-induced tree mortality [150]. Early bark beetle infestations in spruce were detected using NDVI, SAVI, and NDRE [56]. Scots pine mortality can be monitored by remote sensing (with overall accuracy up to 90%), although it is more difficult to differentiate the causes (fire, drought, bark beetles, disease, insects). A proper combination of bands (bands with a higher resolution of 10 m, red edge bands) was crucial for good results [151].
The canopy water content (CWC) index is highly suitable for assessing defoliation caused by Lymantria dispar in oak forests [147]. Remote sensing techniques (vegetation indices DVI, PVI, VREI) are particularly valuable for assessing large forested areas or inaccessible regions where classical field observations are difficult [152].
In Norway spruce under simulated wind stress, indices such as NDVI, SIPI, DVI, and red-edge metrics successfully captured physiological changes over time [63].
The biophysical parameters based on Sentinel satellites showed satisfactory accuracy in the prediction of the forest leaf cover. The fraction of vegetation cover (FCover) index showed the strongest correlation with leaf phenology in European beech forests [121].
Remote sensing bands B1, B2, and B3, from the Sentinel 2 satellite system, predict soil shear strength in Arctic soils of Alaska’s forested areas. A multivariate model improved model fit over the simple univariate models [153].
A combination of high-resolution multirotor platform UAS multispectral imagery with machine learning and ground assessment proved to be very efficient in the identification of Pacific madrone leaf blight (85% accuracy, 92% true positive rate) [154].
Invasive woody plants (Triadica sebifera and Ligustrum sinense) were monitored using orthoimage integrated with vegetation structure and topography parameters derived from airborne light detection and ranging (maximum overall accuracy 87.5%) [155]. High-resolution CBERS-4A satellite combined with drone images were used to estimate the impact on local biodiversity of the invasive Hovenia dulcis [156].
In oil palm plantations, VARI was more accurate than NDVI for health assessments [157]. In buckwheat, ExG, ExGR, and GLI showed negative correlations with biomass, whereas ExR was weakly positive. In hazelnut orchards, GNDVI, GCI, NDREI, NRI, and GI were reliable predictors, while NDVI, SAVI, RECI, and TCARI were less effective [83].
In agroforestry areas, NDVI, NDRE, GNDVI, OSAVI, and LCI were valuable predictors of woody biomass (Mekonen). Urban forest conservation was enhanced through correlations of NDVI, NDRE, GNDVI, and GRVI with ecological and structural parameters [158].
Sentinel-2 satellite imagery with machine learning algorithms (MLAs: historical vegetation indices, phenological metrics) was used to evaluate drought impacts on fruit tree systems in the arid zone of Morocco (North Africa). Tree-based MLAs performed best, with Random Forest (RF) and Gradient Tree Boost achieving 96% and 94% accuracy under phenological metrics [159].

3.2.4. Effects of Environmental and Field Conditions on Vegetation Indices

Vegetation indices are constructed by combining reflectance values from multiple spectral bands to minimize disturbances caused by external factors [160,161,162]. While small aerosols in the atmosphere may affect multispectral images by absorbing and refracting incoming solar radiation, such atmospheric effects are minimal in drone-based imaging conducted at flight altitudes below 100 m [163].
The evaluation of vegetation indices under different field conditions showed varying degrees of sensitivity. Specifically, NDVI (Normalized Difference Vegetation Index) was the least affected by field conditions [10]. The coefficient of variation (CV) analysis indicated that NDVI and TVI (Transformed Vegetation Index) both maintained CV values below 5%, while GNDVI (Green Normalized Difference Vegetation Index) showed CV values below 10% [10]. Overall, vegetation indices that incorporated the near-infrared (NIR) bands demonstrated greater stability against variations in field conditions.
Other agronomic and environmental factors, such as soil exposure, crop residues, and nitrogen fertilization levels, were also found to influence the stability of vegetation indices [164]. This has important implications for the universality of indices across crop species. For instance, NDVI has been identified as a reliable index for estimating N status in corn but is less effective in rice [165,166,167].
Drone-based NDVI measurements further revealed a dependency on sunlight conditions. Plots with lower NDVI values were particularly sensitive to solar altitude and time of day [168]. By applying correction parameters adapted to NDVI values, adjustments successfully mitigated declines caused by changes in solar illumination, improving both NDVI values and image consistency in experimental plots [168].

4. Discussion

4.1. Bibliometric Review

Similarly to other reviews, the majority of publications in this case are represented by journal articles (79%). What is particularly noteworthy; however, is the relatively high proportion of conference proceedings (13%), a fact that can be explained by the nature of the topic, which has been examined and debated at numerous national and international conferences.
A major difference compared to other reviews is that, in the case of drones (a subject that has only recently emerged in research), the published articles are relatively recent: here, publications begin in 2016, as opposed to the 1970s–1980s in other fields.
Although the topic under analysis is recent, scholarly interest in it is considerable. This is reflected both in the large number of journals in which articles have been published and in the broad international authorship. Out of 86 contributing countries, the most strongly represented are those with extensive agricultural and forestry areas, coupled with a strong inclination toward modern technologies: China, the United States, Germany, India, and Spain.
The journals in which the articles appeared, the keywords employed, as well as the research areas to which they are assigned, correspond to the domains that define the scope of the study: Environmental Sciences—Ecology, Remote Sensing, Agriculture, and Forestry.

4.2. Applications and Advances in UAV-Based Vegetation Indices

The results from the literature (Table 3) highlight the wide applicability of UAV-derived vegetation indices across agricultural and forestry systems.
Indices based on RGB sensors (e.g., ExG, ExGR, MGRVI, NGRDI) are particularly valuable for low-cost and rapid vegetation detection. They have been successfully used for identifying crop types such as Chinese cabbage [36] and assessing maize growth [40]. Although RGB indices are sensitive to illumination and background soil variability, their simplicity makes them attractive for practical applications.
Multispectral indices, such as NDVI, NDRE, and GNDVI, remain the most widely applied in UAV-based remote sensing. One of the most used vegetation indices is NDVI (Normalized Difference Vegetation Index) [169]. NDVI has been studied as a reliable indicator of vegetation health; it can also be used in estimating crop density, since vegetation and bare soil have distinct NDVI values [170,171]. Beyond NDVI, indices such as SARVI (Soil Adjusted and Atmospherically Resistant Vegetation Index) [160,161] have been developed to mitigate atmospheric and soil-related disturbances, improving robustness under variable environmental conditions. The addition of red-edge bands (NDRE, ReNDVI, CIRE) further increases sensitivity to subtle changes in chlorophyll, which is particularly useful for early stress detection and monitoring forest pest outbreaks [56].
Hyperspectral indices (e.g., MSR, MTVI, SIPI) offer more detailed physiological information by exploiting narrow spectral bands. These indices have been used to monitor chlorophyll dynamics, leaf area index, and biomass in crops such as rice and strawberries [53,78]. However, their operational deployment is limited by the high cost and data processing requirements of hyperspectral sensors.
Thermal and water-related indices (e.g., CWSI, NDWI, NDDI) demonstrate strong potential for stress detection and irrigation management. The CWSI has been used for irrigation scheduling in soybean fields [63], while NDWI and NDDI have been applied to assess water stress in wheat and drought impacts on oilseed rape [39,90]. These indices provide a complementary perspective to spectral vegetation indices by linking canopy temperature and water content to plant stress.
A few machine learning–based approaches, such as Random Forest Classifiers (RFC), have been integrated with vegetation indices for disease detection [8]. These approaches highlight the increasing role of data-driven models in enhancing the predictive power of UAV remote sensing.
Overall, the reviewed studies demonstrate that UAV-derived vegetation indices are versatile tools for monitoring plant health, diagnosing stress, and supporting precision management in both agricultural and forestry systems. The choice of index is highly dependent on sensor availability, crop type, and the specific stress or physiological trait under investigation.

4.3. Performance of UAV-Derived Vegetation Indices in Agricultural and Forest Monitoring

Our results from the literature inventory demonstrate the increasing reliability of UAV-based remote sensing and vegetation indices for precision agriculture and forest monitoring. Diykh’s integration of curvelet transform and DenseNet for NDVI prediction illustrates the power of deep learning models to overcome atmospheric variability, suggesting scalability for large agricultural monitoring programs [101]. Similarly, the strong correlation of FVC with wheat plant density [82] highlights UAV imaging as a cost-effective alternative to ground sampling, with direct implications for site-specific fertilizer management.
In crop disease monitoring, Abdulridha’s findings [10] confirm that specific indices such as GNDVI and PRI provide higher accuracy for rust detection compared to traditional spectral datasets. The combination of UAV imagery with open-source tools such as QGIS [100] further indicates a pathway for democratizing crop monitoring technologies, making them accessible for non-specialist users.
For forest applications, Allen [104] demonstrated that deep learning combined with UAV-derived vegetation indices can produce results comparable to expert assessments, emphasizing the robustness of automated monitoring pipelines. Ansari’s integration of UAV and satellite imagery provides evidence that multi-sensor data fusion enhances forest change detection accuracy [107]. The high classification accuracy of the MVI [108] emphasizes that vegetation indices designed for specific ecosystems (mangroves in this case) can significantly outperform generalized indices, pointing to the importance of tailored spectral approaches. UAV is used in forest genetics research to map the position of different provenances and families in common gardens experiments [172,173,174,175,176,177,178,179], in order to correlate the results with different microenvironmental factors as edge effect, slope position, gaps position and influence.
Novel indices developed in recent studies further expand the capacity of vegetation monitoring. Heim’s LMMR [117] and Li’s RBI [123] both illustrate the advantages of designing indices tailored to specific diseases, as they significantly outperform conventional SVIs. Likewise, the GSI [86] demonstrated exceptional promise for early forest stress detection, which is critical for proactive forest health management against bark beetle outbreaks. These findings collectively highlight that the next phase of UAV-based vegetation monitoring lies in the development of condition-specific indices, optimized for crop type, disease, or environmental stress.
Overall, the combination of UAV imagery, deep learning, and newly developed vegetation indices offers a transformative potential for both agriculture and forestry. The studies reviewed confirm that targeted spectral index design and advanced analytical methods not only enhance monitoring accuracy but also reduce costs and increase accessibility.
Recommended and non-recommended indices for UAV-based monitoring
Although the performance of vegetation indices is strongly dependent on crop type, forest structure, sensor configuration, and environmental conditions, our synthesis allows for several general recommendations. Indices that incorporate the NIR and red-edge regions (e.g., NDRE, CIgreen, CIRE, RECI, MTCI, GSI) consistently outperform broad-band RGB indices when the goal is to detect early stress, quantify chlorophyll content, or assess subtle physiological changes in both crops and forests. These indices are recommended for applications that require high sensitivity, such as early disease detection, nitrogen monitoring, and bark-beetle infestation surveys.
Conversely, RGB-only indices (e.g., ExG, VARI, GLI), while attractive for their low cost and ease of deployment, are generally less robust under variable illumination or heterogeneous canopies. They perform well for tasks such as canopy cover estimation or vigor mapping but are not recommended for detecting fine physiological changes or for monitoring dense or multi-layered forest stands.
Indices sensitive to soil background (e.g., DVI, simple ratio SR/RVI) may also be less reliable in sparse vegetation or heterogeneous understory conditions unless corrected (e.g., OSAVI, SAVI). For drought-related applications, indices that integrate water-sensitive bands (e.g., NDWI, NDDI, CWSI) are preferable.
Finally, condition-specific or newly developed indices—such as GSI for bark beetle stress or LMMR for disease detection—show strong promise but should be recommended with caution until validated in additional ecosystems and UAV platforms.
Collectively, our review suggests that red-edge and NIR-based multispectral/hyperspectral indices are broadly recommended, RGB indices are recommended only for structural or vigor-oriented applications, and soil-sensitive or unvalidated narrow-band indices should be used cautiously.

4.4. Interpreting UAV-Based Vegetation Indices: Insights for Crop and Forest Monitoring

The collection of studies we found demonstrates the growing role of UAV-based vegetation indices (VIs) in advancing precision agriculture, phenotyping, and forest monitoring.
Agricultural Crops:
In cereals such as wheat, rice, and maize, UAV-derived VIs consistently revealed strong correlations with biomass, disease resistance, yield, and water stress. For instance, NDVI and NDRE were among the most frequently reported indices, reflecting their robustness across environments [39,124,126]. Hyperspectral approaches even allowed disease detection at low severity levels [8], supporting their role in breeding and crop protection.
Soybean, potato, and cotton studies confirmed that UAV phenotyping offers not only rapid trait assessment but also predictive capabilities. Importantly, both multispectral and RGB-derived indices showed high accuracy, suggesting that low-cost RGB sensors can be effectively employed [129,130].
Vegetable and orchard crops highlighted the adaptability of VIs to different conditions. ExG was reliable in Chinese cabbage discrimination [41], while citrus orchards benefited from SIPI2 and GNDVI for irrigation management [100]. The use of UAVs for peanuts, peaches, and chickpeas further illustrates the potential of this technology for both staple and specialty crops.
Weeds and Aquatic Systems:
UAV-based indices also proved useful in weed detection and aquaculture. However, certain indices (e.g., NDVI and SAVI) risk overestimation of healthy vegetation in complex systems such as sugar beet fields [136], indicating a need for index refinement. The extension of VIs to seaweed farming [43] demonstrates the transferability of spectral approaches to marine agriculture.
Forests and Agroforestry:
In forestry, NDVI and red-edge indices were widely effective in detecting tree stress caused by pests, wind damage, or water deficits [56,63]. Notably, VARI outperformed NDVI for oil palm health monitoring (Fajar), showing that visible-light indices can sometimes surpass traditional red–NIR metrics. In hazelnuts and agroforestry systems, indices such as GNDVI and OSAVI emerged as strong predictors of biomass [83].
General trends:
Across all studies, NDVI remained the most commonly used vegetation index, but its limitations—such as sensitivity to soil background and canopy structure—were evident. Red-edge indices (NDRE, RECI), chlorophyll-related indices (MCARI, LCI), and green-band indices (GNDVI, ExG) consistently provided added sensitivity to biophysical parameters. Furthermore, the application of machine learning improved yield and biomass predictions, especially when multiple VIs were combined with temporal dynamics [124,126].

4.5. Vegetation Index Responses to Environmental Influences

The results we found in the literature demonstrate that vegetation indices are differentially influenced by environmental and field conditions, with implications for their application in precision agriculture and forestry. NDVI emerged as the most robust index, displaying minimal variability across changing field conditions [10]. Its stability compared with TVI and GNDVI suggests that indices incorporating near-infrared (NIR) reflectance are less sensitive to short-term fluctuations in environmental parameters. This highlights NDVI’s utility as a reliable indicator of vegetation status, particularly when consistent monitoring is required.
Nevertheless, the universality of NDVI as a diagnostic tool is limited. While effective for nitrogen estimation in corn, its reduced accuracy for rice underscores the crop-specific nature of vegetation index performance [33,165,166]. This suggests that a one-size-fits-all approach to vegetation monitoring may be inadequate, and that index selection must be tailored to both crop species and the traits under investigation.
Drone-based assessments revealed another important factor: the influence of sunlight conditions on NDVI measurements. The pronounced sensitivity of lower NDVI values to solar altitude emphasizes the need for correction protocols when acquiring data at different times of day [168]. The successful adjustment of NDVI values to account for sunlight variability indicates that such corrections are feasible and can enhance the reliability of drone-based monitoring systems.
Taken together, these findings suggest that while NDVI remains a strong candidate for general vegetation assessment, its use should be carefully contextualized. For species-specific or trait-specific applications, alternative indices may be preferable, and for drone operations, sunlight correction should be integrated into analytical workflows. By accounting for these influencing factors, drone-based vegetation monitoring can provide more accurate and reliable information for managing crop and forest systems.

4.6. Research Gaps and Future Directions

Although UAV-based vegetation monitoring has advanced rapidly, our synthesis highlights several important gaps that remain insufficiently addressed in the current literature.
Limited methodological standardization.
Existing studies differ widely in-flight parameters, illumination conditions, calibration targets, radiometric correction approaches, and ground-control workflows. This lack of methodological coherence makes it difficult to compare results across ecosystems, platforms, or sensor types. Standardized protocols for reflectance calibration, sunlight normalization, and soil/atmospheric correction are essential, particularly when integrating data from multiple dates or sites.
Challenges in cross-sensor and cross-scale integration.
While combining UAV, satellite, and ground-based data has strong potential, most studies still evaluate these systems independently. Only a small number explore cross-scale consistency (e.g., UAV vs. Sentinel-2 NDVI or red-edge indices), leaving open questions about transferability, accuracy thresholds, and cost–resolution trade-offs. A coordinated effort to validate UAV-derived VIs against satellite products would help establish interoperability guidelines for operational monitoring programs.
Crop-, species-, and stress-specific limitations.
Many vegetation indices continue to show inconsistent performance across different crops or forest types. For example, indices robust in cereals often fail under dense forest canopies, and NDVI remains highly crop-dependent. Our review identifies an emerging shift toward condition-specific indices (e.g., RBI, GSI, LMMR), but these require broader ecological testing. Future research should prioritize the systematic validation of such indices across multiple species, phenological stages, and climatic zones.
Underrepresentation of low-cost sensors and open-source workflows.
Despite evidence that RGB-based indices can perform well for canopy cover, vigor estimation, and early growth monitoring, the majority of studies rely on expensive multispectral or hyperspectral devices. This technological bias limits accessibility for smallholder farmers and forestry practitioners. Research is needed to optimize and validate low-cost, open-source, and smartphone-based systems, ensuring wider adoption of UAV monitoring outside research institutions.
Environmental sensitivity and operational limitations.
Spectral vegetation indices remain vulnerable to soil brightness, understory heterogeneity, shadowing, and variations in canopy architecture. Even indices designed for soil-adjustment (e.g., SAVI, OSAVI) show inconsistent performance in multi-layered forests. Few studies rigorously test how environmental variability propagates into index calculations or develop universal correction models. Simulated flight experiments and physics-based radiative transfer modeling could help quantify these influences.
Gaps in forestry applications and long-term monitoring.
Compared to agricultural systems, forestry applications remain limited in scale and diversity. Existing studies often focus on case-specific scenarios (e.g., bark beetle outbreaks, mangrove health), but comprehensive frameworks for biomass estimation, biodiversity assessment, or climate-driven tree stress remain scarce. Long-term, multi-season UAV campaigns are particularly lacking, preventing robust evaluation of forest resilience and recovery patterns.
Future directions.
To address these gaps, future research should:
  • Develop harmonized acquisition and calibration standards for UAV vegetation monitoring, enabling reproducibility and interoperability.
  • Strengthen multi-scale fusion approaches combining UAV, satellite, and proximal sensing data to enhance temporal resolution and regional scalability.
  • Validate crop- and stress-specific indices across multiple environments and develop next-generation indices tailored to forest structure, water stress, and disease phenotyping.
  • Expand the use of RGB-based and other low-cost sensors together with open-source analytical platforms to democratize UAV use and reduce dependency on high-end equipment.
  • Conduct long-term ecological and forestry monitoring campaigns to evaluate vegetation responses to disturbances, pests, and climate extremes.
  • Incorporate socio-economic, environmental, and regulatory considerations to bridge the gap between technical research and practical, policy-relevant implementation.
Collectively, these directions highlight where current knowledge remains fragmented and illustrate how UAV-based vegetation indices can evolve toward standardized, scalable, and accessible tools for both agriculture and forestry—addressing the interdisciplinary scope that distinguishes this review from previous studies.

5. Conclusions

This review shows that UAV-derived vegetation indices (VIs) have progressed from simple RGB-based measurements to sophisticated multispectral, hyperspectral, and thermal approaches integrated with machine learning, making them indispensable tools for precision agriculture and sustainable forest management. The synthesis of 472 publications demonstrates clear scientific advances: the consolidation of widely used indices such as NDVI, NDRE, and GNDVI; the emergence of new, condition-specific indices (e.g., GSI, RBI, MVI) that substantially improve early detection of diseases, pests, and physiological stress; and the growing evidence that UAV-based monitoring provides reliable estimates of crop yield, biomass, water status, and forest health across diverse environmental conditions.
From a practical standpoint, the review highlights that UAV platforms enable high-resolution, rapid, and cost-effective monitoring that can complement or replace traditional field surveys. Their ability to detect fine-scale spatial variability supports site-specific management, reduces input waste, and enhances early intervention strategies in both agriculture and forestry. Moreover, the integration of UAV imagery with machine learning and multi-sensor datasets expands the analytical capacity of VIs, offering new opportunities for operational decision-making, forest disturbance detection, and high-throughput phenotyping.
Despite these advances, several constraints remain: the lack of standardized illumination and soil-background corrections; limited adoption of low-cost sensors; insufficient operational workflows for forestry; and the need for crop- and stress-specific indices that can outperform generalized VIs across diverse ecosystems. Addressing these gaps will require coordinated efforts toward multi-scale data integration, open-source analytical tools, and methodological standardization.
Overall, the review underscores that UAV-derived vegetation indices constitute a rapidly maturing technology with significant scientific and practical value. By enabling accurate, efficient, and scalable monitoring, these tools provide a pathway toward adaptive, data-driven management of agricultural and forest ecosystems, supporting long-term sustainability and resilience in a context of environmental change.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/agriengineering7120431/s1, Table S1: PRISMA 2020 Checklist.

Author Contributions

Conceptualization, A.P. and L.D.; methodology, A.P. and L.D.; software, L.D. and G.M.; validation, P.G.I. and G.M.; formal analysis, G.M.; investigation, A.-S.P.; resources, A.-S.P.; data curation, P.G.I.; writing—original draft preparation, A.P. and L.D.; writing—review and editing, A.P. and L.D.; visualization, G.M.; supervision, A.P.; project administration, L.D.; funding acquisition, A.P. All authors have read and agreed to the published version of the manuscript.

Funding

The work of Gabriel Murariu was supported by “Grant intern de cercetare in domeniul Ingineriei Mediului privind studierea distribuției factorilor poluanți in zona de Sud Est a Europei”—Contract de finantare nr. 14886/11.05.2022 Universitatea Dunarea de Jos din Galati—“Internal research grant in the field of Environmental Engineering regarding the study of the distribution of polluting factors in the South-Eastern area of Europe”—Financing contract no. 14886/11.05.2022 Dunarea de Jos University of Galati. Also, this research work was carried out with the support of the Romanian Ministry of Education and Research, within the FORCLIMSOC Nucleu Program (Contract no. 12N/2023)/Project PN23090203 with the title New scientific contributions for the sustainable management of torrent control structures, degraded lands, shelter-belts and other agroforestry systems in the context of climate change.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Vashishth, T.K.; Sharma, V.; Sharma, K.K.; Chaudhary, S.; Kumar, B.; Panwar, R. Integration of unmanned aerial vehicles (UAVs) and IoT for crop monitoring and spraying. In Internet of Things Applications and Technology; Auerbach Publications: Boca Raton, FL, USA, 2024; pp. 95–117. [Google Scholar]
  2. Marin, S.O.; Clinciu, I.; Tudose, N.C.; Ungurean, C. An evaluating methodology for hydrotechnical torrent-control structures condition. Ann. For. Res. 2012, 55, 125–143. [Google Scholar] [CrossRef]
  3. Mihalache, A.L.; Marin, M.; Davidescu, Ș.O.; Ungurean, C.; Adorjani, A.; Tudose, N.C.; Davidescu, A.A.; Clinciu, I. Physical status of torrent control structures in Romania. Environ. Eng. Manag. J. 2020, 19, 861–872. [Google Scholar] [CrossRef]
  4. Marin, M.; Clinciu, I.; Tudose, N.C.; Ungurean, C.; Mihalache, A.L.; Mărțoiu, N.E.; Tudose, O.N. Assessment of seasonal surface runoff under climate and land use change scenarios for a small forested watershed: Upper Tarlung Watershed (Romania). Water 2022, 14, 2860. [Google Scholar] [CrossRef]
  5. Marin, M.; Tudose, N.C.; Ungurean, C.; Mihalache, A.L. Application of life cycle assessment for torrent control structures: A review. Land 2024, 13, 1956. [Google Scholar] [CrossRef]
  6. Nhamo, L.; Magidi, J.; Nyamugama, A.; Clulow, A.D.; Sibanda, M.; Chimonyo, V.G.; Mabhaudhi, T. Prospects of improving agricultural and water productivity through unmanned aerial vehicles. Agriculture 2020, 10, 256. [Google Scholar] [CrossRef]
  7. Mohan, M.; Richardson, G.; Gopan, G.; Aghai, M.M.; Bajaj, S.; Galgamuwa, G.A.P.; Vastaranta, M.; Arachchige, P.S.P.; Amorós, L.; Corte, A.P.D.; et al. UAV-supported forest regeneration: Current trends, challenges and implications. Remote Sens. 2021, 13, 2596. [Google Scholar] [CrossRef]
  8. Berie, H.T.; Burud, I. Application of unmanned aerial vehicles in earth resources monitoring: Focus on evaluating potentials for forest monitoring in Ethiopia. Eur. J. Remote Sens. 2018, 51, 326–335. [Google Scholar] [CrossRef]
  9. Nursaputra, M.; Larekeng, S.H.; Hamzah, A.S. The NDVI algorithm utilization on the Google Earth Engine platform to monitor changes in forest density in mining area. IOP Conf. Ser. Earth Environ. Sci. 2021, 886, 012100. [Google Scholar] [CrossRef]
  10. Abdulridha, J.; Min, A.; Rouse, M.N.; Kianian, S.; Isler, V.; Yang, C. Evaluation of stem rust disease in wheat fields by drone hyperspectral imaging. Sensors 2023, 23, 4154. [Google Scholar] [CrossRef]
  11. Hassan, M.A.; Yang, M.; Rasheed, A.; Yang, G.; Reynolds, M.; Xia, X.; Xiao, Y.; He, Z. A rapid monitoring of NDVI across the wheat growth cycle for grain yield prediction using a multi-spectral UAV platform. Plant Sci. 2019, 282, 95–103. [Google Scholar] [CrossRef]
  12. Änäkkälä, M.; Lajunen, A.; Hakojärvi, M.; Alakukku, L. Evaluation of the influence of field conditions on aerial multispectral images and vegetation indices. Remote Sens. 2022, 14, 4792. [Google Scholar] [CrossRef]
  13. Michels, R.N.; Bertozzi, J.; Dal Bosco, T.C.; Aguiar e Silva, M.A.; Gnoatto, E.; Soares, C.H.E. Use of drone with digital photographic machine embedded for determination of leaf cover. Rev. Agrogeoambient. 2019, 11, 17–25. [Google Scholar] [CrossRef]
  14. Simic Milas, A.; Romanko, M.; Reil, P.; Abeysinghe, T.; Marambe, A. The importance of leaf area index in mapping chlorophyll content of corn under different agricultural treatments using UAV images. Int. J. Remote Sens. 2018, 39, 5415–5431. [Google Scholar] [CrossRef]
  15. Vasile, D.; Petritan, A.-M.; Tudose, N.C.; Toiu, F.L.; Scarlatescu, V.; Petritan, I.C. Structure and spatial distribution of dead wood in two temperate old-growth mixed European beech forests. Not. Bot. Horti Agrobot. 2017, 45, 639–645. [Google Scholar] [CrossRef]
  16. Mustățea, M.; Clius, M.; Tudose, N.C.; Cheval, S. An enhanced Machado Index of naturalness. Catena 2022, 212, 106091. [Google Scholar] [CrossRef]
  17. Oprică, R.; Tudose, N.C.; Davidescu, S.O.; Zup, M.; Marin, M.; Comanici, A.N.; Crit, M.N.; Pitar, D. Gender inequalities in Transylvania’s largest peri-urban forest usage. Ann. For. Res. 2022, 65, 57–69. [Google Scholar] [CrossRef]
  18. Tudose, N.C.; Petritan, I.C.; Toiu, F.L.; Petritan, A.-M.; Marin, M. Relation between topography and gap characteristics in a mixed sessile oak–beech old-growth forest. Forests 2023, 14, 188. [Google Scholar] [CrossRef]
  19. Naseri, M.H.; Shataee Jouibary, S.; Habashi, H. Analysis of forest tree dieback using UltraCam and UAV imagery. Scand. J. For. Res. 2023, 38, 392–400. [Google Scholar] [CrossRef]
  20. Ayamga, M.; Akaba, S.; Nyaaba, A.A. Multifaceted applicability of drones: A review. Technol. Forecast. Soc. Change 2021, 167, 120677. [Google Scholar] [CrossRef]
  21. Floreano, D.; Wood, R.J. Science, technology and the future of small autonomous drones. Nature 2015, 521, 460–466. [Google Scholar] [CrossRef] [PubMed]
  22. Dutta, G.; Goswami, P. Application of drone in agriculture: A review. Int. J. Chem. Stud. 2020, 8, 181–187. [Google Scholar] [CrossRef]
  23. Scudder, M.G.; Sampson, L.E. Cost-effective forestry? A systematic literature review of drone applications (2014–2024). J. For. 2025, 123, 523–545. [Google Scholar] [CrossRef]
  24. Achim, F.; Dinca, L.; Chira, D.; Raducu, R.; Chirca, A.; Murariu, G. Sustainable management of willow forest landscapes: A review of ecosystem functions and conservation strategies. Land 2025, 14, 1593. [Google Scholar] [CrossRef]
  25. Dincă, L.; Crișan, V.; Murariu, G.; Țupu, E. Oxalis acetossela in forests: A systematic bibliometric study over the last 47 years. Sci. Pap. Ser. B Hortic. 2025, 69, 756–761. [Google Scholar]
  26. Bratu, I.; Dinca, L.; Constandache, C.; Murariu, G. Resilience and decline: The impact of climatic variability on temperate oak forests. Climate 2025, 13, 119. [Google Scholar] [CrossRef]
  27. Budău, R.; Timofte, C.S.C.; Mirisan, L.V.; Bei, M.; Dinca, L.; Murariu, G.; Racz, K.A. Living landmarks: A review of monumental trees and their role in ecosystems. Plants 2025, 14, 2075. [Google Scholar] [CrossRef]
  28. Dincă, L.; Crișan, V.; Ienașoiu, G.; Murariu, G.; Drășovean, R. Environmental indicator plants in mountain forests: A review. Plants 2024, 13, 3358. [Google Scholar] [CrossRef] [PubMed]
  29. Enescu, C.M.; Mihalache, M.; Ilie, L.; Dinca, L.; Constandache, C.; Murariu, G. Agricultural benefits of shelterbelts and windbreaks: A bibliometric analysis. Agriculture 2025, 15, 1204. [Google Scholar] [CrossRef]
  30. Dincă, L.; Coca, A.; Tudose, N.C.; Marin, M.; Murariu, G.; Munteanu, D. The Role of Trees in Sand Dune Rehabilitation: Insights from Global Experiences. Appl. Sci. 2025, 15, 7358. [Google Scholar] [CrossRef]
  31. Murariu, G.; Stanciu, S.; Dincă, L.; Munteanu, D. GIS Applications in Monitoring and Managing Heavy Metal Contamination of Water Resources. Appl. Sci. 2025, 15, 10332. [Google Scholar] [CrossRef]
  32. Slepetiene, A.; Belova, O.; Fastovetska, K.; Dincă, L.; Murariu, G. Managing Boreal Birch Forests for Climate Change Mitigation. Land 2025, 14, 1909. [Google Scholar] [CrossRef]
  33. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  34. Clarivate.com. Web of Science Core Collection. Available online: https://clarivate.com/academia-government/lp/pivot-rp/?campaignname=UL_PivotRP_LeadGen_AG_Global&campaignid=701QO00000BvPD5YAN&utm_campaign=UL_PivotRP_LeadGen_AG_Global&utm_source=Google&utm_medium=Paid_Search&utm_content=&utm_term=&gad_source=1&gad_campaignid=22932413470&gbraid=0AAAAACv9_uOFOYI_9kHMn5wNhcCda0nPI&gclid=Cj0KCQiAuvTJBhCwARIsAL6Demg8Fov6bhlPvC0lwXN_vRZToXg0CFqCvczGshFyrg12pnmuK5TwxyMaAu_EEALw_wcB (accessed on 20 June 2025).
  35. Elsevier. Scopus. Available online: https://www.elsevier.com/products/scopus (accessed on 20 June 2025).
  36. Microsoft Corporation. Microsoft Excel. Available online: https://www.microsoft.com/en-us/microsoft-365/excel?legRedir=true&CorrelationId=3bb60ab0-fe13-41a4-812b-2627667cf346 (accessed on 26 June 2025).
  37. Google. Geochart. Available online: https://developers.google.com/chart/interactive/docs/gallery/geochart (accessed on 23 June 2025).
  38. VOSviewer. Available online: https://www.vosviewer.com/ (accessed on 22 June 2025).
  39. Ali, M.; Athar, U.; Zafar, Z.; Berns, K.; Fraz, M.M. Water stress diagnosis in rainfed wheat through UAV multispectral imagery and IoT data. In Proceedings of the 2024 19th International Conference on Emerging Technologies (ICET), Topi, Pakistan, 19–20 November 2024; IEEE: New York City, NY, USA, 2024; pp. 1–7. [Google Scholar]
  40. Woebbecke, D.M.; Meyer, G.E.; Bargen, K.V.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Am. Soc. Agric. Eng. 1995, 38, 259–269. [Google Scholar] [CrossRef]
  41. Du, X.; Zhou, Z.; Huang, D. Influence of spatial scale effect on UAV remote sensing accuracy in identifying Chinese cabbage (Brassica rapa subsp. Pekinensis) plants. Agriculture 2024, 14, 1871. [Google Scholar] [CrossRef]
  42. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  43. Alevizos, E.; Nurdin, N.; Aris, A.; Barillé, L. Proximal sensing for characterising seaweed aquaculture crop conditions: Optical detection of ice-ice disease. Remote Sens. 2024, 16, 3502. [Google Scholar] [CrossRef]
  44. Escadafal, R.; Belghit, A.; Ben-Moussa, A. Indices spectraux pour la télédétection de la dégradation des milieux naturels en Tunisie aride. In Actes du 6ème Symposium International sur les Mesures Physiques et Signatures en Télédétection, Proceedings of the 6th International Symposium on Physical Measurements and Signatures in Remote Sensing, Val d’Isère, France, 17–24 January 1994; Guyot, G., Ed.; CNES: Paris, France, 1994; pp. 253–259. [Google Scholar]
  45. Atanasov, A.I.; Stoyanov, H.P.; Atanasov, A.Z. Differentiating growth patterns in winter wheat cultivars via unmanned aerial vehicle imaging. AgriEngineering 2024, 6, 3652–3671. [Google Scholar] [CrossRef]
  46. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near-infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  47. Adak, A.; Murray, S.C.; Božinović, S.; Lindsey, R.; Nakasagga, S.; Chatterjee, S.; Wilde, S. Temporal vegetation indices and plant height from remotely sensed imagery can predict grain yield and flowering time breeding value in maize via machine learning regression. Remote Sens. 2021, 13, 2141. [Google Scholar] [CrossRef]
  48. Ahamed, T.; Tian, L.; Zhang, Y.; Ting, K.C. A review of remote sensing methods for biomass feedstock production. Biomass Bioenergy 2011, 35, 2455–2469. [Google Scholar] [CrossRef]
  49. D’Odorico, P.; Besik, A.; Wong, C.Y.; Isabel, N.; Ensminger, I. High-throughput drone-based remote sensing reliably tracks phenology in thousands of conifer seedlings. New Phytol. 2020, 226, 1667–1681. [Google Scholar] [CrossRef]
  50. Barnes, E.M.; Clarke, T.R.; Richards, S.E.; Colaizzi, P.D.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T.; et al. Coincident detection of crop water stress, nitrogen status, and canopy density using ground-based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000. [Google Scholar]
  51. Chatraei Azizabadi, E.; El-Shetehy, M.; Cheng, X.; Youssef, A.; Badreldin, N. In-season potato nitrogen prediction using multispectral drone data and machine learning. Remote Sens. 2025, 17, 1860. [Google Scholar] [CrossRef]
  52. Gitelson, A.A.; Viña, A.; Arkebauer, T.J.; Rundquist, D.C.; Keydan, G.; Leavitt, B. Remote estimation of leaf area index and green leaf biomass in maize canopies. Geophys. Res. Lett. 2003, 30, 1248. [Google Scholar] [CrossRef]
  53. Zheng, C.; Abd-Elrahman, A.; Whitaker, V.; Dalid, C. Prediction of strawberry dry biomass from UAV multispectral imagery using multiple machine learning methods. Remote Sens. 2022, 14, 4511. [Google Scholar] [CrossRef]
  54. Arcidiaco, L.; Danti, R.; Corongiu, M.; Emiliani, G.; Frascella, A.; Mello, A.; Bonora, L.; Barberini, S.; Pellegrini, D.; Sabatini, N.; et al. Preliminary machine learning-based classification of ink disease in chestnut orchards using high-resolution multispectral imagery from unmanned aerial vehicles: A comparison of vegetation indices and classifiers. Forests 2025, 16, 754. [Google Scholar] [CrossRef]
  55. Buschmann, C.; Nagel, E. In vivo spectroscopy and internal optics of leaves as basis for remote sensing of vegetation. Int. J. Remote Sens. 1993, 14, 711–722. [Google Scholar] [CrossRef]
  56. Bozzini, A.; Brugnaro, S.; Morgante, G.; Santoiemma, G.; Deganutti, L.; Finozzi, V.; Battisti, A.; Faccoli, M. Drone-based early detection of bark beetle infested spruce trees differs in endemic and epidemic populations. Front. For. Glob. Change 2024, 7, 1385687. [Google Scholar] [CrossRef]
  57. Maccioni, A.; Agati, G.; Mazzinghi, P. New vegetation indices for remote measurement of chlorophylls based on leaf directional reflectance spectra. J. Photochem. Photobiol. B Biol. 2001, 61, 52–61. [Google Scholar] [CrossRef]
  58. Buschmann, C. Fernerkundung von Pflanzen. Naturwissenschaften 1993, 80, 439–453. [Google Scholar] [CrossRef]
  59. Cruz-Grimaldo, C.; Nieves, M.; Vera, E.; Duran, M.; Morales, A.; Salazar, W.; Arbizu, C.I. Yield predictions of ‘Del Cerro’ cotton (Gossypium hirsutum L.) germplasm by multispectral monitoring in the north coast of Peru. Chil. J. Agric. Res. 2025, 85, 15–26. [Google Scholar] [CrossRef]
  60. Baret, F.; Guyot, G. Potentials and limits of vegetation indices for LAI and APAR assessment. Remote Sens. Environ. 1991, 35, 161–173. [Google Scholar] [CrossRef]
  61. Guo, Y.; Wang, N.; Wei, X.; Zhou, M.; Wang, H.; Bai, Y. Desert oasis vegetation information extraction by PLANET and unmanned aerial vehicle image fusion. Ecol. Indic. 2024, 166, 112516. [Google Scholar] [CrossRef]
  62. Herrmann, I.; Karnieli, A.; Bonfil, D.J.; Cohen, Y.; Alchanatis, V. SWIR-based spectral indices for assessing nitrogen content in potato fields. Int. J. Remote Sens. 2010, 31, 5127–5143. [Google Scholar] [CrossRef]
  63. Bāders, E.; Seipulis, A.; Kaupe, D.; Champion, J.J.C.; Krišāns, O.; Elferts, D. UAV-based multispectral assessment of wind-induced damage in Norway spruce crowns. Forests 2025, 16, 1348. [Google Scholar] [CrossRef]
  64. Richardson, A.J.; Everitt, J.H. Using spectral vegetation indices to estimate rangeland productivity. Geocartogr. Int. 1992, 1, 63–69. [Google Scholar] [CrossRef]
  65. Sims, D.A.; Gamon, J.A. Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ. 2002, 81, 337–354. [Google Scholar] [CrossRef]
  66. Chen, J.M.; Cihlar, J. Retrieving leaf area index of boreal conifer forests using Landsat TM images. Remote Sens. Environ. 1996, 55, 153–162. [Google Scholar] [CrossRef]
  67. Blackburn, G.A. Spectral indices for estimating photosynthetic pigment concentrations: A test using senescent tree leaves. Int. J. Remote Sens. 1998, 19, 657–675. [Google Scholar] [CrossRef]
  68. Nielsen, D.C. Scheduling irrigations for soybeans with the crop water stress index (CWSI). Field Crops Res. 1990, 23, 103–116. [Google Scholar] [CrossRef]
  69. Lee, W.S.; Alchanatis, V.; Yang, C.; Hirafuji, M.; Moshou, D.; Li, C. Sensing technologies for precision specialty crop production. Comput. Electron. Agric. 2010, 74, 2–33. [Google Scholar] [CrossRef]
  70. Pazhanivelan, S.; Kumaraperumal, R.; Shanmugapriya, P.; Sudarmanian, N.S.; Sivamurugan, A.P.; Satheesh, S. Quantification of biophysical parameters and economic yield in cotton and rice using drone technology. Agriculture 2023, 13, 1668. [Google Scholar] [CrossRef]
  71. Bannari, A.; Morin, D.; Bonn, F.; Huete, A.R. A review of vegetation indices. Remote Sens. Rev. 1995, 13, 95–120. [Google Scholar] [CrossRef]
  72. Ecke, S.; Stehr, F.; Dempewolf, J.; Frey, J.; Klemmt, H.J.; Seifert, T.; Tiede, D. Species-specific machine learning models for UAV-based forest health monitoring: Revealing the importance of the BNDVI. Int. J. Appl. Earth Obs. Geoinf. 2024, 135, 104257. [Google Scholar] [CrossRef]
  73. Yang, C.; Everitt, J.H.; Bradford, J.M. Airborne hyperspectral imagery and yield monitor data for mapping cotton yield variability. Comput. Electron. Agric. 2004, 5, 445–461. [Google Scholar] [CrossRef]
  74. Matyukira, C.; Mhangara, P. Utilising RGB drone imagery and vegetation indices for accurate above-ground biomass estimation: A case study of the Cradle Nature Reserve, Gauteng Province, South Africa. Geocarto Int. 2024, 39, 2390512. [Google Scholar] [CrossRef]
  75. Atanasov, A.; Mihova, G.; Stoyanov, S.; Bankova, A.; Mihaylova, D. Use of unmanned aircraft for assessment of maize vegetation in Southern Dobruja. In Proceedings of the 2024 9th International Conference on Energy Efficiency and Agricultural Engineering (EE&AE), Ruse, Bulgaria, 27–29 June 2024; IEEE: New York, NY, USA, 2024; pp. 1–6. [Google Scholar]
  76. Datt, B.; McVicar, T.R.; Van Niel, T.G.; Jupp, D.L.B.; Pearlman, J.S. Preprocessing EO-1 Hyperion hyperspectral data to support the application of agricultural indexes. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1246–1259. [Google Scholar] [CrossRef]
  77. Nagler, P.L.; Scott, R.L.; Westenburg, C.; Cleverly, J.R.; Glenn, E.P.; Huete, A.R. Evapotranspiration on western U.S. rivers estimated using the Enhanced Vegetation Index from MODIS and data from eddy covariance and Bowen ratio flux towers. Remote Sens. Environ. 2005, 97, 337–351. [Google Scholar] [CrossRef]
  78. Goigochea-Pinchi, D.; Justino-Pinedo, M.; Vega-Herrera, S.S.; Sanchez-Ojanasta, M.; Lobato-Galvez, R.H.; Santillan-Gonzales, M.D.; Ganoza-Roncal, J.J.; Ore-Aquino, Z.L.; Agurto-Piñarreta, A.I. Yield prediction models for rice varieties using UAV multispectral imagery in the Amazon lowlands of Peru. AgriEngineering 2024, 6, 2955–2969. [Google Scholar] [CrossRef]
  79. Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar]
  80. Adak, A.; DeSalvio, A.J.; Murray, S.C. A computational framework for modeling and predicting maize senescence: Integrating UAV phenotyping, logistic growth, and genomics. Comput. Electron. Agric. 2025, 237, 110471. [Google Scholar] [CrossRef]
  81. Neto, J.C. A Combined Statistical-Soft Computing Approach for Classification and Mapping Weed Species in Minimum-Tillage Systems. Doctoral Dissertation, University of Nebraska-Lincoln, Lincoln, NE, USA, 2004. Available online: http://digitalcommons.unl.edu/dissertations/AAI3147135 (accessed on 8 May 2025).
  82. Du, M.; Li, M.; Noguchi, N.; Ji, J.; Ye, M. Retrieval of fractional vegetation cover from remote sensing image of unmanned aerial vehicle based on mixed pixel decomposition method. Drones 2023, 7, 43. [Google Scholar] [CrossRef]
  83. Morisio, M.; Noris, E.; Pagliarani, C.; Pavone, S.; Moine, A.; Doumet, J.; Ardito, L. Characterization of hazelnut trees in open field through high-resolution UAV-based imagery and vegetation indices. Sensors 2025, 25, 288. [Google Scholar] [CrossRef] [PubMed]
  84. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
  85. Gobron, N.; Pinty, B.; Verstraete, M.M.; Widlowski, J.-L. Advanced vegetation indices optimized for upcoming sensors: Design, performance, and applications. IEEE Trans. Geosci. Remote Sens. 2000, 38, 2489–2505. [Google Scholar]
  86. Huo, L.; Koivumäki, N.; Oliveira, R.A.; Hakala, T.; Markelin, L.; Näsi, R.; Suomalainen, J.; Polvivaara, A.; Junttila, S.; Honkavaara, E. Bark beetle pre-emergence detection using multi-temporal hyperspectral drone images: Green shoulder indices can indicate subtle tree vitality decline. ISPRS J. Photogramm. Remote Sens. 2024, 216, 200–216. [Google Scholar] [CrossRef]
  87. Datt, B. Remote sensing of water content in eucalyptus leaves. Aust. J. Bot. 1999, 47, 909–923. [Google Scholar] [CrossRef]
  88. Eitel, J.U.H.; Long, D.S.; Gessler, P.E.; Smith, A.M.S. Using in-situ measurements to evaluate the new RapidEyeTM satellite series for prediction of wheat nitrogen status. Int. J. Remote Sens. 2007, 28, 4183–4190. [Google Scholar] [CrossRef]
  89. Pereira, J.S.; Ferraz, G.A.e.S.; Santana, L.S. Aerial images to monitor grapevine vegetative growth. Rev. Eng. Agric. 2022, 30, 166–174. [Google Scholar] [CrossRef]
  90. Liu, H.; Xiang, Y.; Chen, J.; Wu, Y.; Du, R.; Tang, Z.; Yang, N.; Shi, H.; Li, Z.; Zhang, F. A new spectral index for monitoring leaf area index of winter oilseed rape (Brassica napus L.) under different coverage methods and nitrogen treatments. Plants 2024, 13, 1901. [Google Scholar] [CrossRef] [PubMed]
  91. Dash, J.; Curran, P.J. The MERIS terrestrial chlorophyll index. Int. J. Remote Sens. 2004, 25, 5403–5413. [Google Scholar] [CrossRef]
  92. Carbonell-Rivera, J.P.; Moran, C.J.; Seielstad, C.A.; Parsons, R.A.; Hoff, V.; Ruiz, L.Á.; Torralba, J.; Estornell, J. Relationships of fire rate of spread with spectral and geometric features derived from UAV-based photogrammetric point clouds. Fire 2024, 7, 132. [Google Scholar] [CrossRef]
  93. Gao, B.-C. NDWI—A normalized difference water index for remote sensing of vegetation liquid water from space. Remote Sens. Environ. 1996, 58, 257–266. [Google Scholar] [CrossRef]
  94. Malthus, T.J.; Andrieu, B.; Danson, F.M.; Jaggard, K.W.; Steven, M.D. Candidate high spectral resolution infrared indices for crop cover. Remote Sens. Environ. 1993, 46, 204–212. [Google Scholar] [CrossRef]
  95. Cunha, J.; Gaspar, P.D.; Assunção, E.; Mesquita, R. Prediction of the vigor and health of peach tree orchard. In Computational Science and Its Applications—ICCSA 2021, Proceedings of the International Conference on Computational Science and Its Applications, Cagliari, Italy, 13–16 September 2021; Springer International Publishing: Cham, Switzerland, 2021; pp. 541–551. [Google Scholar]
  96. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  97. Moran, M.S.; Inoue, Y.; Barnes, E.M. Opportunities and limitations for image-based remote sensing in precision crop management. Remote Sens. Environ. 1997, 61, 319–346. [Google Scholar] [CrossRef]
  98. Chandel, N.S.; Jat, D.; Chakraborty, S.K.; Upadhyay, A.; Subeesh, A.; Chouhan, P.; Manjhi, M.; Dubey, K. Deep learning assisted real-time nitrogen stress detection for variable rate fertilizer applicator in wheat crop. Comput. Electron. Agric. 2025, 237, 110545. [Google Scholar] [CrossRef]
  99. Wang, F.M.; Huang, J.F.; Tang, Y.L.; Wang, X.Z. New vegetation index and its application in estimating leaf area index of rice. Rice Sci. 2007, 14, 195–203. [Google Scholar] [CrossRef]
  100. Peres, D.J.; Cancelliere, A. Analysis of multi-spectral images acquired by UAVs to monitor water stress of citrus orchards in Sicily, Italy. In World Environmental and Water Resources Congress 2021; American Society of Civil Engineers: Reston, VA, USA, 2021; pp. 270–278. [Google Scholar]
  101. de Lima, G.S.A.; Ferreira, M.E.; Madari, B.E.; de Melo Carvalho, M.T. Carbon estimation in an integrated crop-livestock system with imaging sensors aboard unmanned aerial platforms. Remote Sens. Appl. Soc. Environ. 2022, 28, 100867. [Google Scholar] [CrossRef]
  102. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  103. Gitelson, A.A. Wide dynamic range vegetation index for remote quantification of biophysical characteristics of vegetation. J. Plant Physiol. 2004, 161, 165–173. [Google Scholar] [CrossRef]
  104. Allen, M.J.; Moreno-Fernández, D.; Ruiz-Benito, P.; Grieve, S.W.; Lines, E.R. Low-cost tree crown dieback estimation using deep learning-based segmentation. Environ. Data Sci. 2024, 3, e18. [Google Scholar] [CrossRef]
  105. Diykh, M.; Ali, M.; Jamei, M.; Abdulla, S.; Uddin, P.; Farooque, A.A.; Labban, A.H.; Alabdally, H. Empirical curvelet transform based deep DenseNet model to predict NDVI using RGB drone imagery data. Comput. Electron. Agric. 2024, 221, 108964. [Google Scholar] [CrossRef]
  106. Banaszek, S.; Szota, M. A semi-automated RGB-based method for wildlife crop damage detection using QGIS-integrated UAV workflow. Sensors 2025, 25, 4734. [Google Scholar] [CrossRef] [PubMed]
  107. Ansari, R.A.; Esimaje, T.; Ibrahim, O.M.; Mulrooney, T. Analysis of forest change detection induced by Hurricane Helene using remote sensing data. Forests 2025, 16, 788. [Google Scholar] [CrossRef]
  108. Baloloy, A.B.; Blanco, A.C.; Ana, R.R.C.S.; Nadaoka, K. Development and application of a new mangrove vegetation index (MVI) for rapid and accurate mangrove mapping. ISPRS J. Photogramm. Remote Sens. 2020, 166, 95–117. [Google Scholar] [CrossRef]
  109. Heim, R.H.J.; Wright, I.J.; Allen, A.P.; Geedicke, I.; Oldeland, J. Developing a spectral disease index for myrtle rust (Austropuccinia psidii). Plant Pathol. 2019, 68, 738–745. [Google Scholar] [CrossRef]
  110. Chira, D.; Bolea, V.; Chira, F.; Mantale, C.; Tăut, I.; Șimonca, V.; Diamandis, S. Biological control of Cryphonectria parasitica in Romanian protected sweet chestnut forests. Not. Bot. Horti Agrobot. 2017, 45, 632–638. [Google Scholar] [CrossRef]
  111. Pádua, L.; Marques, P.; Martins, L.; Sousa, A.; Peres, E.; Sousa, J.J. Monitoring of chestnut trees using machine learning techniques applied to UAV-based multispectral data. Remote Sens. 2020, 12, 3032. [Google Scholar] [CrossRef]
  112. Sebastiani, A.; Bertozzi, M.; Vannini, A.; Morales-Rodriguez, C.; Calfapietra, C.; Laurin, G.V. Monitoring ink disease epidemics in chestnut and cork oak forests in central Italy with remote sensing. Remote Sens. Appl. Soc. Environ. 2024, 36, 101329. [Google Scholar] [CrossRef]
  113. Chira, D.; Borlea, F.G.; Chira, F.; Mantale, C.Ș.; Ciocîrlan, M.I.C.; Turcu, D.O.; Cadar, N.; Trotta, V.; Camele, I.; Marcone, C.; et al. Selection of elms tolerant to Dutch elm disease in south-west Romania. Diversity 2022, 14, 980. [Google Scholar] [CrossRef]
  114. Wilson, B.A.; Luther, J.E.; Stuart, T.D.T. Spectral reflectance characteristics of Dutch elm disease. Can. J. Remote Sens. 1998, 24, 200–205. [Google Scholar] [CrossRef]
  115. Wei, X.; Zhang, J.; Conrad, A.O.; Flower, C.E.; Pinchot, C.C.; Hayes-Plazolles, N.; Chen, Z.; Song, Z.; Fei, S.; Jin, J. Machine learning-based spectral and spatial analysis of hyper- and multi-spectral leaf images for Dutch elm disease detection and resistance screening. Artif. Intell. Agric. 2023, 10, 26–34. [Google Scholar] [CrossRef]
  116. Hocknell, J.; Morso, I.; Graziano, J.; Paramore, K. Assessing tree health conditions in New York City’s Central Park with Earth observation data. In Proceedings of the AG24 Annual Meeting, Washington, DC, USA, 9–13 December 2024; NASA: Washington, DC, USA, 2024. Available online: https://ntrs.nasa.gov/api/citations/20240014521/downloads/2024AGU_Poster_GC51N_Hocknell.pdf (accessed on 8 May 2025).
  117. Chan, A.H.; Barnes, C.; Swinfield, T.; Coomes, D.A. Monitoring ash dieback (Hymenoscyphus fraxineus) in British forests using hyperspectral remote sensing. Remote Sens. Ecol. Conserv. 2021, 7, 306–320. [Google Scholar] [CrossRef]
  118. Fernández-Fernández, M.; Naves, P.; Witzell, J.; Musolin, D.L.; Selikhovkin, A.V.; Paraschiv, M.; Chira, D.; Martínez-Álvarez, P.; Martín-García, J.; Muñoz-Adalia, E.J.; et al. Pine pitch canker and insects: Relationships and implications for disease spread in Europe. Forests 2019, 10, 627. [Google Scholar] [CrossRef]
  119. Bravo-Arrepol, M.; Sanfuentes, E.; Amigo, J.; Hasbún, R.; Fuentes, C.; Navarro, A.; Sanhueza, P.; Castillo, R.D.P. Early detection of Fusarium circinatum in Pinus radiata cuttings using VIS–NIR hyperspectral imaging and multivariate analysis. Spectrochim. Acta A Mol. Biomol. Spectrosc. 2025, 345, 126778. [Google Scholar] [CrossRef]
  120. Shi, H.; Chen, L.; Chen, M.; Zhang, D.; Wu, Q.; Zhang, R. Advances in global remote sensing monitoring of discolored pine trees caused by pine wilt disease: Platforms, methods, and future directions. Forests 2024, 15, 2147. [Google Scholar] [CrossRef]
  121. Ciocîrlan, M.I.C.; Curtu, A.L.; Radu, G.R. Predicting leaf phenology in forest tree species using UAVs and satellite images: A case study for European beech (Fagus sylvatica L.). Remote Sens. 2022, 14, 6198. [Google Scholar] [CrossRef]
  122. Al-Juboury, I.A.M.; Aljuboury, A.B.T.; Talib, Z.A.; Atya, A.K.; Ahmed, E.; Ali, S.M.H. Prediction vegetation dynamics in Fadak palm forests utilizing remote sensing data and artificial intelligence techniques. AIP Conf. Proc. 2025, 3303, 080007. [Google Scholar]
  123. Li, G.; Zhao, D.; Li, J.; Feng, S. Unmanned aerial vehicle hierarchical detection of leaf blast in rice crops based on a specific spectral vegetation index. Front. Agric. Sci. Eng. 2025, 12, 2. [Google Scholar] [CrossRef]
  124. Bak, H.-J.; Kim, E.-J.; Lee, J.-H.; Chang, S.; Kwon, D.; Im, W.-J.; Kim, D.-H.; Lee, I.-H.; Lee, M.-J.; Hwang, W.-H.; et al. Canopy-level rice yield and yield component estimation using NIR-based vegetation indices. Agriculture 2025, 15, 594. [Google Scholar] [CrossRef]
  125. Adeluyi, O.; Harris, A.; Foster, T.; Clay, G.D. Exploiting centimetre resolution of drone-mounted sensors for estimating mid–late season above-ground biomass in rice. Eur. J. Agron. 2022, 132, 126411. [Google Scholar] [CrossRef]
  126. Tamilmounika, R.; Muthumanickam, D.; Pazhanivelan, S.; Ragunath, K.P.; Kumaraperumal, R.; Sivamurugan, A.P. Rice yield prediction through drone-derived vegetation indices: A case study in Tamil Nadu, India. Plant Sci. Today 2024, 11, 853–863. [Google Scholar] [CrossRef]
  127. Coswosk, G.G.; Gonçalves, V.M.L.; de Lima, V.J.; de Souza, G.A.R.; Junior, A.T.D.A.; Pereira, M.G.; de Oliveira, E.C.; Leite, J.T.; Kamphorst, S.H.; de Oliveira, U.A.; et al. Utilizing visible band vegetation indices from unmanned aerial vehicle images for maize phenotyping. Remote Sens. 2024, 16, 3015. [Google Scholar] [CrossRef]
  128. Macedo, F.L.; Nóbrega, H.; de Freitas, J.G.; Ragonezi, C.; Pinto, L.; Rosa, J.; Pinheiro de Carvalho, M.A. Estimation of productivity and above-ground biomass for corn (Zea mays) via vegetation indices in Madeira Island. Agriculture 2023, 13, 1115. [Google Scholar] [CrossRef]
  129. Alves, A.K.; Araújo, M.S.; Chaves, S.F.; Dias, L.A.S.; Corrêdo, L.P.; Pessoa, G.G.; Bezerra, A.R. High throughput phenotyping in soybean breeding using RGB image vegetation indices based on drone. Sci. Rep. 2024, 14, 32055. [Google Scholar] [CrossRef]
  130. Abrougui, K.; Khemis, C.; Guebsi, R.; Ouni, A.; Mohammadi, A.; Amami, R.; Kefauver, S.; Ben Mansour, H.; Chehaibi, S. Efficient management of potato fields: Integrating ground and UAV vegetation indexes for optimal mechanical planting parameters. Euro-Mediterr. J. Environ. Integr. 2025, 10, 2033–2048. [Google Scholar] [CrossRef]
  131. Ahmad, N.; Iqbal, J.; Shaheen, A.; Ghfar, A.; Al-Anazy, M.M.; Ouladsmane, M. Spatio-temporal analysis of chickpea crop in arid environment by comparing high-resolution UAV image and LANDSAT imagery. Int. J. Environ. Sci. Technol. 2022, 19, 6595–6610. [Google Scholar] [CrossRef]
  132. Sanches, G.M.; Duft, D.G.; Kölln, O.T.; Luciano, A.C.D.S.; De Castro, S.G.Q.; Okuno, F.M.; Franco, H.C.J. The potential for RGB images obtained using unmanned aerial vehicles to assess and predict yield in sugarcane fields. Int. J. Remote Sens. 2018, 39, 5402–5414. [Google Scholar] [CrossRef]
  133. Aldrighettoni, J.; D’Urso, M.G. Advances in precision farming: A contribute for estimating crop health and water stress by comparing UAV multispectral and thermal imagery. ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci. 2025, X-G-2025, 33–40. [Google Scholar] [CrossRef]
  134. Qi, H.; Wu, Z.; Zhang, L.; Li, J.; Zhou, J.; Jun, Z.; Zhu, B. Monitoring of peanut leaves chlorophyll content based on drone-based multispectral image feature extraction. Comput. Electron. Agric. 2021, 187, 106292. [Google Scholar] [CrossRef]
  135. Pinargote, C.K.; Pacheco Gil, H.A. Weed Discrimination Bbased on the Spectral Response of the Corn Crop, Manabí, Ecuador. 2021. Available online: https://redi.cedia.edu.ec/document/498056 (accessed on 16 February 2023).
  136. Stephen, S.; Kumar, V. Detection and analysis of weed impact on sugar beet crop using drone imagery. J. Indian Soc. Remote Sens. 2023, 51, 2577–2597. [Google Scholar] [CrossRef]
  137. Bai, X.; Yang, C.; Fang, L.; Chen, J.; Wang, X.; Gao, N.; Zheng, P.; Wang, G.; Wang, Q.; Ren, S. Identification of salt marsh vegetation in the Yellow River Delta using UAV multispectral imagery and deep learning. Drones 2025, 9, 235. [Google Scholar] [CrossRef]
  138. Keskes, M.I.; Mohamed, A.H.; Borz, S.A.; Niță, M.D. Improving national forest mapping in Romania using machine learning and Sentinel-2 multispectral imagery. Remote Sens. 2025, 17, 715. [Google Scholar] [CrossRef]
  139. Akturk, E. Monitoring forest canopy cover change with ICESat-2 data in fire-prone areas: A case study in Antalya, Türkiye. Ann. For. Res. 2023, 66, 87–98. [Google Scholar] [CrossRef]
  140. Romano, E.; Brambilla, M.; Chianucci, F.; Tattoni, C.; Puletti, N.; Chirici, G.; Travaglini, D.; Giannetti, F. Estimating canopy and stand structure in hybrid poplar plantations from multispectral UAV imagery. Ann. For. Res. 2024, 67, 143–153. [Google Scholar] [CrossRef]
  141. UNECE ICP Forests. Manual on Methods and Criteria for Harmonized Sampling, Assessment, Monitoring and Analysis of the Effects of Air Pollution on Forests; Thünen Institute of Forest Ecosystems: Eberswalde, Germany, 2022; 12 p. + Annexes; Available online: https://www.icp-forests.net/fileadmin/icp_forests/Dateien/Manual_Versions/2020-22/2022_ICP_Forests_Manual_Part_I_-_ICP_Manual_part01_2022_Objectives_version_2022-2.pdf (accessed on 8 May 2025).
  142. Şimonca, V.; Oroian, I.; Chira, D.; Tăut, I. Methods for the Quantification of Decline Phenomenon and Determination of the Vulnerability Degree for the Oak Stands in Northwestern Transylvania. Not. Bot. Horti Agrobot. 2017, 45, 623–631. [Google Scholar] [CrossRef]
  143. Ciceu, A.; Popa, I.; Leca, S.; Pitar, D.; Chivulescu, S.; Badea, O. Climate change effects on tree growth from Romanian forest monitoring Level II plots. Sci. Total Environ. 2020, 698, 134129. [Google Scholar] [CrossRef] [PubMed]
  144. Hernández-Lambraño, R.E.; de la Cruz, D.R.; Sánchez-Agudo, J.Á. Spatial oak decline models to inform conservation planning in the Central-Western Iberian Peninsula. For. Ecol. Manag. 2019, 441, 115–126. [Google Scholar] [CrossRef]
  145. Hornero, A.; Zarco-Tejada, P.J.; Marengo, I.; Faria, N.; Hernández-Clemente, R. Detection of oak decline using radiative transfer modelling and machine learning from multispectral and thermal RPAS imagery. Int. J. Appl. Earth Obs. Geoinf. 2024, 127, 103679. [Google Scholar] [CrossRef]
  146. Mazurek, A.C.; Hill, A.J.; Schumacher, R.S.; McDaniel, H.J. Can Ingredients-Based Forecasting Be Learned? Disentangling a Random Forest’s Severe Weather Predictions. Weather Forecast. 2025, 40, 237–258. [Google Scholar] [CrossRef]
  147. Kalogiannidis, S.; Spinthiropoulos, K.; Kalfas, D.; Chatzitheodoridis, F.; Tziampazi, F. Integration of Remote Sensing and GIS for Urban Sprawl Monitoring in European Cities. Eur. J. Geogr. 2025, 16, 75–90. [Google Scholar]
  148. Posite, V.R.; Ahana, B.S.; Abdelbaki, C.; Saber, M.; Kantoush, S.; Khaldoon, M.; Guadie, A.; Kumar, N. Decoding vegetation dynamics in High-Altitude tropical ecosystems: A Spatio-Temporal assessment using Multi-Index and biophysical remote sensing products (2002–2024). Earth Syst. Environ. 2025, 1–21. [Google Scholar] [CrossRef]
  149. Wang, Y.; Zhang, N.; Chen, M.; Zhao, Y.; Guo, F.; Huang, J.; Peng, D.; Wang, X. Prediction and Spatiotemporal Dynamics of Vegetation Index Based on Deep Learning and Environmental Factors in the Yangtze River Basin. Forests 2025, 16, 460. [Google Scholar] [CrossRef]
  150. Avdagić, A.; Lojo, A.; Balić, B.; Fazlić, I. Detection of Dry Trees Using NDVI Images Taken by a Drone. In New Technologies, Development and Application VIII: Volume 3; Springer: London, UK, 2025; p. 336. [Google Scholar]
  151. Skydan, O.V.; Fedoniuk, T.P.; Mozharovskii, O.S.; Zhukov, O.V.; Zymaroieva, A.A.; Pazych, V.M.; Hurelia, V.M.; Melnychuk, T. Monitoring tree mortality in Ukrainian Pinus sylvestris L. forests using remote sensing data from earth observing satellites. Ann. For. Res. 2022, 65, 91–101. [Google Scholar] [CrossRef]
  152. Buzatu, A.; Nețoiu, C.; Apostol, B.; Badea, O. The use of remote sensing indices derived from Sentinel 2 satellite images for the defoliation damage assessment of Lymantria dispar. Ann. For. Res. 2023, 66, 123–138. [Google Scholar] [CrossRef]
  153. Wall, W.A.; Busby, R.; Bosche, L. Vegetation predicts soil shear strength in Arctic Soils: Ground-based and remote sensing techniques. Ann. For. Res. 2024, 67, 155–166. [Google Scholar] [CrossRef]
  154. Barker, M.; Burnett, J.D.; Haddad, T.; Hirsch, W.; Kang, D.K.; Pawlak-Kjolhaug, K.; Wing, M. Multi-temporal Pacific madrone leaf blight assessment with unoccupied aircraft systems. Ann. For. Res. 2023, 66, 109–118. [Google Scholar] [CrossRef]
  155. Thapa, N.; Narine, L.L.; Fan, Z.; Yang, S.; Tiwari, K. Detection of invasive plants using NAIP imagery and airborne LiDAR in coastal Alabama and Mississippi, USA. Ann. For. Res. 2023, 66, 63–77. [Google Scholar] [CrossRef]
  156. Wiesel, P.G.; Schroeder, M.H.; Deprá, B.; Salgueiro, B.J.; Barreto, B.M.; de Santana, E.R.R.; Köhler, A.; Lobo, E.A. Integrating remote sensing and UAV imagery for detection of invasive Hovenia dulcis Thumb. (Rhamnaceae) in urban Atlantic Forest remnants. Environ. Monit. Assess. 2024, 197, 55. [Google Scholar] [CrossRef] [PubMed]
  157. Fajar, M.M.; Natalie, D.; Sales, B.P.G.; Sihotang, E.F.A.; Irwansyah, E. Deep Learning Classification Model for Oil Palm Tree Health Assessment. In Proceedings of the 2024 IEEE Asia-Pacific Conference on Geoscience, Electronics and Remote Sensing Technology (AGERS), Manado, Indonesia, 13–14 December 2024; IEEE: New York, NY, USA, 2024; pp. 32–37. [Google Scholar]
  158. Wavrek, M.T.; Carr, E.; Jean-Philippe, S.; McKinney, M.L. Drone remote sensing in urban forest management: A case study. Urban For. Urban Green. 2023, 86, 127978. [Google Scholar] [CrossRef]
  159. Oussaoui, S.; Boudhar, A.; Hadri, A.; Lebrini, Y.; Houmma, I.H.; Karaoui, I.; El Khalki, E.M.; Ouzemou, J.-E.; Kinnard, C. Mapping drought severity impact on arboriculture systems over Tadla and lower Tassaout plains in Morocco using Sentinel-2 data and machine learning approaches. Geocarto Int. 2025, 40, 2471104. [Google Scholar] [CrossRef]
  160. Kaufman, Y.; Tanre, D. Atmospherically resistant vegetation index (ARVI) for EOS-MODIS. IEEE Trans. Geosci. Remote Sens. 1992, 30, 261–270. [Google Scholar] [CrossRef]
  161. Huete, A.; Justice, C.; Liu, H. Development of vegetation and soil indices for MODIS-EOS. Remote Sens. Environ. 1994, 49, 224–234. [Google Scholar] [CrossRef]
  162. Fitzgeralda, G.; Rodriguezb, D.; O’Learya, G. Measuring and predicting canopy nitrogen nutrition in wheat using a spectral index—The canopy chlorophyll content index (CCCI). Field Crops Res. 2010, 116, 318–324. [Google Scholar] [CrossRef]
  163. Guo, Y.; Senthilnath, J.; Wu, W.; Zhang, X.; Zeng, Z.; Huang, H. Radiometric Calibration for Multispectral Camera of Different Imaging Conditions Mounted on a UAV Platform. Sustainability 2019, 11, 978. [Google Scholar] [CrossRef]
  164. Yin, C.; Lv, X.; Zhang, L.; Ma, L.; Wang, H.; Zhang, L.; Zhang, Z. Hyperspectral UAV Images at Different Altitudes for Monitoring the Leaf Nitrogen Content in Cotton Crops. Remote Sens. 2022, 14, 2576. [Google Scholar] [CrossRef]
  165. Lee, H.; Wang, J.; Leblon, B. Intra-Field Canopy Nitrogen Retrieval from Unmanned Aerial Vehicle Imagery for Wheat and Corn Fields. Can. J. Remote Sens. 2020, 46, 454–472. [Google Scholar] [CrossRef]
  166. Li, G.-S.; Wu, D.-H.; Su, Y.-C.; Kuo, B.-J.; Yang, M.-D.; Lai, M.-H.; Lu, H.-Y.; Yang, C.-Y. Prediction of plant nutrition state of rice under water-saving cultivation and panicle fertilization application decision making. Agronomy 2021, 11, 1626. [Google Scholar] [CrossRef]
  167. Chen, P.; Wang, F. Effect of crop spectra purification on plant nitrogen concentration estimations performed using high-spatial-resolution images obtained with unmanned aerial vehicles. Field Crops Res. 2022, 288, 108708. [Google Scholar] [CrossRef]
  168. Hama, A.; Tanaka, K.; Chen, B.; Kondoh, A. Examination of appropriate observation time and correction of vegetation index for drone-based crop monitoring. J. Agric. Meteorol. 2021, 77, 200–209. [Google Scholar] [CrossRef]
  169. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in the Great Plains with ERTS. In Third ERTS-1 Symposium; NASA SP-351: Washington, DC, USA, 1974; pp. 309–317. [Google Scholar]
  170. Carlson, T.; Ripley, D.A. On the relation between NDVI, fractional vegetation cover, and leaf area index. Remote Sens. Environ. 1997, 62, 241–252. [Google Scholar] [CrossRef]
  171. Suab, S.A.; Syukur, M.S.B.; Avtar, R.; Korom, A. Unmanned aerial vehicle (UAV) derived normalized difference vegetation index (NDVI) and crown projection area (CPA) to detect health conditions of young oil palm trees for precision agriculture. In Proceedings of the 6th International Conference on Geomatics and Geospatial Technology, Kuala Lumpur, Malaysia, 1–3 October 2019. [Google Scholar]
  172. Apostol, E.N.; Stuparu, E.; Scarlatescu, V.; Budeanu, M. Testing Hungarian oak (Quercus frainetto Ten.) provenances in Romania. iForest 2020, 13, 9–15. [Google Scholar] [CrossRef]
  173. Besliu, E.; Budeanu, M.; Apostol, E.N.; Radu, R.G. Microenvironment impact on survival rate, growth and stability traits in a half-sib test of pendula and pyramidalis varieties of Norway spruce. Forests 2022, 13, 1691. [Google Scholar] [CrossRef]
  174. Besliu, E.; Curtu, A.L.; Apostol, E.N.; Budeanu, M. Using adapted and productive European beech (Fagus sylvatica L.) provenances as future solutions for sustainable forest management in Romania. Land 2024, 13, 183. [Google Scholar] [CrossRef]
  175. Budeanu, M.; Şofletea, N.; Petriţan, I.C. Among-population variation in quality traits in two Romanian provenance trials with Picea abies L. Balt. For. 2014, 20, 37–47. [Google Scholar]
  176. Budeanu, M.; Popescu, F.; Beşliu, E.; Apostol, N.E. Diallel crossing (10 × 10) in Swiss stone pine. Juvenile–adult correlations and genetic gain for predicting forward selection. Ann. For. Res. 2024, 67, 109–120. [Google Scholar] [CrossRef]
  177. Budeanu, M.; Besliu, E.; Pepelea, D. Testing the radial increment and climate–growth relationship between Swiss stone pine European provenances in the Romanian Carpathians. Forests 2025, 16, 391. [Google Scholar] [CrossRef]
  178. Murariu, G.; Hahuie, V.; Murariu, A.; Georgescu, L.; Iticescu, C.; Calin, M.; Preda, C.; Buruiana, D.L.; Carp, G.B. Forest monitoring method using combinations of satellite and UAV aerial images. Case study—Bălăbăneşti forest. Int. J. Conserv. Sci. 2017, 8, 703–714. [Google Scholar]
  179. Mekonen, A.A.; Accardo, D.; Renga, A. Above-Ground Biomass Prediction in Agroforestry Areas Using Machine Learning and Multispectral Drone Imagery. In Proceedings of the IEEE 12th International Workshop on Metrology for AeroSpace (MetroAeroSpace), Naples, Italy, 18–20 June 2025; IEEE: New York, NY, USA, 2025; pp. 63–68. [Google Scholar]
Figure 1. Selection process of the eligible reports based on the PRISMA 2020 flow diagram.
Figure 1. Selection process of the eligible reports based on the PRISMA 2020 flow diagram.
Agriengineering 07 00431 g001
Figure 2. Schematic presentation of the workflow used in our research.
Figure 2. Schematic presentation of the workflow used in our research.
Agriengineering 07 00431 g002
Figure 3. Types of publications on drones and agricultural and forest vegetation index.
Figure 3. Types of publications on drones and agricultural and forest vegetation index.
Agriengineering 07 00431 g003
Figure 4. Representation of the number of publications by year on drones and agricultural and forest vegetation index.
Figure 4. Representation of the number of publications by year on drones and agricultural and forest vegetation index.
Agriengineering 07 00431 g004
Figure 5. Distribution of the primary research areas in publications on drones and agricultural and forest vegetation index.
Figure 5. Distribution of the primary research areas in publications on drones and agricultural and forest vegetation index.
Agriengineering 07 00431 g005
Figure 6. Countries with authors of articles on drones and agricultural and forest vegetation index.
Figure 6. Countries with authors of articles on drones and agricultural and forest vegetation index.
Agriengineering 07 00431 g006
Figure 7. Clusters of countries with authors of articles on drones and the agricultural and forest vegetation index.
Figure 7. Clusters of countries with authors of articles on drones and the agricultural and forest vegetation index.
Agriengineering 07 00431 g007
Figure 8. The leading journals that have published articles on drones and agricultural and forest vegetation indices.
Figure 8. The leading journals that have published articles on drones and agricultural and forest vegetation indices.
Agriengineering 07 00431 g008
Figure 9. Keywords used by authors in relation to drones and agricultural and forest vegetation index.
Figure 9. Keywords used by authors in relation to drones and agricultural and forest vegetation index.
Agriengineering 07 00431 g009
Table 1. Summary table of main methodological characteristics.
Table 1. Summary table of main methodological characteristics.
ItemDetails
Databases consultedScopus; Web of Science—SCI-Expanded
Exact search stringsScopus (TITLE-ABS-KEY): (“drone” OR “drones” OR “UAV” OR “UAS” OR “unmanned aerial vehicle*” OR “unmanned aerial system*”) AND (“vegetation index” OR “vegetation indices” OR “NDVI” OR “EVI” OR “VARI” OR “GNDVI” OR “SAVI”)
WoS—SCI-Expanded (TS): TS = (drone OR drones OR UAV OR UAS OR “unmanned aerial vehicle*” OR “unmanned aerial system*”) AND TS = (“vegetation index” OR “vegetation indices” OR NDVI OR EVI OR VARI OR GNDVI OR SAVI)
Date of record export12 August 2025 (searches and exports performed on this date)
Time spanAll years indexed up to search date (no lower year limit)
Document types includedPeer-reviewed articles and reviews
LanguageEnglish only
Initial records retrieved646 (265 Scopus; 381 WoS)
Duplicates removed70 (automated DOI/title + manual checks)
Records after de-duplication576 (prior to title/abstract screening)
Records screened (title/abstract)576 (after initial automated QC and duplicate removal)
Records after full-text screening472 (included in final analysis)
Screening procedureTwo independent reviewers for title/abstract; two independent reviewers for full text; third reviewer adjudicated disagreements
Exclusion reasons recordedStructured tags: A—out of scope; B—non-peer reviewed/editorial; C—no UAV data; D—inaccessible full text; E—non-English; F—insufficient methodological detail
Data quality control and normalizationDOI/title matching, manual metadata correction, author/affiliation normalization, keyword harmonization, fractional counting for network metrics; audit log maintained
Software usedWeb of Science Core Collection v5.35 [34]; Scopus exports [35]; Microsoft Excel 2024 [36]; Geochart [37]; VOSviewer 1.6.20 [38]; ad hoc processing in R (bibliometrix, tidyverse)
PRISMA compliancePRISMA flow diagram and checklist included; structured exclusion reasons and examples provided in Supplementary Table S1
Qualitative analysisIterative codebook; two-coder calibration; remainder coded with quality checks; themes described in the main text
Table 2. The leading journals that have published articles on drones and agricultural and forest vegetation indices.
Table 2. The leading journals that have published articles on drones and agricultural and forest vegetation indices.
Crt.
No.
JournalDocumentsCitationsTotal Link Strength
1Remote sensing47122458
2Drones1743220
3Agriculture1538619
4Agronomy1410510
5Remote sensing of the environment950923
6Computers and electronics in agriculture942911
7Sensors92124
8Korean journal of remote sensing8212
9Agri-engineering7219
10Frontiers in plant science5396
11Applied science5595
12Forests575
13Agricultural and forest meteorology4196
14International journal of remote sensing313812
Table 3. The most frequently occurring keywords in studies on drones and agricultural and forest vegetation indices.
Table 3. The most frequently occurring keywords in studies on drones and agricultural and forest vegetation indices.
Crt. No.KeywordOccurrencesTotal Link Strength
1UAV101413
2Remote sensing81302
3Machine learning58279
4Precision agriculture53205
5Random forest42189
6classification45184
7biomass34180
8yield32159
9reflectance35158
10imagery33152
11NDVI38139
Table 4. Overview of UAV-derived vegetation indices applied in agriculture and forestry.
Table 4. Overview of UAV-derived vegetation indices applied in agriculture and forestry.
Cur.
No.
Sensor TypeVegetation IndexDefinition/ObservationApplicationReferences
1RGBExG (Excess Green)ExG = 2∙G − (R + B); higher values indicate more vegetationCrop vigor assessmentAli et al., 2024 [39]; Woebbecke et al., 1995 [40]
2RGBExG−ExR (Excess Green − Excess Red)ExG−ExR = ExG − ExR; ExR = (1.4∙R − G)/(R + G + B)Chinese cabbage identificationDu et al., 2024 [41]; Meyer et al., 2008 [42]
3RGBH-Hue IndexH = arctan((2∙R − G − B)/30.5∙(G − B))Ice-ice disease in seaweedsAlevizos et al., 2024 [43]; Escadafal et al., 1994 [44]
4MultispectralMGRVI (Modified Green-Red VI)MGRVI = (G2 − R2)/(G2 + R2)Maize vegetation assessmentAtanasov et al., 2024a [45]; Bendig et al., 2015 [46]
5RGBNGRDI (Normalized Red-Green Difference)NGRDI = (G − R)/(G + R)Predicting maize senescenceAdak et al., 2021 [47]; Ahamed et al., 2011 [48]
6Multispectral (satellite)CCI (Chlorophyll:Carotenoid Index)CCI = (Band11 − Band1)/(Band11 + Band1) or using NIR/RedEdgePhenology in conifersD’Odorico et al., 2020 [49]; Barnes et al., 2000 [50]
7Multispectral VIS-IRCIgreen (Green Chlorophyll Index)CIgreen = NIR/G − 1; estimates chlorophyll contentPotato nitrogen predictionChatraei Azizabadi et al., 2025 [51]; Gitelson et al., 2003 [52]
8Multispectral VIS-IRCIRE (Red-edge Chlorophyll Index)CIRE = NIR/RedEdge − 1Strawberry dry biomass predictionZheng et al., 2022 [53]; Gitelson et al., 2003 [52]
9Multispectral VIS-IRGNDVI (Green NDVI)GNDVI = (NIR − G)/(NIR + G)Ink disease in chestnut orchardsArcidiaco et al., 2025 [54]; Buschmann and Nagel, 1993 [55]
10Multispectral VIS-IRNDRE (Normalized Difference Red-edge)NDRE = (NIR − RedEdge)/(NIR + RedEdge); measures chlorophyllBark beetle detection in spruceBozzini et al., 2024 [56]; Maccioni et al., 2001 [57]
11Multispectral VIS-IRNDVI (Normalized Difference VI)NDVI = (NIR − R)/(NIR + R); −1 to 1 scaleWater stress diagnosis in wheatAli et al., 2024 [39]; Buschmann, 1993 [58]
12Multispectral VIS-IRRVI (Simple Ratio)RVI = NIR/R; alternative: 800 nm/670 nmCotton performance evaluationCruz–Grimaldo et al., 2025 [59]; Baret and Guyot, 1991 [60]
13Multispectral VIS-IROSAVI (Optimized Soil-Adjusted VI)OSAVI = (NIR − R)/(NIR + R + L); L = 0.16Vegetation in desert oasesGuo et al., 2024 [61]; Herrmann et al., 2010 [62]
14Multispectral VIS-IRReNDVI (Red-edge NDVI)ReNDVI = (NIR − RedEdge)/(NIR + RedEdge)Ink disease in chestnut orchardsArcidiaco et al., 2025 [54]; Ahamed et al., 2011 [48]
15Multispectral VIS-IRSAVI (Soil-Adjusted VI)SAVI = (1 + L)(NIR − R)/(NIR + R + L)Wind-induced damage in spruceBaders et al., 2025 [63]; Richardson and Everitt 1992 [64]
16HyperspectralMSR (Modified Simple Ratio)MSR = ((R_NIR/R_Red − 1)/√(R_NIR/R_Red + 1))Strawberry dry biomass predictionZheng et al., 2022 [53]; Sims and Gamon, 2002 [65]
17HyperspectralMSR-RedEdgeMSR = ((R_NIR/R_RE − 1)/√(R_NIR/R_RE + 1))Strawberry dry biomass predictionZheng et al., 2022 [53]; Chen and Cihlar, 1996 [66]
18HyperspectralSIPI (Structural Independent Pigment Index)SIPI = (R800 − R445)/(R800 + R680)Wind-induced damage in spruceBaders et al., 2025 [63]; Blackburn, 1998 [67]
19ThermalCWSI (Crop Water Stress Index)CWSI = (Tc − Ta − Tmin)/(Tmax − Tmin); measures canopy water stressSoybean irrigationNielsen, 1990 [68]; Lee et al., 2010 [69]
20Multispectral VIS-IRARVI (Atmospherically Resistant VI)ARVI = (NIR − R − y(R − B))/(NIR + R − y(R − B))Cotton and rice yieldPazhanivelan et al., 2023 [70]; Bannari et al., 1995 [71]
21Multispectral VIS-IRBNDVI (Blue NDVI)BNDVI = (NIR − B)/(NIR + B)Forest health monitoringEcke et al., 2024 [72]; Yang et al., 2004 [73]
22HyperspectralBGVI (Blue-Green VI)Formula not provided; measures street-side greeneryBiomass estimationMatyukira et al., 2024 [74]
23Multispectral VIS-IRCVI (Chlorophyll VI)CVI = NIR∙R/GMaize vegetation assessmentAtanasov et al., 2024b [75]; Datt et al., 2003 [76]
24Multispectral VIS-IRDVI (Difference VI)DVI = NIR − RWind damage assessment in spruceBāders et al., 2025 [63]; Nagler et al., 2005 [77]
25Multispectral VIS-IREVI (Enhanced VI)EVI = G(NIR − R)/(NIR + C1∙R − C2∙B + L)Rice yield predictionGoigochea–Pinchi et al., 2024 [78]; Barnes et al., 2000 [50]
26Multispectral VIS-IREVI2 (Enhanced VI 2)EVI2 = G(NIR − R)/(L + NIR + C∙R)Winter wheat growth patternsAtanasov et al., 2024a [45]; Huete et al., 2008 [79]
27RGBExR (Excess Red)ExR = (1.4∙R − G)/(R + G + B)Predicting maize senescenceAdak et al., 2025 [80]; Neto, 2004 [81]
28Multispectral VIS-IRFVC (Fractional Vegetation Cover)Proportion of area covered by vegetation; linked to NDVI and LAIWheat density estimationDu et al., 2023 [82]
29Multispectral VIS-IRGCI (Green Chlorophyll Index)GCI = NIR/GHazelnut monitoringMorisio et al., 2025 [83]; Gitelson et al., 2002 [84]
30HyperspectralGI (Greenness Index)GI = R554/R677Hazelnut monitoringMorisio et al., 2025 [83]
31RGBGLI (Green Leaf Index)GLI = (2∙G − R − B)/(2∙G + R + B)Ice-ice disease in seaweedsAlevizos et al., 2024 [43]; Gobron et al., 2000 [85]
32RGBGRVI (Green-Red VI)GRVI = R560/R658Ice-ice disease in seaweedsAlevizos et al., 2024 [43]; Gitelson et al., 2002 [84]
33HyperspectralGSI (Green Shoulder Index)Uses the 490–550 nm range to detect tree vitalitySpruce bark beetle detectionHuo et al., 2024 [86]
34HyperspectralLCI (Leaf Chlorophyll Index)LCI = (R850 − R710)/(R850 + R680)Rice yield predictionGoigochea–Pinchi et al., 2024 [78]; Datt, 1999 [87]
35HyperspectralMCARI (Modified Chlorophyll Absorption Ratio Index)MCARI = ((R700 − R670) − 0.2(R700 − R550)) × (R700/R670)Rice yield
prediction
Goigochea–Pinchi et al., 2024 [78]; Eitel et al., 2007 [88]
36RGBMPRI (Modified Photochemical Reflectance Index)MPRI = (G − R)/(G + R)Maize vegetation assessmentAtanasov et al., 2024a [45]; Pereira et al., 2022 [89]
37HyperspectralMTCI (MERIS Terrestrial Chlorophyll Index)MTCI = (R850 − R730)/(R730 − R675)Winter oilseed rape LAI monitoringLiu et al., 2024 [90]; Dash and Curran, 2004 [91]
38HyperspectralMTVI (Modified Triangular VI)MTVI = 1.5 × (1.2 × (R800 − R550)/A − 1.5 × (2.5 × (R670 − R550)/A))Strawberry dry biomass predictionZheng et al., 2022 [53]; Eitel et al., 2007 [88]
39Hyperspectral and SWIRNBRDI (Normalized Burn Ratio Index)NBRDI = (NIR − SWIR)/(NIR + SWIR)Fire rate assessmentCarbonell–Rivera et al., 2024 [92]
40Multispectral VIS-IRNDDI (Normalized Difference Drought Index)NDDI = (NDVI − NDWI)/(NDVI + NDWI)Winter oilseed rape LAI monitoringLiu et al., 2024 [90]
41Multispectral VIS-IRNDWI (Normalized Difference Water Index)NDWI = (G − NIR)/(G + NIR) or (NIR − SWIR)/(NIR + SWIR)Water stress diagnosisAli et al., 2024 [39]; Gao, B.-C, 1996 [93]
42Multispectral VIS-IRNDRI (Natural Disaster Risk Index)Measures disaster impact (deaths, frequency)Water stress risk assessmentAli et al., 2024 [39]; Malthus et al., 1993 [94]
43Multispectral VIS-IROPIVI (Observation Perspective Insensitivity VI)NDVI-based, reduces angle sensitivityWinter oilseed rape LAILiu et al., 2024 [90]
44Multispectral VIS-IRRECI (Red-edge Chlorophyll Index)RECI = NIR/Red − 1Hazelnut monitoringBarnes et al., 2000 [50]
45Multispectral VIS-IRREGNDVI (Green-Red NDVI)REGNDVI = (G − R)/(G + R)Peach tree health predictionCunha et al., 2021 [95]; Gitelson et al., 1996 [96]
46Multispectral VIS-IRRBVI (Red-Blue VI)NIR-RGB based VI for chlorophyllBiomass estimationMatyukira et al., 2024 [74]; Moran et al., 1997 [97]
47RGBRGRI (Red-Green Ratio Index)RGRI = R/G; indicates anthocyanin vs. chlorophyllNitrogen stress detectionChandel et al., 2025 [98]
48RGBRBNDVIRBNDVI = (R − B)/(R + B) or (NIR − (R + B))/(NIR + (R + B))Ice-ice diseaseAlevizos et al., 2024 [43]; Wang et al., 2007 [99]
49Multispectral VIS-IRRFC (Random Forest Classifier)Machine learning-based selection of VIsWheat stem rust disease evaluationAbdulridha et al., 2023 [10]
50Spectral and mechanicalRI-dB (Redness Index—decibels)Measures redness using reflectance and dBWinter oilseed rape LAILiu et al., 2024 [90]
51HyperspectralSIPI2 (Structure Intensive Pigment Index)SIPI2 = (R800 − R505)/(R800 − R690)Citrus water stress monitoringPeres and Cancelliere, 2021 [100]; Blackburn, 1998 [67]
52Multispectral VIS-IRSR (Simple Ratio)SR = NIR/R; also called RVIRice carbon stockde Lima et al., 2022 [101]; Bannari et al., 1995 [71]
53Multispectral VIS-IRSR-RedEdgeSR-RedEdge = NIR/RedEdgeStrawberry dry biomassZheng et al., 2022 [53]
54HyperspectralTCARI (Transformed Chlorophyll Absorption Ratio)TCARI/OSAVI; TCARI = 3 × (R700 − R670) − 0.2×(R700 − R550)Hazelnut monitoringHaboudane et al., 2002 [102]
55RGBVARI (Visible Atmospherically Resistant Index)VARI = (G − R)/(G + R − B)Nitrogen stress detectionChandel et al., 2025 [98]; Gitelson et al., 2002 [84]
56HyperspectralWDRVI (Wide Dynamic Range VI)WDRVI = (a∙NIR − R)/(a∙NIR + R); a = 0.1–0.2Cotton and rice biophysical parameter quantificationPazhanivelan et al., 2023 [70]; Gitelson, 2004 [103]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Peticilă, A.; Iliescu, P.G.; Dinca, L.; Popa, A.-S.; Murariu, G. Vegetation Indices from UAV Imagery: Emerging Tools for Precision Agriculture and Forest Management. AgriEngineering 2025, 7, 431. https://doi.org/10.3390/agriengineering7120431

AMA Style

Peticilă A, Iliescu PG, Dinca L, Popa A-S, Murariu G. Vegetation Indices from UAV Imagery: Emerging Tools for Precision Agriculture and Forest Management. AgriEngineering. 2025; 7(12):431. https://doi.org/10.3390/agriengineering7120431

Chicago/Turabian Style

Peticilă, Adrian, Paul Gabor Iliescu, Lucian Dinca, Andy-Stefan Popa, and Gabriel Murariu. 2025. "Vegetation Indices from UAV Imagery: Emerging Tools for Precision Agriculture and Forest Management" AgriEngineering 7, no. 12: 431. https://doi.org/10.3390/agriengineering7120431

APA Style

Peticilă, A., Iliescu, P. G., Dinca, L., Popa, A.-S., & Murariu, G. (2025). Vegetation Indices from UAV Imagery: Emerging Tools for Precision Agriculture and Forest Management. AgriEngineering, 7(12), 431. https://doi.org/10.3390/agriengineering7120431

Article Metrics

Back to TopTop