Next Article in Journal
A Nonlinear Adaptive Control and Robustness Analysis for Autonomous Landing of UAVs
Next Article in Special Issue
A Review of Drone Technology and Operation Processes in Agricultural Crop Spraying
Previous Article in Journal
Enhancing Real-Time Visual SLAM with Distant Landmarks in Large-Scale Environments
Previous Article in Special Issue
Unoccupied-Aerial-Systems-Based Biophysical Analysis of Montmorency Cherry Orchards: A Comparative Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Use of Unmanned Aerial Vehicles for Monitoring Pastures and Forages in Agricultural Sciences: A Systematic Review

by
Wagner Martins dos Santos
1,
Lady Daiane Costa de Sousa Martins
1,
Alan Cezar Bezerra
2,
Luciana Sandra Bastos de Souza
2,
Alexandre Maniçoba da Rosa Ferraz Jardim
3,
Marcos Vinícius da Silva
4,*,
Carlos André Alves de Souza
2 and
Thieres George Freire da Silva
1,2
1
Department of Agricultural Engineering, Federal Rural University of Pernambuco, Recife 52171-900, Pernambuco, Brazil
2
Academic Unit of Serra Talhada, Federal Rural University of Pernambuco, Serra Talhada 56909-535, Pernambuco, Brazil
3
Department of Biodiversity, Institute of Biosciences, São Paulo State University—UNESP, Rio Claro 13506-900, São Paulo, Brazil
4
Postgraduate in Forestry Sciences, Federal University of Campina Grande, Patos 58708-110, Paraíba, Brazil
*
Author to whom correspondence should be addressed.
Drones 2024, 8(10), 585; https://doi.org/10.3390/drones8100585
Submission received: 26 August 2024 / Revised: 10 October 2024 / Accepted: 15 October 2024 / Published: 17 October 2024
(This article belongs to the Special Issue Recent Advances in Crop Protection Using UAV and UGV)

Abstract

:
With the growing demand for efficient solutions to face the challenges posed by population growth and climate change, the use of unmanned aerial vehicles (UAVs) emerges as a promising solution for monitoring biophysical and physiological parameters in forage crops due to their ability to collect high-frequency and high-resolution data. This review addresses the main applications of UAVs in monitoring forage crop characteristics, in addition to evaluating advanced data processing techniques, including machine learning, to optimize the efficiency and sustainability of agricultural production systems. In this paper, the Scopus and Web of Science databases were used to identify the applications of UAVs in forage assessment. Based on inclusion and exclusion criteria, the search resulted in 590 articles, of which 463 were filtered for duplicates and 238 were selected after screening. An analysis of the data revealed an annual growth rate of 35.50% in the production of articles, evidencing the growing interest in the theme. In addition to 1086 authors, 93 journals and 4740 citations were reviewed. Finally, our results contribute to the scientific community by consolidating information on the use of UAVs in precision farming, offering a solid basis for future research and practical applications.

1. Introduction

Faced with a scenario of continuous population growth, climate change, and the need for more environmentally conscious activities, the importance of precision farming becomes increasingly evident [1]. These sectors face the challenge of increasing productivity without expanding cultivated areas, while also needing to mitigate environmental impacts [2,3]. The integration of advanced technologies, such as sensors, unmanned aerial vehicles (UAVs), georeferencing systems, and artificial intelligence, enables real-time data collection, optimizing the use of inputs like water, fertilizers, and agricultural pesticides, improving the monitoring of crops and animals, and promoting more efficient management of natural resources [4,5].
Forage management, a fundamental component of sustainable animal production, involves challenges that directly affect the nutritional quality and the quantity of biomass available for animal consumption [6,7]. Climatic variability, water requirements, soil characteristics, and nutritional needs influence forage productivity and quality [8,9]. Inadequate forage production and nutritional deficiencies can lead to problems with animal feed supply, negatively impacting animal performance and health, thereby reducing meat and milk production and consequently causing economic losses [6,10]. Additionally, biotic factors, such as pests and diseases, combined with soil degradation due to improper practices, further compromise production [11,12]. Rigorous monitoring of nutritional quality, soil moisture, and forage availability is crucial for optimizing management and ensuring the sustainability of production systems [13,14,15].
To ensure proper management and monitoring of production systems, especially when implementing more complex strategies, it is essential to carry out accurate monitoring of crop characteristics. Generally, the collection and analysis of the biophysical and physiological characteristics of crops are performed in a site-specific and/or destructive manner, which can become even more exhausting or inaccurate according to the size, complexity, and sensitivity of the system [16,17,18,19]. Thus, there is a trend in the literature of studies focused on remote sensing, especially for the application of UAV systems, due to the better temporal and spatial resolution of these instruments, which make them more suitable for capturing the high variability that occurs in these systems [18,20,21].
There are several applications using UAVs, such as methods aimed at monitoring biomass [22], leaf area index (LAI) [23], nutrients [24], chlorophyll [25], and nutritional quality of forages [26]. Different sensors are applicable, such as red, green, and blue (RGB), multispectral and hyperspectral cameras, light detection and ranging (LiDAR), and thermal sensors, which allow for the capture of detailed and accurate data about crop characteristics [22,27,28,29]. In addition, image processing and data analysis techniques have been applied to improve the accuracy and efficiency of monitoring. For instance, [21] used structure from motion (SfM (version 0.9.8.15)), a 3D reconstruction algorithm, combined with the high resolution of RGB sensors to monitor pasture biomass as an alternative to LiDAR sensors, which are more suitable for depth data but end up being more expensive, and [16] found that the combination of multispectral vegetation indices (VIs) and RGB with texture indices could increase the accuracy of the LAI estimate.
Faced with the growing demand for efficient solutions in precision farming, UAVs emerge as crucial tools due to their capacity to generate high-frequency and high-resolution data. This review aims to describe in detail the main applications of UAVs for monitoring biophysical and physiological parameters in forage crops. In addition, it seeks to evaluate the most advanced data processing techniques, highlighting the role of data analysis and machine learning in optimizing the efficiency and sustainability of agricultural production systems. Furthermore, this paper is structured as follows: Section 2 describes the methodological steps and data collection; Section 3 provides our results and discusses the findings; and Section 4 shows the conclusions and contributions of the research, implications, and future directions.

2. Methodology

In this review, the Scopus and Web of Science databases were used as the main sources for the scientific articles. We use these two literature search databases because they are widely considered the most reliable and provide comprehensive coverage of key scientific research findings from around the world. The search was conducted using the following combination of terms: TITLE-ABS-KEY: “unmanned aerial vehicles” OR “unmanned aerial system” OR “remoted piloted aerial” AND “pasture” OR “grassland” OR “forage”. The following filters were applied: document type: article; publication stage: final; and language: English. During the paper search, no year restriction was applied. This strategy aimed to identify relevant studies using UAVs on pasture and forage. The search yielded 590 results, including 392 from Scopus and 198 from Web of Science. After removing duplicate articles, the total number was reduced to 463, followed by a secondary screening. The inclusion criteria considered studies investigating the use of UAVs in forage species, both in pasture systems and other cultivation conditions. The exclusion criteria included articles that did not directly address the use of UAVs and did not focus on forage species, literature reviews, and articles whose methodology was inadequately described or presented incomplete data. The procedures for selecting and filtering the articles are aligned with Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) principles, particularly regarding the systematic search and the inclusion and exclusion criteria for studies [30]. In this way, a further 225 articles were screened out, resulting in 238 articles being selected for further in-depth analysis. This process was undertaken to identify the key points discussed in the review. When necessary, studies outside these databases were consulted to supplement certain review sections, particularly when additional information was required on specific topics.

3. Findings and Discussion

In this section, we present our findings based on the most relevant research found.

3.1. Scientometric Analysis

The search resulted in a total of 238 articles from 93 journals, with the contribution of 1086 authors and more than 12,250 references and the first appearances occurring in 2014. International co-authorship represented about 13% of the collaborations, reflecting a trend of internationalization of the research. Moreover, the data demonstrate a significant increase in production, with an annual growth rate of 21.64%. The articles add up to 4740 citations, with about 19.92 citations per article (Table 1).
There has been an increase in both the number of articles published and the number of citations received. The years 2021 and 2022 stand out with the highest number of articles published (43 and 52, respectively), and 2019 and 2021 stand out with the highest number of citations received (941 and 890, respectively). It is important to note that more recent years, such as 2023 and 2024, show a decrease in the number of citations, which is expected, as these articles need more time to influence the academic community (Figure 1).
A survey was conducted on the 100 most cited articles, highlighting the diversity of species studied regarding the use of UAVs for monitoring biological and physiological parameters in pasture and forage systems. The species were classified according to the number of occurrences in the articles (Figure 2). Lolium perenne L. was the most frequently addressed species and was mentioned in sixteen articles, followed by Lolium multiflorum Lam. with eight articles, and Medicago sativa L. with seven articles. Additionally, species such as Trifolium pratense L. and Trifolium repens L., which are legumes of high nutritional value, appear in six articles. Other species, such as Phleum pratense L. and Festuca arundinacea Schreb., appear less frequently (five and three articles). In total, more than 41 plant species contributing to the composition of pastures and forage systems were identified. However, the species with lower representation (only one article each) indicate the need for further studies involving emerging technologies for less studied species.

3.2. Image Processing

Image processing is a very broad term, which basically consists of the generation of orthomosaic, corrections (e.g., radiometric, geometric, noise removal), information extraction (e.g., vegetation indices, texture, color space), and modeling (regression, classification, or identification), which may vary according to the application. This step is very important because it defines the final quality of the images obtained by the UAVs, which is decisive in the decision process [31,32,33,34]. With this, this topic will consider only the steps prior to modeling and information extraction, with an exclusive section dedicated to information extraction and the main machine learning algorithms.

3.2.1. Radiometric Correction

The radiometric correction process aims to normalize variations in the pixels recorded by the sensor due to factors such as the influence of lighting, topography, and atmospheric characteristics, typically by converting raw values (digital numbers—DNs) into reflectance values (dimensionless) [35]. Reflectance is the fraction of light energy reflected by a surface relative to the incident energy. This ratio between radiance (output) and irradiance (input) enables the comparison of different images collected by UAV-mounted sensors over time [31]. Although initially disregarded due to the lower altitudes at which UAV flights are conducted, the importance of radiometric correction in UAV usage has been widely demonstrated, and procedures aimed at addressing this issue have been the focus of recent studies [31,32,33,35,36,37]. Radiometric correction methods can be broadly classified into the following two types: absolute radiometric correction, which uses auxiliary data based on atmospheric conditions (e.g., relative humidity, air temperature, atmospheric pressure) through complex calculations, making this task more complicated and less commonly adopted, and relative or empirical methods, which correlate field-measured data with image data collected during the flight [31,38,39].
Calibration targets are among the primary solutions employed, establishing a relationship between the reflectance of ground targets and the digital numbers captured by sensors [38]. The targets and methods applied may vary, but, they generally focus on reflectance calibration panels, reflectance plates, and pseudo-invariant features (PIFs) [22,37,40]. Panels and plates are conceptually the same; we will use the term “panels” to refer to the targets provided by manufacturers for point data collection.
Calibration panels consist of a surface with known reflectance values within a specific range of the electromagnetic spectrum, allowing them to be used as reference standards for calibration during processing in software such as Pix4D (version 4.5.6) [35]. This method offers key advantages, including simplicity and efficiency, as it only requires capturing images of a target with known reflectance before and after the flight, optimizing field time [36,37]. Additionally, integrating this method with automated software significantly reduces manual effort in processing, making it ideal for high-frequency workflows [35,39]. However, the method has limitations, primarily because it assumes consistent lighting throughout the flight. Consequently, in conditions with greater variations, it may not function adequately [36,41]. Additionally, the effectiveness of the calibration directly depends on the quality and maintenance of the reflectance target, and its deterioration can compromise the accuracy of the results [37]. As a complement, some UAVs are equipped with a downwelling light sensor (DLS), which is a sensor that is responsible for measuring irradiance when images are captured, allowing for the dynamic correction of lighting variations. This can be used either in conjunction with or independently of the reference panel, aiming to address the shortcomings of the previous method [41,42]. However, this method works more effectively when the alignment between the camera sensors and the DLS is the same, ensuring better correspondence of the collected data. In the case of multirotor UAVs, this condition may be compromised by vibrations and tilt during flight, requiring consideration of the angles between the sensors in the calculations [35,41].
The use of reflectance plates can serve as a multipoint alternative compared to a single calibration panel. These plates have the advantage of being captured simultaneously with the images by the sensors and at the same altitude. Additionally, they can be manually constructed or purchased in various materials, colors, and sizes [42]. The plates must exhibit behavior as close as possible to that of a Lambertian object, similar to calibration panels, so that the reflected light is isotropic, allowing for consistent and reproducible measurements over time [31,39]. For example, [41] constructed 60 cm × 60 cm wooden calibration panels, each individually painted with matte black, light gray, dark gray, and white paints, while [42] described and evaluated different calibration target materials with Lambertian or near-Lambertian characteristics that are durable, commercially available, cost-effective, and easy to transport. Despite the advantages, one of the challenges is determining the number of plates to use and their distribution across the area. Studies have used anywhere from 8 to over 100 plates [43]. Daniels et al. [37] found that using multiple plates to apply corrections to individual images enhances the accuracy of the corrections. Alternatively, Wang et al. [41] optimized the radiometric block adjustment (RBA) method developed by [44], which consists of adjusting linear regression coefficients between pairs of images based on tie points (the same point detected in two images). In this way, even with a small set of panels, the correction can be extended between image pairs using tie points as a reference. The authors also implemented a solution called RBA-Plant, which only considers tie points for vegetation, further improving calibration. Additionally, the reflectance determination of the plate surfaces must be performed regularly. Since the plates are collected at the same altitude as the images, their dimensions need to be larger than those of the panels and proportionally adjusted to the sampling distance relative to the ground to ensure data accuracy and avoid pixel contamination by adjacent objects [41,42]. However, this requirement increases the likelihood of damage or contamination to the plates, which compromises the quality of the results. Therefore, the use of equipment such as spectroradiometers becomes necessary, and they must cover at least the same wavelengths as the UAV sensors [42].
PIFs seek to identify areas or features in the image that theoretically remain constant over time, facilitating image calibration and correction, such as water bodies, roads, and urban structures [45,46]. This method is similar to the one that uses reflectance plates, with the primary difference being the targets used in the process. It is a simple method as it utilizes resources directly from the image, making it particularly useful in situations in which images have already been collected and require calibration. However, it is important to carefully select the points, as these are subject to variations caused by seasonal or climate-related changes. Monitoring the spectral response of these targets is recommended to reduce the risk of error accumulation in large datasets [46,47,48].
Thermal image collection also faces challenges, such as biases caused by sudden temperature changes, vignetting effects, and interactions among the sensor, the environment, and surfaces, particularly ground temperature [49,50,51]. Thermal sensors can be either radiometric or non-radiometric. Radiometric sensors convert the detected signal into temperature and record it in the image, while non-radiometric sensors convert the thermal radiation signal into an RGB visual representation [49].
In radiometric sensors, one possibility is to use an externally heated shutter installed on the thermal sensor mounted on UAVs to provide a uniform calibration target, as evaluated by [52], which demonstrated an improvement in the quality of thermal images. Another approach is to perform measurements in a controlled environment using a high-precision thermometer and collecting data over a wide temperature range, applying regression techniques or machine learning to calibrate the sensors [53]. The authors found that thermal calibration using XGBoost (version 2.1.1.) significantly reduced the root mean square error (RMSE) and increased the coefficient of determination (R2) for Micasense Altum (Micasense Inc., Seattle, WA, USA) and FLIR Duo Pro-R (FLIR Systems Inc., Wilsonville, OR, USA) sensors. For non-radiometric sensors, calibration can be developed based on panels that serve as temperature samples of materials with constant reflectance, acting as thermal control points (TCPs). These TCPs should have different colors and materials, ensuring characteristics similar to the object of interest [54]. The extraction of the DN from thermal control points, which are associated with the temperature measurements collected during the flight, enables the application of the radiometric calibration equation [55]. The calibration curve can be generated linearly or by using other regression methods, such as Gaussian [49,56,57]. However, radiometric calibration does not resolve all sources of uncertainty in the conversion of thermal infrared images into temperature values due to systematic biases caused by thermal drift, vignetting effects, and interactions between the sensor and environmental and operational field conditions, which can result in errors of up to 8 °C [49], necessitating appropriate techniques to address each of these factors.

3.2.2. Geometric Correction

Geometric correction is important for correcting distortions in images, which can result in inaccurate representations of the targets captured by the sensors. Through advanced mathematical techniques, such as polynomial functions and orthorectification models, distortions are mapped and corrected, ensuring accuracy in their applications, especially in multi-temporal applications [32,34,46,58].
For these corrections, ground control points (GCPs) are commonly used, which involves placing the GCPs in the collection area at easily visible locations and georeferencing these points with the aid of high-precision equipment, such as real-time kinematic global navigation satellite systems (RTK-GNSS) or theodolites. The quality of the corrections using GCPs is primarily related to the quality of the survey, as well as the quantity and distribution of the GCP [59,60,61]. The proper determination of the number and distribution of GCPs is important for optimizing both field time and computational processing [62,63]. Although it may be assumed that the higher the number of GCPs, the better the accuracy, some studies have reported that beyond a certain number, a threshold is reached without significant improvement in results [59,60,61,64]. Poorly distributed GCPs, or those concentrated in a small region, can result in uneven corrections, compromising accuracy in areas farther from the points. The general recommendation is to distribute the GCPs evenly across the entire image, covering the edges and the center [59,61,65,66]. The key point is that the number and distribution of GCPs are influenced by various factors, such as the sampling grid, study area, correction method, terrain relief, flight configurations, and purpose. For example, [65] evaluated different distributions—edge, central, corner, stratified, and random—and found better overall results with 20 GCPs around the edge, yielding a horizontal error of 0.035 m and a vertical error of 0.062 m. Vertical accuracy was further improved to 0.048 m by adding points to the center, following the aforementioned recommendation. However, if even better vertical accuracy is required, in this study, the stratified distribution with 30 GCPs stood out, yielding errors of 0.043. In contrast, [59] evaluated different distributions—random, edge, central, diagonal, parallel, and stratified across different altitudes—and found better results, with a maximum of 14 GCPs for any distribution and the parallel distribution achieving the lowest RMSE (0.03 to 0.07). This performance was attributed to the georeferencing from the center to the edge distribution consistently. Regarding altitude, they found that as the altitude increased, the density of GCPs per image also increased, thereby reducing the error—even for the worst distributions evaluated. Therefore, there is no uniform recommendation for the number or density of points (GCPs per area or GCP per image) to be adopted, making prior planning crucial to ensure the best configuration [61,65,67].
Alternatives to conventional GCP collection have been gaining ground with the advancement of high-precision positioning techniques, such as post-processed kinematics (PPK) and real-time kinematics (RTK). These technologies allow for obtaining georeferenced coordinates with high accuracy, thereby reducing the need for several GCPs and minimizing dependence on manual field collection. However, a smaller number of GCPs is still necessary for higher precision work due to inconsistencies and signal distortions from satellites [64,68]. In both PPK and RTK, data are collected in the field and processed later, using information from a fixed reference station to improve coordinate accuracy. The difference is that PPK corrections are performed afterward, and RTK adjustments occur in real time [69]. Note that these methods are not mutually exclusive; in fact, PPK can be used to correct possible RTK errors. The authors of [70] achieved horizontal accuracy of 0.1 m and vertical accuracy of 0.2 m by combining just one GCP with PPK for correcting RTK data, considering a radius of up to 1 km from the GCP. Both techniques significantly reduce positioning errors, but they may increase operational costs (reference bases and RTK modules equipped on the UAV), which must be weighed against the benefits in terms of accuracy [71]. A more cost-effective alternative involves performing manual corrections using software such as QGIS to select GCPs from a reference image and their corresponding points in other images. This method depends on the accuracy of the reference image and the availability of easily identifiable and stable points between scenes. Additionally, it is a time-consuming, repetitive process that is prone to human error [72,73].

3.3. Methods for Image Feature Extraction

The process of information extraction is understood as obtaining additional data from the images captured by UAVs. After the flights, the collected images are combined to form an orthomosaic, with a certain number of bands that vary according to the sensor used. However, bands may not be sufficient for regression or classification processes. Thus, methods such as vegetation indices, texture analysis, color spaces, and three-dimensional point clouds are applied to enrich the data obtained from the UAVs.

3.3.1. Vegetation Index (VIs)

VIs are quantitative measures designed to highlight certain characteristics of targets derived from arithmetic operations between bands of the electromagnetic spectrum, usually through simple ratios or normalized differences [74,75]. VIs are constructed based on the different responses of vegetation to various wavelengths of the electromagnetic spectrum, such as greater absorption in the visible region (specifically in the red and blue regions) due to pigments and higher reflectance in the green region [4]. Another example is the near-infrared (NIR) and shortwave infrared (SWIR) regions, which are influenced by factors such as leaf structure and canopy water content [42,76].
In recent studies, it is common to use multiple VIs, as each index typically addresses spectral band combinations differently, allowing for the capture of specific crop characteristics at various growth stages. It is important to consider the complementarity between indices due to their limitations [77,78]. For example, the Normalized Difference Vegetation Index (NDVI), which is widely used and associated with biomass, presents saturation issues in dense vegetation and is affected by soil reflectance in sparse vegetation, in which it reaches a threshold and fails to capture significant variations in vegetation [77,79]. These are not unique limitations of NDVI; [25] also identified saturation issues in the combination of VIs, such as the Modified Soil-Adjusted Vegetation Index (NDVI-MTCI) and Optimized Soil-Adjusted Vegetation Index (OSAVI)–MTCI, when LAI and leaf chlorophyll content (Cab) values were high. Naturally, indices have been developed to compensate for these issues, such as the Enhanced Vegetation Index (EVI), which added the L coefficient to adjust the canopy background and the C1 and C2 coefficients, along with the blue band to correct for aerosol scattering in the red band [80]. The Soil-Adjusted Vegetation Index (SAVI) was developed considering soil-vegetation interactions using the adjustment coefficient L in the NDVI equation to reduce soil-induced variations [81], in addition to others, such as the Normalized Difference Red Edge Index (NDRE), Red Normalized Vegetation Index (RNDVI), and Ratio Vegetation Index (RVI). The authors of [78] used different principles of index acquisition to select vegetation indices in soybean (Glycine max L.) productivity modeling, namely, slope-based, spectral feature depth, orthogonal axes, and soil-line, identifying promising results for this approach. Furthermore, it was found that the contribution of a vegetation index in explaining data variability is as important as its correlation with crop response.
The selection of VIs can also vary according to the sensors used, such as RGB indices, multispectral indices, and hyperspectral indices (Table 2). However, it is important to be aware of the limitations that may arise depending on the sensors used. RGB sensors are the most common and capture images in visible colors, and they are useful for mapping and visual monitoring of crops, allowing for the analysis of soil cover and the identification of different types of vegetation [82,83]. Their main benefits include the ease of use and relatively low cost, as well as providing high-resolution images that are easily interpretable [83,84]. However, RGB sensors have significant limitations, as they cannot capture information outside the visible spectrum, which restricts vegetation analysis, and they are less effective in low-light conditions or environments, with high variations in shadow [84]. In contrast, multispectral sensors capture at least one additional band of the electromagnetic spectrum, such as NIR, enabling the derivation of a greater number of vegetation indices [85]. This makes these sensors particularly valuable for monitoring plant health, water stress, and agricultural productivity [86]. Their benefits include the ability to detect features not visible to the naked eye and provide more detailed and accurate analyses compared to RGB sensors [87]. However, higher costs and the need for more complex data preprocessing methods are disadvantages to be considered [86]. Hyperspectral sensors represent a significant advancement, capturing data in hundreds of narrow spectral bands and allowing for a highly detailed analysis of soil and vegetation composition [86,88]. Although their high spectral resolution makes them essential tools for cutting-edge research, their use faces important limitations, including high costs, the complexity of data interpretation, and the demand for intensive computational processing, often requiring specialized processing software [86,89].

3.3.2. Texture Analysis

Texture analysis refers to the extraction of information from the visual effects of images, which are generally characterized by patterns of intensity variations formed by a group of pixels [102,103]. Texture analysis as a predictive resource of remote sensing products has been applied in the prediction of various characteristics in the agricultural environment in recent studies, such as biomass [22], height [104], LAI [16], yield [103], soil–plant analysis development (SPAD) [79], nitrogen fixation [105], and grass classification [106]. Among the different methods used for texture analysis, the gray level co-occurrence matrix (GLCM) is among the most used techniques [79,107,108,109]. Benco et al. [107] defines the GLCM as a two-dimensional matrix that represents the joint probabilities between pairs of pixels, separated by a given distance and direction (Figure 3).
From this matrix, different metrics are extracted to quantitatively characterize the spatial texture of the image, and the following eight characteristics are commonly explored: mean (ME), variance (VAR), homogeneity (HO), contrast (CO), dissimilarity (DI), entropy (EN), second moment (SM), and correlation (CC) (Table 3). These metrics can also be combined to form indices, such as VIs, through differences, ratio, and normalized difference between bands [108,111].

3.3.3. Color Space

A color space is a mathematical structure that describes color information in terms of channels or components, usually three or four, such as RGB and CMYK (i.e., cyan, magenta, yellow, and black). These spaces are essential to ensure the accurate representation and reproduction of colors, which is important in the processing of images in processes such as object recognition, segmentation, and image retrieval [112]. Ganesan et al. [113] emphasized the importance of different spaces in image segmentation, in addition to describing several color spaces used for segmentation, CIEXYZ, CMY, HSV, HSL, HSI, I1I2I3, CIELab, CIELuv, CIELch, YIQ, YUV, YCbCr, and LMS (Table 4), and the forms of conversion for each of the spaces from the RGB space. Gracia-Romero et al. [114] used the HSI and CIELab spaces in the processing of RGB images to evaluate the early growth of maize (Zea mays L.) under phosphorus fertilization and found good performance for indices derived from the HSI hue, green area (GA), and greener area (GGA) and for the *a channel of CIELab.

3.3.4. Three-Dimensional (3D) Point Clouds

The capacity to capture and represent information in three dimensions makes 3D modeling an important application of UAVs in advancing plant phenotyping, especially because of the capacity to collect high-resolution images and reduce the time needed to collect this type of information manually, in addition to the structural characteristics being used to estimate other characteristics such as biomass and productivity [116,117,118,119]. Three-dimensional construction methods can be carried out using LiDAR sensors or with algorithms such as SfM [117]. SfM is a technique for the three-dimensional reconstruction of an environment from a sequence of two-dimensional images collected from multiple views. This process involves identifying corresponding points or features in multiple images, determining the relative geometry between images by estimating the positions and orientations of the camera in the images relative to a common coordinate system, and refining the reprojection errors [120,121].
Although it is a lower-cost alternative to LiDAR sensors, due to its application in RGB cameras, its use in agricultural fields is a challenge due to the distribution of crops [103]. Thus, studies such as those conducted by [103] to improve soybean yield forecasting and by [118] to enhance LAI and biomass estimation in maize proposed the use of cross-circling oblique (CCO) photography and found promising results. CCO is a flight plan with routes in circles so that a single circle overlaps several images and there is overlapping between circles (Figure 4), enabling the capture of multiple images in different views [116,118,122]. The application of CCO in UAVs is recent, so few studies have used this approach.

3.4. Machine Learning

Machine Learning (ML) is a field of artificial intelligence aimed at developing algorithms capable of learning and improving through a dataset, detecting complex patterns or rules, and making predictions, classifications, identifications, and decisions based on this information without being explicitly programmed [123]. ML encompasses a wide range of algorithms, including generalized linear models, decision trees, random forests, gradient augmentation algorithms, dimensionality reduction techniques, probabilistic models, and neural networks, which can be applied to regression, classification, and clustering problems [124,125,126]. These algorithms operate under different types of learning, supervised, semi-supervised, unsupervised, and reinforcement, offering diverse approaches to solving complex problems in various contexts [127,128].
These algorithms have been applied in several areas and are one of the central points in the application of UAVs in agriculture because they are effective due to their ability to deal with nonlinear relationships, partially visible or non-visible patterns, and the high dimensionality of the data. For example, partial least squares regression (PLSR) has been used to predict biomass, LAI, the physiological characteristics of leaves (e.g., chlorophyll, carotenoids, carbon, nitrogen), and forage quality [26,27,129]; the generalized additive model (GAM) has been applied to estimate biomass and forage quality [22,130,131], and the random forest (RF) has been applied for biomass, LAI, nitrogen content, and crop classification [16,24,40,132].
In this study, we identified 32 algorithms from an analysis of the 100 most cited articles in the database. These algorithms were classified based on their frequency of use in the studies examined (Figure 5). Simple linear regression (LR) leads with 37 articles, closely followed by RF with 32, along with methods such as PLSR, support vector machines (SVMs), artificial neural networks (ANNs), multiple linear regression (MLR), convolutional neural networks (CNNs), and nonlinear regression (NLR), indicating a clear preference for algorithms that balance simplicity, interpretability, and predictive power (Figure 5). The main approaches used in the analyzed articles are detailed below, along with a discussion of their advantages and limitations.
LR is widely recognized as one of the most fundamental and frequently used statistical methods for understanding the relationship between a dependent variable and one or more independent variables [127,133,134]. In simple linear regression, this relationship is modeled using a single predictor variable, whereas MLR incorporates multiple predictors, allowing for the analysis of how various variables explain the dependent variable [128]. Both approaches are efficient when the linearity assumption is valid. In scenarios in which this assumption is not met, NLR becomes more relevant, and the form of the equation can be exponential, polynomial, logarithmic, among others [128,135]. These simpler regression models tend to be easier to implement, interpret, and reuse than more complex models. However, while nonlinear regression offers greater flexibility, it is more prone to overfitting, and its accuracy depends on the complexity of the interactions between variables [127,136,137].
RF can be used for both classification and regression tasks, and it has been widely adopted across various fields due to its accuracy and generalization capability [124,138]. RF combines multiple simple and independent decision trees using random samples from subsets of training data to fit each tree and mitigate the risk of overfitting, thus providing greater generalization to the model. The algorithm offers valuable insights into variable importance, which facilitates the interpretation of results. Additionally, it is fast, stable, can model nonlinear relationships, and works well, even with small datasets [138,139,140,141]. However, due to the combination of numerous trees, the model’s interpretability is complex, especially when compared to simpler models, and it may overfit large data noise [33,142,143]. These characteristics make RF a powerful tool, but it requires careful tuning in large-scale applications or when dealing with imbalanced data [144].
PLSR is a regression method that utilizes concepts similar to principal component analysis (PCA), with the difference being that the components are constructed independently, as covariance is maximized between the predictor variables and the response variable, even though most of the variance comes from the predictors [126,145]. By maximizing the covariance between predictor variables and the response, PLSR allows for handling high-dimensional and multicollinear variables, estimating multiple dependent variables, and working with datasets with a larger number of predictors than observations [126,146]. However, with numerous datasets, both in terms of observations and variables, the process can become more computationally expensive due to the iterative calculations involved in constructing the latent components. Additionally, challenges such as interpreting the components, managing outliers, and the risk of overfitting may arise [146,147,148,149].
SVM is a widely used algorithm in classification and regression tasks, whose main characteristic is the search for a hyperplane that maximizes the separation between distinct classes or minimizes errors in regression predictions, optimizing the margin between the data points closest to this hyperplane, known as support vectors [125,150,151]. To handle non-linearly separable data, SVM uses kernel functions (e.g., polynomial kernel, radial basis function—RBF, and sigmoid) to map the data into a higher-dimensional space, allowing for more effective separation [125]. SVM excels in its robustness in high-dimensional scenarios and its generalization capability, mitigating the risk of overfitting. However, the algorithm faces challenges, such as high computational costs in large datasets, especially in the presence of outliers [125,133,152,153].
ANNs stand out as powerful tools for modeling complex patterns, with applications in classification, regression, computer vision, and natural language processing. Their structure, which is composed of interconnected layers of neurons, allows for the detection of nonlinear relationships [154,155]. ANNs include an input layer, an output layer, and intermediate layers, whose dimensions and numbers can vary. Learning occurs through error backpropagation, adjusting the weights between layers, and using various activation functions [156,157]. However, their performance depends on large volumes of data and robust computational resources, and they are often considered a “black box” which limits their interpretability [157,158].
CNNs represent a significant evolution in the field of computer vision, with widespread applications in tasks such as image recognition and object detection [159,160]. CNNs use convolutional layers with kernels to automatically extract hierarchical patterns from visual data, providing high efficiency in capturing spatial features and making them highly accurate in identifying local patterns. Additionally, weight sharing reduces computational complexity compared to fully connected networks, and the convolution process reduces the number of parameters through subsampling [157,159]. Like ANNs, CNNs face challenges related to interpretability, data requirements, and computational costs [161].
In addition to the challenges of interpretability and computational cost, the high dimensionality of the data underscores the need for effective techniques to handle large volumes of variables [138]. In this context, feature selection methods are frequently applied to reduce the number of predictive variables used, identifying the most relevant ones and removing those that are highly correlated (redundant) and do not significantly impact model performance [125,162]. The application of these methods allows models to focus on the most important features, resulting in reduced training time and potentially improving model performance. Moreover, using a smaller set of variables facilitates model interpretation, making it more understandable [132,163,164]. Methods such as sequential selection or feature elimination, Boruta, Regression ReliefF, recursive feature elimination, permutation importance, and correlation have been applied in the studies included in this work and have been shown to contribute to both computational and predictive performance [132,151,154,165,166].
Other crucial aspects include hyperparameter tuning and model training. Hyperparameters are variables defined before model training that control the learning process, directly impacting its performance and generalization ability [161,167]. The types and quantity of hyperparameters vary between algorithms, and the search for the best combination becomes more complex as the number of hyperparameters increases [168]. This search can be conducted across all possible combinations using grid search; however, this process may become impractical due to the time required, especially for slower models. Alternatively, random search or more advanced methods, such as Bayesian optimization and genetic algorithms, can be employed [167,168]. During the model training phase, it is essential to carry out the validation process to assess performance on new data, reducing issues of overfitting or underfitting [169]. To this end, the data are divided into the following three subsets: training, validation, and testing. Training adjusts the model’s parameters, validation optimizes hyperparameters and prevents overfitting, and testing evaluates the final performance on unseen data [170]. To improve the performance of machine learning models, cross-validation becomes an essential tool, allowing for a more robust model evaluation by using different data subsets for training and testing [171]. A commonly used method is k-fold cross-validation, in which the data are divided into a specified number of subsets (k or folds), and training is repeated k − 1 times, with the remaining fold used for validation [172,173]. This ensures that each instance of the data is used as both test and training data at least once. Certainly, for small datasets, this method may not be suitable, as the divisions reduce the amount of training data. In such cases, an alternative is to use leave-one-out cross-validation (LOOCV), which works similarly but has the advantage of leaving only one data point out for testing, while the others are used for training [171,174].

3.5. Applications of UAVs in Pastures and Forage Crops

3.5.1. Biomass and Crop Height

Aboveground biomass estimation is among the main applications for the use of UAVs (Figure 6) and is often adopted as an indicator of yield, growth, and vegetation health [18,22,27]. Height is another parameter that has been widely studied (Figure 4), since in addition to serving as an indicator in common points with biomass, it is commonly correlated with it [20,21,27].
Pan et al. [22] used texture characteristics from RGB images and multispectral VIs to estimate aboveground biomass (AGB) in Central Asian pastures. Chen et al. [27] used RGB, hyperspectral, and LiDAR sensors to classify shrub and herbaceous vegetation in different pastures, with subsequent estimation of the biomass of both, in addition to combining the information collected from the UAVs with data from the Sentinel satellite for estimation at the landscape level. Rueda-Ayala et al. [20] estimated the height and biomass of pastures through an RGB-Depth (RGB-D) camera for modeling the height of crops carried on a terrestrial device together with an RGB camera aboard a UAV. The authors identified that crops whose biomass is represented more by tillering than by height may have a lower performance, although the results were still good. Grüner et al. [21] used SfM in RGB images to generate a digital elevation model (DEM), which proved to be a promising alternative for biomass estimation in heterogeneous pastures containing grasses and legumes.

3.5.2. Chlorophyll

Chlorophyll is the main pigment involved in photosynthesis, allowing plants to transform light energy into chemical energy, and its estimation is very important in monitoring nutritional status, health, senescence, stress, and yield [25,175]. It is responsible for the green color in plants, and the change in this color is one of the main points when it comes to remote sensing of vegetation, especially with RGB sensors [176] (Figure 7). Specifically, NIR wavelengths are not affected by plant pigments but are instead reflected by the spongy mesophyll in the leaf.
Zhu et al. [25], considering the challenges of simultaneously estimating LAI and chlorophyll content at different combinations resulting in the same spectral signature, employed a two-dimensional matrix method using hyperspectral data and data simulated by the PROSAIL model to estimate these parameters simultaneously in pastures. This approach outperformed PLSR and RF models in terms of performance, improving efficiency and accuracy. Zhang et al. [177] related hyperspectral images with 10 functional characteristics, including the pigments chlorophyll a and chlorophyll b and the carotenoid content, of a heterogeneous alpine meadow ecosystem and found good results for the generic algorithm integrated with PLSR (GA-PLSR). Lu et al. [17] used a modified camera to replace the original red band with a near-infrared band to estimate chlorophyll and leaf area index using vegetation indices and texture parameters in the PLSR model. It was observed that the model behaved well for the initial data, especially for LAI; however, with the advancement of the senescence process, the performance of the model was reduced due to the greater heterogeneity of the canopy due to the change in leaf pigmentation, with a greater impact on the models for chlorophyll.

3.5.3. Leaf Area Index (LAI)

LAI is a measure that expresses the total leaf area of a plant relative to the amount of soil area it covers, enabling the characterization of leaf density. Determining this parameter is important because of its association with photosynthesis, energy balance, and the ability to compete for light, nutrients, and water [16,17].
The authors of [23] used a commercial digital camera modified to capture red (R), green (G), and NIR light in the estimation of biomass and LAI of a ryegrass (Lolium multiflorum Lam.) field with an MLR model. The authors of [16] performed an LAI estimation in a heterogeneous pasture containing alfalfa (Medicago sativa L., “WL366HQ”), tall fescue (Festuca elata Keng ex E.B. Alexeev, “K31”), and Lamb’s tongue (Plantago lanceolata L., “Giant lizard”) with a multispectral UAV. As an alternative for better performance, the following elements were combined: multispectral and visible vegetation indices, texture indices by GLCM, and meteorological data (average daily temperature, maximum daily temperature, minimum daily temperature and daily temperature difference). Different models were evaluated, and RF obtained better results.

3.5.4. Nutrients

Inadequate concentrations of nutrients can result in nutritional deficiencies, compromising the yield, quality, and health of crops. In addition to the nutritional requirements of each crop and weather conditions, monitoring the nutritional status is essential for balanced fertilization, promoting a higher yield and lower use of inputs [178,179]. López-Calderón et al. [24] used a multispectral UAV to determine the total nitrogen content in forage maize (Zea mays L.) with an RF model and found good results in the generated model, which basically consisted of the combination of different vegetation indices. Pereira et al. [141] evaluated the spatial distribution of nitrogen in pastures in an integrated crop–livestock system using a multispectral UAV and RGB, in addition to the PlanetScope and Sentinel-2A platforms and the combination of both. When using the RF algorithm, the results based on multispectral data were better than those obtained with the other options. Alternatively, combining RGB with PlanetScope or Sentinel-2A led to better results compared to all three individually.

3.5.5. Soil Moisture (SM)

Several studies have used UAVs equipped with different sensors to estimate soil moisture (SM) or evapotranspiration (ET). The determination of soil moisture plays a crucial role in sustainable agricultural management, influencing plant growth, nutrient absorption, soil temperature regulation, and microbial activity [180,181,182]. Evapotranspiration also plays an important role in water management in agriculture, as it represents the sum of soil surface evaporation and plant transpiration, directly affecting irrigation needs and, consequently, water planning [82,183].
In the database, there is a diversification of sensors used for soil moisture estimation, ranging from thermal, hyperspectral, multispectral, RGB, and their combinations. The authors of [184] estimated SM over different patches (native vegetation, isolated vegetation, and three levels of bare soil) using various sensors on a UAV. Thermal and multispectral sensors were used to model SM using land surface temperature (LST) and the GNDVI index, while RGB images were used to distinguish vegetation from soil. Overall, the authors found good estimates (R2 = 0.89 and RMSE = 0.036) for soil moisture (0–40 cm). Meanwhile, [181] modeled soil moisture in more superficial layers (0–10 cm) using only RGB images. The authors based their approach on brightness changes and their relationship with water content in vegetation and soil. The study was conducted in sparse pasture vegetation (with exposed soil) and achieved positive results by adjusting in grazing areas where vegetation was thinner and the soil was more compacted due to animals, considering both stable moisture scenarios and after rainfall events.
Conversely, in the studies reviewed, ET or crop transpiration was largely determined by the use of thermal imaging sensors, although other sensors were additionally used to accurately estimate this variable [82,183,185]. For example, Morgan et al. [185] used a UAV equipped with a multispectral camera (six bands), a thermal camera, two pyranometers, a 3D sonic anemometer, and a climate sensor to estimate latent heat flux (the energy equivalent of transpiration). The approaches, one based on the surface energy balance (SEB) and the other on the Bowen ratio, provided independent estimates of latent heat flux that were within 20% to 30% of Eddy covariance tower measurements. Brenner et al. [82] applied thermal and RGB imagery to estimate ET using the following three models: two simpler models, deriving atmosphere turbulent transport useful to dummies using temperature (DATTUTDUT) and the triangle method and a physics-based surface energy balance model, two-source energy balance (TSEB). As in the previously mentioned studies, RGB images were used to provide vegetation information, in this case, for the TSED and the triangle method through the NGRDI and NGBDI indices. With mean absolute errors around 20 to 40 W m−2, both models proved satisfactory for ET modeling.

3.5.6. Forage Quality

Production indicators of forage crops, such as acid detergent fiber and crude protein, are essential to assess the nutritional quality of forages. Understanding these indicators allows producers to make informed decisions in the management of forage crops, aiming to maximize the production and feed efficiency of the animals [26,186,187]. Geipel et al. [26] used hyperspectral images to estimate forage production and quality in a mixture of grasses and legumes, applying PLSR adjusted to reflectance data. In general, good results were found for the estimation of fresh and dry matter, crude protein, dry matter digestibility, neutral detergent fiber, and indigestible neutral detergent fiber content. Wijesingha et al. [187] applied hyperspectral images in different pastures to estimate crude protein and acid detergent fiber, comparing the following different types of algorithms: PLSR, GPR, CBR, RF, and SVM. The predictive algorithms SVM and CBR were the best for crude protein and acidic detergent fiber. Giraldo et al. [130] applied multispectral images in the estimation of crude protein, neutral detergent fiber, acid detergent fiber, and lignin in pastures of Urochloa humidicola cv. Llanero through a GAM.

3.5.7. Challenges in the Use of UAVs

The use of UAVs in agriculture results in the generation of large volumes of data, which require sophisticated processing and analysis to extract useful information [188,189]. Applying machine learning algorithms and advanced data analysis techniques becomes essential for obtaining these insights [127,189]. However, the complexity of these processes represents a significant barrier, especially for farmers who lack the technological resources or specialized knowledge to interpret the collected data [190]. Furthermore, the use of specific software and the integration of different types of sensors, such as RGB, multispectral, hyperspectral, and LiDAR cameras, further increases the complexity of the workflow [191,192]. In addition, environmental conditions, such as weather and topography, pose challenges to using UAVs [31,193]. The presence of clouds, strong winds, rainfall, or rough terrain, for instance, can compromise the quality of the captured images [31,68]. The continuous adaptation of UAV technologies to address these environmental variables remains a significant challenge.
Despite the benefits that UAVs offer regarding efficiency and precise monitoring, the initial cost of acquisition, drone maintenance, and investment in data analysis software can be high [193,194]. For small-scale farmers, this investment may not be feasible, making it essential to conduct a robust cost–benefit analysis highlighting how UAVs can generate long-term savings through improvements in productivity and resource management [194,195]. The adoption of UAVs requires farmers and agricultural workers to be trained to operate the drones and interpret the data that are generated [194,196]. The lack of proper training can undermine the effectiveness of this technology, leading to its underutilization [197]. Thus, training programs and technical support are essential to ensure that users can maximize the benefits offered by UAVs. The use of UAVs in agriculture is also subject to regulatory restrictions that vary by region. Flight license requirements, altitude restrictions, and no-fly zones may limit UAV operations [198]. While these regulations are crucial for ensuring safety, privacy, and the proper use of UAVs, they can become obstacles to the widespread adoption of the technology when overly restrictive or when current regulations lack clarity [196,199,200].

4. Conclusions

Technologies, such as unmanned aerial vehicles (UAVs), together with machine learning algorithms, show a promising potential to improve the monitoring and management of production areas. It is evident that research in this field is constantly evolving, with a significant increase in the number of articles published and citations received. International collaboration and the search for technological innovations reflect the commitment of the academic community to drive the development of these technologies.
The use of UAVs for monitoring forage crops has proven to be an effective and promising tool, enabling the collection of biophysical and physiological data with high frequency and resolution. Throughout this review, we highlighted the main applications of UAVs, from biomass estimation to monitoring parameters, such as chlorophyll, leaf area index, and soil moisture. Furthermore, we emphasized the crucial role of data processing techniques, such as machine learning, which have optimized analysis accuracy and contributed to the sustainability of agricultural systems. While these technological advances still face challenges, such as the need for greater standardization and accessibility of sensors, they indicate a promising future for precision farming.
Therefore, the integration of advanced technologies, data analysis methods, innovative algorithms, and feature selection techniques is essential to promote sustainability, efficiency, and productivity in agricultural and livestock activities. The growing use of UAVs and the continued development of advanced machine learning algorithms suggest that these technologies will continue to evolve, playing a central role in improving productive efficiency and sustainable resource management. Future research should consider these aspects, along with deeper exploration of less studied species, broader applications of machine learning algorithms, and the exploration of different interactions and management strategies.

Author Contributions

Conceptualization, A.C.B., T.G.F.d.S. and W.M.d.S.; methodology, A.C.B., C.A.A.d.S., L.D.C.d.S.M., M.V.d.S., T.G.F.d.S. and W.M.d.S.; software, A.C.B., A.M.d.R.F.J., T.G.F.d.S. and W.M.d.S.; validation, A.M.d.R.F.J., L.S.B.d.S. and M.V.d.S.; formal analysis, C.A.A.d.S., L.S.B.d.S., M.V.d.S., T.G.F.d.S. and W.M.d.S.; investigation, A.C.B., L.D.C.d.S.M., L.S.B.d.S., M.V.d.S., T.G.F.d.S. and W.M.d.S.; resources, A.C.B., C.A.A.d.S., L.S.B.d.S., M.V.d.S. and T.G.F.d.S.; data curation, A.C.B., L.D.C.d.S.M., L.S.B.d.S., T.G.F.d.S. and W.M.d.S.; writing—original draft preparation, A.C.B., L.D.C.d.S.M., L.S.B.d.S., T.G.F.d.S. and W.M.d.S.; writing—review and editing, A.M.d.R.F.J., C.A.A.d.S., A.C.B. and M.V.d.S.; visualization, A.C.B., A.M.d.R.F.J., T.G.F.d.S. and W.M.d.S.; supervision, A.C.B., A.M.d.R.F.J. and T.G.F.d.S.; project administration, A.C.B., T.G.F.d.S. and W.M.d.S.; funding acquisition, M.V.d.S. and T.G.F.d.S. All authors have read and agreed to the published version of the manuscript.

Funding

Financial support was provided by the Coordination for the Improvement of Higher Education Personnel (CAPES—Finance Code 001).

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank the Postgraduate Program in Agricultural Engineering (PGEA) and the Federal Rural University of Pernambuco (UFRPE), Brazil, for their support and provision of equipment for the development of this research. We acknowledge the Coordination for the Improvement of Higher Education Personnel (CAPES—Finance Code 001), Brazil, and the São Paulo Research Foundation—FAPESP (grant number 2023/05323-4), Brazil, for fellowships and financial support for this study. We also extend our gratitude to the anonymous reviewers for their insightful and constructive feedback, which has significantly enhanced the quality of this paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. da Rocha Fernandes, M.H.M.; de Souza FernandesJunior, J.; Adams, J.M.; Lee, M.; Reis, R.A.; Tedeschi, L.O. Using Sentinel-2 Satellite Images and Machine Learning Algorithms to Predict Tropical Pasture Forage Mass, Crude Protein, and Fiber Content. Sci. Rep. 2024, 14, 8704. [Google Scholar] [CrossRef]
  2. Sanyaolu, M.; Sadowski, A. The Role of Precision Agriculture Technologies in Enhancing Sustainable Agriculture. Sustainability 2024, 16, 6668. [Google Scholar] [CrossRef]
  3. Papakonstantinou, G.I.; Voulgarakis, N.; Terzidou, G.; Fotos, L.; Giamouri, E.; Papatsiros, V.G. Precision Livestock Farming Technology: Applications and Challenges of Animal Welfare and Climate Change. Agriculture 2024, 14, 620. [Google Scholar] [CrossRef]
  4. Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  5. Řezník, T.; Lukas, V.; Charvát, K.; Křivánek, Z.; Kepka, M.; Herman, L.; Řezníková, H. Disaster Risk Reduction in Agriculture through Geospatial (Big) Data Processing. ISPRS Int. J. Geo-Inf. 2017, 6, 238. [Google Scholar] [CrossRef]
  6. Manoj, K.N.; Shekara, B.G.; Sridhara, S.; Jha, P.K.; Prasad, P.V.V. Biomass Quantity and Quality from Different Year-Round Cereal–Legume Cropping Systems as Forage or Fodder for Livestock. Sustainability 2021, 13, 9414. [Google Scholar] [CrossRef]
  7. Blaix, C.; Chabrerie, O.; Alard, D.; Catterou, M.; Diquelou, S.; Dutoit, T.; Lacoux, J.; Loucougaray, G.; Michelot-Antalik, A.; Pacé, M.; et al. Forage Nutritive Value Shows Synergies with Plant Diversity in a Wide Range of Semi-Natural Grassland Habitats. Agric. Ecosyst. Environ. 2023, 347, 108369. [Google Scholar] [CrossRef]
  8. Tlahig, S.; Neji, M.; Atoui, A.; Seddik, M.; Dbara, M.; Yahia, H.; Nagaz, K.; Najari, S.; Khorchani, T.; Loumerem, M. Genetic and Seasonal Variation in Forage Quality of Lucerne (Medicago sativa L.) for Resilience to Climate Change in Arid Environments. J. Agric. Food Res. 2024, 15, 100986. [Google Scholar] [CrossRef]
  9. Fraser, M.D.; Vallin, H.E.; Roberts, B.P. Animal Board Invited Review: Grassland-Based Livestock Farming and Biodiversity. Animal 2022, 16, 100671. [Google Scholar] [CrossRef]
  10. Cheng, M.; McCarl, B.; Fei, C. Climate Change and Livestock Production: A Literature Review. Atmosphere 2022, 13, 140. [Google Scholar] [CrossRef]
  11. Cabrita, A.R.J.; Valente, I.M.; Monteiro, A.; Sousa, C.; Miranda, C.; Almeida, A.; Cortez, P.P.; Castro, C.; Maia, M.R.G.; Trindade, H.; et al. Environmental Conditions Affect the Nutritive Value and Alkaloid Profiles of Lupinus Forage: Opportunities and Threats for Sustainable Ruminant Systems. Heliyon 2024, 10, e28790. [Google Scholar] [CrossRef]
  12. Vermelho, A.B.; Moreira, J.V.; Teixeira Akamine, I.; Cardoso, V.S.; Mansoldo, F.R.P. Agricultural Pest Management: The Role of Microorganisms in Biopesticides and Soil Bioremediation. Plants 2024, 13, 2762. [Google Scholar] [CrossRef]
  13. Subhashree, S.N.; Igathinathane, C.; Akyuz, A.; Borhan, M.; Hendrickson, J.; Archer, D.; Liebig, M.; Toledo, D.; Sedivec, K.; Kronberg, S.; et al. Tools for Predicting Forage Growth in Rangelands and Economic Analyses—A Systematic Review. Agriculture 2023, 13, 455. [Google Scholar] [CrossRef]
  14. Carella, A.; Fischer, B.; Massenti, P.T.; Lo Bianco, R.; Carella, A.; Tomas, P.; Massenti, R.; Lo Bianco, R. Continuous Plant-Based and Remote Sensing for Determination of Fruit Tree Water Status. Horticulturae 2024, 10, 516. [Google Scholar] [CrossRef]
  15. Cárceles Rodríguez, B.; Durán-Zuazo, V.H.; Soriano Rodríguez, M.; García-Tejero, I.F.; Gálvez Ruiz, B.; Cuadros Tavira, S. Conservation Agriculture as a Sustainable System for Soil Health: A Review. Soil Syst. 2022, 6, 87. [Google Scholar] [CrossRef]
  16. Wang, X.; Yan, S.; Wang, W.; Liubing, Y.; Li, M.; Yu, Z.; Chang, S.; Hou, F. Monitoring Leaf Area Index of the Sown Mixture Pasture through UAV Multispectral Image and Texture Characteristics. Comput. Electron. Agric. 2023, 214, 108333. [Google Scholar] [CrossRef]
  17. Lu, B.; He, Y.; Liu, H.H.T. Mapping Vegetation Biophysical and Biochemical Properties Using Unmanned Aerial Vehicles-Acquired Imagery. Int. J. Remote Sens. 2018, 39, 5265–5287. [Google Scholar] [CrossRef]
  18. Avneri, A.; Aharon, S.; Brook, A.; Atsmon, G.; Smirnov, E.; Sadeh, R.; Abbo, S.; Peleg, Z.; Herrmann, I.; Bonfil, D.J.; et al. UAS-Based Imaging for Prediction of Chickpea Crop Biophysical Parameters and Yield. Comput. Electron. Agric. 2023, 205, 107581. [Google Scholar] [CrossRef]
  19. Júnior, G.D.N.A.; da Silva, T.G.F.; de Souza, L.S.B.; de Araújo, G.G.L.; de Moura, M.S.B.; Alves, C.P.; da Silva Salvador, K.R.; de Souza, C.A.A.; de Assunção Montenegro, A.A.; da Silva, M.J. Phenophases, Morphophysiological Indices and Cutting Time in Clones of the Forage Cacti under Controlled Water Regimes in a Semiarid Environment. J. Arid Environ. 2021, 190, 104510. [Google Scholar] [CrossRef]
  20. Rueda-Ayala, V.P.; Peña, J.M.; Höglind, M.; Bengochea-Guevara, J.M.; Andújar, D. Comparing UAV-Based Technologies and RGB-D Reconstruction Methods for Plant Height and Biomass Monitoring on Grass Ley. Sensors 2019, 19, 535. [Google Scholar] [CrossRef]
  21. Grüner, E.; Astor, T.; Wachendorf, M. Biomass Prediction of Heterogeneous Temperate Grasslands Using an SfM Approach Based on UAV Imaging. Agronomy 2019, 9, 54. [Google Scholar] [CrossRef]
  22. Pan, T.; Ye, H.; Zhang, X.; Liao, X.; Wang, D.; Bayin, D.; Safarov, M.; Okhonniyozov, M.; Majid, G. Estimating Aboveground Biomass of Grassland in Central Asia Mountainous Areas Using Unmanned Aerial Vehicle Vegetation Indices and Image Textures—A Case Study of Typical Grassland in Tajikistan. Environ. Sustain. Indic. 2024, 22, 100345. [Google Scholar] [CrossRef]
  23. Fan, X.; Kawamura, K.; Xuan, T.D.; Yuba, N.; Lim, J.; Yoshitoshi, R.; Minh, T.N.; Kurokawa, Y.; Obitsu, T. Low-Cost Visible and near-Infrared Camera on an Unmanned Aerial Vehicle for Assessing the Herbage Biomass and Leaf Area Index in an Italian Ryegrass Field. Grassl. Sci. 2018, 64, 145–150. [Google Scholar] [CrossRef]
  24. López-Calderón, M.J.; Estrada-ávalos, J.; Rodríguez-Moreno, V.M.; Mauricio-Ruvalcaba, J.E.; Martínez-Sifuentes, A.R.; Delgado-Ramírez, G.; Miguel-Valle, E. Estimation of Total Nitrogen Content in Forage Maize (Zea mays L.) Using Spectral Indices: Analysis by Random Forest. Agriculture 2020, 10, 451. [Google Scholar] [CrossRef]
  25. Zhu, X.; Yang, Q.; Chen, X.; Ding, Z. An Approach for Joint Estimation of Grassland Leaf Area Index and Leaf Chlorophyll Content from UAV Hyperspectral Data. Remote Sens. 2023, 15, 2525. [Google Scholar] [CrossRef]
  26. Geipel, J.; Bakken, A.K.; Jørgensen, M.; Korsaeth, A. Forage Yield and Quality Estimation by Means of UAV and Hyperspectral Imaging. Precis. Agric. 2021, 22, 1437–1463. [Google Scholar] [CrossRef]
  27. Chen, A.; Xu, C.; Zhang, M.; Guo, J.; Xing, X.; Yang, D.; Xu, B.; Yang, X. Cross-Scale Mapping of above-Ground Biomass and Shrub Dominance by Integrating UAV and Satellite Data in Temperate Grassland. Remote Sens. Environ. 2024, 304, 114024. [Google Scholar] [CrossRef]
  28. Messina, G.; Modica, G. Applications of UAV Thermal Imagery in Precision Agriculture: State of the Art and Future Research Outlook. Remote Sens. 2020, 12, 1491. [Google Scholar] [CrossRef]
  29. Xu, C.; Zhao, D.; Zheng, Z.; Zhao, P.; Chen, J.; Li, X.; Zhao, X.; Zhao, Y.; Liu, W.; Wu, B.; et al. Correction of UAV LiDAR-Derived Grassland Canopy Height Based on Scan Angle. Front. Plant Sci. 2023, 14, 1108109. [Google Scholar] [CrossRef]
  30. Yepes-Nuñez, J.J.; Urrútia, G.; Romero-García, M.; Alonso-Fernández, S. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. Rev. Esp. Cardiol. 2021, 74, 790–799. [Google Scholar] [CrossRef]
  31. Jenerowicz, A.; Wierzbicki, D.; Kedzierski, M. Radiometric Correction with Topography Influence of Multispectral Imagery Obtained from Unmanned Aerial Vehicles. Remote Sens. 2023, 15, 2059. [Google Scholar] [CrossRef]
  32. Jakob, S.; Zimmermann, R.; Gloaguen, R. The Need for Accurate Geometric and Radiometric Corrections of Drone-Borne Hyperspectral Data for Mineral Exploration: MEPHySTo—A Toolbox for Pre-Processing Drone-Borne Hyperspectral Data. Remote Sens. 2017, 9, 88. [Google Scholar] [CrossRef]
  33. Lu, B.; He, Y. Species Classification Using Unmanned Aerial Vehicle (UAV)-Acquired High Spatial Resolution Imagery in a Heterogeneous Grassland. ISPRS J. Photogramm. Remote Sens. 2017, 128, 73–85. [Google Scholar] [CrossRef]
  34. Kim, J.I.; Kim, T.; Shin, D.; Kim, S.H. Fast and Robust Geometric Correction for Mosaicking UAV Images with Narrow Overlaps. Int. J. Remote Sens. 2017, 38, 2557–2576. [Google Scholar] [CrossRef]
  35. Zhu, H.; Huang, Y.; An, Z.; Zhang, H.; Han, Y.; Zhao, Z.; Li, F.; Zhang, C.; Hou, C. Assessing Radiometric Calibration Methods for Multispectral UAV Imagery and the Influence of Illumination, Flight Altitude and Flight Time on Reflectance, Vegetation Index and Inversion of Winter Wheat AGB and LAI. Comput. Electron. Agric. 2024, 219, 108821. [Google Scholar] [CrossRef]
  36. Xue, B.; Ming, B.; Xin, J.; Yang, H.; Gao, S.; Guo, H.; Feng, D.; Nie, C.; Wang, K.; Li, S. Radiometric Correction of Multispectral Field Images Captured under Changing Ambient Light Conditions and Applications in Crop Monitoring. Drones 2023, 7, 223. [Google Scholar] [CrossRef]
  37. Daniels, L.; Eeckhout, E.; Wieme, J.; Dejaegher, Y.; Audenaert, K.; Maes, W.H. Identifying the Optimal Radiometric Calibration Method for UAV-Based Multispectral Imaging. Remote Sens. 2023, 15, 2909. [Google Scholar] [CrossRef]
  38. Jiang, J.; Zhang, Q.; Wang, W.; Wu, Y.; Zheng, H.; Yao, X.; Zhu, Y.; Cao, W.; Cheng, T. MACA: A Relative Radiometric Correction Method for Multiflight Unmanned Aerial Vehicle Images Based on Concurrent Satellite Imagery. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–14. [Google Scholar] [CrossRef]
  39. Poncet, A.M.; Knappenberger, T.; Brodbeck, C.; Fogle, M.; Shaw, J.N.; Ortiz, B.V. Multispectral UAS Data Accuracy for Different Radiometric Calibration Methods. Remote Sens. 2019, 11, 1917. [Google Scholar] [CrossRef]
  40. Andrade, O.B.d.; Montenegro, A.A.d.A.; Silva Neto, M.A.d.; Sousa, L.d.B.d.; Almeida, T.A.B.; de Lima, J.L.M.P.; Carvalho, A.A.d.; Silva, M.V.d.; Medeiros, V.W.C.d.; Soares, R.G.F.; et al. UAV-Based Classification of Intercropped Forage Cactus: A Comparison of RGB and Multispectral Sample Spaces Using Machine Learning in an Irrigated Area. AgriEngineering 2024, 6, 509–525. [Google Scholar] [CrossRef]
  41. Wang, Y.; Yang, Z.; Khan, H.A.; Kootstra, G. Improving Radiometric Block Adjustment for UAV Multispectral Imagery under Variable Illumination Conditions. Remote Sens. 2024, 16, 3019. [Google Scholar] [CrossRef]
  42. Cao, S.; Danielson, B.; Clare, S.; Koenig, S.; Campos-Vargas, C.; Sanchez-Azofeifa, A. Radiometric Calibration Assessments for UAS-Borne Multispectral Cameras: Laboratory and Field Protocols. ISPRS J. Photogramm. Remote Sens. 2019, 149, 132–145. [Google Scholar] [CrossRef]
  43. Nigon, T.; Paiao, G.D.; Mulla, D.J.; Fernández, F.G.; Yang, C. The Influence of Aerial Hyperspectral Image Processing Workflow on Nitrogen Uptake Prediction Accuracy in Maize. Remote Sens. 2021, 14, 132. [Google Scholar] [CrossRef]
  44. Honkavaara, E.; Hakala, T.; Markelin, L.; Rosnell, T.; Saari, H.; Mäkynen, J. A Process for Radiometric Correction of UAV Image Blocks. Photogramm. Fernerkund. Geoinf. 2012, 2012, 115–127. [Google Scholar] [CrossRef] [PubMed]
  45. Khadka, N.; Teixeira Pinto, C.; Leigh, L.; Petropoulos, G.P.; Pavlides, A.; Nocerino, E. Detection of Change Points in Pseudo-Invariant Calibration Sites Time Series Using Multi-Sensor Satellite Imagery. Remote Sens. 2021, 13, 2079. [Google Scholar] [CrossRef]
  46. Mei, A.; Bassani, C.; Fontinovo, G.; Salvatori, R.; Allegrini, A. The Use of Suitable Pseudo-Invariant Targets for MIVIS Data Calibration by the Empirical Line Method. ISPRS J. Photogramm. Remote Sens. 2016, 114, 102–114. [Google Scholar] [CrossRef]
  47. Ryadi, G.Y.I.; Syariz, M.A.; Lin, C.H. Relaxation-Based Radiometric Normalization for Multitemporal Cross-Sensor Satellite Images. Sensors 2023, 23, 5150. [Google Scholar] [CrossRef]
  48. Liu, K.; Ke, T.; Tao, P.; He, J.; Xi, K.; Yang, K. Robust Radiometric Normalization of Multitemporal Satellite Images Via Block Adjustment without Master Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 6029–6043. [Google Scholar] [CrossRef]
  49. Redana, M.; Lancaster, L.T.; Chong, X.Y.; Lip, Y.Y.; Gibbins, C. An Open-Source Method for Producing Reliable Water Temperature Maps for Ecological Applications Using Non-Radiometric Sensors. Remote Sens. Appl. Soc. Environ. 2024, 34, 101184. [Google Scholar] [CrossRef]
  50. Malbéteau, Y.; Johansen, K.; Aragon, B.; Al-Mashhawari, S.K.; McCabe, M.F. Overcoming the Challenges of Thermal Infrared Orthomosaics Using a Swath-Based Approach to Correct for Dynamic Temperature and Wind Effects. Remote Sens. 2021, 13, 3255. [Google Scholar] [CrossRef]
  51. Aragon, B.; Johansen, K.; Parkes, S.; Malbeteau, Y.; Al-mashharawi, S.; Al-amoudi, T.; Andrade, C.F.; Turner, D.; Lucieer, A.; McCabe, M.F. A Calibration Procedure for Field and UAV-Based Uncooled Thermal Infrared Instruments. Sensors 2020, 20, 3316. [Google Scholar] [CrossRef] [PubMed]
  52. Virtue, J.; Turner, D.; Williams, G.; Zeliadt, S.; McCabe, M.; Lucieer, A. Thermal Sensor Calibration for Unmanned Aerial Systems Using an External Heated Shutter. Drones 2021, 5, 119. [Google Scholar] [CrossRef]
  53. Tunca, E.; Köksal, E.S.; Çetin Taner, S. Calibrating UAV Thermal Sensors Using Machine Learning Methods for Improved Accuracy in Agricultural Applications. Infrared Phys. Technol. 2023, 133, 104804. [Google Scholar] [CrossRef]
  54. de Oca, A.M.; Flores, G. A UAS Equipped with a Thermal Imaging System with Temperature Calibration for Crop Water Stress Index Computation. In Proceedings of the 2021 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 15–18 June 2021; pp. 714–720. [Google Scholar]
  55. Kelly, J.; Kljun, N.; Olsson, P.O.; Mihai, L.; Liljeblad, B.; Weslien, P.; Klemedtsson, L.; Eklundh, L. Challenges and Best Practices for Deriving Temperature Data from an Uncalibrated UAV Thermal Infrared Camera. Remote Sens. 2019, 11, 567. [Google Scholar] [CrossRef]
  56. Han, Y.; Tarakey, B.A.; Hong, S.J.; Kim, S.Y.; Kim, E.; Lee, C.H.; Kim, G. Calibration and Image Processing of Aerial Thermal Image for UAV Application in Crop Water Stress Estimation. J. Sens. 2021, 2021, 5537795. [Google Scholar] [CrossRef]
  57. Matese, A.; Di Gennaro, S.F. Practical Applications of a Multisensor UAV Platform Based on Multispectral, Thermal and RGB High Resolution Images in Precision Viticulture. Agriculture 2018, 8, 116. [Google Scholar] [CrossRef]
  58. Rocchini, D.; Di Rita, A. Relief Effects on Aerial Photos Geometric Correction. Appl. Geogr. 2005, 25, 159–168. [Google Scholar] [CrossRef]
  59. Santana, L.S.; Ferraz, G.A.E.S.; Marin, D.B.; Barbosa, B.; Dos Santos, L.M.; Ferraz, P.F.P.; Conti, L.; Camiciottoli, S.; Rossi, G. Influence of Flight Altitude and Control Points in the Georeferencing of Images Obtained by Unmanned Aerial Vehicle. Eur. J. Remote Sens. 2021, 54, 59–71. [Google Scholar] [CrossRef]
  60. Zhang, K.; Okazawa, H.; Hayashi, K.; Hayashi, T.; Fiwa, L.; Maskey, S. Optimization of Ground Control Point Distribution for Unmanned Aerial Vehicle Photogrammetry for Inaccessible Fields. Sustainability 2022, 14, 9505. [Google Scholar] [CrossRef]
  61. Dai, W.; Zheng, G.; Antoniazza, G.; Zhao, F.; Chen, K.; Lu, W.; Lane, S.N. Improving UAV-SfM Photogrammetry for Modelling High-Relief Terrain: Image Collection Strategies and Ground Control Quantity. Earth Surf. Process. Landf. 2023, 48, 2884–2899. [Google Scholar] [CrossRef]
  62. Villanueva, J.K.S.; Blanco, A.C. Optimization of Ground Control Point (GCP) Configuration for Unmanned Aerial Vehicle (UAV) Survey Using Structure from Motion (SFM). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 167–174. [Google Scholar] [CrossRef]
  63. Gomes Pessoa, G.; Caceres Carrilho, A.; Takahashi Miyoshi, G.; Amorim, A.; Galo, M. Assessment of UAV-Based Digital Surface Model and the Effects of Quantity and Distribution of Ground Control Points. Int. J. Remote Sens. 2021, 42, 65–83. [Google Scholar] [CrossRef]
  64. Liu, X.; Lian, X.; Yang, W.; Wang, F.; Han, Y.; Zhang, Y. Accuracy Assessment of a UAV Direct Georeferencing Method and Impact of the Configuration of Ground Control Points. Drones 2022, 6, 30. [Google Scholar] [CrossRef]
  65. Martínez-Carricondo, P.; Agüera-Vega, F.; Carvajal-Ramírez, F.; Mesas-Carrascosa, F.J.; García-Ferrer, A.; Pérez-Porras, F.J. Assessment of UAV-Photogrammetric Mapping Accuracy Based on Variation of Ground Control Points. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 1–10. [Google Scholar] [CrossRef]
  66. James, M.R.; Antoniazza, G.; Robson, S.; Lane, S.N. Mitigating Systematic Error in Topographic Models for Geomorphic Change Detection: Accuracy, Precision and Considerations beyond off-Nadir Imagery. Earth Surf. Process. Landf. 2020, 45, 2251–2271. [Google Scholar] [CrossRef]
  67. Ulvi, A. The Effect of the Distribution and Numbers of Ground Control Points on the Precision of Producing Orthophoto Maps with an Unmanned Aerial Vehicle. J. Asian Archit. Build. Eng. 2021, 20, 806–817. [Google Scholar] [CrossRef]
  68. Stott, E.; Williams, R.D.; Hoey, T.B. Ground Control Point Distribution for Accurate Kilometre-Scale Topographic Mapping Using an RTK-GNSS Unmanned Aerial Vehicle and SfM Photogrammetry. Drones 2020, 4, 55. [Google Scholar] [CrossRef]
  69. Tomaštík, J.; Mokroš, M.; Surový, P.; Grznárová, A.; Merganič, J. UAV RTK/PPK Method—An Optimal Solution for Mapping Inaccessible Forested Areas? Remote Sens. 2019, 11, 721. [Google Scholar] [CrossRef]
  70. Cho, J.M.; Lee, B.K. GCP and PPK Utilization Plan to Deal with RTK Signal Interruption in RTK-UAV Photogrammetry. Drones 2023, 7, 265. [Google Scholar] [CrossRef]
  71. Famiglietti, N.A.; Cecere, G.; Grasso, C.; Memmolo, A.; Vicari, A. A Test on the Potential of a Low Cost Unmanned Aerial Vehicle RTK/PPK Solution for Precision Positioning. Sensors 2021, 21, 3882. [Google Scholar] [CrossRef]
  72. Jain, A.; Mahajan, M.; Saraf, R. Standardization of the Shape of Ground Control Point (GCP) and the Methodology for Its Detection in Images for UAV-Based Mapping Applications. In Proceedings of the Advances in Computer Vision; Arai, K., Kapoor, S., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 459–476. [Google Scholar]
  73. Santos, W.M.d.; Costa, C.d.J.P.; Medeiros, M.L.d.S.; Jardim, A.M.d.R.F.; Cunha, M.V.d.; Dubeux Junior, J.C.B.; Jaramillo, D.M.; Bezerra, A.C.; Souza, E.J.O.d. Can Unmanned Aerial Vehicle Images Be Used to Estimate Forage Production Parameters in Agroforestry Systems in the Caatinga? Appl. Sci. 2024, 14, 4896. [Google Scholar] [CrossRef]
  74. Carneiro, F.M.; Angeli Furlani, C.E.; Zerbato, C.; Candida de Menezes, P.; da Silva Gírio, L.A.; Freire de Oliveira, M. Comparison between Vegetation Indices for Detecting Spatial and Temporal Variabilities in Soybean Crop Using Canopy Sensors. Precis. Agric. 2020, 21, 979–1007. [Google Scholar] [CrossRef]
  75. Pereira, J.A.; Vélez, S.; Martínez-Peña, R.; Castrillo, D. Beyond Vegetation: A Review Unveiling Additional Insights into Agriculture and Forestry through the Application of Vegetation Indices. J 2023, 6, 421–436. [Google Scholar] [CrossRef]
  76. El-Hendawy, S.E.; Al-Suhaibani, N.A.; Elsayed, S.; Hassan, W.M.; Dewir, Y.H.; Refay, Y.; Abdella, K.A. Potential of the Existing and Novel Spectral Reflectance Indices for Estimating the Leaf Water Status and Grain Yield of Spring Wheat Exposed to Different Irrigation Rates. Agric. Water Manag. 2019, 217, 356–373. [Google Scholar] [CrossRef]
  77. Ali, A.; Martelli, R.; Lupia, F.; Barbanti, L. Assessing Multiple Years’ Spatial Variability of Crop Yields Using Satellite Vegetation Indices. Remote Sens. 2019, 11, 2384. [Google Scholar] [CrossRef]
  78. Amaral, L.R.; Oldoni, H.; Baptista, G.M.M.; Ferreira, G.H.S.; Freitas, R.G.; Martins, C.L.; Cunha, I.A.; Santos, A.F. Remote Sensing Imagery to Predict Soybean Yield: A Case Study of Vegetation Indices Contribution. Precis. Agric. 2024, 25, 2375–2393. [Google Scholar] [CrossRef]
  79. Wang, R.; Tuerxun, N.; Zheng, J. Improved Estimation of SPAD Values in Walnut Leaves by Combining Spectral, Texture, and Structural Information from UAV-Based Multispectral Image. Sci. Hortic. 2024, 328, 112940. [Google Scholar] [CrossRef]
  80. Liu, H.Q.; Huete, A. A Feedback Based Modification of the NDVI to Minimize Canopy Background and Atmospheric Noise. IEEE Trans. Geosci. Remote Sens. 1995, 33, 457–465. [Google Scholar] [CrossRef]
  81. Huete, A.R. A Soil-Adjusted Vegetation Index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  82. Brenner, C.; Zeeman, M.; Bernhardt, M.; Schulz, K. Estimation of Evapotranspiration of Temperate Grassland Based on High-Resolution Thermal and Visible Range Imagery from Unmanned Aerial Systems. Int. J. Remote Sens. 2018, 39, 5141–5174. [Google Scholar] [CrossRef]
  83. Gée, C.; Denimal, E. RGB Image-Derived Indicators for Spatial Assessment of the Impact of Broadleaf Weeds on Wheat Biomass. Remote Sens. 2020, 12, 2982. [Google Scholar] [CrossRef]
  84. Liang, H.; Lee, S.C.; Bae, W.; Kim, J.; Seo, S. Towards UAVs in Construction: Advancements, Challenges, and Future Directions for Monitoring and Inspection. Drones 2023, 7, 202. [Google Scholar] [CrossRef]
  85. Cottrell, B.; Kalacska, M.; Arroyo-Mora, J.P.; Lucanus, O.; Inamdar, D.; Løke, T.; Soffer, R.J. Limitations of a Multispectral UAV Sensor for Satellite Validation and Mapping Complex Vegetation. Remote Sens. 2024, 16, 2463. [Google Scholar] [CrossRef]
  86. Shamaoma, H.; Chirwa, P.W.; Ramoelo, A.; Hudak, A.T.; Syampungani, S. The Application of UASs in Forest Management and Monitoring: Challenges and Opportunities for Use in the Miombo Woodland. Forests 2022, 13, 1812. [Google Scholar] [CrossRef]
  87. Marques, P.; Pádua, L.; Sousa, J.J.; Fernandes-Silva, A. Advancements in Remote Sensing Imagery Applications for Precision Management in Olive Growing: A Systematic Review. Remote Sens. 2024, 16, 1324. [Google Scholar] [CrossRef]
  88. Stuart, M.B.; McGonigle, A.J.S.; Willmott, J.R. Hyperspectral Imaging in Environmental Monitoring: A Review of Recent Developments and Technological Advances in Compact Field Deployable Systems. Sensors 2019, 19, 3071. [Google Scholar] [CrossRef]
  89. Akbar, S.; Abdolmaleki, M.; Ghadernejad, S.; Esmaeili, K. Applying Knowledge-Based and Data-Driven Methods to Improve Ore Grade Control of Blast Hole Drill Cuttings Using Hyperspectral Imaging. Remote Sens. 2024, 16, 2823. [Google Scholar] [CrossRef]
  90. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color Indices for Weed Identification Under Various Soil, Residue, and Lighting Conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  91. Tucker, C.J. Red and Photographic Infrared Linear Combinations for Monitoring Vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  92. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel Algorithms for Remote Estimation of Vegetation Fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
  93. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in the Great Plains with ERTS. NASA Spec. Publ. 1973, 351, 309. [Google Scholar]
  94. Gitelson, A.; Kaufman, Y.; Merzlyak, M.N. Use of a Green Channel in Remote Sensing of Global Vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  95. Gitelson, A.; Merzlyak, M.N. Spectral Reflectance Changes Associated with Autumn Senescence of Aesculus hippocastanum L. and Acer platanoides L. Leaves. Spectral Features and Relation to Chlorophyll Estimation. J. Plant Physiol. 1994, 143, 286–292. [Google Scholar] [CrossRef]
  96. Jordan, C.F. Derivation of Leaf-Area Index from Quality of Light on the Forest Floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  97. Broge, N.H.; Leblanc, E. Comparing Prediction Power and Stability of Broadband and Hyperspectral Vegetation Indices for Estimation of Green Leaf Area Index and Canopy Chlorophyll Density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
  98. Penuelas, J.; Filella, I.; Biel, C.; Serrano, L.; Save, R. The Reflectance at the 950–970 Nm Region as an Indicator of Plant Water Status. Int. J. Remote Sens. 1993, 14, 1887–1905. [Google Scholar] [CrossRef]
  99. Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-Destructive Optical Detection of Pigment Changes during Leaf Senescence and Fruit Ripening. Physiol. Plant. 1999, 106, 135–141. [Google Scholar] [CrossRef]
  100. Gamon, J.A.; Serrano, L.; Surfus, J.S. The Photochemical Reflectance Index: An Optical Indicator of Photosynthetic Radiation Use Efficiency across Species, Functional Types, and Nutrient Levels. Oecologia 1997, 112, 492–501. [Google Scholar] [CrossRef]
  101. Sellami, M.H.; Albrizio, R.; Čolović, M.; Hamze, M.; Cantore, V.; Todorovic, M.; Piscitelli, L.; Stellacci, A.M. Selection of Hyperspectral Vegetation Indices for Monitoring Yield and Physiological Response in Sweet Maize under Different Water and Nitrogen Availability. Agronomy 2022, 12, 489. [Google Scholar] [CrossRef]
  102. Lu, D.; Batistella, M. Exploring TM Image Texture and Its Relationships with Biomass Estimation in Rondônia, Brazilian Amazon. Acta Amaz. 2005, 35, 249–257. [Google Scholar] [CrossRef]
  103. Sun, G.; Zhang, Y.; Chen, H.; Wang, L.; Li, M.; Sun, X.; Fei, S.; Xiao, S.; Yan, L.; Li, Y.; et al. Improving Soybean Yield Prediction by Integrating UAV Nadir and Cross-Circling Oblique Imaging. Eur. J. Agron. 2024, 155, 127134. [Google Scholar] [CrossRef]
  104. Liu, T.; Zhu, S.; Yang, T.; Zhang, W.; Xu, Y.; Zhou, K.; Wu, W.; Zhao, Y.; Yao, Z.; Yang, G.; et al. Maize Height Estimation Using Combined Unmanned Aerial Vehicle Oblique Photography and LIDAR Canopy Dynamic Characteristics. Comput. Electron. Agric. 2024, 218, 108685. [Google Scholar] [CrossRef]
  105. Grüner, E.; Astor, T.; Wachendorf, M. Prediction of Biomass and N Fixation of Legume–Grass Mixtures Using Sensor Fusion. Front. Plant Sci. 2021, 11, 603921. [Google Scholar] [CrossRef]
  106. Zhu, X.; Bi, Y.; Du, J.; Gao, X.; Zhang, T.; Pi, W.; Zhang, Y.; Wang, Y.; Zhang, H. Research on Deep Learning Method Recognition and a Classification Model of Grassland Grass Species Based on Unmanned Aerial Vehicle Hyperspectral Remote Sensing. Grassl. Sci. 2023, 69, 3–11. [Google Scholar] [CrossRef]
  107. Benco, M.; Hudec, R.; Kamencay, P.; Zachariasova, M.; Matuskal, S. An Advanced Approach to Extraction of Colour Texture Features Based on GLCM. Int. J. Adv. Robot. Syst. 2014, 11, 104. [Google Scholar] [CrossRef]
  108. Wu, Y.; Ma, J.; Zhang, W.; Sun, L.; Liu, Y.; Liu, B.; Wang, B.; Chen, Z. Rapid Evaluation of Drought Tolerance of Winter Wheat Cultivars under Water-Deficit Conditions Using Multi-Criteria Comprehensive Evaluation Based on UAV Multispectral and Thermal Images and Automatic Noise Removal. Comput. Electron. Agric. 2024, 218, 108679. [Google Scholar] [CrossRef]
  109. Liu, Y.; Fan, Y.; Feng, H.; Chen, R.; Bian, M.; Ma, Y.; Yue, J.; Yang, G. Estimating Potato Above-Ground Biomass Based on Vegetation Indices and Texture Features Constructed from Sensitive Bands of UAV Hyperspectral Imagery. Comput. Electron. Agric. 2024, 220, 108918. [Google Scholar] [CrossRef]
  110. Vyas, R.; Kanumuri, T.; Sheoran, G.; Dubey, P. Co-Occurrence Features and Neural Network Classification Approach for Iris Recognition. In Proceedings of the 2017 Fourth International Conference on Image Information Processing (ICIIP), Shimla, India, 21–23 December 2017; Institute of Electrical and Electronics Engineers Inc.: Shimla, India, 2017; pp. 1–6. [Google Scholar]
  111. Wang, F.; Yi, Q.; Hu, J.; Xie, L.; Yao, X.; Xu, T.; Zheng, J. Combining Spectral and Textural Information in UAV Hyperspectral Images to Estimate Rice Grain Yield. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102397. [Google Scholar] [CrossRef]
  112. Ganesan, P.; Sajiv, G. User Oriented Color Space for Satellite Image Segmentation Using Fuzzy Based Techniques. In Proceedings of the 2017 International Conference on Innovations in Information, Embedded and Communication Systems (ICIIECS), Coimbatore, India, 17–18 March 2017; Institute of Electrical and Electronics Engineers Inc.: Coimbatore, India, 2017; pp. 1–6. [Google Scholar]
  113. Ganesan, P.; Sathish, B.S.; Vasanth, K.; Sivakumar, V.G.; Vadivel, M.; Ravi, C.N. A Comprehensive Review of the Impact of Color Space on Image Segmentation. In Proceedings of the 2019 5th International Conference on Advanced Computing & Communication Systems (ICACCS), Coimbatore, India, 15–16 March 2019; Institute of Electrical and Electronics Engineers Inc.: Coimbatore, India, 2019; pp. 962–967. [Google Scholar]
  114. Gracia-Romero, A.; Kefauver, S.C.; Vergara-Díaz, O.; Zaman-Allah, M.A.; Prasanna, B.M.; Cairns, J.E.; Araus, J.L. Comparative Performance of Ground vs. Aerially Assessed Rgb and Multispectral Indices for Early-Growth Evaluation of Maize Performance under Phosphorus Fertilization. Front. Plant Sci. 2017, 8, 309121. [Google Scholar] [CrossRef]
  115. Weerasuriya, C.; Ng, S.H.; Woods, W.; Johnstone, T.; Vitta, P.; Hugrass, L.; Juodkazis, S. Feasibility of Magneto-Encephalography Scan under Color-Tailored Illumination. Appl. Sci. 2023, 13, 2988. [Google Scholar] [CrossRef]
  116. Niu, Y.; Han, W.; Zhang, H.; Zhang, L.; Chen, H. Estimating Maize Plant Height Using a Crop Surface Model Constructed from UAV RGB Images. Biosyst. Eng. 2024, 241, 56–67. [Google Scholar] [CrossRef]
  117. Liu, Y.; You, H.; Tang, X.; You, Q.; Huang, Y.; Chen, J. Study on Individual Tree Segmentation of Different Tree Species Using Different Segmentation Algorithms Based on 3D UAV Data. Forests 2023, 14, 1327. [Google Scholar] [CrossRef]
  118. Fei, S.; Xiao, S.; Li, Q.; Shu, M.; Zhai, W.; Xiao, Y.; Chen, Z.; Yu, H.; Ma, Y. Enhancing Leaf Area Index and Biomass Estimation in Maize with Feature Augmentation from Unmanned Aerial Vehicle-Based Nadir and Cross-Circling Oblique Photography. Comput. Electron. Agric. 2023, 215, 108462. [Google Scholar] [CrossRef]
  119. Zhao, X.; Su, Y.; Hu, T.; Cao, M.; Liu, X.; Yang, Q.; Guan, H.; Liu, L.; Guo, Q. Analysis of UAV Lidar Information Loss and Its Influence on the Estimation Accuracy of Structural and Functional Traits in a Meadow Steppe. Ecol. Indic. 2022, 135, 108515. [Google Scholar] [CrossRef]
  120. Taugourdeau, S.; Diedhiou, A.; Fassinou, C.; Bossoukpe, M.; Diatta, O.; N’Goran, A.; Auderbert, A.; Ndiaye, O.; Diouf, A.A.; Tagesson, T.; et al. Estimating Herbaceous Aboveground Biomass in Sahelian Rangelands Using Structure from Motion Data Collected on the Ground and by UAV. Ecol. Evol. 2022, 12, e8867. [Google Scholar] [CrossRef]
  121. Hasheminasab, S.M.; Zhou, T.; Habib, A. GNSS/INS-Assisted Structure from Motion Strategies for UAV-Based Imagery over Mechanized Agricultural Fields. Remote Sens. 2020, 12, 351. [Google Scholar] [CrossRef]
  122. Xiao, S.; Ye, Y.; Fei, S.; Chen, H.; Zhang, B.; Li, Q.; Cai, Z.; Che, Y.; Wang, Q.; Ghafoor, A.Z.; et al. High-Throughput Calculation of Organ-Scale Traits with Reconstructed Accurate 3D Canopy Structures Using a UAV RGB Camera with an Advanced Cross-Circling Oblique Route. ISPRS J. Photogramm. Remote Sens. 2023, 201, 104–122. [Google Scholar] [CrossRef]
  123. Wazid, M.; Das, A.K.; Chamola, V.; Park, Y. Uniting Cyber Security and Machine Learning: Advantages, Challenges and Future Research. ICT Express 2022, 8, 313–321. [Google Scholar] [CrossRef]
  124. Bulagang, A.F.; Weng, N.G.; Mountstephens, J.; Teo, J. A Review of Recent Approaches for Emotion Classification Using Electrocardiography and Electrodermography Signals. Inform. Med. Unlocked 2020, 20, 100363. [Google Scholar] [CrossRef]
  125. Guido, R.; Ferrisi, S.; Lofaro, D.; Conforti, D. An Overview on the Advancements of Support Vector Machine Models in Healthcare Applications: A Review. Information 2024, 15, 235. [Google Scholar] [CrossRef]
  126. Khan, F.; Albalawi, O. Analysis of Fat Big Data Using Factor Models and Penalization Techniques: A Monte Carlo Simulation and Application. Axioms 2024, 13, 418. [Google Scholar] [CrossRef]
  127. Tufail, S.; Riggs, H.; Tariq, M.; Sarwat, A.I. Advancements and Challenges in Machine Learning: A Comprehensive Review of Models, Libraries, Applications, and Algorithms. Electronics 2023, 12, 1789. [Google Scholar] [CrossRef]
  128. Drogkoula, M.; Kokkinos, K.; Samaras, N. A Comprehensive Survey of Machine Learning Methodologies with Emphasis in Water Resources Management. Appl. Sci. 2023, 13, 12147. [Google Scholar] [CrossRef]
  129. Zhao, Y.; Sun, Y.; Lu, X.; Zhao, X.; Yang, L.; Sun, Z.; Bai, Y. Hyperspectral Retrieval of Leaf Physiological Traits and Their Links to Ecosystem Productivity in Grassland Monocultures. Ecol. Indic. 2021, 122, 107267. [Google Scholar] [CrossRef]
  130. Giraldo, R.A.D.; De León, M.Á.; Castillo, Á.R.; López, O.P.; Rocha, E.C.; Asprilla, W.P. Estimation of Forage Availability and Parameters Associated to the Nutritional Quality of Urochloa humidicola Cv Llanero Based on Multispectral Images. Trop. Grasslands-Forrajes Trop. 2023, 11, 61–74. [Google Scholar] [CrossRef]
  131. De Rosa, D.; Basso, B.; Fasiolo, M.; Friedl, J.; Fulkerson, B.; Grace, P.R.; Rowlings, D.W. Predicting Pasture Biomass Using a Statistical Model and Machine Learning Algorithm Implemented with Remotely Sensed Imagery. Comput. Electron. Agric. 2021, 180, 105880. [Google Scholar] [CrossRef]
  132. Freitas, R.G.; Pereira, F.R.S.; Dos Reis, A.A.; Magalhães, P.S.G.; Figueiredo, G.K.D.A.; do Amaral, L.R. Estimating Pasture Aboveground Biomass under an Integrated Crop-Livestock System Based on Spectral and Texture Measures Derived from UAV Images. Comput. Electron. Agric. 2022, 198, 107122. [Google Scholar] [CrossRef]
  133. Singh, A.K.; Kumar, P.; Ali, R.; Al-Ansari, N.; Vishwakarma, D.K.; Kushwaha, K.S.; Panda, K.C.; Sagar, A.; Mirzania, E.; Elbeltagi, A.; et al. An Integrated Statistical-Machine Learning Approach for Runoff Prediction. Sustainability 2022, 14, 8209. [Google Scholar] [CrossRef]
  134. P Fernandes, A.C.; R Fonseca, A.; Pacheco, F.A.L.; Sanches Fernandes, L.F. Water Quality Predictions through Linear Regression—A Brute Force Algorithm Approach. MethodsX 2023, 10, 102153. [Google Scholar] [CrossRef]
  135. Eilbeigi, S.; Tavakkolizadeh, M.; Masoodi, A.R. Nonlinear Regression Prediction of Mechanical Properties for SMA-Confined Concrete Cylindrical Specimens. Buildings 2022, 13, 112. [Google Scholar] [CrossRef]
  136. Ranstam, J.; Cook, J.A. LASSO Regression. Br. J. Surg. 2018, 105, 1348. [Google Scholar] [CrossRef]
  137. Rocks, J.W.; Mehta, P. Bias-Variance Decomposition of Overparameterized Regression with Random Linear Features. Phys. Rev. E 2022, 106, 025304. [Google Scholar] [CrossRef] [PubMed]
  138. Speiser, J.L.; Miller, M.E.; Tooze, J.; Ip, E. A Comparison of Random Forest Variable Selection Methods for Classification Prediction Modeling. Expert Syst. Appl. 2019, 134, 93–101. [Google Scholar] [CrossRef] [PubMed]
  139. Cheng, L.; Chen, X.; De Vos, J.; Lai, X.; Witlox, F. Applying a Random Forest Method Approach to Model Travel Mode Choice Behavior. Travel Behav. Soc. 2019, 14, 1–10. [Google Scholar] [CrossRef]
  140. Sutradhar, A.; Akter, S.; Shamrat, F.M.J.M.; Ghosh, P.; Zhou, X.; Bin Idris, M.Y.I.; Ahmed, K.; Moni, M.A. Advancing Thyroid Care: An Accurate Trustworthy Diagnostics System with Interpretable AI and Hybrid Machine Learning Techniques. Heliyon 2024, 10, e36556. [Google Scholar] [CrossRef]
  141. Pereira, F.R.d.S.; de Lima, J.P.; Freitas, R.G.; Dos Reis, A.A.; Amaral, L.R.d.; Figueiredo, G.K.D.A.; Lamparelli, R.A.C.; Magalhães, P.S.G. Nitrogen Variability Assessment of Pasture Fields under an Integrated Crop-Livestock System Using UAV, PlanetScope, and Sentinel-2 Data. Comput. Electron. Agric. 2022, 193, 106645. [Google Scholar] [CrossRef]
  142. Akhiat, Y.; Manzali, Y.; Chahhou, M.; Zinedine, A. A New Noisy Random Forest Based Method for Feature Selection. Cybern. Inf. Technol. 2021, 21, 10–28. [Google Scholar] [CrossRef]
  143. Mentch, L.; Zhou, S. Randomization as Regularization: A Degrees of Freedom Explanation for Random Forest Success. J. Mach. Learn. Res. 2020, 21, 1–36. [Google Scholar]
  144. Probst, P.; Wright, M.N.; Boulesteix, A.L. Hyperparameters and Tuning Strategies for Random Forest. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2019, 9, e1301. [Google Scholar] [CrossRef]
  145. Kim, H.; Ko, K. Improving Forecast Accuracy of Financial Vulnerability: PLS Factor Model Approach. Econ. Model. 2020, 88, 341–355. [Google Scholar] [CrossRef]
  146. Bratković, K.; Luković, K.; Perišić, V.; Savić, J.; Maksimović, J.; Adžić, S.; Rakonjac, A.; Matković Stojšin, M. Interpreting the Interaction of Genotype with Environmental Factors in Barley Using Partial Least Squares Regression Model. Agronomy 2024, 14, 194. [Google Scholar] [CrossRef]
  147. Al Marouni, Y.; Bentaleb, Y. State of Art of PLS Regression for Non Quantitative Data and in Big Data Context. In Proceedings of the 4th International Conference on Networking, Information Systems & Security, Kenitra, Morocco, 1–2 April 2021; Association for Computing Machinery: New York, NY, USA, 2021. [Google Scholar]
  148. Hou, Y.Y.; Li, J.; Chen, X.B.; Ye, C.Q. A Partial Least Squares Regression Model Based on Variational Quantum Algorithm. Laser Phys. Lett. 2022, 19, 095204. [Google Scholar] [CrossRef]
  149. Metz, M.; Abdelghafour, F.; Roger, J.M.; Lesnoff, M. A Novel Robust PLS Regression Method Inspired from Boosting Principles: RoBoost-PLSR. Anal. Chim. Acta 2021, 1179, 338823. [Google Scholar] [CrossRef] [PubMed]
  150. Alnaqbi, A.J.; Zeiada, W.; Al-Khateeb, G.; Abttan, A.; Abuzwidah, M. Predictive Models for Flexible Pavement Fatigue Cracking Based on Machine Learning. Transp. Eng. 2024, 16, 100243. [Google Scholar] [CrossRef]
  151. Feng, L.; Zhang, Z.; Ma, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Alfalfa Yield Prediction Using UAV-Based Hyperspectral Imagery and Ensemble Learning. Remote Sens. 2020, 12, 2028. [Google Scholar] [CrossRef]
  152. Costa, L.S.; Sano, E.E.; Ferreira, M.E.; Munhoz, C.B.R.; Costa, J.V.S.; Rufino Alves Júnior, L.; de Mello, T.R.B.; da Cunha Bustamante, M.M. Woody Plant Encroachment in a Seasonal Tropical Savanna: Lessons about Classifiers and Accuracy from UAV Images. Remote Sens. 2023, 15, 2342. [Google Scholar] [CrossRef]
  153. Adugna, T.; Xu, W.; Fan, J. Comparison of Random Forest and Support Vector Machine Classifiers for Regional Land Cover Mapping Using Coarse Resolution FY-3C Images. Remote Sens. 2022, 14, 574. [Google Scholar] [CrossRef]
  154. Lin, X.; Chen, J.; Lou, P.; Yi, S.; Qin, Y.; You, H.; Han, X. Improving the Estimation of Alpine Grassland Fractional Vegetation Cover Using Optimized Algorithms and Multi-Dimensional Features. Plant Methods 2021, 17, 96. [Google Scholar] [CrossRef]
  155. Araya, S.N.; Fryjoff-Hung, A.; Anderson, A.; Viers, J.H.; Ghezzehei, T.A. Advances in Soil Moisture Retrieval from Multispectral Remote Sensing Using Unoccupied Aircraft Systems and Machine Learning Techniques. Hydrol. Earth Syst. Sci. 2021, 25, 2739–2758. [Google Scholar] [CrossRef]
  156. Vilar, P.; Morais, T.G.; Rodrigues, N.R.; Gama, I.; Monteiro, M.L.; Domingos, T.; Teixeira, R.F.M. Object-Based Classification Approaches for Multitemporal Identification and Monitoring of Pastures in Agroforestry Regions Using Multispectral Unmanned Aerial Vehicle Products. Remote Sens. 2020, 12, 814. [Google Scholar] [CrossRef]
  157. Taye, M.M. Understanding of Machine Learning with Deep Learning: Architectures, Workflow, Applications and Future Directions. Computers 2023, 12, 91. [Google Scholar] [CrossRef]
  158. Fan, F.L.; Xiong, J.; Li, M.; Wang, G. On Interpretability of Artificial Neural Networks: A Survey. IEEE Trans. Radiat. Plasma Med. Sci. 2021, 5, 741–760. [Google Scholar] [CrossRef] [PubMed]
  159. Taye, M.M. Theoretical Understanding of Convolutional Neural Network: Concepts, Architectures, Applications, Future Directions. Computation 2023, 11, 52. [Google Scholar] [CrossRef]
  160. Wang, Y.-H.; Su, W.-H.; Wang, Y.-H.; Su, W.-H. Convolutional Neural Networks in Computer Vision for Grain Crop Phenotyping: A Review. Agronomy 2022, 12, 2659. [Google Scholar] [CrossRef]
  161. Krichen, M. Convolutional Neural Networks: A Survey. Computers 2023, 12, 151. [Google Scholar] [CrossRef]
  162. Kuzudisli, C.; Bakir-Gungor, B.; Bulut, N.; Qaqish, B.; Yousef, M. Review of Feature Selection Approaches Based on Grouping of Features. PeerJ 2023, 11, e15666. [Google Scholar] [CrossRef]
  163. Khaire, U.M.; Dhanalakshmi, R. Stability of Feature Selection Algorithm: A Review. J. King Saud Univ.—Comput. Inf. Sci. 2022, 34, 1060–1073. [Google Scholar] [CrossRef]
  164. Bertolini, R.; Finch, S.J.; Nehm, R.H. Enhancing Data Pipelines for Forecasting Student Performance: Integrating Feature Selection with Cross-Validation. Int. J. Educ. Technol. High. Educ. 2021, 18, 44. [Google Scholar] [CrossRef]
  165. Lu, B.; He, Y.; Liu, H. Investigating Species Composition in a Temperate Grassland Using Unmanned Aerial Vehicle-Acquired Imagery. In Proceedings of the 2016 4th International Workshop on Earth Observation and Remote Sensing Applications (EORSA), Guangzhou, China, 4–6 July 2016; Institute of Electrical and Electronics Engineers Inc.: Piscataway Township, NJ, USA, 2016; pp. 107–111. [Google Scholar]
  166. Lin, X.; Chen, J.; Wu, T.; Yi, S.; Chen, J.; Han, X. Time-Series Simulation of Alpine Grassland Cover Using Transferable Stacking Deep Learning and Multisource Remote Sensing Data in the Google Earth Engine. Int. J. Appl. Earth Obs. Geoinf. 2024, 131, 103964. [Google Scholar] [CrossRef]
  167. Raiaan, M.A.K.; Sakib, S.; Fahad, N.M.; Al Mamun, A.; Rahman, M.A.; Shatabda, S.; Mukta, M.S.H. A Systematic Review of Hyperparameter Optimization Techniques in Convolutional Neural Networks. Decis. Anal. J. 2024, 11, 100470. [Google Scholar] [CrossRef]
  168. Elgeldawi, E.; Sayed, A.; Galal, A.R.; Zaki, A.M. Hyperparameter Tuning for Machine Learning Algorithms Used for Arabic Sentiment Analysis. Informatics 2021, 8, 79. [Google Scholar] [CrossRef]
  169. Setiadi, D.R.I.M.; Susanto, A.; Nugroho, K.; Muslikh, A.R.; Ojugo, A.A.; Gan, H.S. Rice Yield Forecasting Using Hybrid Quantum Deep Learning Model. Computers 2024, 13, 191. [Google Scholar] [CrossRef]
  170. Angelakis, D.; Ventouras, E.C.; Kostopoulos, S.; Asvestas, P. Comparative Analysis of Deep Learning Models for Optimal EEG-Based Real-Time Servo Motor Control. Eng 2024, 5, 1708–1736. [Google Scholar] [CrossRef]
  171. Kaliappan, J.; Bagepalli, A.R.; Almal, S.; Mishra, R.; Hu, Y.C.; Srinivasan, K. Impact of Cross-Validation on Machine Learning Models for Early Detection of Intrauterine Fetal Demise. Diagnostics 2023, 13, 1692. [Google Scholar] [CrossRef]
  172. Jan, M.S.; Hussain, S.; e Zahra, R.; Emad, M.Z.; Khan, N.M.; Rehman, Z.U.; Cao, K.; Alarifi, S.S.; Raza, S.; Sherin, S.; et al. Appraisal of Different Artificial Intelligence Techniques for the Prediction of Marble Strength. Sustainability 2023, 15, 8835. [Google Scholar] [CrossRef]
  173. Szeghalmy, S.; Fazekas, A. A Comparative Study of the Use of Stratified Cross-Validation and Distribution-Balanced Stratified Cross-Validation in Imbalanced Learning. Sensors 2023, 23, 2333. [Google Scholar] [CrossRef]
  174. Allgaier, J.; Pryss, R. Cross-Validation Visualized: A Narrative Guide to Advanced Methods. Mach. Learn. Knowl. Extr. 2024, 6, 1378–1388. [Google Scholar] [CrossRef]
  175. Wan, L.; Liu, Y.; He, Y.; Cen, H. Prior Knowledge and Active Learning Enable Hybrid Method for Estimating Leaf Chlorophyll Content from Multi-Scale Canopy Reflectance. Comput. Electron. Agric. 2023, 214, 108308. [Google Scholar] [CrossRef]
  176. Chang, Y.; Le Moan, S.; Bailey, D. RGB Imaging Based Estimation of Leaf Chlorophyll Content. In Proceedings of the 2019 International Conference on Image and Vision Computing New Zealand (IVCNZ), Dunedin, New Zealand, 2–4 December 2019; IEEE Computer Society: Dunedin, New Zealand, 2019; pp. 1–6. [Google Scholar]
  177. Zhang, Y.W.; Wang, T.; Guo, Y.; Skidmore, A.; Zhang, Z.; Tang, R.; Song, S.; Tang, Z. Estimating Community-Level Plant Functional Traits in a Species-Rich Alpine Meadow Using UAV Image Spectroscopy. Remote Sens. 2022, 14, 3399. [Google Scholar] [CrossRef]
  178. Cockson, P.; Landis, H.; Smith, T.; Hicks, K.; Whipker, B.E. Characterization of Nutrient Disorders of Cannabis sativa. Appl. Sci. 2019, 9, 4432. [Google Scholar] [CrossRef]
  179. Noulas, C.; Torabian, S.; Qin, R. Crop Nutrient Requirements and Advanced Fertilizer Management Strategies. Agronomy 2023, 13, 2017. [Google Scholar] [CrossRef]
  180. Casamitjana, M.; Torres-Madroñero, M.C.; Bernal-Riobo, J.; Varga, D. Soil Moisture Analysis by Means of Multispectral Images According to Land Use and Spatial Resolution on Andosols in the Colombian Andes. Appl. Sci. 2020, 10, 5540. [Google Scholar] [CrossRef]
  181. Lu, F.; Sun, Y.; Hou, F. Using UAV Visible Images to Estimate the Soil Moisture of Steppe. Water 2020, 12, 2334. [Google Scholar] [CrossRef]
  182. Sang, Y.; Yu, S.; Lu, F.; Sun, Y.; Wang, S.; Ade, L.; Hou, F. UAV Monitoring Topsoil Moisture in an Alpine Meadow on the Qinghai–Tibet Plateau. Agronomy 2023, 13, 2193. [Google Scholar] [CrossRef]
  183. Brenner, C.; Thiem, C.E.; Wizemann, H.D.; Bernhardt, M.; Schulz, K. Estimating Spatially Distributed Turbulent Heat Fluxes from High-Resolution Thermal Imagery Acquired with a UAV System. Int. J. Remote Sens. 2017, 38, 3003–3026. [Google Scholar] [CrossRef]
  184. Zhang, W.; Yi, S.; Qin, Y.; Sun, Y.; Shangguan, D.; Meng, B.; Li, M.; Zhang, J. Effects of Patchiness on Surface Soil Moisture of Alpine Meadow on the Northeastern Qinghai-Tibetan Plateau: Implications for Grassland Restoration. Remote Sens. 2020, 12, 4121. [Google Scholar] [CrossRef]
  185. Morgan, B.E.; Caylor, K.K. Estimating Fine-Scale Transpiration From UAV-Derived Thermal Imagery and Atmospheric Profiles. Water Resour. Res. 2023, 59, e2023WR035251. [Google Scholar] [CrossRef]
  186. Nobre, I.d.S.; Araújo, G.G.L.d.; Santos, E.M.; Carvalho, G.G.P.d.; de Albuquerque, I.R.R.; Oliveira, J.S.d.; Ribeiro, O.L.; Turco, S.H.N.; Gois, G.C.; Silva, T.G.F.d.; et al. Cactus Pear Silage to Mitigate the Effects of an Intermittent Water Supply for Feedlot Lambs: Intake, Digestibility, Water Balance and Growth Performance. Ruminants 2023, 3, 121–132. [Google Scholar] [CrossRef]
  187. Wijesingha, J.; Astor, T.; Schulze-Brüninghoff, D.; Wengert, M.; Wachendorf, M. Predicting Forage Quality of Grasslands Using UAV-Borne Imaging Spectroscopy. Remote Sens. 2020, 12, 126. [Google Scholar] [CrossRef]
  188. Xia, G.S.; Datcu, M.; Yang, W.; Bai, X. Information Processing for Unmanned Aerial Vehicles (UAVs) in Surveying, Mapping, and Navigation. Geo-Spat. Inf. Sci. 2018, 21, 1. [Google Scholar] [CrossRef]
  189. Wu, B.; Zhang, M.; Zeng, H.; Tian, F.; Potgieter, A.B.; Qin, X.; Yan, N.; Chang, S.; Zhao, Y.; Dong, Q.; et al. Challenges and Opportunities in Remote Sensing-Based Crop Monitoring: A Review. Natl. Sci. Rev. 2023, 10, nwac290. [Google Scholar] [CrossRef] [PubMed]
  190. Karmakar, P.; Teng, S.W.; Murshed, M.; Pang, S.; Li, Y.; Lin, H. Crop Monitoring by Multimodal Remote Sensing: A Review. Remote Sens. Appl. Soc. Environ. 2024, 33, 101093. [Google Scholar] [CrossRef]
  191. Lambertini, A.; Mandanici, E.; Tini, M.A.; Vittuari, L. Technical Challenges for Multi-Temporal and Multi-Sensor Image Processing Surveyed by UAV for Mapping and Monitoring in Precision Agriculture. Remote Sens. 2022, 14, 4954. [Google Scholar] [CrossRef]
  192. Sproles, E.A.; Mullen, A.; Hendrikx, J.; Gatebe, C.; Taylor, S. Autonomous Aerial Vehicles (AAVs) as a Tool for Improving the Spatial Resolution of Snow Albedo Measurements in Mountainous Regions. Hydrology 2020, 7, 41. [Google Scholar] [CrossRef]
  193. Puppala, H.; Peddinti, P.R.T.; Tamvada, J.P.; Ahuja, J.; Kim, B. Barriers to the Adoption of New Technologies in Rural Areas: The Case of Unmanned Aerial Vehicles for Precision Agriculture in India. Technol. Soc. 2023, 74, 102335. [Google Scholar] [CrossRef]
  194. Askerbekov, D.; Garza-Reyes, J.A.; Roy Ghatak, R.; Joshi, R.; Kandasamy, J.; Luiz de Mattos Nascimento, D. Embracing Drones and the Internet of Drones Systems in Manufacturing—An Exploration of Obstacles. Technol. Soc. 2024, 78, 102648. [Google Scholar] [CrossRef]
  195. Rakholia, R.; Tailor, J.; Prajapati, M.; Shah, M.; Saini, J.R. Emerging Technology Adoption for Sustainable Agriculture in India—A Pilot Study. J. Agric. Food Res. 2024, 17, 101238. [Google Scholar] [CrossRef]
  196. Bai, A.; Kovách, I.; Czibere, I.; Megyesi, B.; Balogh, P. Examining the Adoption of Drones and Categorisation of Precision Elements among Hungarian Precision Farmers Using a Trans-Theoretical Model. Drones 2022, 6, 200. [Google Scholar] [CrossRef]
  197. Parmaksiz, O.; Cinar, G. Technology Acceptance among Farmers: Examples of Agricultural Unmanned Aerial Vehicles. Agronomy 2023, 13, 2077. [Google Scholar] [CrossRef]
  198. Tsiamis, N.; Efthymiou, L.; Tsagarakis, K.P. A Comparative Analysis of the Legislation Evolution for Drone Use in OECD Countries. Drones 2019, 3, 75. [Google Scholar] [CrossRef]
  199. Merz, M.; Pedro, D.; Skliros, V.; Bergenhem, C.; Himanka, M.; Houge, T.; Matos-Carvalho, J.P.; Lundkvist, H.; Cürüklü, B.; Hamrén, R.; et al. Autonomous UAS-Based Agriculture Applications: General Overview and Relevant European Case Studies. Drones 2022, 6, 128. [Google Scholar] [CrossRef]
  200. Ayamga, M.; Tekinerdogan, B.; Kassahun, A. Exploring the Challenges Posed by Regulations for the Use of Drones in Agriculture in the African Context. Land 2021, 10, 164. [Google Scholar] [CrossRef]
Figure 1. (a) Annual numbers of publications (2014–2024); (b) number of citations per year.
Figure 1. (a) Annual numbers of publications (2014–2024); (b) number of citations per year.
Drones 08 00585 g001
Figure 2. Survey of forage species covered in the 100 most cited articles.
Figure 2. Survey of forage species covered in the 100 most cited articles.
Drones 08 00585 g002
Figure 3. Example of the gray level co-occurrence matrix (GLCM) construction process, with colored lines representing the directions in which the filters are applied [110].
Figure 3. Example of the gray level co-occurrence matrix (GLCM) construction process, with colored lines representing the directions in which the filters are applied [110].
Drones 08 00585 g003
Figure 4. (a) Side view of a CCO route. (b) Top view of a CCO route composed of four single circles. (c) Actual flight path of the CCO route [118].
Figure 4. (a) Side view of a CCO route. (b) Top view of a CCO route composed of four single circles. (c) Actual flight path of the CCO route [118].
Drones 08 00585 g004
Figure 5. Survey of algorithms covered in the 100 most cited articles. LR: linear regression, RF: random forest, PLSR: partial least squares regression, SVM: support vector machine, ANN: artificial neural network, RLM: multiple linear regression, CNN: convolutional neural network, NLR: nonlinear regression, MVREG: multivariate linear regression, GP: Gaussian processing, MLE: maximum likelihood estimation, PCA: principal component analysis, REML: residual maximum likelihood, XGBoost: extreme gradient boosting, BRTs: boosted regression tree, CART: classification and regression tree, CB: cubist, CL: clustering, DT: decision tree, EML: ensemble, FCM: fuzzy C-means, GAM: generalized additive model, KNN: K-nearest neighbors, L1: lasso regression, LMM: mixed-effect linear models, MaxEnt: maximum entropy, RK: regression kriging, RMA: reduced major axis regression, RVR: relevance vector regression, SLR: stepwise linear regression, SMR: stepwise multiple regression, VHGPR: variational heteroscedastic Gaussian processes regression.
Figure 5. Survey of algorithms covered in the 100 most cited articles. LR: linear regression, RF: random forest, PLSR: partial least squares regression, SVM: support vector machine, ANN: artificial neural network, RLM: multiple linear regression, CNN: convolutional neural network, NLR: nonlinear regression, MVREG: multivariate linear regression, GP: Gaussian processing, MLE: maximum likelihood estimation, PCA: principal component analysis, REML: residual maximum likelihood, XGBoost: extreme gradient boosting, BRTs: boosted regression tree, CART: classification and regression tree, CB: cubist, CL: clustering, DT: decision tree, EML: ensemble, FCM: fuzzy C-means, GAM: generalized additive model, KNN: K-nearest neighbors, L1: lasso regression, LMM: mixed-effect linear models, MaxEnt: maximum entropy, RK: regression kriging, RMA: reduced major axis regression, RVR: relevance vector regression, SLR: stepwise linear regression, SMR: stepwise multiple regression, VHGPR: variational heteroscedastic Gaussian processes regression.
Drones 08 00585 g005
Figure 6. Distribution of articles addressing different applications of UAVs in pastures and forage crops.
Figure 6. Distribution of articles addressing different applications of UAVs in pastures and forage crops.
Drones 08 00585 g006
Figure 7. Classic RGB and NIR responses of healthy and diseased plants and chlorophyll molecule.
Figure 7. Classic RGB and NIR responses of healthy and diseased plants and chlorophyll molecule.
Drones 08 00585 g007
Table 1. Descriptive analysis of the database.
Table 1. Descriptive analysis of the database.
DescriptionResults
Time period2014–2024 *
Articles238
Journals93
Authors1086
Author appearances1529
Authors of single-authored documents0
N° of citations4740
References12,250
Author keywords936
Annual growth rate 31.50%
International co-authorship13%
Documents per author0.219
Authors per document6.43
Citations per document19.92
Articles per year21.64
* 2024 signifies until 12 September 2024.
Table 2. Examples of RGB, multispectral, and hyperspectral vegetation indices.
Table 2. Examples of RGB, multispectral, and hyperspectral vegetation indices.
SensorVegetation IndicesCalculation FormulaReference
RGBExcess Green Vegetation Index (EXG) 2 × G R - B [90]
Red Chromatic Coordinate Index (RCC) R ( R + G + B ) [90]
Green Chromatic Coordinate Index (GCC) G ( R + G + B ) [90]
Blue Chromatic Coordinate Index (BCC) B ( R + G + B ) [90]
Normalized Green Red Difference Index (NGRDI) ( G - R ) ( G + R ) [91]
Visible Atmospherically Resistant Index (VARI) ( G - R ) ( G + R - B ) [92]
MultispectralNormalized Difference Vegetation Index (NDVI) ( NIR - R ) ( NIR + R ) [93]
Green Normalized Difference Vegetation Index (GNDVI) ( NIR - G ) ( NIR + G ) [94]
Normalized Difference Red Edge Index (NDRE) ( NIR - RE ) ( NIR + RE ) [95]
Ratio Vegetation Index (RVI) NIR R [96]
Triangular Vegetation Index (TVI) 0.5 × [ 120 ( NIR - G ) - 200 ( R - G ) ] [97]
Soil-adjusted Vegetation Index (SAVI) ( NIR - R ) ( NIR + R + L ) × ( 1 + L ) [81]
Hyperspectral *Normalized Difference Vegetation Index (NDVI) ( R 860 R 650 ) ( R 860 + R 650 ) [93]
Water Band Index (WBI) ( R 970 ) ( R 900 ) [98]
Normalized Difference Red Edge Index (NDRE) ( R 790 R 720 ) ( R 790 + R 720 ) [95]
Soil-Adjusted Vegetation Index (SAVI) ( R 860 R 650 ) ( R 860 + R 650 + L ) × ( 1 + L ) [81]
Plant Senescence Reflectance Index (PSRI) ( R 680 R 500 ) ( R 750 ) [99]
Structure Insensitive Pigment Index (SIPI) ( R 800 R 445 ) ( R 800 R 680 ) [100]
Ri represents the reflectance value at specified wavelengths in nanometer, NIR: near-infrared, L: soil brightness correction factor, RE: red edge, R: red, G: green, B: blue. * Hyperspectral indices defined based on [101].
Table 3. Texture metrics.
Table 3. Texture metrics.
MetricsFormula
P i , j V i , j i , j = 0 N 1 V i , j
Mean (ME) i , j = 0 N 1 i P i , j
Variance (VA) i , j = 0 N 1 i P i , j ( i M E ) 2
Homogeneity (HO) i , j = 0 N 1 i P i , j 1 + ( i j ) 2
Contrast (CO) i , j = 0 N 1 i P i , j ( i j ) 2
Dissimilarity (DI) i , j = 0 N 1 i P i , j i j
Entropy (EN) i , j = 0 N 1 P i , j ( ln P i , j )
Second Moment (SM) i , j = 0 N 1 P i , j 2
Correlation (CC) i , j = 0 N 1 i P i , j ( i M E ) ( j M E ) V A i V A j
Vi,j: value in cells i and j of the image window, Pi,j: normalization of the matrix that represents an approximation of the probability of values i and j occurring in adjacent pixels within the defined window, i: value of a target pixel, j: value of the neighbor of pixel i, N: number of rows or columns.
Table 4. Description of components’ space colors [113,115].
Table 4. Description of components’ space colors [113,115].
Color SpaceComponents
CIEXYZY: luminance, Z: blue stimulation, and X: linear combination of cone response curves chosen to be non-negative
CIELabL: luminance, a and b: chrominance
CIELuvL: luminance, u and v: chrominance
CIELchL: luminance, C: chrominance, and h: hue angle
CMYC: cyan, M: magenta, and Y: yellow
HSVH: hue, S: saturation, and V: value
HSLH: hue, S: saturation, and L: luminance
HSIH: hue, S: saturation, and I: intensity
I1I2I3I1: luminance, I2 and I3: chrominance
YIQY: luminance, I and Q: chrominance
YUVY: luminance, U and V: chrominance
YCbCrY: luminance, Cb and Cr: chrominance
LMSL: long, M: medium, and S: short light wavelengths
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Santos, W.M.d.; Martins, L.D.C.d.S.; Bezerra, A.C.; Souza, L.S.B.d.; Jardim, A.M.d.R.F.; Silva, M.V.d.; Souza, C.A.A.d.; Silva, T.G.F.d. Use of Unmanned Aerial Vehicles for Monitoring Pastures and Forages in Agricultural Sciences: A Systematic Review. Drones 2024, 8, 585. https://doi.org/10.3390/drones8100585

AMA Style

Santos WMd, Martins LDCdS, Bezerra AC, Souza LSBd, Jardim AMdRF, Silva MVd, Souza CAAd, Silva TGFd. Use of Unmanned Aerial Vehicles for Monitoring Pastures and Forages in Agricultural Sciences: A Systematic Review. Drones. 2024; 8(10):585. https://doi.org/10.3390/drones8100585

Chicago/Turabian Style

Santos, Wagner Martins dos, Lady Daiane Costa de Sousa Martins, Alan Cezar Bezerra, Luciana Sandra Bastos de Souza, Alexandre Maniçoba da Rosa Ferraz Jardim, Marcos Vinícius da Silva, Carlos André Alves de Souza, and Thieres George Freire da Silva. 2024. "Use of Unmanned Aerial Vehicles for Monitoring Pastures and Forages in Agricultural Sciences: A Systematic Review" Drones 8, no. 10: 585. https://doi.org/10.3390/drones8100585

APA Style

Santos, W. M. d., Martins, L. D. C. d. S., Bezerra, A. C., Souza, L. S. B. d., Jardim, A. M. d. R. F., Silva, M. V. d., Souza, C. A. A. d., & Silva, T. G. F. d. (2024). Use of Unmanned Aerial Vehicles for Monitoring Pastures and Forages in Agricultural Sciences: A Systematic Review. Drones, 8(10), 585. https://doi.org/10.3390/drones8100585

Article Metrics

Back to TopTop