Next Article in Journal
Partitioning Early Warning in the Mining Process of Residual Ore Bodies via Microseismic Monitoring—Taking the Xianglushan Tungsten Mine as an Example
Previous Article in Journal
A Multimodal Voice Phishing Detection System Integrating Text and Audio Analysis
Previous Article in Special Issue
Phenology-Based Maize and Soybean Yield Potential Prediction Using Machine Learning and Sentinel-2 Imagery Time-Series
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Hybrid Methodological Evaluation Using UAV/Satellite Information for the Monitoring of Super-Intensive Olive Groves

by
Esther Alfonso
1,
Serafín López-Cuervo
2,*,
Julián Aguirre
2,
Enrique Pérez-Martín
3 and
Iñigo Molina
2
1
GESyP Research Group, Universidad Politécnica de Madrid, 28031 Madrid, Spain
2
Department of Surveying and Cartography, Universidad Politécnica de Madrid, 28031 Madrid, Spain
3
Department of Agroforestry Engineering, Universidad Politécnica de Madrid, 28240 Madrid, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(20), 11171; https://doi.org/10.3390/app152011171
Submission received: 29 August 2025 / Revised: 12 October 2025 / Accepted: 13 October 2025 / Published: 18 October 2025

Abstract

Advances in Earth observation technology using multispectral imagery from satellite Earth observation systems and sensors mounted on unmanned aerial vehicles (UAVs) are enabling more accurate crop monitoring. These images, once processed, facilitate the analysis of crop health by enabling the study of crop vigour, the calculation of biomass indices, and the continuous temporal monitoring using vegetation indices (VIs). These indicators allow for the identification of diseases, pests, or water stress, among others. This study compares images acquired with the Altum PT sensor (UAV) and Super Dove (satellite) to evaluate their ability to detect specific problems in super-intensive olive groves at two critical times: January, during pruning, and April, at the beginning of fruit development. Four different VIs were used, and multispectral maps were generated for each: the Normalized Difference Vegetation Index (NDVI), the Green Normalized Difference Vegetation Index (GNDVI), the Normalized Difference Red Edge Index (NDRE) and the Leaf Chlorophyll Index (LCI). Data for each plant (n = 11,104) were obtained for analysis across all dates and sensors. A combined methodology (Spearman’s correlation coefficient, Student’s t-test and decision trees) was used to validate the behaviour of the variables and propose predictive models. The results showed significant differences between the sensors, with a common trend in spatial patterns and a correlation range between 0.45 and 0.68. Integrating both technologies enables multiscale assessment, optimizing agronomic management and supporting more sustainable precision agriculture.

1. Introduction

The monitoring of woody crops, particularly olive groves, has made significant progress due to the use of remote sensors and the analysis of their multispectral images [1]. The use of remote sensing and multispectral image analysis provides an innovative and accurate approach to assessing crop condition and development in a non-destructive way, enabling real-time decision-making and optimisation of agricultural resources [2,3,4]. In recent years, the incorporation of satellite and unmanned aerial vehicle (UAV) imagery has become popular, providing complementary solutions for the analysis of large crop areas, wide coverage with high temporal frequency (satellites) and higher spatial resolution for localised detection (UAVs) and detailed assessment at plant level [5,6].
The present study proposes a hybrid methodological assessment, understood as the integration of multi-source data, satellite and UAV, through statistical, spatial and predictive analyses, with the aim of ensuring the scalability and statistical validation of the results, so that the metrics obtained at tree level can be reliably extrapolated to the plot and farm scales [7]. This transferability constitutes an innovative contribution in comparison to previous studies that focused on herbaceous crops or single-source analysis [8,9].
The main challenge in integrating data from multiple sources is the variation in observation scales and spatial and spectral resolutions [10,11]. This complementarity is particularly relevant in super-intensive olive groves, where UAVs’ ability to capture intra-plot variability and critical plant level phenomena complements the satellites’ ability to monitor regional trends.
Multispectral imaging allows for the measurement of electromagnetic energy re-emitted by plants at different wavelengths across the spectrum. These measurements vary according to the physical properties of the vegetation, producing reflectance curves that are key to identifying plant health and vigour [12].
Healthy vegetation (Figure 1) shows a minimum of reflectance in the visible spectrum (400–700 nm), with a peak of reflected energy in the green band at 500–600 nm, known as the ‘green bulge’ of healthy vegetation. The Red Edge (700–730 nm) is a sensitive indicator of plant stress, as a stressed plant exhibits very low reflectance in this band. In the Near-Infrared Spectroscopy (NIR) band (700–1400 nm), reflectance increases dramatically in healthy vegetation, and monitoring this is a way of determining how healthy a crop is, since stressed vegetation reflects significantly less in the infrared range. Similar to the red spectral band, the infrared range is widely used to calculate most vegetation indices in agriculture [13].
The use of sensors onboard UAVs enables a much higher spatial resolution than satellite imagery, allowing a detailed representation of each plant and facilitating the detection of local variations in the crop. However, the higher spatial resolution of UAVs must be balanced against the limited temporal coverage of satellites, which provide images of large areas at lower resolution but are useful for large-scale trend analysis.
This research focuses on the olive crops grown under super-intensive management, whose linear and homogenous structure allows a good response to remote image-based segmentation. This study focuses on olives due to their significant economic importance in the Mediterranean region, as well as their distinct phenology, which allows for the identification of critical assessment periods using VL.
Remote sensing offers specific advantages and contributions when applied to perennial woody crops such as olive trees, compared to annual crops. This is because it allows us to address the challenges posed by olives’ complex canopy structure and long life cycles, as well as the need for multi-year monitoring of their response to pruning and management practices [14,15]. In this context, the combination of UAV and satellite imagery not only improves the detection of variability within plots and across regions but also helps overcome two critical limitations of agricultural remote sensing: the scalability of analyses and the statistical validation of predictive models. This ensures that results can be transferred from the plant scale to the farm scale. The need for scalability and validation has been identified as an area of research that remains underexplored in the literature on woody crop applications.
This multiscale approach is particularly novel in super-intensive olive groves [7,9]. The periods selected for this study (January, when pruning defines the potential for sprouting and yield for the season, and April, when the foundations for production are laid through sprouting and fruit set) represent critical phenological phases in olive tree management. Accurate monitoring in these phases allows for the efficient adjustment of irrigation, fertilisation and pruning practices, as these determine annual productivity and crop sustainability [16,17].
While previous studies have explored the use of remote sensing for herbaceous crops such as maize and wheat, fewer studies have focused on woody crops, particularly super-intensive olive groves [18,19,20]. Some studies have shown that the Normalized Difference Red Edge Index (NDRE) index obtained by UAVs can detect irrigation non-uniformity in olive groves [21], while others applied high-resolution UAV images for 3D canopy reconstruction [22]. However, few studies have proposed the integration of UAV/satellite data in high-density contexts. One such example is the study by Bollas et al. [23], which evaluated the NDVI obtained from both UAV imagery and Sentinel-2 satellite data in northern Greece, achieving correlations ranging from 83.5% to 98.3%. This demonstrates the value of UAVs in detecting intra-plot variability that is not visible in satellite images.
Therefore, this study proposes a hybrid methodological assessment that not only compares sensors but also proposes combined spatial and temporal analysis models. This hybrid approach integrates comparative statistical analysis, predictive modelling and geospatial segmentation for the generation of management zones. The aim is to overcome the limitations of single-source approaches and provide transferable tools for other perennial agricultural systems. The approach also addresses the lack of scalability and statistical validation in multi-source analysis, supporting more accurate and sustainable agronomic decision-making, with the goal of maximising productivity while reducing environmental impact [24,25,26].
While some recent studies have addressed the spatial segmentation of individual trees in olive cultivation using UAV imagery, such as the work of Safonova et al. [27], which focused on canopy volume estimation, no research has been found that explicitly combines UAV, satellite and predictive modelling to generate management zoning, which constitutes the novel contribution of this study [28,29].

2. Materials and Methods

The main technological advance in the monitoring of woody crops has been the adoption of multispectral imagery [30,31] obtained via drone and satellite flights. In this study, the data obtained from both types of flights and sensors were compared, considering that they have different technical characteristics and flight conditions.
Two specific sensors were used in this study: Planet Lab’s Super-Dove multispectral sensor (Plant Labs PBC, San Francisco, CA, USA) [32] for the satellite imagery and the Altum PT sensor (Micasense; AgEagle Aerial Systems, Wichita, KS, USA) onboard the DJI Matrice 300 RTK drone (https://enterprise.dji.com) (DJI; SZ DJI Technology Co., Ltd., Shenzhen, China) for the UAV imagery.
Satellite images (eight bands) and UAV images (Altum PT, 5 + 1 bands) were obtained using different spectral ranges (Figure 2). In addition to the comparative analysis of the aforementioned bands, the combination of the spectral bands was carried out by means of vegetation indices (VIs), which allowed us to quantify the variability in the crop plots and carry out a geospatial analysis for crop monitoring [33,34].

2.1. Location of the Research

This study was carried out in a super-intensive olive plantation (Cornicabra variety) in Osa de la Vega, Cuenca, Spain (39°40.2′79″ N, 2°42′44.02″ W). The farm has an area of 10 hectares containing 11,104 trees distributed in rows, with a distance of 3.5 m between rows and 1.25–1.5 m between trees (Figure 3). This layout is designed for intensive mechanisation and high productivity [35,36].
The plantation is located on practically flat terrain with a slope of less than 3%. The soil is a mixture of limestone and loam. Although the soil contains a lot of stones, it is of medium fertility and has good drainage. All the trees are the same age (eight years old at the time of this study) and are managed using controlled deficit drip irrigation and annual mechanical pruning. This ensures uniform vegetative development and improves the representativeness of the data.

2.2. Data Collection

The main resource for analysis in this study was multispectral imagery. For the satellite case, daily images are available for download from the satellite provider’s website. However, in the case of the UAV, monthly flights have been carried out since January 2023. From these, two key phenological dates were selected for the year 2023: 29 January and 15 April, corresponding to the rest/pruning and budbreak/flowering stages, respectively. These dates coincide with relevant phenological phases for detecting changes in olive tree vigour and photosynthetic activity [37]. From an agronomic point of view, these phases are critical because they determine key management practices. In January, pruning regulates canopy load and defines the cycle’s potential productivity. In April, budding and flowering determine the yield by influencing both fruit set and competition for resources. From a methodological perspective, these extreme stages of the cycle allow us to validate the ability of UAVs and satellites to detect well-defined physiological contrasts, such as minimal vegetative activity during dormancy, compared to maximum leaf expansion during flowering. This approach strengthens the reliability of the comparison between sensors, as their responses are evaluated in both low- and high-vigour situations. These conditions represent the limits of the spectral range in olive trees, ensuring that the results can be transferred to other contexts more easily.
The satellite images were obtained via direct download from Planet’s own website, Ortho Analytic 8B SR (www.planet.com). The images from Planet are provided with geometric and radiometric corrections already applied to minimise atmospheric distortion [38,39,40,41]. The images are acquired by Planet Lab’s SuperDove sensors (Planet Labs PBC, San Francisco, CA, USA), with a resolution of 3.7 m/pixel and 8 spectral bands: R, G, B, NIR, Coastal Blue, Green, Yellow and Red Edge [32].
However, the images obtained via UAV flight were acquired using photogrammetric techniques using the Altum PT multispectral sensor onboard the DJI Matrice 300 RTK drone (Table 1 and Figure 4). This camera allows a spatial resolution of 4 cm when flying at an altitude of 120 m thanks to the pansharpening of the panchromatic band. For this work, the drone was flown at an altitude of 70 m, achieving a ground sample distance (GSD) of 2.3 cm/pixel in 4 + 1 bands: R, G, B, Red Edge and IR. The drone was configured to fly with a 75% overlap on the lateral and frontal axes to ensure full coverage. This type of flight with multispectral cameras requires the acquisition of calibration panels to subsequently correct reflectance values and achieve uniformity when comparing flights on different dates and in different weather conditions.
The working area was georeferenced by measuring four points coinciding with the corners of the crop field using a GNSS Zenith 40 device (Geomax AG, Widnau, St. Gallen, Switzerland).

2.3. Methodology

The first phase of satellite data acquisition consisted of downloading Level III images from Planet’s web platform (www.planet.com). As previously indicated in Section 2, the drone flights recorded data on SD cards housed in the multispectral sensor.
In contrast to the satellite images, which are adjusted for reflectance in the download, the photogrammetric flight images had to be processed. This was carried out using Pix4D Mapper software (version 4.9.0), (www.pix4d.com).
Pix4D Mapper made it possible to insert the coordinates of the Ground Control Points (GCPs) obtained from the RTK observations (Table 2), to carry out the pansharpening process in order to obtain the specified resolution and to radiometrically correct the images using the DLS system [42,43,44,45]. The following products were obtained: point clouds, DSMs and orthophotos for each sensor band, from which the different vegetation indices were subsequently calculated. The orthophotos are the result of geometrically correcting the aerial photographs. The orthophotos have a constant scale, and the points indicate accurate geographical positions.
In order to carry out a statistical study of the subject area, the difference between the images was evaluated by comparing the averages (calculated for each individual image) for each of the four VIs: the Normalized Difference Vegetation Index (NDVI), the Green Normalized Difference Vegetation Index (GNDVI), the Normalized Difference Red Edge Index (NDRE) and the Leaf Chlorophyll Index (LCI) [46]. For each of the 11,104 trees, data were obtained for the four VIs on each of the selected dates: 29 January and 15 April 2023, i.e., 44,416 statistical values were obtained for analysis.
These indices were selected for their ability to characterise different physiological aspects of the crop, such as vigour, biomass, chlorophyll content and photosynthetic activity, at critical stages. Previous studies on olive groves have demonstrated the usefulness of the NDRE index for detecting irrigation heterogeneities [21], and the consistency of the NDVI index in UAV–satellite comparisons [23]. Meanwhile, research on other crops has shown that these indices are effective in segmenting and monitoring management areas [47].
From the processing of the orthophotos, the values of reflectance in the different bands of the electromagnetic spectrum were obtained. By combining these bands, the following vegetation indices were obtained [31,47,48,49,50,51]:
NDVI [52,53,54]: this index estimates the vegetation value by combining the NIR and red bands.
NDRE: this index is a useful indicator for analysing biomass during the ripening stage [21,55].
LCI: this index assesses chlorophyll [56,57,58]:
GNDVI: this index assesses photosynthetic activity at advanced stages of the plant cycle [59,60,61,62].
These indices were calculated from calibrated reflectances. In the case of UAVs, images were processed with Pix4D Mapper to generate georeferenced orthomosaics; their resolution was then adjusted to 3 m/pixel to allow direct comparison with Planet imagery. For the satellite images, processing was performed in QGIs Desktop 3.40.8 by selecting specific bands for each index.
This resampling method used block averaging, whereby each satellite pixel (3 × 3 m) acted as a reference polygon, and the average value of all the UAV pixels within it was calculated. This equated each satellite pixel to an average polygon derived from the UAV, ensuring spatial consistency between the two resolutions. This technique was chosen for its ability to preserve average radiometric information and avoid spectral biases when comparing the two resolutions, as well as to ensure spatial consistency.
Following this process, the values corresponding to each individual tree (n = 11,104) were extracted, yielding a total of 88,832 data points per date. These values formed the basis of subsequent statistical analyses, including Spearman’s correlation, Student’s t-test and predictive models.
At the individual tree analysis level, a vector grid was used to segment each of the 11,104 olive trees in the grove. This allowed the extraction of mean index values per individual tree for both dates and sensors. These values are summarized in Appendix A (Table A1), which presents the complete descriptive statistics used for the comparative analyses between sensors.
Once the statistics were obtained for each individual tree (88,832 per date), different study methods were used to assess the differences between the images, including correlation analyses, mean comparison tests and regression analyses using IBM SPSS Statistics 27 software.
The first step was to check whether the study variables followed a normal distribution. The Kolmogorov–Smirnov test was used for this purpose, and the non-parametric Spearman correlation method was employed in all cases where the p-value < 0.05.
To evaluate common patterns, the Spearman correlation coefficient between the variables (VIs) obtained from UAV and satellite imagery was analysed to determine joint variability, typify the data and identify significant relationships in the spatial distribution of vigour [8,31,63,64,65].
It was established that at a 95% confidence level of and a significance level of 5%, the null hypothesis, Ho, states that there is no linear relationship between the variables (r = 0).
The second method of analysis, the Student’s t-test for paired samples, was used to compare the images based on the means of the four VIs calculated for each of the 11,104 individuals on each date, i.e., for each pair of images [66,67]. This test allowed us to determine whether there was a significant difference between the dependent groups. As in the previous case, a confidence level of 95% and a significance level of 5% were used to formulate the null hypothesis, H0, which postulates that there are no significant differences between the VI means obtained for each individual tree, and the alternative hypothesis, H1, which postulates that there are significant differences between the VIs measured in the satellite images and in the images obtained via the drone. As in the other studies, the test was carried out for the two dates indicated.
A multiple regression analysis was performed to calibrate the system, linking measurements obtained from the satellite imagery to those from drone imagery [68]. The aim of this study was to obtain a regression equation to explain the relationship between the products, thus obtaining a model, and to analyse how the different variables behave in a multivariate manner and whether they are all significant in the model. This study was carried out to predict the NDRE values obtained from the drone imagery using combinations of vegetation indices. Initially, a multiple linear regression model was fitted using ordinary least squares, with the NDRE derived from the UAV acting as the dependent variable and the satellite indices NDVI, GNDVI and LCI acting as the predictors. However, multicollinearity was detected between the variables (VIF > 5), meaning that only the LCI index was statistically significant.
This reduced the model to a simple linear regression with a high coefficient of determination (R2 = 0.753), indicating a strong association between the two indices. However, it did not meet the assumptions of residual independence (Durbin–Watson statistic = 0.974), indicating autocorrelation and limiting its validity as a robust predictive model. Given these limitations, a non-parametric approach based on decision trees was chosen [69,70,71]. These models offered superior performance, with reduced mean errors (RMSE ≈ 0.04) and stability between the training and validation sets. This makes them a more suitable alternative for agronomic applications in the context of high intra-plot variability. Cross-validation (70/30 training/testing) was used to avoid overfitting.
To optimise olive orchard monitoring, cluster segmentation was performed in the final study. An unsupervised classification using hierarchical k-means was performed to segment the olive orchard into homogeneous management zones, both with 5 and 10 clusters. This methodology has previously been used in olive cultivation, with positive outcomes [72,73,74,75,76,77].
This entire methodology is outlined in Figure 5.

3. Results

3.1. VI Image Comparison

Thematic maps were generated from the VIs (NDVI, NDRE, LCI and GNDVI), visually representing crop conditions for each survey date (January and April). These maps facilitated the identification of areas of significant variation in crop vigour and biomass, as well as areas of stress or low vegetative development. Numerical data were also generated for the statistical evaluation of the images obtained from both sources.
The IV values were extracted for each of the 11,104 individual trees, resulting in a total of 88,832 data points. The comparison of images (satellite and drone) with the same resolution and using descriptive statistical analysis showed that the mean values obtained from the drone imagery were, in general, higher than those from the satellite images (Table 3). This indicates that the UAV sensor is more sensitive due to its higher spatial resolution and lower pixel blending effect. This finding is also documented by Caruso et al. [78] and Li et al. [79].
Figure 6 and Figure 7 illustrate the spatial distribution of the NDRE index for both sources. A common pattern emerges in the zoning, with higher vigour in the central areas of the crop and lower vigour observed at the edges. However, the UAV images provide a more accurate definition of tree lines and individual differences. This greater UAV accuracy enables farmers to detect intra-plot variability (i.e., differences between neighbouring trees), which satellites cannot distinguish. This is crucial for the early identification of water-, nutritional- or health-related stress problems. This pattern reflects the effects of microclimate and competition for resources: central trees typically have greater canopy density and water availability than peripheral trees, which are more exposed to environmental stress. In olive groves, where canopy variability can indicate soil limitation or management deficiencies, this finer resolution enables more precise adjustments to be made to irrigation or fertilization doses for specific areas, rather than applying uniform treatments across the entire farm.

3.2. Correlation Between Satellite Indices and UAVs

Spearman’s correlation was applied to the IVs to check whether both sources reflected the same spatial patterns.
The results show a significant positive correlation (p < 0.001) in all cases (Table 4 and Table 5), with correlation coefficients ranging from 0.45 to 0.68, depending on the IV and date. The LCI index presented the highest correlation (R = 0.625 in January and R = 0.553 in April), as shown in Figure 8 and Figure 9.
This indicates that, despite differences in resolution, both sensors detect similar patterns of vegetation vigour and can identify areas of overlap [71].
Although the absolute values differ, the existence of consistent correlations confirms that, while satellites are useful for general monitoring, UAVs offer added values for precise management, as they can distinguish minor canopy variations that are relevant to the final yield. The January stage (dormancy/pruning), which has lower vegetative activity and a more open canopy structure, showed higher correlations than April. This suggests that, during phases of low leaf cover, satellites can accurately represent general patterns. However, in April, when budding, flowering and leaf expansion occur, the structural complexity of the olive tree reduces satellites’ ability to capture actual variability. This finding is critical from an agronomic perspective, as April is a key stage in determining the future production load. This complementary evidence confirms the relevance of the chosen phases and demonstrates that the UAV–satellite correlation responds to critical agronomic changes in the olive tree cycle.

3.3. Statistical Comparison of Means

A Student’s t-test for related samples was performed to compare the mean values of each index between the two images for each individual (Table 6).
A p-value of less than 0.001 was obtained; therefore, there were significant differences between the two sources for all variables evaluated on both dates. This confirms that sensors, although reflecting similar patterns, do not provide the same absolute values, with the UAV data being systematically higher due to its higher spectral and spatial resolution (Table 7).
From a practical point of view, this difference means that UAVs can be used to calibrate satellite indices in olive groves with high precision. This enables local models to be scaled up to larger areas, thereby improving the transferability of results and facilitating the implementation of continuous monitoring programs.

3.4. Regression Models and Decision Trees

The possibility of predicting UAV sensor values from satellite data was explored. An attempt was made to fit a multiple linear regression model, using the NDRE values obtained from the UAV imagery as the dependent variable and the satellite values as the independent variables [80].
The independence of the satellite variables in relation to the drone’s dependent variable was analysed, along with the type of line that best fit each relationship. The example study for the dependent variable LCI is shown in Table 8 and Figure 10.
This study was carried out for all variables on both dates by creating the possible regression models and comparing the results. The null hypothesis (Ho) was that the variables could form a model because there was no significant relationship between them. Although all three options yielded a significance level of less than 0.05, indicating that a model could be formed with each (Table 9), according to the collinearity diagnostics (Table 10), only one of the models, NDRE_UAV = −0.063 + 0.859 × LCI_Planet (Table 11), showed acceptable collinearity, with R2 = 0.753. However, the Durbin–Watson statistic detected autocorrelation of residuals (DW = 0.974), so the linear model was discarded.
A non-parametric model based on decision trees Figure 11 was then chosen, which produced better results.
This model was tested by splitting it into test/sample groups, with the results shown in Figure 12 and Figure 13.
The root mean square error (RMSE) was 0.039 for the training set and 0.04 for the test set in January, remaining similarly low in April. A precision of more than 80% indicates good generalisability without signs of overfitting when non-parametric methodologies are applied in multi-source studies [71].
In decision trees, the satellite variables with the highest predictive power were NDRE and LCI. This is consistent with their sensitivity to photosynthetic activity and leaf chlorophyll content.
Compared to linear regression, decision trees achieved a 20–25% reduction in error. This demonstrates that this approach is more appropriate for perennial crops, such as olive trees, where the relationship between vigour, canopy structure and reflectance is nonlinear and influenced by multiple factors, such as canopy density, tree age and soil conditions.
This result shows farmers that UAVs do not always need to fly over the entire cropland: a satellite can provide baseline data and a decision tree can help identify areas where a UAV should be used to obtain more detailed information.

3.5. Crop Area Segmentation into Clusters

Finally, a two-stage hierarchical cluster classification was used for zoning the production area according to homogeneous management zones based on the individual IVs. Five- and ten-cluster classifications were compared for each date and sensor.
Classifying the results into 5 or 10 groups enabled a direct linkage to management practices. In 5-zone configurations, areas of high and low vigour can be identified to inform general fertilisation and irrigation strategies. In 10-zone configurations, greater discrimination enables more specific interventions to be designed, such as differential pruning or localised nutrient applications. This segmentation capability transforms the maps into operational tools for precision agriculture in olive groves.
For the April drone data, five zones were identified (Figure 14 and Table 12), and the corresponding zone map is shown in Figure 15.
For the April drone data, ten zones were identified (Figure 16 and Table 13), and the corresponding zone map is shown in Figure 17.
The same analyses were carried out on the Planet images on both dates, with the same results. Classifying the images into 10 clusters showed better discrimination, preventing outliers from being overly influential and providing a more detailed zoning.
The central zones of the crop, which exhibited higher vigour, could be clearly differentiated from the peripheral zones, which were more adversely affected by environmental conditions. This technique has been used successfully in similar studies on olive orchard intensification [76].
Cluster classification should be viewed as an agronomic zoning tool. In areas of greater vigour (central clusters), farmers can apply controlled deficit irrigation strategies to optimise water use without compromising production.
In peripheral areas, where vigour is low, this information enables informed decisions to be made regarding the provision of additional inputs (e.g., irrigation or fertiliser) or, in extreme cases, the implementation of management changes or replanting.
Thus, UAV and satellite-based zoning provide a direct route to precision agriculture in olive groves.
Overall, the results show that combining UAVs and satellites allows variability patterns to be captured and provides an objective basis for defining UAV activation thresholds. In other words, it enables anomalies that are significant enough to justify a drone campaign for a more detailed diagnostics to be identified when they are detected by the satellite.

4. Discussion

The comparative analysis of vegetation indices calculated from multispectral satellite and UAV images allowed us to evaluate the effectiveness of these technologies with respect to the monitoring of super-intensive olive orchards. This section discusses the study results in terms of data accuracy, correlation between images, and usefulness for decision making in agricultural management.
Although satellite images have a lower spatial resolution, this study has shown that they can capture general patterns of vigour in the crop, which is useful for obtaining a general overview of the state of the crop at a low operational cost. However, drone imagery can identify specific areas at the level of individual trees, which is essential for localised management applications such as differentiated fertilisation, disease detection or water stress damage assessment. Thess findings are consistent with those of Psomiadis et al. [71] and Bollas et al. [23], who reported correlation values above 90% between UAV and Sentinel-2 for the NDVI index. This reinforces our observation of a general correspondence between the two sources, although with higher definition in UAV imagery.
Significant correlations between sensors (the Spearman correlation coefficients ranged between 0.45 and 0.68, p < 0.001) indicate that, despite differences in scale and resolution, both systems record coherent spatial patterns. This level of agreement suggests that integrating both types of data is a viable approach for developing multiscale monitoring systems.
The fitted multiple linear regression model showed a high coefficient of determination (R2 = 0.753) when the satellite-derived LCI was used to predict the UAV-based NDRE. However, due to autocorrelation in the residuals (DW = 0.974), this model was discarded in favour of a non-parametric solution. In this context, decision trees produced more accurate results, with low quadratic error (RMS = 0.004) and enabled robust prediction of UAV sensor values from satellite data without over-fitting.
This study further proposes the utilisation of the 10th percentile of the vegetation index values obtained by satellite as an activation threshold, with the objective of identifying critical areas. This threshold facilitates the identification of trees or areas exhibiting diminished vigour, thereby enabling the implementation of agronomic interventions (e.g., irrigation or diversified fertilisation) solely in instances where such intervention is genuinely imperative. The present approach has been shown to optimise resources and avoid including healthy areas, thereby maintaining precise and efficient intervention. This facilitates the planning of UAV flights with greater precision, enabling the inspection of trees that require intervention in a more targeted manner. This, in turn, optimises the allocation of time, financial resources and other assets. In this manner, the activation threshold provides a quantitative instrument for agronomic decision-making, whilst also ensuring the more efficient monitoring of intra-plot variability, stress and the growth of olive trees. This is in accordance with the objectives of precision agriculture.
These results are consistent with those of other studies that have integrated UAV and satellite data to predict crop response to varying nitrogen doses [81], demonstrating that non-parametric models can overcome the limitations of classical methods. Multisensor integration has also been identified as an effective approach for enhancing monitoring accuracy without losing scalability [23,79].
Cluster segmentation was also a key component of the analysis, as it enabled the crop to be divided into homogeneous management zones according to the spectral behaviour of the trees. The 10-cluster classification was more efficient than the 5-cluster classification, as it avoided the overlap of extreme values and showed better differentiation of intermediate zones, which could be missed in a more general zoning system (Figure 18). The selection of the optimal number of clusters (k) was based on a combined evaluation of agronomic coherence and statistical consistency criteria. A comparative analysis was conducted on solutions exhibiting varying k values (ranging from 5 to 15). It was observed that a stabilisation point was attained in the reduction of intragroup variance around k = 10. This finding suggests that increasing the number of groups does not result in a significant enhancement of internal differentiation. This behaviour, analogous to that observed using the elbow method, was utilised as a reference to define the final structure of the classification. Moreover, this solution offered optimal spatial correspondence between dates, thereby circumventing overlaps of extreme values and enabling the agronomic interpretation of homogeneous management areas [28,29,82,83].
For the April case study of Planet, it was observed that the difference in cohesion measures between five and ten zones is very similar (0.4). However, the size of the groups was, for five clusters, 7619 individuals, i.e., 68.6% of the cases, generating poor zoning due to the existence of extreme values that needed to be considered. Analysing the zoning in ten clusters resulted in a better distribution of the data, with the largest group size being 4142 individuals, corresponding to 37.3%. It was therefore decided to use the 10-cluster classifications.
For the January images, the olive tree canopies were relatively homogeneous, which was reflected by both sensors, clearly showing the zones segmented by clusters. However, in the April images, a greater variability in the condition of the olive trees was observed, coinciding with the phenological stage of the crop during which buds appear that will later develop into flowers or new shoots. In this case, although the areas identified were the same for both sensors, the drone showed a higher resolution, which made it possible to locate the problem areas with greater precision, as shown in Figure 19.
The ability to identify variations within plots at pivotal stages of the growth cycle is crucial for predicting variations in growth, vigour and productivity between neighbouring trees. This has direct implications for planning irrigation and fertilisation.
This zoning can be applied directly to precision agricultural management strategies, such as irrigation planning, fertilisation and sampling. In this way, maps derived from remote sensing serve as visual representations and decision-support tools, enabling the optimisation of inputs, reduction of costs and improvement of sustainability. Segmentation based on multispectral indices is a useful tool for data-driven agronomic planning, as highlighted in articles on technological innovation in olive growing [77].
Despite the large number of studies examining the integration of remote sensing in precision agriculture [84,85,86], few focus specifically on super-intensive olive orchards. This work is novel in that it compares UAV and satellite imagery in real conditions, employs detailed statistical and spatial analyses and proposes predictive models with practical implications. Specifically, two predictive approaches were evaluated: a multiple linear regression model (NDRE_UAV = −0.063 + 0.859 × LCI_Planet), which achieved a high coefficient of determination (R2 = 0.753) but was discarded due to autocorrelation in the residuals (DW = 0.97); and a non-parametric model based on decision trees, which offered better performance (RMSE = 0.04, accuracy > 80%) and higher generalisability. The fact that decision trees proved more appropriate is explained by their ability to handle nonlinear relationships and physiological response thresholds that are common in woody crops, where growth and production do not increase proportionately to inputs (water, nutrients), but rather present saturation points or differential responses depending on the phenological stage.
Progress has been made compared to previous work, as it includes spatial clustering techniques applied at the individual (tree) scale, instead of homogeneous blocks or grids. It also provides a methodological improvement by applying a robust statistical validation, including hierarchical geospatial segmentation and focusing on highly specific woody crops such as super-intensive olive groves, which have received little attention the literature. This approach differs from previous work, in which instance segmentation was applied using convolutional neural networks to estimate individual canopy biometry in traditional olive orchards based on geometric variables [27]. However, the approach of this study combines segmentation scale with multispectral analysis, UAV/satellite correlation and predictive modelling to generate management zones.
A comparison of this study with recent research in this field [87,88] indicates that it underlines the potential of machine learning models applied to agricultural remote sensing. However, the present study focuses on the adaptation of these models to woody crops and the segmentation of these crops at the individual level.
This study complements and extends the approaches developed for annual crops such as maize, wheat and onions [23,79,81] by transferring these methodologies to the context of woody crops, whose physiological responses and management present different particularities. Compared to other perennial crops, such as vines and stone fruit trees, olive trees respond more slowly to interventions. At the same time, they are highly sensitive during phases such as budding and flowering in April, when water or nutritional stress can have a decisive impact on production in the following season [14,89]. These findings improve the methodological basis for multiscale monitoring of olive orchards, proposing a scalable, validated and adaptable integration methodology that contributes to the development of intelligent decision support systems for the agricultural sector.
Finally, this study has several limitations that require further discussion. The spatial representativeness of UAVs may be restricted by the size of the plots and the partial coverage of the flights, limiting the ability to generalise results to regional scales.
The comparison between UAVs and satellites is affected by inherent differences in spatial and spectral resolution, which can amplify or reduce the detected differences. Furthermore, the temporal extrapolation of the results is limited to two phenological stages (winter and spring), meaning that critical stages of the olive tree, such as fruit enlargement and ripening, are not covered.
Furthermore, the availability of UAV data and its operational cost continue to hinder the scalability of these methods in larger production systems, where satellites are more accessible. As pointed out multiscale integration requires statistical tools, as well as calibration and cross-validation protocols, to ensure consistency between data sources.
Recognising these limitations does not undermine the findings; rather, it defines the scope of application and suggests future areas of research, such as incorporating longer time series, exploring the use of hyperspectral and thermal sensors, validating the results in other olive-growing regions and comparing the approach with other woody crops, such as grapes, almonds and citrus fruits. This will enable an evaluation of the robustness and transferability of the approach.

5. Conclusions

This study confirms the effectiveness of the combined use of multispectral images acquired by satellites and UAVs for detailed monitoring of olive crops. By comparing vegetation indices such as NDVI, NDRE, LCI and GNDVI, this work shows that UAV images provide a higher level of resolution and precision, allowing the detection of specific problems at the plant level. However, satellite images, despite their lower resolution, facilitate the analysis of spatial variability patterns at a larger scale and with more continuous temporal coverage.
The results of this study show that satellite images are particularly useful for providing an overview of the crop and identifying areas of interest for monitoring. On the other hand, UAV images allow detailed zoning, improving accuracy in the classification of specific management areas within the olive grove, optimising input management and facilitating a form of precision agriculture that is more adapted to the requirements of each crop. In the analyses carried out, a significant correlation was observed between the vegetation indices obtained with both technologies. Moreover, the use of statistical methods such as Spearman’s correlation coefficient and multiple regression showed that both types of images exhibited similar patterns of vigour and crop stress, with statistically significant results (p < 0.001), particularly evident in the LCI index.
Furthermore, the non-parametric models based on decision trees outperformed linear regressions in terms of error and generalisability, achieving an RMSE of 0.04 and an accuracy of over 80%. This reinforces the usefulness of flexible approaches in complex agricultural scenarios.
This study, however, has limitations. The need to match the resolution between satellite and UAV images using a 3 × 3 m mask may introduce errors due to the averaging of multi-pixel values in the UAV imagery. This suggests that, for future studies, multitemporal data should be integrated throughout the whole phenological cycle of the olive tree, along with field validation using physiological or yield variables. It would also be useful to incorporate methods based on object-oriented analysis to improve the discrimination between vegetation and soil.
In light of the results obtained, a hybrid approach is recommended, combining the wide coverage of satellite images for the initial detection of variations, followed by the detailed analysis of UAVs in specific areas. This approach not only optimises costs by reducing the frequency of UAV flights but also improves the ability to respond to specific crop problems, such as water stress or pest detection. In addition, to improve monitoring efficiency and accuracy, future research should explore the use of advanced data fusion technologies and the integration of other sensors, such as hyperspectral or thermal sensors, to provide additional information on crop health and vigour.
This work adds to the limited but growing literature on precision olive growing, especially in super-intensive systems. This research provides results that are applicable to a permanent woody system.
In conclusion, this study demonstrates the potential of the combined use of remote sensing technologies in precision agriculture, providing a replicable and scalable tool for woody crops with uniform structure. The integration of multispectral imagery and advanced analysis techniques reinforces the importance of these approaches in optimising resource use, improving productivity and promoting more sustainable agricultural management.
This study demonstrates that the combination of UAV and satellite multispectral data is an effective strategy for agronomic monitoring of super-intensive olive orchards. The integration of these makes it possible to take advantage of the strengths of each: on the one hand, the high spatial resolution of UAV enables the identification of variability at the individual tree level; on the other hand, satellite images offer a higher temporal resolution frequency and wider coverage, facilitating more consistent and cost-effective operational monitoring.

Author Contributions

Conceptualization, E.A. and S.L.-C.; methodology, E.A., J.A. and I.M.; software, E.A., J.A. and E.P.-M.; validation, E.A., S.L.-C. and J.A.; formal analyses, E.A., S.L.-C., J.A., E.P.-M. and I.M.; investigation, E.A., S.L.-C. and J.A.; data curation, E.A., S.L.-C. and I.M.; writing—review and editing, E.A., S.L.-C., J.A., E.P.-M. and I.M.; supervision, S.L.-C. and I.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was carried out with the Support of the “information fusion with hyperspectral and multispectral uav sensors and their use in precision agriculture (Project No. FPA1900001501)”, Premio Arce Fundation, E.T.S.I. Agronomos, Universidad Politécnica de Madrid.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available in the article.

Acknowledgments

The authors sincerely thank the Acre Group company for their collaboration in the field tests.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Mean values of the different VIs (NDVI, NDRE, LCI, GNDVI) measured for each of the 11,104 individual trees in the plot on 15 April 2023.
Table A1. Mean values of the different VIs (NDVI, NDRE, LCI, GNDVI) measured for each of the 11,104 individual trees in the plot on 15 April 2023.
Centroid CoordDRONPlanet
IDΧYNDVlLCINDREGNDVlNDVILCINDREGNDVl
MeanMeanMeanMeanMeanMeanMeanMean
1524,656.54,390,903.50.19470.11610.10740.40840.36520.22970.20230.4433
2524,659.54,390,903.50.29530.14740.12340.45120.36960.23910.21150.452
3524,650.54,390,900.50.51450.33390.33970.51390.47630.39620.36680.5199
4524,653.54,390,900.50.62170.4320.36120.59450.48340.36120.32190.5043
5524,656.54,390,900.50.46340.30420.25960.52080.49360.38180.34190.5285
6524,659.54,390,900.50.26460.13880.12210.43530.43940.28930.25160.4887
7524,662.54,390,900.50.35260.17210.14030.48210.41050.27540.24260.4813
8524,647.54,390,897.50.59120.43520.37610.56910.52070.46280.43750.5741
9524,650.54,390,897.50.7153 0.50310.41360.68550.55670.46020.41970.5864
10524,653.54,390,897.50.73080.53110.44230.70920.59840.47870.42750.6072
11524,656.54,390,897.50.74830.52660.43120.71680.59060.49240.44840.609
12524,659.54,390,897.50.43160.30720.26020.53960.510.37780.33370.5451
13524,662.54,390,897.50.24470.12250.10780.43370.45020.31540.27790.5145
14524,665.54,390,897.50.27350.12610.10670.43850.40280.28750.25780.5007
15524,641.54,390,894.50.5866 0.43010.37220.56680.53590.41570.37110.6008
16524,644.54,390,894.50.65410.46960.39590.62350.59370.46120.40730.6275
17524,647.54,390,894.50.75470.54760.45370.72510.65760.52090.45830.6603
18524,650.54,390,894.50.73510.52420.4330.71730.6390.52180.4670.6442
19524,653.54,390,894.50.7340.51220.41860.71540.6370.51190.4550.6346
20524,656.54,390,894.50.75150.5540.46270.72340.62960.50430.44810.6262
21524,659.54,390,894.50.74330.52550.43160.7160.57940.44840.39640.5898
22524,662.54,390,894.50.40430.24040.20540.52450.52250.33370.33690.5666
23524,665.54,390,894.50.20660.09730.08710.42350.45750.3210.28250.5409
24524,638.54,390,891.50.62260.45850.39370.60330.59560.43020.36910.6044
25524,641.54,390,891.50.72360.51390.42430.68780.63540.50020.44060.6419

References

  1. Gomez-del-Campo, M.; Pérez, A.G.; García, J.M. Olive Oil Quality of Cultivars Cultivated in Super-High-Density Orchard under Cold Weather Conditions. Horticulturae 2023, 9, 824. [Google Scholar] [CrossRef]
  2. Plant Physiological Ecology—Hans Lambers, F Stuart Chapin III, Thijs L. Pons—Google Libros. Available online: https://books.google.es/books?hl=es&lr=&id=PXBq6jsT5SYC&oi=fnd&pg=PR2&dq=Lambers+et+al.,+2008&ots=zuGGm4NokF&sig=8mu-F6332AphaGNUfH4kCJrq7gs#v=onepage&q=Lambers%20et%20al.%2C%202008&f=false (accessed on 4 November 2024).
  3. Hatfield, J.L.; Gitelson, A.A.; Schepers, J.S.; Walthall, C.L. Application of Spectral Remote Sensing for Agronomic Decisions. Agron. J. 2008, 100, S-117–S-131. [Google Scholar] [CrossRef]
  4. Wang, Y.P.; Chang, K.W.; Chen, R.K.; Lo, J.C.; Shen, Y. Large-Area Rice Yield Forecasting Using Satellite Imageries. Int. J. Appl. Earth Obs. Geoinf. 2010, 12, 27–35. [Google Scholar] [CrossRef]
  5. Mulla, D.J. Twenty Five Years of Remote Sensing in Precision Agriculture: Key Advances and Remaining Knowledge Gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  6. Zhang, C.; Kovacs, J.M. The Application of Small Unmanned Aerial Systems for Precision Agriculture: A Review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  7. Gómez, C.; White, J.C.; Wulder, M.A. Optical Remotely Sensed Time Series Data for Land Cover Classification: A Review. ISPRS J. Photogramm. Remote Sens. 2016, 116, 55–72. [Google Scholar] [CrossRef]
  8. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved Estimation of Rice Aboveground Biomass Combining Textural and Spectral Analysis of UAV Imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  9. Bendig, J.; Bolten, A.; Bareth, G. UAV-Based Imaging for Multi-Temporal, Very High Resolution Crop Surface Models to Monitor Crop Growth Variability. Photogramm.—Fernerkund.—Geoinf. 2013, 2013, 551–562. [Google Scholar] [CrossRef]
  10. Lu, D. The Potential and Challenge of Remote Sensing—Based Biomass Estimation. Int. J. Remote Sens. 2006, 27, 1297–1328. [Google Scholar] [CrossRef]
  11. Dindaroğlu, T.; Kılıç, M.; Günal, E.; Gündoğan, R.; Akay, A.E.; Seleiman, M. Multispectral UAV and Satellite Images for Digital Soil Modeling with Gradient Descent Boosting and Artificial Neural Network. Earth Sci. Inf. 2022, 15, 2239–2263. [Google Scholar] [CrossRef]
  12. Semanat, A.S.; Hung, L.R.; Piñol, D.C. Fraunhofer IIS (PDF) Teledetección de Estrés en Cultivos de Caña a Través de Imágenes Multiespectrales Remote Detection of Stress in Cane Crops Through Multispectral Images. Available online: https://www.researchgate.net/publication/363567221_TELEDETECCION_DE_ESTRES_EN_CULTIVOS_DE_CANA_A_TRAVES_DE_IMAGENES_MULTIESPECTRALES_REMOTE_DETECTION_OF_STRESS_IN_CANE_CROPS_THROUGH_MULTISPECTRAL_IMAGES (accessed on 6 November 2024).
  13. (PDF) Development and Characterization of Healthy Plant Tissue Simulators and with Water Stress for the Calibration of a Multispectral Camera. Available online: https://www.researchgate.net/publication/317415630_DEVELOPMENT_AND_CHARACTERIZATION_OF_HEALTHY_PLANT_TISSUE_SIMULATORS_AND_WITH_WATER_STRESS_FOR_THE_CALIBRATION_OF_A_MULTISPECTRAL_CAMERA (accessed on 6 November 2024).
  14. Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A.J. Fluorescence, Temperature and Narrow-Band Indices Acquired from a UAV Platform for Water Stress Detection Using a Micro-Hyperspectral Imager and a Thermal Camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
  15. Calderón, R.; Navas-Cortés, J.A.; Lucena, C.; Zarco-Tejada, P.J. High-Resolution Airborne Hyperspectral and Thermal Imagery for Early Detection of Verticillium Wilt of Olive Using Fluorescence, Temperature and Narrow-Band Spectral Indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  16. Fernández, J.E. Understanding Olive Adaptation to Abiotic Stresses as a Tool to Increase Crop Performance. Environ. Exp. Bot. 2014, 103, 158–179. [Google Scholar] [CrossRef]
  17. Prousalidis, K.; Bourou, S.; Velivassaki, T.H.; Voulkidis, A.; Zachariadi, A.; Zachariadis, V. Olive Tree Segmentation from UAV Imagery. Drones 2024, 8, 408. [Google Scholar] [CrossRef]
  18. Nolè, G. Remote Sensing Techniques in Olive-Growing: A Review. Curr. Investig. Agric. Curr. Res. 2018, 2, 205–208. [Google Scholar] [CrossRef]
  19. Messina, G.; Peña, J.M.; Vizzari, M.; Modica, G. A Comparison of UAV and Satellites Multispectral Imagery in Monitoring Onion Crop. An application in the ‘Cipolla Rossa di Tropea’ (Italy). Remote Sens. 2020, 12, 3424. [Google Scholar] [CrossRef]
  20. Manfreda, S.; McCabe, M.F.; Miller, P.E.; Lucas, R.; Madrigal, V.P.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the Use of Unmanned Aerial Systems for Environmental Monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef]
  21. Jorge, J.; Vallbé, M.; Soler, J.A. Detection of Irrigation Inhomogeneities in an Olive Grove Using the NDRE Vegetation Index Obtained from UAV Images. Eur. J. Remote Sens. 2019, 52, 169–177. [Google Scholar] [CrossRef]
  22. Díaz-Varela, R.A.; De La Rosa, R.; León, L.; Zarco-Tejada, P.J.; Lucieer, A.; Rascher, U.; Bareth, G.; Baghdadi, N.; Thenkabail, P.S. High-Resolution Airborne UAV Imagery to Assess Olive Tree Crown Parameters Using 3D Photo Reconstruction: Application in Breeding Trials. Remote Sens. 2015, 7, 4213–4232. [Google Scholar] [CrossRef]
  23. Bollas, N.; Kokinou, E.; Polychronos, V. Comparison of Sentinel-2 and UAV Multispectral Data for Use in Precision Agriculture: An Application from Northern Greece. Drones 2021, 5, 35. [Google Scholar] [CrossRef]
  24. Xing, Y.; Wang, F.; Xu, F. Above Ground Biomass Estimation By Multi-Source Data Based On Interpretable DNN Model. Int. Geosci. Remote Sens. Symp. 2023, 2023, 1894–1897. [Google Scholar] [CrossRef]
  25. Jurado, J.M.; Ortega, L.; Cubillas, J.J.; Feito, F.R. Multispectral Mapping on 3D Models and Multi-Temporal Monitoring for Individual Characterization of Olive Trees. Remote Sens. 2020, 12, 1106. [Google Scholar] [CrossRef]
  26. Houborg, R.; McCabe, M.F. A Cubesat Enabled Spatio-Temporal Enhancement Method (CESTEM) Utilizing Planet, Landsat and MODIS Data. Remote Sens. Environ. 2018, 209, 211–226. [Google Scholar] [CrossRef]
  27. Safonova, A.; Guirado, E.; Maglinets, Y.; Alcaraz-Segura, D.; Tabik, S. Olive Tree Biovolume from UAV Multi-Resolution Image Segmentation with Mask R-CNN. Sensors 2021, 21, 1617. [Google Scholar] [CrossRef]
  28. Weiss, M.; Jacob, F.; Duveiller, G. Remote Sensing for Agricultural Applications: A Meta-Review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  29. Khaliq, A.; Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Chiaberge, M.; Gay, P. Comparison of Satellite and UAV-Based Multispectral Imagery for Vineyard Variability Assessment. Remote Sens. 2019, 11, 436. [Google Scholar] [CrossRef]
  30. Vicent, J. Métodos de Documentación, Análisis y Conservación No Invasivos Para El Arte Rupestre Postpaleolítico: Radiometría de Campo e Imágenes Multiespectrales. Ensayos En La Cueva Del Tío Garroso (Alacón, Teruel)—Academia.Edu. Available online: https://www.academia.edu/14617570/M%C3%A9todos_de_documentaci%C3%B3n_an%C3%A1lisis_y_conservaci%C3%B3n_no_invasivos_para_el_arte_rupestre_postpaleol%C3%ADtico_radiometr%C3%ADa_de_campo_e_im%C3%A1genes_multiespectrales_Ensayos_en_la_cueva_del_t%C3%ADo_Garroso_Alac%C3%B3n_Teruel_ (accessed on 12 November 2024).
  31. Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting Grain Yield in Rice Using Multi-Temporal Vegetation Indices from UAV-Based Multispectral and Digital Imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  32. PlanetScope—Earth Online. Available online: https://earth.esa.int/eogateway/missions/planetscope (accessed on 6 November 2024).
  33. Zhou, Y.; Zhang, L.; Xiao, J.; Chen, S.; Kato, T.; Zhou, G. A Comparison of Satellite-Derived Vegetation Indices for Approximating Gross Primary Productivity of Grasslands. Rangel. Ecol. Manag. 2014, 67, 9–18. [Google Scholar] [CrossRef]
  34. Bannari, A.; Morin, D.; Bonn, F.; Huete, A.R. A Review of Vegetation Indices. Remote Sens. Rev. 1995, 13, 95–120. [Google Scholar] [CrossRef]
  35. Connor, D.J.; Gómez-del-Campo, M. Simulation of Oil Productivity and Quality of N–S Oriented Olive Hedgerow Orchards in Response to Structure and Interception of Radiation. Sci. Hortic. 2013, 150, 92–99. [Google Scholar] [CrossRef]
  36. Rallo, L.; Díez, C.M.; Morales-Sillero, A.; Miho, H.; Priego-Capote, F.; Rallo, P. Quality of Olives: A Focus on Agricultural Preharvest Factors. Sci. Hortic. 2018, 233, 491–509. [Google Scholar] [CrossRef]
  37. Medina-Alonso, M.G.; Cabezas, J.M.; Ríos-Mesa, D.; Lorite, I.J.; León, L.; de la Rosa, R. Flowering Phenology of Olive Cultivars in Two Climate Zones with Contrasting Temperatures (Subtropical and Mediterranean). Agriculture 2023, 13, 1312. [Google Scholar] [CrossRef]
  38. Kelcey, J.; Lucieer, A. Sensor Correction of a 6-Band Multispectral Imaging Sensor for UAV Remote Sensing. Remote Sens. 2012, 4, 1462–1493. [Google Scholar] [CrossRef]
  39. Cooley, T.; Anderson, G.P.; Felde, G.W.; Hoke, M.L.; Ratkowski, A.J.; Chetwynd, J.H.; Gardner, J.A.; Adler-Golden, S.M.; Matthew, M.W.; Berk, A.; et al. FLAASH, a MODTRAN4-Based Atmospheric Correction Algorithm, Its Applications and Validation. Int. Geosci. Remote Sens. Symp. 2002, 3, 1414–1418. [Google Scholar] [CrossRef]
  40. Mahiny, A.S.; Turner, B.J. A Comparison of Four Common Atmospheric Correction Methods. Photogramm. Eng. Remote Sens. 2007, 73, 361–368. [Google Scholar] [CrossRef]
  41. Zhu, W.; Xia, W. Effects of Atmospheric Correction on Remote Sensing Statistical Inference in an Aquatic Environment. Remote Sens. 2023, 15, 1907. [Google Scholar] [CrossRef]
  42. Daniels, L.; Eeckhout, E.; Wieme, J.; Dejaegher, Y.; Audenaert, K.; Maes, W.H. Identifying the Optimal Radiometric Calibration Method for UAV-Based Multispectral Imaging. Remote Sens. 2023, 15, 2909. [Google Scholar] [CrossRef]
  43. Xue, B.; Ming, B.; Xin, J.; Yang, H.; Gao, S.; Guo, H.; Feng, D.; Nie, C.; Wang, K.; Li, S. Radiometric Correction of Multispectral Field Images Captured under Changing Ambient Light Conditions and Applications in Crop Monitoring. Drones 2023, 7, 223. [Google Scholar] [CrossRef]
  44. Zarzar, C.M.; Dash, P.; Dyer, J.L.; Moorhead, R.; Hathcock, L. Development of a Simplified Radiometric Calibration Framework for Water-Based and Rapid Deployment Unmanned Aerial System (Uas) Operations. Drones 2020, 4, 17. [Google Scholar] [CrossRef]
  45. Mamaghani, B.; Salvaggio, C. Multispectral Sensor Calibration and Characterization for SUAS Remote Sensing. Sensors 2019, 19, 4453. [Google Scholar] [CrossRef]
  46. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-Based Plant Height from Crop Surface Models, Visible, and near Infrared Vegetation Indices for Biomass Monitoring in Barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  47. Revelo Luna, D.A.; Mejía Manzano, J.; Montoya Bonilla, B.; Hoyos García, J. Analysis of the Vegetation Indices NDVI, GNDVI, and NDRE for the Characterization of Coffee Crops (Coffea arabica). Ing. Desarro. 2021, 38, 298–312. [Google Scholar] [CrossRef]
  48. Šiljeg, A.; Marinović, R.; Domazetović, F.; Jurišić, M.; Marić, I.; Panđa, L.; Radočaj, D.; Milošević, R. GEOBIA and Vegetation Indices in Extracting Olive Tree Canopies Based on Very High-Resolution UAV Multispectral Imagery. Appl. Sci. 2023, 13, 739. [Google Scholar] [CrossRef]
  49. Albornoz, V.M.; Ñanco, L.J.; Sáez, J.L. Delineating Robust Rectangular Management Zones Based on Column Generation Algorithm. Comput. Electron. Agric. 2019, 161, 194–201. [Google Scholar] [CrossRef]
  50. de Oliveira Maia, F.C.; Bufon, V.B.; Leão, T.P. Vegetation Indices as a Tool for Mapping Sugarcane Management Zones. Precis. Agric. 2023, 24, 213–234. [Google Scholar] [CrossRef]
  51. Naqvi, S.M.Z.A.; Tahir, M.N.; Shah, G.A.; Sattar, R.S.; Awais, M. Remote Estimation of Wheat Yield Based on Vegetation Indices Derived from Time Series Data of Landsat 8 Imagery. Appl. Ecol. Environ. Res. 2019, 17, 3909–3925. [Google Scholar] [CrossRef]
  52. Pettorelli, N.; Vik, J.O.; Mysterud, A.; Gaillard, J.M.; Tucker, C.J.; Stenseth, N.C. Using the Satellite-Derived NDVI to Assess Ecological Responses to Environmental Change. Trends Ecol. Evol. 2005, 20, 503–510. [Google Scholar] [CrossRef] [PubMed]
  53. Yuniasih, B.; Adji, A.R.P.; Budi, B. Evaluation of Pre-Replanting Oil Palm Plant Health Using the NDVI Index from Landsat 8 Satellite Imagery. J. Tek. Pertan. Lampung J. Agric. Eng. 2022, 11, 304–313. [Google Scholar] [CrossRef]
  54. Priya, M.V.; Kalpana, R.; Pazhanivelan, S.; Kumaraperumal, R.; Ragunath, K.P.; Vanitha, G.; Nihar, A.; Prajesh, P.J.; Vasumathi, V. Monitoring Vegetation Dynamics Using Multi-Temporal Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI) Images of Tamil Nadu. J. Appl. Nat. Sci. 2023, 15, 1170–1177. [Google Scholar] [CrossRef]
  55. Boiarskii, B. Comparison of NDVI and NDRE Indices to Detect Differences in Vegetation and Chlorophyll Content. J. Mech. Contin. Math. Sci. 2019, 4, 20–29. [Google Scholar] [CrossRef]
  56. Barbosa, J.d.A.; de Faria, R.T.; Coelho, A.P.; Dalri, A.B.; Palaretti, L.F. Nitrogen Fertilization Management in White Oat Using Spectral Indices. Pesqui. Agropecu. Trop. 2020, 50, e64924. [Google Scholar] [CrossRef]
  57. Veneros, J.; Chavez, S.; Oliva, M.; Arellanos, E.; Maicelo, J.L.; García, L. Comparing Six Vegetation Indexes between Aquatic Ecosystems Using a Multispectral Camera and a Parrot Disco-Pro Ag Drone, the ArcGIS, and the Family Error Rate: A Case Study of the Peruvian Jalca. Water 2023, 15, 3103. [Google Scholar] [CrossRef]
  58. Zebarth, B.J.; Younie, M.; Paul, J.W.; Bittman, S. Evaluation of Leaf Chlorophyll Index for Making Fertilizer Nitrogen Recommendations for Silage Corn in a High Fertility Environment. Commun. Soil. Sci. Plant Anal. 2002, 33, 665–684. [Google Scholar] [CrossRef]
  59. Macedo, F.L.; Nóbrega, H.; de Freitas, J.G.R.; Ragonezi, C.; Pinto, L.; Rosa, J.; Pinheiro de Carvalho, M.A.A. Estimation of Productivity and Above-Ground Biomass for Corn (Zea mays) via Vegetation Indices in Madeira Island. Agriculture 2023, 13, 1115. [Google Scholar] [CrossRef]
  60. Elhag, M.; Gitas, I.; Othman, A.; Bahrawi, J.; Gikas, P. Assessment of Water Quality Parameters Using Temporal Remote Sensing Spectral Reflectance in Arid Environments, Saudi Arabia. Water 2019, 11, 556. [Google Scholar] [CrossRef]
  61. Rahman, M.M.; Robson, A. Integrating Landsat-8 and Sentinel-2 Time Series Data for Yield Prediction of Sugarcane Crops at the Block Level. Remote Sens. 2020, 12, 1313. [Google Scholar] [CrossRef]
  62. Zhibo, L.; Yaqin, L.; Huanmin, Z.; Junta, W.; Yiyuan, G.; Xinwei, Q. Study on the Decision-Making Scheme of Soil-Rice Canopy Variable Nitrogen Application Based on GNDVI Index. J. Chin. Agric. Mech. 2022, 43, 160. [Google Scholar] [CrossRef]
  63. Zhou, X.; Hu, K.; Xiao, H.; Yang, Y.; Chen, J.; Cheng, Y. Effects of Vegetation on the Spatiotemporal Distribution of Soil Water Content in Re-Vegetated Slopes Using Temporal Stability Analysis. Catena 2024, 234, 107570. [Google Scholar] [CrossRef]
  64. Bodor-Pesti, P.; Taranyi, D.; Nyitrainé Sárdy, D.Á.; Le Phuong Nguyen, L.; Baranyai, L. Correlation of the Grapevine (Vitis vinifera L.) Leaf Chlorophyll Concentration with RGB Color Indices Hortic. Horticulturae 2023, 9, 899. [Google Scholar] [CrossRef]
  65. Zhong, F.; Cheng, Q.; Ge, Y. Relationships between Spatial and Temporal Variations in Precipitation, Climatic Indices, and the Normalized Differential Vegetation Index in the Upper and Middle Reaches of the Heihe River Basin, Northwest China. Water 2019, 11, 1394. [Google Scholar] [CrossRef]
  66. Pertille, C.T.; Nicoletti, M.F. Discrimination of Forest Species Using Medium Spatial Resolution Images. Adv. For. Sci. 2021, 8, 1575–1581. [Google Scholar] [CrossRef]
  67. Telesca, L.; Lanorte, A.; Lasaponara, R. Investigating Dynamical Trends in Burned and Unburned Vegetation Covers Using SPOT-VGT NDVI Data. J. Geophys. Eng. 2007, 4, 128–138. [Google Scholar] [CrossRef]
  68. Saavedra Mora, D.; Cubillos Ortiz, A.; Machado Cuellar, L.; Murcia Torrejano, V.; Méndez Pastrana, D.A. Análisis de Índices de Vegetación en el Cultivo de Arroz en la Finca la Tebaida del Municipio de Campoalegre. Rev. Agropecu. Agroind. Angostura 2018, 5. [Google Scholar] [CrossRef]
  69. Loh, W.Y. Classification and Regression Trees. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2011, 1, 14–23. [Google Scholar] [CrossRef]
  70. Breiman, L.; Friedman, J.H.; Olshen, R.A.; Stone, C.J. Classification and Regression Trees; Chapman and Hall/CRC: Boca Raton, FL, USA, 2017; pp. 1–358. [Google Scholar] [CrossRef]
  71. Psomiadis, E.; Dercas, N.; Dalezios, N.R.; Spiropoulos, N.V. Evaluation and Cross-Comparison of Vegetation Indices for Crop Monitoring from Sentinel-2 and Worldview-2 Images. Remote Sens. Agric. Ecosyst. Hydrol. XIX 2017, 79, 104211B. [Google Scholar] [CrossRef]
  72. Marino, S.; Alvino, A. Detection of Homogeneous Wheat Areas Using Multi-Temporal UAS Images and Ground Truth Data Analyzed by Cluster Analysis. Eur. J. Remote Sens. 2018, 51, 266–275. [Google Scholar] [CrossRef]
  73. Marino, S.; Alvino, A. Agronomic Traits Analysis of Ten Winter Wheat Cultivars Clustered by UAV-Derived Vegetation Indices. Remote Sens. 2020, 12, 249. [Google Scholar] [CrossRef]
  74. Mimenbayeva, A.; Artykbayev, S.; Suleimenova, R.; Abdygalikova, G.; Naizagarayeva, A.; Ismailova, A. Determination of the Number of Clusters of Normalized Vegetation Indices Using the k-Means Algorithm. East.-Eur. J. Enterp. Technol. 2023, 5, 42–55. [Google Scholar] [CrossRef]
  75. Blachowski, J.; Dynowski, A.; Buczyńska, A.; Ellefmo, S.L.; Walerysiak, N. Integrated Spatiotemporal Analysis of Vegetation Condition in a Complex Post-Mining Area: Lignite Mine Case Study. Remote Sens. 2023, 15, 3067. [Google Scholar] [CrossRef]
  76. Navarro, R.; Wirkus, L.; Dubovyk, O. Spatio-Temporal Assessment of Olive Orchard Intensification in the Saïss Plain (Morocco) Using k-Means and High-Resolution Satellite Data. Remote Sens. 2022, 15, 50. [Google Scholar] [CrossRef]
  77. Roma, E.; Laudicina, V.A.; Vallone, M.; Catania, P. Application of Precision Agriculture for the Sustainable Management of Fertilization in Olive Groves. Agronomy 2023, 13, 324. [Google Scholar] [CrossRef]
  78. Caruso, G.; Palai, G.; Marra, F.P.; Caruso, T. High-Resolution UAV Imagery for Field Olive (Olea europaea L.) Phenotyping. Horticulturae 2021, 7, 258. [Google Scholar] [CrossRef]
  79. Li, M.; Shamshiri, R.R.; Weltzien, C.; Schirrmann, M. Crop Monitoring Using Sentinel-2 and UAV Multispectral Imagery: A Comparison Case Study in Northeastern Germany. Remote Sens. 2022, 14, 4426. [Google Scholar] [CrossRef]
  80. Hao, G.; Dong, Z.; Hu, L.; Ouyang, Q.; Pan, J.; Liu, X.; Yang, G.; Sun, C. Biomass Inversion of Highway Slope Based on Unmanned Aer-2 Ial Vehicle Remote Sensing and Deep Learning. Forests 2024, 15, 1564. [Google Scholar] [CrossRef]
  81. Messina, G.; Praticò, S.; Badagliacca, G.; Di Fazio, S.; Monti, M.; Modica, G. Monitoring Onion Crop “Cipolla Rossa Di Tropea Calabria IGP” Growth and Yield Response to Varying Nitrogen Fertilizer Application Rates Using UAV Imagery. Drones 2021, 5, 61. [Google Scholar] [CrossRef]
  82. Matese, A.; Toscano, P.; Di Gennaro, S.F.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, Aircraft and Satellite Remote Sensing Platforms for Precision Viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef]
  83. Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Peres, E.; Morais, R.; Sousa, J.J. Multi-Temporal Vineyard Monitoring through UAV-Based RGB Imagery. Remote Sens. 2018, 10, 1907. [Google Scholar] [CrossRef]
  84. Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-Based Multispectral Remote Sensing for Precision Agriculture: A Comparison between Different Cameras. ISPRS J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
  85. Samadzadegan, F.; Toosi, A.; Dadrass Javan, F. A Critical Review on Multi-Sensor and Multi-Platform Remote Sensing Data Fusion Approaches: Current Status and Prospects. Int. J. Remote Sens. 2025, 46, 1327–1402. [Google Scholar] [CrossRef]
  86. Wang, J.; Zhang, S.; Lizaga, I.; Zhang, Y.; Ge, X.; Zhang, Z.; Zhang, W.; Huang, Q.; Hu, Z. UAS-Based Remote Sensing for Agricultural Monitoring: Current Status and Perspectives. Comput. Electron. Agric. 2024, 227, 109501. [Google Scholar] [CrossRef]
  87. Zhu, H.; Lin, C.; Liu, G.; Wang, D.; Qin, S.; Li, A.; Xu, J.L.; He, Y. Intelligent Agriculture: Deep Learning in UAV-Based Remote Sensing Imagery for Crop Diseases and Pests Detection. Front. Plant Sci. 2024, 15, 1435016. [Google Scholar] [CrossRef] [PubMed]
  88. Zhu, H.; Lin, C.; Dong, Z.; Xu, J.L.; He, Y. Early Yield Prediction of Oilseed Rape Using UAV-Based Hyperspectral Imaging Combined with Machine Learning Algorithms. Agriculture 2025, 15, 1100. [Google Scholar] [CrossRef]
  89. Farooq, M.; Hussain, M.; Ul-Allah, S.; Siddique, K.H.M. Physiological and Agronomic Approaches for Improving Water-Use Efficiency in Crop Plants. Agric. Water Manag. 2019, 219, 95–108. [Google Scholar] [CrossRef]
Figure 1. Typical spectral curves showing the difference in reflectance in the visible, near-edge and NIR bands between healthy and stressed vegetation and soil (https://physicsopenlab.org/2017/01/30/ndvi-index, accessed on 30 July 2023).
Figure 1. Typical spectral curves showing the difference in reflectance in the visible, near-edge and NIR bands between healthy and stressed vegetation and soil (https://physicsopenlab.org/2017/01/30/ndvi-index, accessed on 30 July 2023).
Applsci 15 11171 g001
Figure 2. Spectral ranges of the band sensors. Top image: Planet’s SuperDove satellite (https://www.planet.com/products/, accessed on 30 July 2023); bottom image: Altum PT sensor, highlighting their differences and complementarities (https://ageagle.com/drone-sensors/altum-pt-camera, accessed on 30 July 2023).
Figure 2. Spectral ranges of the band sensors. Top image: Planet’s SuperDove satellite (https://www.planet.com/products/, accessed on 30 July 2023); bottom image: Altum PT sensor, highlighting their differences and complementarities (https://ageagle.com/drone-sensors/altum-pt-camera, accessed on 30 July 2023).
Applsci 15 11171 g002
Figure 3. (a) Geographical location of the super-intensive olive grove under study in Osa de la Vega, Cuenca (Spain). (b) Distribution and linear structure of the trees in the super-intensive olive grove, designed for mechanization and high productivity.
Figure 3. (a) Geographical location of the super-intensive olive grove under study in Osa de la Vega, Cuenca (Spain). (b) Distribution and linear structure of the trees in the super-intensive olive grove, designed for mechanization and high productivity.
Applsci 15 11171 g003
Figure 4. (a) DJI Matrice 300 RTK UAV platform equipped with an Altum PT sensor. (b) Reflectance calibration panel used to correct multispectral images.
Figure 4. (a) DJI Matrice 300 RTK UAV platform equipped with an Altum PT sensor. (b) Reflectance calibration panel used to correct multispectral images.
Applsci 15 11171 g004
Figure 5. Methodological approach adopted in this study, from the acquisition of satellite and UAV images to the calculation of indices and statistical analysis.
Figure 5. Methodological approach adopted in this study, from the acquisition of satellite and UAV images to the calculation of indices and statistical analysis.
Applsci 15 11171 g005
Figure 6. Map of the NDRE index obtained from satellite images (Planet) in April 2023, showing the spatial distribution of plant vigour.
Figure 6. Map of the NDRE index obtained from satellite images (Planet) in April 2023, showing the spatial distribution of plant vigour.
Applsci 15 11171 g006
Figure 7. Map of the NDRE index derived from UAV images in April 2023, with higher spatial resolution and detail at the individual level.
Figure 7. Map of the NDRE index derived from UAV images in April 2023, with higher spatial resolution and detail at the individual level.
Applsci 15 11171 g007
Figure 8. Spearman correlation graph between LCI values obtained from satellite and UAV imagery in April 2023, showing a significant positive relationship.
Figure 8. Spearman correlation graph between LCI values obtained from satellite and UAV imagery in April 2023, showing a significant positive relationship.
Applsci 15 11171 g008
Figure 9. Scatter plot of LCI values between UAV and satellite imagery in April 2023, showing correspondence in vigour patterns.
Figure 9. Scatter plot of LCI values between UAV and satellite imagery in April 2023, showing correspondence in vigour patterns.
Applsci 15 11171 g009
Figure 10. Curvilinear regression fit between NDRE_UAV and satellite LCI (January 2023), exemplifying predictive index analysis.
Figure 10. Curvilinear regression fit between NDRE_UAV and satellite LCI (January 2023), exemplifying predictive index analysis.
Applsci 15 11171 g010
Figure 11. Non-parametric decision model for predicting NDRE UAV values from satellite indices (January 2023).
Figure 11. Non-parametric decision model for predicting NDRE UAV values from satellite indices (January 2023).
Applsci 15 11171 g011
Figure 12. Results of decision tree model training using satellite data (January 2023).
Figure 12. Results of decision tree model training using satellite data (January 2023).
Applsci 15 11171 g012
Figure 13. Validation of the decision tree model with test data, showing good predictive power without signs of overfitting.
Figure 13. Validation of the decision tree model with test data, showing good predictive power without signs of overfitting.
Applsci 15 11171 g013
Figure 14. Results of the five-cluster classification of the plot (April 2023, satellite images), with homogeneous management areas.
Figure 14. Results of the five-cluster classification of the plot (April 2023, satellite images), with homogeneous management areas.
Applsci 15 11171 g014
Figure 15. Zoning map of the plot classified into five clusters (April 2023), based on spectral indices from UAV images.
Figure 15. Zoning map of the plot classified into five clusters (April 2023), based on spectral indices from UAV images.
Applsci 15 11171 g015
Figure 16. Results of ten-cluster classification (April 2023, satellite images), with better discrimination of intermediate zones.
Figure 16. Results of ten-cluster classification (April 2023, satellite images), with better discrimination of intermediate zones.
Applsci 15 11171 g016
Figure 17. Zoning map of the plot classified into ten clusters (April 2023), derived from UAV images.
Figure 17. Zoning map of the plot classified into ten clusters (April 2023), derived from UAV images.
Applsci 15 11171 g017
Figure 18. Comparison between five- and ten-cluster zoning, showing greater accuracy and robustness of classification with ten groups.
Figure 18. Comparison between five- and ten-cluster zoning, showing greater accuracy and robustness of classification with ten groups.
Applsci 15 11171 g018
Figure 19. Identification of problem areas in olive groves during April 2023 using clusters, highlighting areas of intra-plot variability.
Figure 19. Identification of problem areas in olive groves during April 2023 using clusters, highlighting areas of intra-plot variability.
Applsci 15 11171 g019
Table 1. Basic specifications of the DJI Matrice 300 RTK UAV.
Table 1. Basic specifications of the DJI Matrice 300 RTK UAV.
WeightApprox. 6.3 kg (with One Gimbal)
Max. transmitting distance8 km
Max. flight time40 min
Dimensions810 × 670 × 430 mm
Max. payload2.7 kg
Max. speed70 km/h
GNSSGPS+ GLONASS+ BeiDou+ Galileo
RTK positioning accuracy1 cm + 1 ppm (horizontal)
1.5 cm + 1 ppm (vertical)
Table 2. GCP accuracy in meters (m) using 4 support points.
Table 2. GCP accuracy in meters (m) using 4 support points.
XYZ
Mean (m)0.0060.0070.022
Standard deviation (m)0.0070.0090.087
RMS error (m)0.0100.0110.090
Table 3. Descriptive statistics for each of the indices and data analysed.
Table 3. Descriptive statistics for each of the indices and data analysed.
NDVINDRELCIGNDVI
29 January 2023PlanetUAVPlanetUAVPlanetUAVPlanetUAV
Mean0.69510.72950.52300.43670.58200.52700.70360.7246
Standard error0.00090.00120.00080.00090.00090.00100.00060.0008
Median0.72910.76560.54910.45980.61360.55590.72550.7484
Standard deviation0.09190.13090.08290.09020.09100.10980.06220.0807
Variance0.00850.01710.00690.00810.00830.01210.00390.0065
Asymmetry−2.179−3.1246−1.9342−2.7228−2.0689−2.8216−2.1231−2.8967
Kurtosis3.95819.21613.21907.13683.55797.54513.94548.1179
Standard error of kurtosis0.04650.04650.04650.04650.04650.04650.04650.0465
Range0.54160.75890.50680.51370.53370.61890.36250.5174
Minimum0.28660.11410.15910.04470.17170.04860.42570.3010
Maximum0.82830.87300.66590.55840.71040.66750.78820.8183
15 April 2023PlanetUAVPlanetUAVPlanetUAVPlanetUAV
Mean0.56380.57430.32190.39830.38690.44540.64810.6029
Standard error0.00060.00100.00070.00050.00080.00060.00060.0004
Median0.57410.59080.33060.40760.39720.45580.66020.6126
Standard deviation0.06390.11010.06850.05420.08570.06050.06370.0446
Variance0.00410.01210.00470.00290.00730.00370.00410.0020
Asymmetry−1.1865−1.4860−1.3130−1.1420−1.2830−1.2102−1.7775−1.5064
Kurtosis1.49113.09982.43761.75922.34711.76904.08232.6100
Standard error of kurtosis0.04650.04650.04650.04650.04650.04650.04650.0465
Range0.41330.68020.46040.42180.55460.43120.44220.3317
Minimum0.28110.12530.04810.14520.05290.16240.34210.3735
Maximum0.69440.80550.50840.56700.60750.59370.78430.7052
Table 4. Spearman correlations, 290123.
Table 4. Spearman correlations, 290123.
29 January 2012NDVI PlanetNDRE PlanetLCI PlanetGNDVI Planet
NDVI UAV0.5860.5240.5440.523
NDRE UAV0.6130.6170.6390.608
LCI UAV0.6040.6080.6250.597
GNDVI UAV00.5190.5880.554
Table 5. Spearman correlations, 150423.
Table 5. Spearman correlations, 150423.
15 April 2023NDVI PlanetNDRE PlanetLCI PlanetGNDVI Planet
NDVI UAV0.5430.9760.9980.939
NDRE UAV0.890.5380.5290.479
LCI UAV0.9430.5630.5530.5
GNDVI UAV0.9030.5470.5360.544
Table 6. The results of the paired-samples Student’s t-test for image 290123.
Table 6. The results of the paired-samples Student’s t-test for image 290123.
Paired-Samples Test 29/01/23
Pair Differences
95% Confidence Interval of the Difference
MeanStd. DeviationStd. Error MeanLowerUppertdfSig. (2-Tailed)
Pair 1NDVI Planet-NDVI UAV0.34350.07140.0007−0.0357−0.0330−50.69211,1030.000
Pair 2NDRE Planet-NDRE UAV0.33340.64890.0006−0.0396−0.0372−62.24911,1030.000
Pair 3LCI Planet-LCI UAV0.05500.05500.00050.05400.0560105.27311,1030.000
Pair 4GNDVI Planet-GNDVI UAV0.02100.04120.0004−0.0218−0.0202−53.66111,1030.000
Table 7. The results of the paired-samples Student’s t-test for image 150423.
Table 7. The results of the paired-samples Student’s t-test for image 150423.
Paired-Samples t-Test 15/04/23
Pair Differences
95% Confidence Interval of the Difference
MeanStd. DeviationStd. Error MeanLowerUppertdfSig. (2-Tailed)
Pair 1NDVI Planet- NDVI UAV0.01050.08160.0007−0.0120−0.0090−13.5611,103<0.000
Pair 2NDRE Planet- NDRE UAV0.07640.05010.0005−0.0773−0.0754−160.511,1030.000
Pair 3LCI Planet-LCI UAV0.05860.06150.00060.0597−0.0574100.411,1030.000
Pair 4GNDVI Planet- GNDVI UAV0.04520.04430.00040.0444−0.0460107.3911,1030.000
Table 8. Curvilinear estimation, NDRE/LCI 290123.
Table 8. Curvilinear estimation, NDRE/LCI 290123.
Model Summary and Parameter Estimates 29/01/23
Dependent Variable: NDRE DRON
EquationModel Summary Parameter Estimates
R-SquaredFdf1df2Sig.Constantb1b2b3
Linear0.75333,823.186111,1020.000−0.0630.859
Logarithmic0.78841,331.599111,1020.0000.6640.407
Quadratic0.80522,909.451211,1010.000−0.5493.003
Cubic0.80522,909.451211,1010.000−0.5493.003−2.2
Power0.75934,888.143111,1020.0000.9971.556−2.20.000
Exponential0.70125,980.686111,1020.0000.0643.23
The independent variable is LCI Planet.
Table 9. Regression models for Planet image 290123.
Table 9. Regression models for Planet image 290123.
Coefficients 29/01/23 a
Unstandardized CoefficientsStandardized Coefficients CorrelationsCollinearity Statistics
ModelBStd. ErrorBetatSigZero. OrderPartialPartToleranceVIF
1(Constant) −0.0630.003 −23.021<0.001
LCI Planet0.8590.0050.868183.911<0.0000.8680.8680.8681.0001.000
2(Constant) −0.1940.009 −20.485<0.001
LCI Planet0.5850.0200.59129.935<0.0010.8680.2730.1400.05617.85
GNDVI Planet0.4120.0290.28516.814<0.0010.8590.1350.0670.05617.85
3(Constant)−0.1860.009 −19.566<0.001
LCI Planet0.4390.0260.44416.814<0.0010.8680.1580.0780.03132.075
GNDVI Planet0.3040.0310.2109.72<0.0010.8590.0920.0450.04721.488
NDVI Planet0.220.0260.2248.389<0.0010.8640.0790.0390.03032.843
a Dependent variable: MEDIA290123_NDRE_DRON.
Table 10. Collinearity diagnostics.
Table 10. Collinearity diagnostics.
Collinearity Diagnostics 29/01/23 a
Variance Proportions
ModelDimensionEigenvalueCondition Index(Constant)LCI PlanetGNDVI PlanetNDVI Planet
111.9881.0000.010.01
20.01212.8630.990.99
212.9881.0000.000.000.00
20.01215.7400.090.050.00
30.000101.8080.910.951.00
313.9851.0000.000.000.000.00
20.01416.9370.090.010.000.00
30.000104.3280.140.910.010.74
40.000121.9150.770.080.990.26
a Dependent variable: MEDIA290123_NDRE_DRON.
Table 11. Summary of model 1.
Table 11. Summary of model 1.
Model Summary 29/01/23 b
ModelRR-SquaredAdjusted R-SquaredStd. Error of the EstimateDurbin–Watson
10.896 a0.8030.8030.03997061.001
a Predictors (constant): LCI Planet, LCI PANET; b dependent variable: NDRE UAV.
Table 12. Summary of the index values in each cluster in Planet image 150423 for 5 clusters.
Table 12. Summary of the index values in each cluster in Planet image 150423 for 5 clusters.
PlanetCluster12345
SIZE0.5% (61)6.1% (680)15.7% (1743)9% (1001)68.6% (7619)
GNDVl0.380.470.610.720.67
LCI0.090.160.330.510.41
NDRE0.080.140.270.420.34
NDVI0.360.420.510.60.59
Table 13. Summary of index values in each cluster for Planet image 150423 for ten clusters.
Table 13. Summary of index values in each cluster for Planet image 150423 for ten clusters.
PlanetCluster12345678910
Size0.2% (27)3.7% (412)1.3% (142)2.4% (270)7.2% (800)2.3% (254)3.6% (3994)1.9% (214)7.6% (849)37.3% (4142)
GNDVI0.370.440.490.530.60.740.690.730.680.64
LCI0.090.120.20.220.320.530.450.520.420.37
NDRE0.080.110.170.180.270.430.370.420.350.31
NDVI0.340.40.520.410.490.640.610.550.510.57
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alfonso, E.; López-Cuervo, S.; Aguirre, J.; Pérez-Martín, E.; Molina, I. Hybrid Methodological Evaluation Using UAV/Satellite Information for the Monitoring of Super-Intensive Olive Groves. Appl. Sci. 2025, 15, 11171. https://doi.org/10.3390/app152011171

AMA Style

Alfonso E, López-Cuervo S, Aguirre J, Pérez-Martín E, Molina I. Hybrid Methodological Evaluation Using UAV/Satellite Information for the Monitoring of Super-Intensive Olive Groves. Applied Sciences. 2025; 15(20):11171. https://doi.org/10.3390/app152011171

Chicago/Turabian Style

Alfonso, Esther, Serafín López-Cuervo, Julián Aguirre, Enrique Pérez-Martín, and Iñigo Molina. 2025. "Hybrid Methodological Evaluation Using UAV/Satellite Information for the Monitoring of Super-Intensive Olive Groves" Applied Sciences 15, no. 20: 11171. https://doi.org/10.3390/app152011171

APA Style

Alfonso, E., López-Cuervo, S., Aguirre, J., Pérez-Martín, E., & Molina, I. (2025). Hybrid Methodological Evaluation Using UAV/Satellite Information for the Monitoring of Super-Intensive Olive Groves. Applied Sciences, 15(20), 11171. https://doi.org/10.3390/app152011171

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop