Next Article in Journal
Impact of Foliar Application of Copper, Manganese, Molybdenum, and Zinc on the Chemical Composition and Malting Quality of Barley Cultivars
Next Article in Special Issue
Automated Weed Detection in Red Beet (Beta vulgaris L., Conditiva Group, cv. Kestrel F1) Using Deep Learning Models
Previous Article in Journal
The Influence of Alternative Weed Control Under “Sauvignon Blanc” Vines on Grape Characteristics and Environmental Footprint
Previous Article in Special Issue
Machine Learning-Based Estimation of Tractor Performance in Tillage Operations Using Soil Physical Properties
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Using Virtual Drones to Mitigate the Bias Introduced by Sensor Wavelength Approximations in Crop Monitoring with Drones

by
Pierre Bancal
1,*,
Marie-Odile Bancal
1,
Mélanie Heers
1,2,† and
Jean Charles Deswartes
2
1
Université Paris-Saclay, INRAE, AgroParisTech, UMR EcoSys, 91120 Palaiseau, France
2
Arvalis Institut du Végétal—Route du Châteaufort, 91190 Villiers-le-Bâcle, France
*
Author to whom correspondence should be addressed.
Present Address: Union Française des Semenciers, 17 rue du Louvre, 75001 Paris, France.
Agronomy 2025, 15(11), 2665; https://doi.org/10.3390/agronomy15112665
Submission received: 10 October 2025 / Revised: 17 November 2025 / Accepted: 18 November 2025 / Published: 20 November 2025

Abstract

Remote sensing based on the reflectance of light at certain wavelengths enables the calculation of various vegetation indices (VIs) as proxies for agronomic variables. However, drone-mounted sensors have a limited number of bands, so the wavelengths defining VIs often have to be modified in line with sensor characteristics. This article addresses the problem of such wavelength shift based on experimental agronomic measurements and on reflectances acquired by both multispectral spectrophotometers and drone-mounted sensors. We demonstrate that wavelength shift can significantly affect VIs, particularly those using the red-edge band, compared to a multispectral reference. In the worst cases, the drone’s VI was not even correlated with its multispectral target. We therefore propose a calibration method using a “virtual drone” simulated from a complete dataset obtained by multispectral measurements in order to use sensors with a limited number of bands. Virtual drones can guide the choice of drone sensors, depending on the features to estimate, or facilitate the intercalibration of sensors for comparisons of the results of the literature studies. This study aims at providing the agronomist community with a method for intercomparing VIs acquired by drones.

1. Introduction

Agricultural interest in drones, or unmanned aerial vehicles (UAVs), has increased over the last 12 years [1]. By allowing the capture of real-time data without disturbing the plot, drones can improve decision-making [2]. Many applications have been proposed, the most realistic of which are the estimation of yield and biomass by remote sensing [3]. The detection of biotic (diseases and weeds) or abiotic (deficiencies of fertilizer or water) stresses renders rapid remediation possible [4]. In some cases, the drones could themselves be used for local interventions. However, the sensors used must be light enough to be carried on a small UAV, affordable for agricultural use. A good compromise between weight and functionality can be achieved by using multispectral cameras designed to capture a few spectral bands (3 to 10) at visible and near-infrared wavelengths [5]. Such sensors make use of published data from observation satellites. The canopy reflectances measured with broadband and fine-band sensors provide information about the combined effects of chlorophyll concentration [6] and plant architecture [7,8]. Other canopy features, such as nitrogen, cellulose and water contents can also be assessed with wavelengths >950 nm not widely available for small multispectral cameras [9,10,11,12]. The development of a panel of spectral vegetation indices (VIs) combining reflectances at several wavelengths has made it possible to reduce the adverse effects of atmospheric conditions or soil moisture on the signal.
Alternatives avoiding VIs have also been developed through the widespread use of multispectral or hyperspectral spectrophotometers, which read reflectance at hundreds of wavelengths. Agronomic traits could be adjusted without the use of VIs, by machine learning [13] trained on continuous reflectance spectra, or in combination with data from other sensors, such as LIDAR [14]. However, machine learning requires complete sets of input variables, which are beyond the reach of the UAV-borne sensors that only read a few wavelengths. Consequently, VIs are needed to process the data acquired by drones. Unfortunately, the fewer wavelengths detected, the fewer VIs are available, unless one accepts the bias associated with the approximation in the VI definition. However, an analysis of published studies reveals that the definition of VIs is not strictly respected, with the wavelengths actually used varying from one article to another. Some authors justify their choice of wavelength on the basis of the available satellite data or, in some more recent cases, the technical specifications of UAVs, but the distortions of VIs induced by wavelength substitutions are rarely discussed.
Some VIs, such as NDVI [15], are reputed to be robust to wavelength substitution, as little affected by the precise definition of the wavelength read. However, perhaps even because of this robustness, NDVI is largely limited to qualitative comparisons or ranking. Indeed, the choice of the best genotype for plant breeding does not always require a precise quantification, but only relies on there being a strong correlation between the trait for selection and a corresponding proxy. However, in comparisons of trials or studies using different devices, the use of VIs with the same name but measuring different things may be a source of confusion leading to bias, even in a qualitative comparison, particularly in meta-analyses. Quantitative characterization is increasingly required, both in meta-analyses and for the use of canopy monitoring as an aid to short-term decision-making, for chemical applications, for example. The current trend in studies involving multispectral ground-based spectrophotometers is to provide the definition of the VIs used, together with justifying references [16,17]. Given that monitoring devices, such as UAV-borne sensors, intrinsically limit the number of wavelengths, it is necessary to provide a guide to either limit to the few IVs that the sensor can calculate, or measure the impact of wavelength substitution in order to determine whether or not to use a larger number of IVs.
Consequently, the aim of this study was to characterize the bias resulting from the replacement of a full-spectrum terrestrial spectrometer with two five-band sensors reading slightly different wavelengths. We focused on the consequences of choosing close but different bands for both the reflectance levels measured and the set of VIs calculated, regardless of differences in the optical quality of the sensors. Three wheat multi-crop trials were monitored with ground-based spectrometers throughout the season. Based on the full spectra obtained, two virtual drones were simulated from the specificities of the first and second UAV-sensors by extracting target reflectances. The results of virtual drones were then compared together, and with results of the ground-based spectrometer. Basic Pearson correlations indicate whether substitution affected qualitative ranking, and relative standard errors were calculated to address the need for quantitative characterization. We then propose a procedure for calibrating virtual drones either against a ground-based spectrometer or against another UAV-mounted sensor. The calibration is then applied for the prediction of an agronomic trait, in this case the green leaf area index (GLAI) during grain filling. Finally, the conditions for using a virtual drone to calibrate a real UAV-mounted device are discussed.

2. Materials and Methods

2.1. Winter Wheat Crops

Three trials were performed in which wheat crops were monitored with ground-based spectrometers. The first trial (145482) took place in 2018–2019 on the PhénoField® phenotyping platform at Ouzouer-le-Marché, France (47.92° N; 1.43° E). PhénoField® allows the cultivation of microplots (5.2 × 1.2 m) in semi-controlled environments through the use of rolling glasshouses to prevent rain from reaching the canopies and tight control over water availability [18]. Seven different nitrogen and/or water stress treatments were applied at different times in the crop cycle to eight bread wheat cultivars with a wide range of varietal precocity. The second (146672) and third (149157) trials were conducted in 2019–2020 and 2020–2021, respectively, close to the Arvalis station in Aubigny-aux-Kaisnes, France (49.77° N; 3.12° E). In these two trials, 16 and 20 cultivars, respectively, were subjected to three treatments combining nitrogen and fungicide application in a classic split-plot design with three replicates, as in typical breeding trials. Further details are provided about these trials, amounting 530 crop microplots, as Supplementary Data (Tables S1 and S2).
The first three trials were conducted by scientists at a research station. In order to ensure the relevance of the concepts and methods used for private breeders, a fourth trial (BH-PHENO-OR) was conducted in 2020–2021 at the AgriObtention station of Orsonville (48.48° N; 1.83° E), with the same cultivars and treatments as in trial 149157. As in the other trials, heading date was recorded for each microplot. No other crop monitoring was carried out as part of this trial, with the exception of aerial surveys conducted by an external operator (Phytodrone—UAV Services Company in Agriculture and Environment industries, Paris, France) specializing in remote sensing by UAVs and data processing.

2.2. GLAI Measurement by Agronomic Sampling

Green leaf area index (GLAI) was measured on one third of the microplots in trials 145482 and 149157, using the method described below to limit sample removal. (1) At heading, a single sample of approximately 50 consecutive culms was dissected into its different leaf layers, which were then color-scanned. The surface areas of the different leaf layers of an average culm were determined by analyzing the images obtained [19]. The other GLAI components were then acquired non-destructively to prevent an impact on remote sensing results. (2) Ear density was determined during grain filling, and the value obtained was multiplied by the mean leaf area per culm to determine leaf area index (LAI) and GLAI at heading. (3) The senescence of each leaf layer was estimated weekly on ten tagged culms per microplot using a customized scale designed for visual assessment of the green fraction from 0 (full yellow) to 1 (full green) [20]. With trained peoples, the accuracy of the method (RMSE 0.1 against plot sampling) is enough to monitor canopy senescence [21]. GLAI was then obtained by multiplying the estimated green fraction by LAI. GLAI was thus determined weekly on 116 microplots, resulting in 812 data points collected from heading to late senescence.

2.3. Remote Sensing

Spectral detections were performed weekly in the first three trials and covered the main growth stages from leaf development in late March to soft dough ripeness in July. On each of the days on which spectral detection was performed, an external standard measurement was also obtained with a neutral gray carpet. In trial 145482, the PhénoField® mobile sensor platform equipped with a multispectral active point sensor spectrometer (MMS1, ZEISS, Jena, Germany) was used to measure five spots in each microplot. The spots were 60 × 60 cm squares at the top of the canopy. In trials 146672 and 149157, two on-board multispectral sensors on the left and right sides of a tractor were used [22]. The tractor was driven at approximately 0.45 m·s−1, making it possible to obtain measurements for four spots in each microplot. These spots were 60 × 60 cm squares at the top of the canopy. Approximately one sixth of the microplots underwent a second round of measurement to check the equivalence of the sensors on the left and right sides of the tractor. The sensors in these two trials were bidirectional passive point spectrometers (AgroSpec, TEC5, Steinbach, Germany). Passive spectrophotometers used natural sunlight instead of flash and scanned them repeatedly before each measurement of light reflected by the canopy. The spot data were averaged to obtain a single reflectance value per microplot. Both MMS1 and AgroSpec sensors read 256 wavelengths between 300 and 1150 nm, but the end tails of the spectra were not usable. The central part of the spectra (350–1000 nm) was interpolated using a cubic spline over a five-point moving window to obtain reflectances at 1 nm resolution.
An UAV-mounted sensor was used for monitoring in the fourth trial. After three training flights in early spring, a quadricopter (Phantom 4 Professional; DJI Enterprise) carrying a multispectral sensor (RedEdge-MX; Micasense, Wichita, KS, USA) was flown over the plots six times between late May and early July. The Micasense sensor scans plots in five wavelength bands (blue (B): 475 ± 10 nm; green (G): 560 ± 10 nm; red (R): 668 ± 5 nm; red edge (E): 717 ± 5 nm; near-infrared (N): 840 ± 20 nm). The characteristics of this “first sensor” were then compared with those of the DJI P4 multispectral camera (B: 450 ± 16 nm; G: 560 ± 16 nm; R: 650 ± 16 nm; E: 730 ± 16 nm; N: 840 ± 26 nm). This “second sensor” is often fitted by default on the Phantom drone, without comparing actual optical quality of the sensors; indeed, the study focused on the consequences of the choice of BGREN band specifications.

2.4. Data Processing

In each of the first three trials, ground-based multispectral photometers read 256 wavelengths between 300 and 1150 nm. Using a cubic spline over a moving four-point window, continuous spectra were interpolated with a resolution of 1 nm. The active point-sensor spectrometer used in trial 145482 provided data unaffected by the variability of sunlight. Therefore, the spot data from this trial were directly averaged to obtain a single reflectance value per microplot. By contrast, the passive point-sensor spectrometers used in trials 146672 and 149157 provided data affected by rapid fluctuations of sunlight levels on cloudy days, resulting in differences between spots within microplots. Therefore, VIs were calculated separately for each spot, and then averaged for each microplot.
Data from the ground-based multispectral spectrometers were used to simulate the first and second UAV-mounted sensors, extracting reflectances at the wavelengths characterizing their respective BGREN bands. Virtual sensors can be simulated in two ways: by extracting reflectances exclusively for the central wavelengths of the bands, or by averaging reflectances for all wavelengths within the bandwidth. These two types of calculation gave very similar results. We therefore present results only for the second type of calculation here.
UAV imaging in the trial BH-PHENO-OR was performed by the PhytoDrone company (Paris, France), which released the reflectance data for the BGREN bands.
Finally, basic Pearson correlations indicate whether substitution affected qualitative ranking and significant differences were given at the level p = 0.01. Relative mean standard errors (RMSE) or normalized RMSE (NRMSE) were calculated to address the need for quantitative characterization and model comparison.

3. Results

3.1. Sensor Description

Figure 1 shows a typical reflectance curve (gray line), as recorded by a ground-based spectrometer, for a wheat crop. Reflectance is low in the visible part of the spectrum, with a small peak in the green. By contrast, it is high in the near-infrared part of the spectrum, after a marked step in the red edge. UAV-mounted sensors typically scan only a few color bands, 10 to 50 nm wide, around the nominal wavelength. The technical data from the two five-band sensors, referred to here as the first and second sensors, are shown as solid and dotted lines, respectively. Several differences between the sensors were identified: (1) the green (G) and near-infrared (N) bands were centered on the same wavelength for both sensors; they differed only slightly in the width of the windows. (2) The blue (B) and red (R) bands display an offset between the second and first sensors; fortunately, these bands are located in a region of the spectrum in which reflectance varies slowly. (3) The red-edge (E) band is offset in a region of the spectrum in which reflectance varies rapidly.
Real sensors may also differ in terms of optical quality, but this issue is not addressed by the use of virtual drones. Indeed, we focus here exclusively on the consequences of differences in band definition for the accuracy of vegetation indices. Figure 2 therefore shows data from virtual sensors, i.e., data extracted from complete spectra obtained by ground-based multispectral spectrometers for the wavelengths corresponding to the bands of the first and second UAV-mounted sensors. The G-band (Figure 2A) and the N-band (Figure A1m–o) were centered on the same wavelength for both sensors, moreover this wavelength being located in a part of the spectrum in which reflectance varies slowly. The data were highly reproducible between virtual sensors, with coefficients of determination greater than 0.999 in all three trials for N and G. The dots aligned with the first bisector of the figure, and the NRMSE (RMSE normalized by standard error of data) between the virtual sensors was less than 3%, well below biological variability. The B-band (Figure 2B) and the R-band (Figure A1d–f) displayed an offset between sensors, but the corresponding peaks were located in a part of the spectrum in which reflectance varies only slowly. There was some dispersion and moderate trial effects, but the coefficients of determination were greater than 0.95 in all cases. The dots for the R-band were close to the first bisector (Figure A1d–f), resulting in a moderate NRMSE of 10% between virtual sensors in each trial. However, reflectance varies more with wavelength in the blue than in the red part of the spectrum. As a result, the dots for B-band (Figure 2B) indicated a slope below 0.8, giving a high NRMSE of 30 to 45%, depending on the trial. Finally, the E-band (Figure 2C) was offset between the two sensors and located in a part of the spectrum in which reflectance varies rapidly. No alignment of the dots was obtained because the red-edge part of the spectrum fluctuated with the aging of the crop. The coefficients of determination were, therefore, highly variable, ranging from 0.35 to 0.88, resulting in a high NRMSE of up to 150%, depending largely on the trial. Both sensors do take measurements in the red-edge part of the spectrum, but it was not possible to compare their data.

3.2. Differences in Vegetation Indices Between Sensors

Table A1 lists 30 VIs that UAV sensors could potentially estimate, along with the wavelengths of their multispectral definition taken into account by the BGREN bands. The distortion of VIs is inevitable, given the limited number of bands available for sensors. For example, the BRI uses reflectance at 690 nm whereas the CCCi uses reflectance at 670 nm, but the drone will offer data for the same R-band in both cases. A second source of distortion lies in the variability of sensor specifications. Two different sensors will not translate the multispectral formula of a VI in the same way.
Figure 3 illustrates the impact of varying band definition on the estimates of certain VIs. The gNDVI is a VI computed from the G and N bands, little differing between the first and second sensors (Figure 3A). The dots align with the first bisector, the R2 for the relationship between sensors was greater than 0.999, and the NRMSE between sensors was below 2%. The gNDVI estimates of the two sensors studied can therefore be considered almost identical. However, the bands of the two sensors present an offset with the multispectral definition of the gNDVI. Nevertheless, the effect of this offset remains moderate, as indicated by the NRMSE at 5–8% between UAV estimates and their multispectral target.
Conversely, CI red-edge (Figure 3B) is a VI calculated from reflectance in the E and N bands, and the E band used differed considerably between sensors. CI red-edge reached a maximum of 9, according to its multispectral definition, versus 6 for the first sensor and 2 for the second. A comparison of the estimates from the two sensors revealed that the coefficient of determination remained high (0.90–0.99, depending on the trial), but the points were so far from the first bisector (slope at 0.3) that the NRMSE reached 140% (Figure 3B). Clearly, the data obtained for this VI were not identical for the two sensors. If only a qualitative ranking between crops is required, either sensor can be used to obtain a CI red-edge proxy, but data from different sensors should not be mixed. If a quantitative estimate is required, the sensors should be calibrated to obtain accurate assessments.
Finally, CCCi is a VI calculated from reflectance in the E, R and N bands, which differed considerably between sensors (Figure 3C). The estimates were of similar magnitudes, but the correlation varied between detection dates, as a function of crop aging. Under no circumstances can the information obtained from one sensor be used as an estimate for the other. In this example, the CCCi estimate provided by the first sensor was correlated with its multispectral target (R2 of about 0.95; Table A2) and the CCCi estimate from the first sensor can therefore be considered a proxy, as for CI red-edge. Conversely, the CCCi estimate provided by the second sensor did not correlate with its multispectral target (R2 < 0.6; Table A2). The second sensor can provide interesting information, but it cannot provide an accurate approximation of the multispectral CCCi.
An extensive comparison between the two sensors for estimates of the 30 VIs listed in Table A1 and their multispectral targets is reported in Table A2. Eleven of the 30 VIs (shown in bold in Table A2) were little affected by the choice of sensor, with an R2 value systematically greater than 0.98 and an NRMSE value systematically below 20%; none of these VIs used the E band. Seven other VIs not based on data for the E band had a greater sensor effect (bold-italics in Table A2); all of these less robust VIs used the B band, which was offset between the sensors. Finally, for VIs using the E band, the coefficient of determination was clearly affected by the shift to the red-edge wavelength that should be read according to the multispectral definition of the VI (Figure 4). The closer this wavelength is to the E-band of the UAV sensor, the higher the R2 value obtained for the correlation between the multispectral VI and its UAV estimate. As the E-band was offset between the first and second sensor, R2 also differed between sensors. The highest R2 values for the first sensor were obtained for VIs for which the red-edge wavelength that should be read was 720 nm, because the E-band of this sensor was centered on 717 nm (Figure 4A). Conversely, for the second sensor, R2 values were highest for VIs for which the red-edge wavelength that should be read was 730 nm, corresponding to the center of the E-band of this sensor (Figure 4B). In other words, UAV sensors can provide robust estimates of multispectral VIs, provided that these VIs do not use the B and E bands. Otherwise, the estimate should be calibrated, which is only possible if the wavelength required for calculation of the VI corresponds closely to the E-band of the sensor carried by the UAV Anyway, Figure 4 indicates that an approximation of 10 nm in wavelength could be enough to bring R2 below 0.9, at least in certain unpredictable years. The consequences, such as that in Figure 3B, highlight the need of intercalibration.

3.3. UAV Calibration with Virtual Sensors

When the VI estimates produced by UAV sensors are strongly correlated with their multispectral target, UAV-mounted sensors can replace ground-based spectrometers for the qualitative ranking of scanned crops, even if their NRMSE is high, as observed if the slope between sensors is far from equal to one (Figure 3B). However, when a quantitative measurement is required, the UAV sensor must first be calibrated. Calibration for agronomic traits requires extensive experimentation, and it is therefore advisable to make use of data from previous studies. Indeed, virtual drones can be simulated with data from trials monitored with ground-based spectrometers. The calibration of virtual drones for agronomic traits is the first step towards trait prediction from the reflectance data acquired by real UAVs. We show below an example based on data from trials 145482 and 149157. In these trials, the GLAI of wheat crops was measured throughout grain filling, together with multispectral VIs. A linear combination of BRI and LCI fitted the GLAI data well. The two UAV sensors were suitable for use to evaluate both BRI and LCI, although the correlation with their multispectral target was only moderate (Table A2). Further adjustments were then made, comparing the GLAI measurements with the BRI and LCI estimates computed by the virtual drones for data from these trials, and Equation (1) was obtained with data from the first sensor:
GLAI = −2.65 + 6.93·BRI + 2.57·LCI; R2 = 0.87; RMSE = 0.58 m2·m−2
This equation was then applied to the reflectance data recovered by a real UAV, using data from the first sensor for the fourth trial (BH-PHENO-OR), to predict GLAI throughout grain filling in this trial (Figure 5A). Unfortunately, no actual GLAI measurements were made in the BH-PHENO-OR trial for definitive validation of the predicted GLAI values, but the estimated GLAI followed a pattern consistent with typical observations for wheat crops. Predicted GLAI was between 3.5 and 5.0 m2·m−2 at heading on all microplots, falling to zero five to six weeks later. Moreover, the effect of treatments was also consistent: relative to control (black circles) an early fertilizer shortage led to lower maximum GLAI and earlier senescence (white triangles), whereas an absence of fungicidal protection against leaf diseases did not alter maximum GLAI, but was associated with earlier senescence (white squares).
GLAI could also be obtained from BRI and LCI estimates by a virtual drone using the second sensor, which led to Equation (2):
GLAI = −2.25 + 9.06·BRI + 2.89·LCI; R2 = 0.86; RMSE = 0.61 m2·m−2
Equations (1) and (2) are different because the BRI and LCI estimates of the two sensors, despite bearing the same name, are actually different due to the differences in the wavelength definition of BGREN bands. Thus Equation (2), built for the second sensor, should not be applied to reflectance data recovered with the first sensor. In the BH-PHENO-OR trial (Figure 5B), the GLAI values obtained in this way were systematically overestimated. GLAI was not biologically relevant at heading (between 5.5 and 7.0 m2·m−2) and did not fall to zero at maturity. The RMSE between the predictions obtained with Equations (1) and (2) reached 1.87 m2·m−2. However, confusion might occur because the general shape of the curves was conserved. It is essential to bear in mind that the first and second sensors have different specifications. The calibration of one sensor should therefore not be applied to the other.
Finally, the combination of VIs with UAV estimates can be useful even when their multispectral targets are unsuitable. For example, CCCi and mNDI705 do not fit GLAI data well if their multispectral definition is used. Moreover, CCCi and mNDI705 are poorly estimated by the virtual sensors (Table A2). Nevertheless, GLAI can be adjusted for CCCi and mNDI705 estimates by a virtual drone using the first sensor (Equation (3)):
GLAI = −5.63 + 7.50·CCCi + 5.57·mNDI705; R2 = 0.87; RMSE = 0.60 m2·m−2
When Equation (3) was applied to the reflectance data obtained in the BH-PHENO-OR trial with the first sensor, the resulting predicted value of GLAI was very close to that obtained with the BRI and LCI estimates using Equation (1), resulting in a low RMSE between predictions (0.08 m2·m−2).

4. Discussion

Remote sensing provides interesting information about crop development, particularly through measurements of the reflectance of light at certain wavelengths. Vegetation indices (VIs) are based on reflectance at several wavelengths, to decrease the undesirable effects of soil and atmosphere, and to target agronomic traits more specifically. The recent development of UAVs or drones has facilitated the repeated acquisition of data for real-time crop monitoring without the need to disturb the crop [23]. However, UAV-mounted sensors can read reflectances for only a few bands, whereas ground-based spectrometers can obtain data over the entire spectrum. It is not possible to calculate some VIs from the restricted data available from sensors. This is the case for the Vogelman VOG2 index [24], which combines data obtained at four wavelengths, all in the red-edge band (700–750 nm). The calculation of other VIs is also frequently distorted by replacement of the canonical wavelength with a close wavelength for which data are available from the sensor. As a result, UAV-mounted sensors actually provide proxies of their multispectral target. There may be discrepancies between UAV estimates and their multispectral target and biases may be introduced by mixing data from different sensors reading reflectances at close but different wavelengths. This study investigated the consequences of such band substitution to provide warnings and recommendations when inter-comparing VIs coming from different studies.
Two five-band sensors reading different wavelengths were simulated from data extracted from continuous spectra. These spectra were acquired in three trials monitoring wheat crops throughout the growing season. Each virtual drone was then compared with the other and with ground-based spectrometers. We investigated the consequences of band definitions on both reflectance levels and a set of candidate VIs using wavelengths that are close to various degrees to those available for virtual drones. The basic Pearson coefficient for the correlation between multispectral VIs and their drone estimates ranged from 0 to 1, but high R2 values with slopes far from 1 could be obtained, these estimates being strongly affected by the characteristics of the drone in terms of the wavelength actually read. Only one third of the candidate VIs were unaffected by the choice of sensor and could be considered as robust to wavelength choice. However, none of the candidates using the red-edge band were present among these robust VIs.
Many VIs of agricultural interest are based on data for the red-edge band, which corresponds to a region of the spectrum in which canopy reflectance varies greatly between the near-infrared, where it is high, and the visible, where it is low due to absorption by photosynthetic pigments. Measurements in the red-edge band are ultimately much more sensitive to small changes in the condition of the vegetation than measurements in the visible part of the spectrum, for which the signal is rapidly saturated [25]. Conversely, an approximation of the red-edge wavelength at which readings are taken would cast doubt on the inferred prediction of vegetation indices. Unsurprisingly, we found that VIs based on red-edge band data were more sensitive to sensor characteristics than VIs based exclusively on data for the green, red and near-infrared bands. The R2 value for the relationship between the multispectral VI and its UAV estimate decreased rapidly with increasing distance of the UAV sensor specification in the red-edge band away from the canonical wavelength for the VI.
The consequences wavelength variations between cameras on IVs have never been studied directly, although some information can be gleaned from meta-analyses. For example, the CI-red-edge is a fairly simple VI based on a ratio of the reflectances for the red-edge and near-infrared (NIR) bands of the spectrum. Gitelson et al. [26] obtained readings in the red-edge band between 720 and 730 nm and in the NIR band between 840 and 870 nm. They suggested that the VI they measured corresponded to chlorophyll content. However, in a second paper [27], the same authors obtained red-edge readings at wavelengths between 704 and 714 nm and NIR readings for two bands (750–758 nm and 771–786 nm); they found that this version of the CI-red-edge corresponded more closely to GLAI. In other words, despite having the same name, these two versions of the CI-red-edge were not interchangeable. Furthermore, the exact position of the red edge varies with the extent of the vegetation [28]. As vegetation is alive, the consequences of approximating the wavelength at which the red edge is read are themselves variable, depending on the trial and the day on which the data are acquired. Our study focuses on wheat, but the reflectance curves of canopies show convergent spectral characteristics [29]; noteworthy the red-edge point that shifts over the season [30] or between Graminae species [31]. If sensor restricted to a few reflectance wavelength are used, our approach could easily generalized to these cases. In our study, no consistent correlation between VI estimates by virtual drones and their multispectral target was observed when red edge was used. However, a correlation was observed in most other cases and appeared to be repeatable between trials. This study provides a first diagnosis of impacts of wavelength substitution.
Nevertheless, the relationships observed were often far from identity and there was a need for drone calibration. Furthermore, switching to another drone would require at least a new calibration. Without such calibration, it remains difficult to extrapolate without bias the results from sensors detecting reflectance at only a few wavelengths. The use of virtual drones facilitates calibration, through the extraction of point reflectances from datasets of continuous spectra. Virtual drones can be used before purchasing or equipping an UAV, to determine which specifications are the most appropriate. In addition to their utility for calibrating reflectances or VIs, virtual drones can also be applied to datasets from previous studies linking agronomic traits to the reflectances acquired by ground-based spectrometers. For the first time, this study proposes a preliminary calibration of drones based on their virtual drones simulated from continuous multispectral data.
We finally propose here a three-step method for UAV-sensor calibration to an agronomic trait, which should lead to a better sharing of UAV-remote sensing measurements. The example chosen was GLAI, although its assessment by remote sensing remains a matter of considerable debate that goes well beyond the scope of this study [32]. (i) A virtual drone is simulated from continuous multispectral assessments in training trials in which standard agronomic methods also measured GLAI. (ii) Data from this virtual drone are fitted to GLAI. (iii) The calibrated equations are applied during a validation trial to data obtained from a real UAV-mounted sensor with the same specifications as the virtual drone, making it possible to simulate GLAI.
In this study, the use of a private operator renowned for its technical reliability, but not part of the scientific team, ensured the applicability of our methods in real life. Indeed, like any device, the UAV must also be implemented correctly. The precautions to be taken when devices of this type have been listed elsewhere [33], and are continually updated. However, even high-quality data obtained without calibration could be useless. Therefore, the recommendations presented here should not be seen as a source of undue concern. Calibration using virtual drones ensures the validity of crop trait values acquired by remote sensing with sensors mounted on UAVs.

5. Conclusions

The small number of bands actually read by UAV-mounted sensors generally necessitates the calculation of VIs with substitute wavelengths matching the characteristics of the sensor. We show here that this approach can be risky, leading to biased estimates of VIs, especially when the red-edge band is used in their calculation. For the extrapolation of VI results without bias, we propose a three-step methodology based first on the calibration the sensors with their corresponding virtual drones based on multispectral data. In the worst-case scenario, which proved rare in the example given, there may be no correlation between the drone’s estimate of the VI and its multispectral target. However, when calibration is possible, UAV-mounted sensors can be used to predict agronomic traits quantitatively with a sensitivity similar to that obtained with ground-based spectrometers. The use of virtual drones can greatly facilitate this calibration and allow for inter-comparison of independent data based on different UAV-mounted sensors.

Supplementary Materials

The following supporting information can be downloaded at https://www.mdpi.com/article/10.3390/agronomy15112665/s1, Table S1: Agronomic characterization of main assays; Table S2: Genotype characterization in main assays.

Author Contributions

Conceptualization, P.B., M.-O.B. and J.C.D.; Methodology, P.B., M.-O.B. and M.H.; Software, P.B. and M.H.; Validation, P.B., M.-O.B. and M.H.; Formal Analysis, P.B. and M.H.; Investigation, P.B., M.-O.B. and M.H.; Resources, M.H.; Data Curation, P.B. and M.H.; Writing—Original Draft Preparation, P.B.; Writing—Review and Editing, M.-O.B. and M.H.; Visualization, P.B.; Supervision, J.C.D.; Project Administration, P.B., M.-O.B. and J.C.D.; Funding Acquisition, J.C.D. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the PHENOTOL project (2018), funded (grant number 11001305) by the FSOV, the French fund to support plant breeding.

Data Availability Statement

Restrictions due to privacy policy apply to the availability of these data. Data were obtained from Arvalis and are available from Jean Charles Deswartes at jc.deswarte@arvalis.fr with the permission of Arvalis.

Acknowledgments

We wish to thank all the technical staff at Arvalis and INRAE-ECOSYS for their investment in data collection, and Katia Beauchêne, Céline Huet and Fabrice Gierczak, in particular, for their skillful technical assistance throughout the study. Julie Sappa’s guidance regarding English is also acknowledged.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Abbreviations

The following abbreviations are used in this manuscript:
VIsVegetation indices
UAVsUnmanned aerial vehicles
LIDARLaser imaging detection and ranging
GLAIgreen Leaf area index
BGREN bandsBlue, green, red, red-edge, and near-infrared detection bands
NRMSERoot mean square error normalized by standard error of data

Appendix A

Figure A1. Reflectances measured by various bands, according to the wavelength windows of either the first (X-axis) or the second (Y-axis) sensor. Bands are reported in line: blue (ac), green (df), red (gi), red-edge (jl) and near-infrared (mo). Trials are reported in column with symbols and colors as in Figure 2. Blue diamonds: trial 145482 (3528 data); orange squares: trial 146672 (3100 data); gray triangles: trial 149157 (1968 data).
Figure A1. Reflectances measured by various bands, according to the wavelength windows of either the first (X-axis) or the second (Y-axis) sensor. Bands are reported in line: blue (ac), green (df), red (gi), red-edge (jl) and near-infrared (mo). Trials are reported in column with symbols and colors as in Figure 2. Blue diamonds: trial 145482 (3528 data); orange squares: trial 146672 (3100 data); gray triangles: trial 149157 (1968 data).
Agronomy 15 02665 g0a1
Table A1. Spectral Vegetation Indices explored. The formulae presented enable the calculation of vegetation indices from UAV-mounted sensors reading five bands: blue (B), green (G), red (R), red edge (E) and near-infrared (N). The canonical wavelength that should be used when full spectra are available is also indicated.
Table A1. Spectral Vegetation Indices explored. The formulae presented enable the calculation of vegetation indices from UAV-mounted sensors reading five bands: blue (B), green (G), red (R), red edge (E) and near-infrared (N). The canonical wavelength that should be used when full spectra are available is also indicated.
Full NameShort NameDrone DefinitionMultispectral Wavelength
BGREN
Blue Green Pigment IndexBGIB/G450550---
Blue Red Pigment IndexBRIB/R450-690--
Canopy Chlorophyll Content IndexCCCI ( N E ) · ( N + R ) ( N + E ) · ( N R ) --670720790
Chlorophyll Index-GreenCI green N G 1 -550--855
Chlorophyll Index—Red EdgeCI red edge N E 1 ---710785
Enhanced Vegetation IndexEVI 2.5 · ( N R ) 1 + N + 6 · R 7.5 · B ) 420-670-864
Enhanced Vegetation Index n° 2EVI2 2.5 · ( N R ) 1 + N + 2.4 · R ) --670-864
Green Normalized Difference Vegetation IndexgNDVI N G N + G -550--780
Leaf Chlorophyll Content IndexLCCI G B N B 450560--540
Leaf Chlorophyll IndexLCI N E N + R --680710850
Maccioni IndexMaccioni G B E B 450550-750-
Modified Chlorophyll Absorption Reflectance Index n° 2MCARI2 3.75 · N R 1.95 · ( N G ) 2 N + 1 2 6 N + 5 R 0.5 -550670-800
MCARI (Hiphen version)MCARIH N E 0.2 · ( N G ) E -570-710850
Modified Normalized Difference Blue—570mNDb570 G B N + B 450570--850
Modified Normalized Difference Blue—675mNDb675 R B N + B 450-675-850
Modified Normalized Difference Blue—730mNDb730 E B N + B 450--730850
Modified Red-Edge Normalized Difference IndexmNDI705 E R E + R 2 · B 445-705750-
Modified Soil-Adjusted VIMSAVI N + 1 ( 2 · N + 1 ) 2 8 · ( N R ) 2 --670-800
Modified Simple Ratio 670mSR670 N R R · N + R --670-800
Modified Simple Ratio 705/445mSR705-445 E B R B 445-705750-
Modified Simple Ratio 780mSR780 N E N R --680710780
Meris Terrestrial Chlorophyll IndexMTCI N E E R --675710785
Modified Triangular VI no. 2MTVI2 1.8 · N G 3.75 · ( R G ) 2 N + 1 2 6 N + 5 R 0.5 -550670-800
Normalized Difference NIR-Red EdgeNDRE N E N + E ---720790
Normalized Difference Vegetation IndexNDVI N R N + R --670-864
Optimized Soil-Adjusted VIOSAVI 1.16 · ( N R ) N + R + 0.16 --670-800
Pigment-Specific Simple Ratio (chlorophyll B)PSSRbN/R--650-800
Structural Independent Pigment IndexSIPI N B N R 445-680-800
Simple RatioSR 730/R670E/R--670730-
Visible Atmospherically Resistant IndexVARI G R G + R B 470550670--
Table A2. Comparison between multispectral VIs and their estimates by virtual drones in trials 145482 (T1), 146672 (T2) and 149157 (T3). Qualitative ranking using an UAV mounted sensor needs its basic Pearson correlation (R2) to multispectral VI is high. Quantitative trait estimation from UAV mounted sensor needs its RMSE to multispectral VI is low. For comparisons between VIs, RMSEs are divided by the standard deviation in order to obtain dimension less values. The NRMSE values obtained for mSR705-445 were extremely high and have been replaced by the infinity symbol ∞.
Table A2. Comparison between multispectral VIs and their estimates by virtual drones in trials 145482 (T1), 146672 (T2) and 149157 (T3). Qualitative ranking using an UAV mounted sensor needs its basic Pearson correlation (R2) to multispectral VI is high. Quantitative trait estimation from UAV mounted sensor needs its RMSE to multispectral VI is low. For comparisons between VIs, RMSEs are divided by the standard deviation in order to obtain dimension less values. The NRMSE values obtained for mSR705-445 were extremely high and have been replaced by the infinity symbol ∞.
VI
Short Name
First SensorSecond Sensor
R2NRMSER2NRMSE
(T1)(T2)(T3)(T1)(T2)(T3)(T1)(T2)(T3)(T1)(T2)(T3)
BGI0.860.950.94100%60%78%0.950.910.8835%26%36%
BRI0.960.880.87101%178%225%0.990.890.8827%122%161%
CCCI0.940.970.9456%45%47%0.510.580.6088%83%83%
CI green0.991.001.005%6%8%0.991.001.005%6%8%
CI red edge0.990.990.9827%39%39%0.940.920.9168%87%85%
EVI0.991.001.0013%7%6%0.991.001.009%5%5%
EVI20.981.001.0010%6%5%0.981.001.0010%10%7%
gNDVI1.001.000.996%13%14%1.001.000.996%13%14%
LCCI0.990.990.9811%19%26%1.001.001.003%4%8%
LCI0.990.990.9910%32%30%0.950.880.8743%114%102%
Maccioni 0.770.840.93113%145%127%0.960.990.9954%52%46%
MCARI21.001.001.009%8%6%0.991.000.998%9%8%
MCARIH0.990.990.9844%46%68%0.940.920.8885%93%113%
mNDb5700.960.930.9127%31%37%0.990.990.9914%17%19%
mNDb6750.991.001.0018%34%46%0.990.990.999%23%24%
mNDb7300.820.740.66148%170%147%0.951.000.9923%4%6%
mNDI7050.940.940.9219%107%104%0.980.930.9326%104%102%
MSAVI1.001.001.0010%10%7%0.991.001.009%9%7%
mSR6701.001.001.004%3%4%1.001.000.995%6%9%
mSR705-4450.690.000.0079%0.880.000.00109%
mSR7800.850.860.7639%48%50%0.420.450.36153%175%174%
MTCI0.850.960.8451%48%50%0.420.630.42104%96%99%
MTVI21.001.001.0011%11%8%0.991.000.9910%11%9%
NDRE1.001.000.9919%27%26%0.960.920.9034%57%54%
NDVI1.001.001.004%6%7%0.991.001.005%11%11%
OSAVI1.001.001.007%12%10%0.991.001.007%14%11%
PSSRb1.001.001.0017%15%15%1.001.001.005%3%5%
SIPI0.991.000.9861%241%305%0.970.980.9856%242%301%
SR 730/R6700.980.980.9850%58%59%0.991.000.9912%13%14%
VARI1.001.001.008%8%9%0.990.990.9914%32%30%

References

  1. Rejeb, A.; Abdollahi, A.; Rejeb, K.; Treiblmaier, H. Drones in agriculture: A review and bibliometric analysis. Comput. Electr. Agric. 2022, 198, 107017. [Google Scholar] [CrossRef]
  2. Olson, D.; Anderson, J. Review on unmanned aerial vehicles, remote sensors, imagery processing, and their applications in agriculture. Agron. J. 2021, 113, 971–992. [Google Scholar] [CrossRef]
  3. del Cerro, J.; Cruz Ulloa, C.; Barrientos, A.; de León Rivas, J. Unmanned Aerial Vehicles in Agriculture: A Survey. Agronomy 2021, 11, 203. [Google Scholar] [CrossRef]
  4. Maddikunta, M.K.R.; Hakak, S.; Alazab, M.; Bhattacharya, S.; Gadekallu, T.R.; Khan, W.Z.; Pham, Q.V. Unmanned Aerial Vehicles in Smart Agriculture: Applications, Requirements and Challenges. IEEE Sens. J. 2021, 21, 17608–17619. [Google Scholar] [CrossRef]
  5. Hagen, N.; Kudenov, M.W. Review of snapshot spectral imaging technologies. Opt. Eng. 2013, 52, 090901. [Google Scholar] [CrossRef]
  6. Gitelson, A.A.; Merzlyak, M.N. Remote sensing of chlorophyll concentration in higher plant leaves. Adv. Space Res. 1998, 22, 689–692. [Google Scholar] [CrossRef]
  7. Ustin, S.L.; Gamon, J.A. Remote sensing of plant functional types. New Phytol. 2010, 186, 795–816. [Google Scholar] [CrossRef]
  8. Madec, S.; Baret, F.; de Solan, B.; Thomas, S.; Dutartre, D.; Jezequel, S.; Hemmerlé, M.; Colombeau, G.; Comar, A. High-Throughput Phenotyping of Plant Height: Comparing Unmanned Aerial Vehicles and Ground LiDAR Estimates. Front. Plant Sci. 2017, 8, 2002. [Google Scholar] [CrossRef]
  9. Peñuelas, J.; Filella, I.; Biel, C.; Serrano, I.; Savé, R. The Reflectance at the 950-970 Region as an Indicator of Plant Water Status. Int. J. Remote Sens. 1993, 14, 1887–1905. [Google Scholar] [CrossRef]
  10. Gao, B. NDWI—A normalized difference water index for remote sensing of vegetation liquid water from space. Remote Sens. Environ. 1996, 58, 257–266. [Google Scholar] [CrossRef]
  11. Daughtry, C.S.T. Discriminating Crop Residues from Soil by Short-Wave Infrared Reflectance. Agron. J. 2001, 93, 125–131. [Google Scholar] [CrossRef]
  12. Serrano, L.; Peñuelas, J.; Ustin, S. Remote Sensing of Nitrogen and Lignin in Mediterranean Vegetation from AVIRIS Data: Decomposing Biochemical from Structural Signals. Remote Sens. Environ. 2002, 81, 355–364. [Google Scholar] [CrossRef]
  13. Arya, S.; Sandhu, K.S.; Singh, J.; Kumar, S. Deep learning: As the new frontier in high-throughput plant phenotyping. Euphytica 2022, 218, 47–68. [Google Scholar] [CrossRef]
  14. Luo, S.; Wang, C.; Xi, X.; Nie, S.; Fan, X.; Chen, H.; Yang, X.; Peng, D.; Lin, Y.; Zhou, G. Combining hyperspectral imagery and LiDAR pseudo-waveform for predicting crop LAI, canopy height and above-ground biomass. Ecol. Indic. 2019, 102, 801–812. [Google Scholar] [CrossRef]
  15. Perry, E.M.; Fitzgerald, G.J.; Poole, N.; Craig, S.; Whitlock, A. NDVI from active optical sensors as a measure of canopy cover and biomass. In Proceedings of the XXII International Society for Photogrammetry and Remote Sensing (ISPRS) Congress, Melbourne, Australia, 25 August–1 September 2012. [Google Scholar] [CrossRef]
  16. Prey, L.; Hu, Y.; Schmidhalter, U. High-throughput field phenotyping traits of grain yield formation and Nitrogen Use Efficiency: Optimizing the selection of vegetation indices and growth stages. Front. Plant Sci. 2020, 10, 1672. [Google Scholar] [CrossRef]
  17. Ma, J.; Wang, L.; Chen, P. Comparing different methods for wheat LAI inversion based on hyperspectral data. Agriculture 2022, 12, 1353. [Google Scholar] [CrossRef]
  18. Beauchêne, K.; Leroy, F.; Fournier, A.; Huet, C.; Bonnefoy, M.; Lorgeou, J.; de Solan, B.; Piquemal, B.; Thomas, S.; Cohan, J.-P. Management and characterization of abiotic stress via PhénoField®, a high-throughput field phenotyping platform. Front. Plant Sci. 2019, 10, 904. [Google Scholar] [CrossRef]
  19. Bancal, P.; Bancal, M.O.; Collin, F.; Gouache, D. Identifying traits leading to tolerance of wheat to Septoria tritici blotch. Field Crops Res. 2015, 180, 176–185. [Google Scholar] [CrossRef]
  20. Chapman, E.A.; Orford, S.; Lage, J.; Griffiths, S. Capturing and selecting senescence variation in wheat. Front. Plant Sci. 2021, 12, 638738. [Google Scholar] [CrossRef]
  21. Bancal, M.O.; Collin, F.; Gate, P.; Gouache, D.; Bancal, P. Towards a global characterization of winter wheat cultivars behavior in response to stressful environments during grain-filling. Eur. J. Agron. 2022, 133, 126421. [Google Scholar] [CrossRef]
  22. Liu, S.; Baret, F.; Abichou, M.; Boudon, F.; Thomas, S.; Zhao, K.; Fournier, C.; Andrieu, B.; Irfan, K.; Hemmerlé, M.; et al. Estimating wheat green area index from ground-based LiDAR measurement using a 3D canopy structure model. Agric. For. Meteorol. 2017, 247, 12–20. [Google Scholar] [CrossRef]
  23. Zhang, Z.; Zhu, L. A review on Unmanned Aerial Vehicle remote sensing: Platforms, sensors, data processing methods, and applications. Drones 2023, 7, 398. [Google Scholar] [CrossRef]
  24. Vogelmann, J.E.; Rock, B.N.; Moss, D.M. Red Edge Spectral Measurements from Sugar Maple Leaves. Int. J. Remote Sens. 1993, 14, 1563–1572. [Google Scholar] [CrossRef]
  25. Filella, I.; Peñuelas, J. The red edge position and shape as indicators of plant chlorophyll content, biomass and hydric status. Int. J. Remote Sens. 1994, 15, 1459–1470. [Google Scholar] [CrossRef]
  26. Gitelson, A.A.; Viña, A.; Ciganda, V.; Rundquist, D.C.; Arkebauer, T.J. Remote estimation of canopy chlorophyll content in crops. Geophys. Res. Lett. 2005, 32, L08403. [Google Scholar] [CrossRef]
  27. Viña, A.; Gitelson, A.A.; Nguy-Robertson, A.L.; Peng, Y. Comparison of different vegetation indices for the remote assessment of green leaf area index of crops. Remote Sens. Environ. 2011, 115, 3468–3478. [Google Scholar] [CrossRef]
  28. Cho, M.A.; Skidmore, A.K. A new technique for extracting the red edge position from hyperspectral data: The linear extrapolation method. Remote Sens. Environ. 2006, 101, 181–193. [Google Scholar] [CrossRef]
  29. Ollinger, S.V. Sources of variability in canopy reflectance and the convergent properties of plants. New Phytol. 2011, 189, 375–394. [Google Scholar] [CrossRef] [PubMed]
  30. Liu, W.; Mõttus, M.; Gastellu-Etchegorry, J.-P.; Fang, H.; Atherton, J. Seasonal and vertical variation in canopy structure and leaf spectral properties determine the canopy reflectance of a rice field. Agric. For. Meteorol. 2024, 355, 110132. [Google Scholar] [CrossRef]
  31. Wang, X.; Xu, H.; Zhou, J.; Fang, X.; Shuai, S.; Yang, X. Analysis of vegetation canopy spectral features and species discrim-ination in reclamation mining area using In situ hyperspectral data. Remote Sens. 2024, 16, 2372. [Google Scholar] [CrossRef]
  32. Fang, H.; Baret, F.; Plummer, S.; Schaepman-Strub, G. An overview of global Leaf Area Index (LAI): Methods, products, validation, and applications. Rev. Geophys. 2019, 57, 739–799. [Google Scholar] [CrossRef]
  33. Xie, C.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electr. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
Figure 1. Wavelength windows of the five bands detected by two UAV-mounted sensors in the blue (B), green (G), red (R), red edge (E) and near-infrared (N) parts of the spectrum. The peaks for the first sensor (RedEdge-MX; Micasense) are shown as solid lines, whereas those for the second sensor (Phantom 4 Professional; DJI Enterprise) are shown as dotted lines. The gray line shows a typical reflectance spectrum for wheat crops.
Figure 1. Wavelength windows of the five bands detected by two UAV-mounted sensors in the blue (B), green (G), red (R), red edge (E) and near-infrared (N) parts of the spectrum. The peaks for the first sensor (RedEdge-MX; Micasense) are shown as solid lines, whereas those for the second sensor (Phantom 4 Professional; DJI Enterprise) are shown as dotted lines. The gray line shows a typical reflectance spectrum for wheat crops.
Agronomy 15 02665 g001
Figure 2. Reflectances measured by the green (A), blue (B) and red-edge (C) bands, according to the wavelength windows of either the first (X-axis) or the second (Y-axis) sensor. The color code refers to the three trials in which data were obtained. Blue diamonds: trial 145482 (3528 data); orange squares: trial 146672 (3100 data); gray triangles: trial 149157 (1968 data). These dots overlap considerably, which hinders the clarity of the information. For this reason, the trials are presented separately in the Appendix A (Figure A1). All trials involved weekly remote sensing from February (end of tillering) to July (maturity) on different wheat cultivars grown in plots subjected to different crop management methods in order to achieve a wide variation in the growth and senescence kinetics of the canopies.
Figure 2. Reflectances measured by the green (A), blue (B) and red-edge (C) bands, according to the wavelength windows of either the first (X-axis) or the second (Y-axis) sensor. The color code refers to the three trials in which data were obtained. Blue diamonds: trial 145482 (3528 data); orange squares: trial 146672 (3100 data); gray triangles: trial 149157 (1968 data). These dots overlap considerably, which hinders the clarity of the information. For this reason, the trials are presented separately in the Appendix A (Figure A1). All trials involved weekly remote sensing from February (end of tillering) to July (maturity) on different wheat cultivars grown in plots subjected to different crop management methods in order to achieve a wide variation in the growth and senescence kinetics of the canopies.
Agronomy 15 02665 g002
Figure 3. Impact of band definition on VI estimates, according to the wavelength windows of either the first (X-axis) or the second (Y-axis) sensor: gNDVI (A); CI red-edge (B); CCCi (C). Dot colors are as in Figure 2.
Figure 3. Impact of band definition on VI estimates, according to the wavelength windows of either the first (X-axis) or the second (Y-axis) sensor: gNDVI (A); CI red-edge (B); CCCi (C). Dot colors are as in Figure 2.
Agronomy 15 02665 g003
Figure 4. Coefficients of determination between multispectral VIs and their estimates by the first (A) and second (B) UAV sensors. The x-axis indicates the wavelengths in the red-edge band used by the multispectral VI definition. The dashed lines indicate the window of wavelengths in the red-edge band detected. Dot colors indicate experimental trials as in Figure 2: when variable, the obtained R2 change with time and trial.
Figure 4. Coefficients of determination between multispectral VIs and their estimates by the first (A) and second (B) UAV sensors. The x-axis indicates the wavelengths in the red-edge band used by the multispectral VI definition. The dashed lines indicate the window of wavelengths in the red-edge band detected. Dot colors indicate experimental trials as in Figure 2: when variable, the obtained R2 change with time and trial.
Agronomy 15 02665 g004
Figure 5. Predictions of GLAI during grain filling in the BH-PHENO-OR trial. The points correspond to 120 wheat plots over which a drone using the first UAV-sensor flew six times during grain filling. GLAI was calculated with calibration either of the first sensor (A) or of the second sensor (B). The dot code refers to the treatments applied to the plots: control (black circles), or early fertilizer shortage (white triangles), or absence of fungicidal protection against leaf diseases (white squares).
Figure 5. Predictions of GLAI during grain filling in the BH-PHENO-OR trial. The points correspond to 120 wheat plots over which a drone using the first UAV-sensor flew six times during grain filling. GLAI was calculated with calibration either of the first sensor (A) or of the second sensor (B). The dot code refers to the treatments applied to the plots: control (black circles), or early fertilizer shortage (white triangles), or absence of fungicidal protection against leaf diseases (white squares).
Agronomy 15 02665 g005
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bancal, P.; Bancal, M.-O.; Heers, M.; Deswartes, J.C. Using Virtual Drones to Mitigate the Bias Introduced by Sensor Wavelength Approximations in Crop Monitoring with Drones. Agronomy 2025, 15, 2665. https://doi.org/10.3390/agronomy15112665

AMA Style

Bancal P, Bancal M-O, Heers M, Deswartes JC. Using Virtual Drones to Mitigate the Bias Introduced by Sensor Wavelength Approximations in Crop Monitoring with Drones. Agronomy. 2025; 15(11):2665. https://doi.org/10.3390/agronomy15112665

Chicago/Turabian Style

Bancal, Pierre, Marie-Odile Bancal, Mélanie Heers, and Jean Charles Deswartes. 2025. "Using Virtual Drones to Mitigate the Bias Introduced by Sensor Wavelength Approximations in Crop Monitoring with Drones" Agronomy 15, no. 11: 2665. https://doi.org/10.3390/agronomy15112665

APA Style

Bancal, P., Bancal, M.-O., Heers, M., & Deswartes, J. C. (2025). Using Virtual Drones to Mitigate the Bias Introduced by Sensor Wavelength Approximations in Crop Monitoring with Drones. Agronomy, 15(11), 2665. https://doi.org/10.3390/agronomy15112665

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop