Next Article in Journal
Determination and Removal of Potentially Toxic Elements by Phragmites australis (Cav.) Trin. ex Steud. (Poaceae) in the Valles River, San Luis Potosí (Central Mexico)
Previous Article in Journal
Characterizing Droughts During the Rice Growth Period in Northeast China Based on Daily SPEI Under Climate Change
Previous Article in Special Issue
Validation of SNP Markers for Diversity Analysis, Quality Control, and Trait Selection in a Biofortified Cassava Population
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

High-Throughput Phenotyping for Agronomic Traits in Cassava Using Aerial Imaging

by
José Henrique Bernardino Nascimento
1,
Diego Fernando Marmolejo Cortes
1,
Luciano Rogerio Braatz de Andrade
2,
Rodrigo Bezerra de Araújo Gallis
3,
Ricardo Luis Barbosa
3 and
Eder Jorge de Oliveira
2,*
1
Centro de Ciências Agrárias, Ambientais e Biológicas, Universidade Federal do Recôncavo da Bahia, Cruz das Almas 44380-000, Bahia, Brazil
2
Embrapa Mandioca e Fruticultura, Nugene, Cruz das Almas 44380-000, Bahia, Brazil
3
Instituto de Geografia, Universidade Federal de Uberlândia, Av. João Naves de Ávila, 2121—Bairro Santa Mônica, Uberlândia 38408-902, Minas Gerais, Brazil
*
Author to whom correspondence should be addressed.
Plants 2025, 14(1), 32; https://doi.org/10.3390/plants14010032
Submission received: 9 September 2024 / Revised: 16 December 2024 / Accepted: 23 December 2024 / Published: 25 December 2024
(This article belongs to the Special Issue Genetic Improvement of Cassava)

Abstract

:
Large-scale phenotyping using unmanned aerial vehicles (UAVs) has been considered an important tool for plant selection. This study aimed to estimate the correlations between agronomic data and vegetation indices (VIs) obtained at different flight heights and to select prediction models to evaluate the potential use of aerial imaging in cassava breeding programs. Various VIs were obtained and analyzed using mixed models to derive the best linear unbiased predictors, heritability parameters, and correlations with various agronomic traits. The VIs were also used to build prediction models for agronomic traits. Aerial imaging showed high potential for estimating plant height, regardless of flight height (r = 0.99), although lower-altitude flights (20 m) resulted in less biased estimates of this trait. Multispectral sensors showed higher correlations compared to RGB, especially for vigor, shoot yield, and fresh root yield (−0.40 ≤ r ≤ 0.50). The heritability of VIs at different flight heights ranged from moderate to high (0.51 ≤ H C u l l i s 2 ≤ 0.94), regardless of the sensor used. The best prediction models were observed for the traits of plant vigor and dry matter content, using the Generalized Linear Model with Stepwise Feature Selection (GLMSS) and the K-Nearest Neighbor (KNN) model. The predictive ability for dry matter content increased with flight height for the GLMSS model ( R 2 = 0.26 at 20 m and R 2 = 0.44 at 60 m), while plant vigor ranged from R 2 = 0.50 at 20 m to R 2 = 0.47 at 40 m in the KNN model. Our results indicate the practical potential of implementing high-throughput phenotyping via aerial imaging for rapid and efficient selection in breeding programs.

1. Introduction

Cassava (Manihot esculenta Crantz) is a species known for its tuberous roots, which are of significant global importance, ranking as the fourth most important crop after wheat, rice, and corn [1]. It is a major source of starch for both human and animal consumption, as well as various industrial applications [1,2], and provides a key source of calories in developing countries, particularly in Africa and Asia [3]. Nigeria leads the world in cassava root production, followed by the Democratic Republic of Congo, Thailand, Ghana, and Brazil, with production figures of 60.00, 41.01, 28.99, 21.81, and 18.2 million tons, respectively [4].
The development of new cassava cultivars is essential for improving root yield, quality, and disease resistance [5]. However, cultivar selection in breeding programs is often hindered by conventional phenotypic evaluations, which are typically costly and have low throughput. This limitation makes it difficult to assess a large number of individuals, potentially reducing experimental precision and hindering the selection of superior genotypes that could maximize genetic gain. Therefore, adopting new approaches to overcome these challenges could improve data collection, enable larger breeding populations, and optimize the use of both financial and human resources [6].
Large-scale phenotyping using unmanned aerial vehicles (UAVs) is increasingly recognized as a key tool for improving genetic gains in breeding programs. UAVs provide valuable data for plant selection across various species [7,8]. They are effective in monitoring key growth parameters, such as biomass [9], leaf area index [10,11], chlorophyll content [12], vigor, and yield [13], all of which can be estimated using vegetation indices [14].
Vegetation indices are simple yet effective mathematical combinations that enable both quantitative and qualitative assessments of vegetation characteristics, such as ground cover, vigor, and growth rate [15]. These indices primarily rely on reflectance values from three wavelength ranges (visible, red edge, and near-infrared). These indices help reduce the volume of data to be analyzed, facilitating the estimation of structural and physiological biophysical variables of vegetation [16]. By combining reflectance values, vegetation indices link data to the physiological traits of plants [17].
Among the many vegetation indices, the Normalized Difference Vegetation Index (NDVI) is the most widely used [18]. It uses spectral reflectance from the red and near-infrared bands and is widely used due to its high sensitivity to changes in vegetation vigor [19]. NDVI is also useful for detecting and mapping the spatial distribution and temporal changes of vegetation [20]. Other indices, such as the Normalized Green–Red Difference Index (NGRDI), are valuable for estimating vegetation fraction, green biomass, leaf chlorophyll content, and plant phenology [21]. The Chlorophyll Vegetation Index (GLI), which differentiates live plants from dead ones and exposed soil, is effective in detecting leaf chlorophyll variations and assessing vegetation degradation [22].
In a recent study on cassava, Selvaraj et al. [23] found promising results in predicting fresh root yield using vegetation indices like the Green Normalized Difference Vegetation Index (GNDVI) and the Normalized Difference Red Edge Index (NDRE). Maresma et al. [22] tested several vegetation indices and discovered that the Wide Dynamic Range Vegetation Index (WDRVI) was the most accurate for predicting grain yield in corn (Zea mays L.). In wheat, Kyratzis et al. [24] analyzed vegetation indices across twenty varieties under water deficit conditions, reporting that GNDVI was the most effective predictor of grain yield when measured during the early reproductive stages.
Prediction models are crucial for identifying agronomically important traits and prioritizing those to be collected in the field, thereby reducing phenotyping costs in breeding programs [25]. In cassava cultivation, prediction models should focus on indices that reflect the genotypes’ ability for above-ground biomass production and, most importantly, root yield.
Several studies have shown that traits such as above-ground biomass, root diameter, and branching density can predict water and nutrient efficiency, making them valuable for selecting cassava genotypes during early growth phase [26]. However, implementing prediction models in agriculture presents challenges, including the lack of extensive databases, the high cost of sensors and technology, and the need for specialized training to develop and maintain these systems [27]. Despite these obstacles, as precision farming practices become more widespread and more agricultural data are collected and stored, the benefits of using prediction models will become more apparent. While prediction models for key traits are still in the early stages, initial results are promising [28].
One critical factor that impacts predictions is flight height, which directly affects the spatial resolution of the images [29]. Mesas-Carrascosa et al. [30] observed that the more images collected, the better the spatial resolution; in other words, spatial resolution is directly related to flight height and can be optimized for greater image detail.
Despite the growing use of UAVs in phenotyping, studies on agronomic traits and vegetation indices in cassava at various flight altitudes remain limited. For example, Selvaraj et al. [23] used flight altitudes between 30 and 40 m to assess four cassava genotypes, but they did not provide comparative results for these altitudes. Similarly, Rattanasopa et al. [31] used a flight altitude of 73 m to capture imagery of the “Kasetsart 50” cultivar. This study evaluates four distinct flight altitudes in genotypes from Embrapa’s cassava breeding program, providing critical insights for high-throughput phenotyping and emphasizing the importance of selecting optimal flight altitudes to improve data accuracy and quality in UAV-based phenotyping.
Given the increasing demand for efficient high-throughput phenotyping methods to enhance selection processes in cassava breeding, this study aims to (I) assess the correlation between agronomic data and vegetation indices obtained at various flight altitudes, including the accuracy of plant height measurements from digital elevation models; (II) identify the optimal flight altitude for evaluating cassava genotypes in terms of root yield and quality; and (III) develop predictive models for agronomic traits based on vegetation indices derived from aerial imagery at different flight altitudes.

2. Results

2.1. Heritability and Correlation Between Agronomic Traits and Vegetation Indices

The correlations and Bland–Altman agreement between plant heights measured conventionally and those derived from aerial images at four different flight heights are shown in Figure 1. The results reveal very high correlations between the two measurement methods (r = 0.99). The 95% limits of agreement captured 95% of the differences between the two methods, indicating strong concordance across all flight heights. However, measurement quality varied across the four flight heights. As flight height increased, a greater bias was observed in the estimated plant heights compared to ground measurements. Specifically, the difference was approximately 0.25 m at a flight height of 20 m and around 0.83 m at 60 m. This suggests that higher flight altitudes tend to overestimate plant height relative to ground-based measurements.
Flight altitude is directly related to mission duration and image processing time (Table 1). At a flight height of 60 m, fewer images were captured (50), and the flight duration was shorter (4 min) compared to 20 m, where nearly five times as many images were taken (222) and the mission duration was more than double (10 min). Shorter flights consume less battery, allowing for an increased number of missions and, consequently, more trials to be evaluated. However, in this study, a greater bias was observed in estimating plant heights at higher flight heights, which should be taken into account during field evaluations.
Moderate correlations (−0.40 ≤ r ≤ 0.50) were observed between vegetation indices from multispectral sensors and agronomic data for cassava during the 2022 crop season at a flight height of 60 m (Figure 2). Specifically, the vegetation indices DVI (r = 0.50), NDVI (r = 0.40), GNDVI (r = 0.40), RVI (r = 0.40), and CIG (r = 0.40) at 60 m showed moderate correlations with plant vigor. Similarly, the SCI and SI indices were moderately correlated with above-ground biomass yield (r = 0.40), while the NGRDI and VARI indices showed a similar correlation with fresh root yield (r = 0.40). In contrast, the indices BGI (r = −0.40 to −0.50), GLI (r = 0.40), and PSRI (r = −0.40 to −0.50) were associated with plant vigor at flight heights of 40 m and 60 m, respectively.
The correlation estimates between vegetation indices obtained from images captured by the RGB camera showed moderate correlations for traits such as plant vigor, leaf retention, dry matter content, and fresh root yield (Figure 3). Plant vigor exhibited a moderate correlation with the vegetation indices BI (r = −0.40), BGI (r = −0.40), and PSRI (r = −0.51). Similarly, leaf retention was correlated with the indices BI (r = −0.40) and CIRE (r = 0.50), while dry matter content (r = −0.40) and fresh root yield (r = 0.40) displayed moderate correlations with the HUE index. These correlations were observed in images captured at an altitude of 60 m. Additionally, the HUE vegetation index showed a moderate correlation (r = −0.40) at an altitude of 30 m with both leaf retention and dry matter content.
The heritability estimates ( H C u l l i s 2 ) for the eighteen indices obtained from the multispectral camera at four different flight heights (20 m, 30 m, 40 m, and 60 m) ranged from moderate ( H C u l l i s 2 = 0.30 to 0.60) to high ( H C u l l i s 2 ≥ 0.60) (Figure 4). Among these vegetation indices, CIRE ( H C u l l i s 2 = 0.82) and HI ( H C u l l i s 2 = 0.92) showed the highest heritability values at 20 m and 60 m, respectively. Additionally, indices such as GNDVI, NDVI, VARI, CIG, GLI, and CIRE exhibited high heritability (0.60 ≤ H C u l l i s 2 ≤ 0.82) when measured at 20 m. At other flight heights, high heritability values were observed for the indices SCI, SI, RVI, and DVI at 30 m (0.57 ≤ H C u l l i s 2 ≤ 0.76); BGI ( H C u l l i s 2 = 0.61) and BI ( H C u l l i s 2 = 0.56) at 40 m; and HUE, NGRDI, PSRI, CVI, NDRE, and HI (0.59 ≤ H C u l l i s 2 ≤ 0.92) at 60 m.
The heritability estimates for the indices extracted from the RGB camera were similar to those obtained from the multispectral camera, ranging from moderate to high (Figure 5). The PSRI ( H C u l l i s 2 = 0.94) and HUE ( H C u l l i s 2 = 0.97) indices exhibited particularly high heritability. Other indices with high heritability included NDVI and GNDVI ( H C u l l i s 2 = 0.83 and H C u l l i s 2 = 0.88, respectively) when measured at 20 m. At 30 m, the NDRE, SI, SCI, HI, and GLI indices displayed high heritability values (0.88 ≤ H C u l l i s 2 ≤ 0.94). At 40 m, high heritability was observed only for the CIRE index ( H C u l l i s 2 = 0.83), while the remaining indices showed high heritability values (0.80 ≤ H C u l l i s 2 ≤ 0.93) when measured at 60 m. Additionally, high heritability was observed for all agronomic traits ( H C u l l i s 2 > 0.90) (Table 2).

2.2. Prediction of Agronomic Traits Based on Vegetation Indices at Different Flight Height

The performance of prediction models for agronomic traits in cassava, using multispectral and RGB images captured by UAVs, was evaluated through cross-validation and variable selection based on the importance of the indices for prediction. The R 2 values for different flight heights indicated low prediction accuracy for the agronomic variables, ranging from 0.05 to 0.34 (Table 3).
Overall, the models performed better in predicting plant vigor and dry matter content, especially when using vegetation indices captured at a height of 20 m. The PLS and KNN models demonstrated higher predictive ability for plant vigor ( R 2 = 0.30), while the PLS model yielded an R 2 = 0.34 for dry matter content. For leaf retention, the GLMSS model showed the best predictive performance ( R 2 = 0.29), also utilizing vegetation indices captured at 20 m.
For leaf spot resistance, as well as fresh root yield and above-ground biomass yield, the best predictions were obtained using vegetation indices from flight heights of 60 m and 30 m. In particular, the SVM model achieved the highest prediction accuracy for fresh root yield and above-ground biomass yield ( R 2 = 0.22), while the GLMSS model ( R 2 = 0.16) produced comparable results at 60 m. For leaf spot resistance, the PLS model provided the best predictive performance ( R 2 = 0.20) using images at 30 m.
The average relationship between observed and predicted values is presented in the Supplementary Material (Figures S1–S6). The best fits were observed for plant vigor, with R2 = 0.50 using the KNN model and images collected at 20 m, and R2 = 0.47 for images taken at 40 m. For dry matter content, predictive accuracy improved, with R2 = 0.26 at 20 m and R2 = 0.44 at 60 m. While the models showed varying degrees of fit, most regressions were statistically significant (p ≤ 0.01), indicating a clear relationship between observed and predicted values.

3. Discussion

3.1. Efficiency of Measuring Cassava Plant Height Using Digital Elevation Models

This study showed a high correlation (r = 0.99) between plant height estimates from the digital elevation model based on UAV images. Similar results were found in cassava studies, with a strong correlation (r = 0.83) between UAV measurements at 30 m and 40 m and ground-based measurements [23]. While both point cloud and manual methods were consistent, lower flight height showed less bias compared to higher ones. From a practical standpoint, lower flight height results in more images being collected, which enhances the reliability of the results [32]. Based on our findings regarding cassava plant height, we recommend conducting UAV flights at an altitude of 20 m for the most accurate measurement of plant height.

3.2. Correlation Between Vegetation Indices and Agronomic Data in Cassava

The use of UAVs in high-throughput phenotyping has gained significant traction, particularly in breeding programs where large numbers of individuals must be assessed quickly for multiple traits. Despite the significant potential of UAVs, the correlation between vegetation indices and agronomic traits ranged from low to moderate (−0.1 ≤ r ≤ 0.5) at different flight heights. Generally, vegetation indices derived from multispectral sensors showed higher correlation magnitudes compared to those derived from RGB sensors, although the correlation remained low, especially at lower flight heights (20 m and 30 m).
Multispectral images are particularly advantageous for precision agriculture due to their ability to capture additional spectral information compared to RGB images, which typically offer lower resolution and color quality [32]. For example, in rice, Zhou et al. [14] explored the relationship between leaf area index and vegetation indices at different phenological stages and found that several indices derived from multispectral images had higher correlations (ranging from 0.63 to 0.79) compared to those from RGB images (ranging from 0.36 to 0.38).
In this study, the highest correlations (ranging from −0.4 to 0.5) were observed for traits such as plant vigor, above-ground biomass yield, fresh root yield, and dry matter content, particularly with indices derived from multispectral images at flight altitudes of 40 m and 60 m. The vegetation indices BGI, CIRE, DNVI, GLI, GNDVI, NDVI, NGRDI, PSRI, SCI, SI, and VARI provided the strongest correlations for these traits.
Phenotyping methods differ across plant species but generally focus on traits related to plant vigor, growth, and productivity, such as grains, fruits, and roots. For example, studies have shown moderate correlations between the GNDVI index and vegetation cover (0.64 ≤ r ≤ 0.66) and leaf area (0.40 ≤ r ≤ 0.59) in trials of common beans (Phaseolus vulgaris) grown under various irrigation treatments [33]. Other research indicates correlations of the NDVI (0.36 ≤ r ≤ 0.53) and GNDVI (r = 0.42 ≤ r ≤ 0.56) with grain yield in durum wheat (Triticum turgidum subsp. durum) [24]. In cassava, indices like NDRE, NDVI, and GNDVI have also demonstrated strong correlations with traits such as above-ground yield, vegetation cover, and fresh root yield [23].
Vegetation indices have also shown strong correlations with agronomic traits in other crops, depending on flight height. For instance, a study by Avtar et al. [34] evaluated NDVI and NDRE indices at different flight heights (20 m, 60 m, and 80 m). They found that flights at 60 m correlated well with traits such as canopy diameter and plant height of oil palm (Elaeis guineensis), while indices from images captured at 60 m and 80 m showed a high correlation with plant vigor.
Rattanasopa et al. [31] used UAVs to estimate agronomic traits in cassava over three different growth stages (5, 6, and 7 months after planting) using multispectral images. Their results revealed a high correlation (r = 0.95, 0.91, and 0.96) between the NDVI, RVI, and CIRE vegetation indices and agronomic traits such as leaf area, canopy height, and total fresh weight. While some authors argue that these indices are essential for assessing or classifying soil cover, detecting climate changes, and monitoring soil deficits [31,35], our study found significant correlations between the NDVI, RVI, and CIRE indices and the traits such as vigor and leaf retention in cassava trials, although the correlation magnitudes were generally low.

3.3. Influence of Flight Height on Vegetation Indices

Different flight heights can affect the spatial resolution of images and, consequently the quality of orthomosaics. Lower flight heights reduce the mapped area, leading to a higher number of images and longer data processing times for orthomosaic generation [36]. Therefore, optimizing flight height is crucial, especially in areas where environmental factors, such as sunlight glare and persistent cloud cover may hinder image quality.
Our results indicate that vegetation index values can vary with flight height, regardless of the sensor type used. Some studies have reported that increased flight height can impair the accuracy of information derived from images due to loss of detail [37], while lower flight heights (between 15 and 30 m) tend to provide sharper and more detailed images [38]. For example, Mesas-Carrascosa et al. [30] found no significant differences in vegetation index values, such as NDVI, when comparing images captured at higher flight heights.
Overall, our study showed greater variation in vegetation indices between different flight heights using RGB sensors, compared to indices obtained from multispectral images. This may be because cassava vegetation appears relatively homogeneous in RGB images, and the spatial variation from different flight heights can affect reflectance properties and image quality in the visible spectrum. In contrast, multispectral images undergo radiometric calibration before orthomosaic processing [39,40], resulting in less variation in vegetation indices derived from these images. Factors such as solar angle, time of day, bidirectional reflectance, and crop type can all influence vegetation index values. However, research on these factors remains limited, and more comprehensive studies are needed to fully understand the observed results.
The choice of flight height is typically determined by the study’s objectives and the target traits for inference. Current research is investigating the impact of flight height on vegetation index calculations and their correlation with traits in various crops. Some studies, such as those by Perroy et al. [41] and Quirós and Khot [42], found that higher flight height resulted in lower spatial resolution and poor canopy detection in Miconia calvescens and apple trees (Malus domestica Borkh), respectively. In contrast, our study observed increased vegetation index values for cassava when analyzing images from higher flight heights. Overall, a flight height of 60 m showed stronger correlations with most agronomic traits evaluated, such as dry matter content, fresh root yield, above-ground biomass yield, and plant vigor, indicating its high potential for future aerial imaging applications.

3.4. Heritability of Vegetation Indices and Agronomic Traits

Regardless of the sensor type (RGB or multispectral), the broad-sense heritability estimates for vegetation indices obtained at different flight heights ranged from moderate to high (0.51 ≤ H C u l l i s 2 ≤ 0.94). The PSRI index ( H C u l l i s 2 = 0.94) and HUE ( H C u l l i s 2 = 0.97), derived from RGB sensors, exhibited the highest heritability estimates when measured at 30 m. Meanwhile, the multispectral indices CIRE ( H C u l l i s 2 = 0.82) and HI ( H C u l l i s 2 ≤ 0.92) showed the highest heritability values for images captured at 20 m and 60 m flight height, respectively. Silva et al. [43] reported high average heritability values (0.87) when using UAV-based phenotyping for wheat genotypes at crop maturation using multispectral indices.
Although high heritability values suggest that vegetation indices can be useful for the indirect selection of desirable attributes, such as yield traits in breeding programs, some studies report limited success in achieving high heritability for these indices. For example, Tao et al. [44] studied variations in growth and vegetation indices of slash pine (Pinus elliottii) at two locations, finding that heritability estimates for indices like the Landsat Soil Adjusted Vegetation Index (SAVI), GNDVI, and NDVI were very low (~0.11 at the first location and 0.23–0.27 at the second location) using RGB and multispectral cameras. This variability in heritability suggests that for cassava breeding, it is important to assess which vegetation indices are most stable and reliable across environments.
In elephant grass (Cenchrus purpureus (Schumach.) Morrone), heritability for vegetation indices and total dry biomass (TDB) was assessed using both basic linear mixed models and spatial linear models. Heritability values ranged from 0.22 (for MSAVI: Modified Soil-Adjusted Vegetation Index) to 0.55 (for GLI: Green Leaf Index). Generally, broad-sense heritability for individual indices was higher than for TDB, suggesting that certain vegetation indices could be used as secondary traits to support the indirect selection of superior genotypes. The NDRE index was particularly effective for indirect selection of TDB, with heritability 2.7 times greater than the trait itself, though genetic gains from indirect selection were lower than from direct selection. The efficiency of indirect selection depends on the heritability of the traits and their correlations [45].
Heritability is a population characteristic, not an individual one [46], so estimating heritability for vegetation indices at different stages of a breeding program is essential to define effective selection strategies. In uniform yield trials, where replications, locations, and years are considered, more accurate heritability estimates are obtained compared to preliminary trials, which rely on averages of replications [47]. Understanding these correlations is crucial for applying selection based on vegetation indices estimated through high-throughput phenotyping via UAVs, allowing breeding programs to fully exploit the potential of this approach for breeding. By selecting indices with high heritability that are correlated with desirable traits, cassava breeders can use high-throughput phenotyping techniques to identify superior genotypes more efficiently. This can significantly improve the speed of breeding cycles, reduce costs, and help develop cassava varieties that are higher-yielding, more disease-resistant, and better adapted to changing environmental conditions.
Similar to the vegetation indices, agronomic traits showed high heritability estimates (0.93 ≤ H C u l l i s 2 ≤ 0.99), likely due to the analysis of trials in the later stages of agronomic validation (UYT), where plots contained more plants, leading to greater uniformity across different replications. Similar findings were reported by Sampaio Filho et al. [48], who presented H C u l l i s 2 values of 0.94, 0.94, and 0.93 for above-ground biomass yield, fresh root yield, and dry matter content, respectively. Additionally, Conceição et al. [49] reported H C u l l i s 2 values of 0.84 for leaf retention and 0.72 for plant vigor.

3.5. Performance of Vegetation Index-Based Models for Predicting Agronomic Traits in Cassava

Prediction models based on vegetation indices are valuable tools for identifying relationships between remote sensing data and agronomic parameters, such as yield, leaf area index, and biomass [50,51]. In this study, machine learning models demonstrated low to moderate predictive abilities for various cassava traits (0.01 ≤ R2 ≤ 0.50).
The highest predictive performance was observed for plant vigor and dry matter content using the GLMSS and KNN models. For plant vigor, the KNN model achieved R 2 values of 0.50 and 0.47 at flight altitudes of 20 m and 40 m, respectively. For dry matter content, the GLMSS model showed improved predictive ability with R 2 = 0.26 at 20 m and R 2 = 0.44 at 60 m. Conversely, the PLS model consistently yielded the lowest R 2 values for all agronomic traits. These results contrast with previous studies, such as Wang et al. [50], who reported better performance of the PLS model for several variables. Similarly, while Li et al. [52] demonstrated the high predictive ability of the SVM model for wheat yield during the grain-filling stage ( R 2 = 0.73 and RMSE = 0.87 t/ha) using hyperspectral imagery, our study found lower SVM performance for cassava traits (0.01 ≤ R 2 ≤ 0.26) using spectral images.
The predictive accuracy of these models is influenced by flight altitude, which affects imaging parameters like spatial resolution and field of view. Lower altitudes (e.g., 20 m) provide higher spatial resolution, crucial for capturing fine details such as leaf texture and spectral variations. Studies by Guo et al. [53] and Ye et al. [54] highlight the importance of high-resolution imagery for detecting biological symptoms like yellow rust in wheat or Fusarium wilt in bananas. Conversely, higher altitudes (e.g., 60 m) offer broader coverage but compromise spatial resolution, limiting the ability to detect fine-scale variability. At 20 m, the KNN model achieved its best performance for plant vigor (R2 = 0.50). This aligns with findings from Wang et al. [55], who noted that KNN performs well with high-resolution images for traits influenced by texture and color differences captured through vegetation indices. However, while KNN performed well for above-ground traits, its predictive ability for below-ground traits, such as root yield, remained low. This reflects the inherent challenges of using aerial imaging to capture traits not directly observable through vegetation indices.
Similarly, PLS achieved moderate performance for plant vigor at 20 m (R2 = 0.30) showing better use of spectral variations, as demonstrated in studies by Lizuka et al. [56], where detailed spectral data enhanced predictions of vegetation cover fractions using regression models. As the flight altitudes increase to 30–40 m, the balance between spatial resolution and field of view begins to shift. At these altitudes, aggregation effects can reduce variability, improving correlations for certain traits, such as canopy cover, but at the expense of losing fine-scale details. For instance, the PLS model performed best at 30 m for predicting leaf spot resistance ( R 2 = 0.20), effectively capturing spectral differences linked to leaf health. Similarly, the SVM model showed the highest predictive ability for fresh root yield and above-ground biomass yield at 30 m ( R 2 = 0.22), benefiting from the moderate spatial detail suitable for biomass-related traits.
At the highest flight altitude of 60 m, reduced spatial resolution negatively impacted models that rely on detailed textures, such as PLS, whose performance for predicting dry matter content declined to R 2 = 0.10. However, models like GLMSS showed improved performance results at this altitude for traits dependent on broader spatial patterns, achieving an R 2 = 0.16 for fresh root yield.
The development of rapid phenotyping methods has become a priority in cassava breeding programs, aiming to shorten selection cycles, evaluate more genotypes, and reduce costs associated with measuring challenging traits. While significant progress has been made, few studies have focused on validating predictive models for root yield and dry matter content using image analysis in cassava. In this context, the predictive ability of R 2 = 0.44 for dry matter content achieved with the GLMSS model at a flight height of 60 m represents a promising starting point for early plant selection in breeding programs. Although this level of accuracy is moderate, it offers practical insights into traits that are otherwise difficult to measure directly.
These results highlight that while predictive accuracies remain moderate, aerial imaging shows considerable potential as a complementary tool to advanced methodologies like genomic selection, which has achieved similar levels of accuracy in certain cases [57,58]. Integrating aerial imaging with genomic selection could significantly enhance cassava breeding efforts by offering a cost-effective and scalable approach to evaluating complex traits. For instance, aerial imaging facilitates rapid assessments that align well with breeding objectives, particularly for traits such as root yield and dry matter content. However, the limited success in predicting root yield underscores the need for further methodological advancements.

3.6. Applications of Aerial Imaging in Cassava Breeding

The large volume of data generated by UAVs, particularly when equipped with sensors capturing extensive datasets, presents challenges in processing and delivering results quickly [59]. Many recent studies focus on validating spectral vegetation indices in field-scale plant phenotyping, with a significant portion of measurements taken proximally [60], which ties into the aforementioned challenges. Nevertheless, UAV-based studies have made notable progress in overcoming current limitations and maximizing the potential of this technology in agriculture. Despite obstacles like adverse weather conditions, sensor calibration issues, and platform stability—factors directly affecting data reliability and accuracy [59,60]—UAVs are expected to become a valuable tool for supporting breeding programs and aiding farmers in obtaining reliable precision agriculture data for informed decision making.
Breeding programs are actively developing technologies and sensor algorithms to improve data capture accuracy and precision. These advancements aim to enhance the detection and quantification of plant growth, chlorophyll content, plant height, and vigor, along with the use of state-of-the-art analytical techniques. In the context of the cassava breeding program, UAV-based phenotyping for estimating plant height has already proven feasible. However, the prediction of agronomic traits through aerial imaging phenotyping across various trials in a breeding program can still be enhanced to more effectively select individuals in large populations.
Correlations between agronomic traits and vegetation indices derived from RGB and multispectral sensors varied from low to moderate at different flight heights, contributing to modest predictive accuracies for most of the agronomic traits evaluated. To improve the predictive models, incorporating spatial features could help standardize performance across different camera systems. Similar findings were reported by Herzig et al. [61], who compared RGB and multispectral camera systems for predicting barley grain yield. Due to costs and simpler image processing, they suggested that RGB cameras might be preferred over multispectral systems.
Several factors should be considered for effectively integrating aerial imaging into cassava breeding selection routines. One potential approach is to capture images throughout the entire crop cycle, which would allow for the observation of environmental variations that influence productive traits. This strategy could result in more robust vegetation indices for analysis and ensure the standardization of the study area, ultimately improving the quality of the images captured.
To effectively integrate aerial imaging into cassava breeding programs, it is essential to capture images throughout the crop cycle, allowing for the observation of environmental variations that influence productive traits. This strategy could lead to the development of more robust vegetation indices, standardization of study areas, and improvements in image quality. The combination of aerial imagery with ground-based sensors and proximal sensing technologies could further enhance prediction accuracy. Moreover, incorporating data from multiple sources—such as spectral indices and environmental variables collected at key phenological stages—has the potential to significantly improve model performance for complex traits like root yield. Future research should focus on developing robust models that leverage these integrated data streams to refine predictive capabilities.
Despite existing challenges, advancements in UAV technologies and methodologies offer substantial promise for cassava breeding. By enabling rapid and cost-effective phenotyping, UAVs can play a pivotal role in evaluating complex traits and accelerating the selection process in large breeding populations.

4. Conclusions

Aerial imaging has demonstrated high accuracy for estimating traits such as plant height (r = 0.99), irrespective of flight altitude. Moderate agreement between aerial imaging and ground measurements was most notable at a flight height of 20 m. However, the predictive capability for below-ground traits, including root yield and dry matter content, remains limited.
Models like GLMSS showed moderate predictive ability for dry matter content at 60 m ( R 2 = 0.44), but prediction accuracy for root yield was consistently low across all models and flight heights. These findings highlight the challenges of using aerial imaging to estimate below-ground traits, emphasizing the need for further methodological advancements or the integration of complementary data sources.
A flight height of 60 m provides practical benefits, such as reduced battery consumption, shorter flight durations, and greater field coverage, making it a feasible choice for phenotypic assessments in cassava breeding programs despite limitations in trait prediction accuracy.
In conclusion, aerial imaging is a promising tool for field phenotyping, particularly for above-ground traits like plant vigor. However, its effectiveness for predicting below-ground traits is constrained. Continued research is essential to improve predictive models and integrate additional data sources, thereby enhancing the utility of aerial imaging for cassava breeding programs.

5. Materials and Methods

5.1. Plant Material and Experimental Design

The experiment was conducted in June 2021 at Fazenda Botelho (latitude 11°48′7.24″ S, longitude 38°22′27.53″ W, with an average altitude of 224 m) in the rural area of Inhambupe, Bahia, Brazil. This region has a hot, subhumid tropical climate, with an average annual temperature of 34 °C, a mean annual relative humidity of 82%, and an average annual rainfall of 1100 mm, concentrated between March and August, followed by a warmer period from September to February. A total of 36 cassava clones from the uniform yield trial (UYT) were evaluated, representing the final stage in the process of evaluation and selection. A list of clones and their genealogy is provided in Table S1.
UYTs are a critical phase in cassava breeding programs, where promising clones are assessed through multiple repetitions across diverse environments and over several years. This approach allows for higher experimental precision, particularly for traits with low heritability. These trials are designed to maximize data accuracy and reliability, providing a robust foundation for the selection of superior, well-adapted clones. The experimental design was a randomized complete block (RCBD) with three replicates, with plots consisting of three rows of seven plants each.
Soil preparation included plowing followed by two harrowings, after which planting furrows approximately 15 cm deep were created. Planting was performed manually during the rainy season in May 2021, using 16 cm seed stem cuttings with a spacing of 0.90 m between rows and 0.80 m between plants. Crop management practices followed recommended guidelines for cassava cultivation [62].

5.2. Evaluation of Agronomic Traits

In addition to vegetation indices at different flight altitudes, the following agronomic traits were evaluated 12 months after planting: (I) plant height (m), measured manually on 15 plants per plot using a tape measure from ground level to the apical meristem (shoot tip); (II) fresh root yield (FRY), obtained by weighing roots produced per plot on a hydrostatic scale (t.ha−1); (III) shoot yield (SY), determined by weighing all above ground biomass (stems, leaves, and petioles) from plants in the plot (t.ha−1); (IV) leaf spot resistance (LSR), visually assessed using a severity scale: 0 = no symptoms, 1 = 25% of leaves affected in the lower third of the plant, 2 = >50% of leaves affected in the lower third of the plant, 3 = leaves affected in the middle and lower thirds parts, 4 = mild incidence distributed throughout the plant, 5 = moderate incidence distributed throughout the plant, along with yellowing and/or defoliation of the lower third, 6 = complete defoliation of the plant; (V) leaf retention (LR), scored on a scale representing leaf cover at the apical meristem: 1 = less than 5% leaf retention, 2 = 6–15% leaf retention, 3 = 16–30% leaf retention, 4 = 31–50% leaf retention, 5 = more than 50% leaf retention; (VI) plant vigor (PV), assessed on a scale where 1 = low vigor, 3 = intermediate vigor, and 5 = high vigor; (VII) dry matter content (DMC) in roots (%), determined by weighing 3–5 kg of roots per plot to obtain air and water weights, following the gravimetric method proposed by Kawano et al. [63].

5.3. Image Acquisition

The acquisition of RGB and multispectral images was carried out using the DJI Phantom 4 Pro V2 UAV (DJI, Shenzen, China). The RGB camera has a focal length of 8.8 mm, a 5.5-inch display, and a 1-inch CMOS sensor with a resolution of 20 megapixels. The Micasense multispectral camera model RedEdge-M (MicaSense Inc., Seattle, WA, USA) features a fixed focal length of 5.4 mm, with sensors that have a resolution of 1280 × 980 pixels and five different spectral bands (Table S2).
The flights were conducted 12 months after planting, on the same day as the ground measurements of plant height. Image collection with the UAV took place under favorable weather conditions, with a uniform sky between the hours of 10 a.m. and 2 p.m., which corresponds to the period of maximum direct solar irradiation on the trial. The DJI Pilot 2 app (DJI, Shenzen, China) was used to control the UAV’s flight path and speed during phenotyping, ensuring a 90% overlap between images.
Images were captured at different flight heights (20, 30, 40, and 60 m) to estimate the optimal flight height for predicting phenotypic data. To enhance the accuracy of the location of the experimental plots prior to each flight, ground control points (GCPs) were established by positioning targets around and within the area. GCPs are essential for improving the georeferencing accuracy of UAV-based imagery. By establishing GCPs around and within the experimental plots, the spatial alignment of the captured images is corrected, ensuring that the data corresponds accurately to real-world coordinates. The Emlid Reach RS+ GNSS system (Emlid Ltd, Budapest, Hungary), which offers 2 cm accuracy using real-time kinematic (RTK) technology, was used to measure these points. This high precision is critical for reducing distortions caused by variations in flight height, such as those experienced when capturing images at 20, 30, 40, and 60 m. The 19 ground control points (GCPs) were evenly distributed across the plantation, including its edges and interior, to ensure comprehensive coverage throughout the experiment.

5.4. Orthomosaic Construction, Processing, and Radiometric Calibration

The images were processed using Agisoft Metashape software version 1.5.5 (Agisoft LLC, St. Petersburg, Russia), following the workflow for image processing. This included alignment, generation of sparse point clouds from photo alignment, preparation of dense clouds, creation of a mesh for 3D visualization, image sharpening, orthorectification, and export of the final orthomosaics from the field [64].
Radiometric calibration was performed to produce multispectral and hyperspectral images that provide information consistent with a known radiometric reference. The preprocessing of images from the MicaSense RedEdge sensor utilized the radiometric calibration conversion formula provided by MicaSense (https://support.micasense.com/hc/en-us/articles/115000351194-RedEdge-Camera-Radiometric-Calibration-Model (accessed on 5 January 2024), which converts digital number (DN) values into absolute spectral radiance values based on the formula: L = V x , y × a 1 g × p p B L t e + a 2 y a 3 t e y , where L is the spectral radiance; V x , y is the polynomial function of the vignette at pixel x , y ; a 1 , a 2 and a 3 are the radiometric calibration coefficients; g is the sensor gain setting; p is the normalized DN value; p B L is the black level offset; and t e is the exposure time of the image. All parameters necessary for calculating spectral radiance were obtained from the photo metadata. Considering various climatic and lighting conditions, radiometric calibration produces more accurate and reliable data that enable time series analysis and comparison of results over time.
MicaSense provides albedo values for the calibration panel, along with a CSV file containing the reflectance values for the panel, which are essential for radiometric calibration. These values were used in Agisoft Metashape during the image calibration process to correct for any variations in reflectance caused by environmental factors such as lighting, sensor conditions, and atmospheric influences. The albedo values represent the inherent reflectivity of the calibration panel, which is a known reference, and they allow for the accurate conversion of digital number (DN) values from the sensor into absolute spectral radiance values. By incorporating these values into the calibration process, Agisoft Metashape adjusts the raw imagery to more reliably reflect the true spectral characteristics of the captured scene, ensuring consistency and accuracy in the final processed data. The multispectral images captured with the RedEdge camera are radiometrically calibrated using Agisoft Metashape software. This software utilizes data recorded by the irradiance sensor, stored in the EXIF metadata of each image, to convert digital numbers into radiance and subsequently into reflectance before generating the orthophoto.

5.5. Estimation of Plant Height from Aerial Images

Plant heights were measured through digital image processing using QGIS software (version 3.32.3) and the Terrain Profile plugin. Data processed in Agisoft Metashape (version 1.5.5) were exported to generate the digital elevation model (DEM) and to crop the orthophoto. The orthophoto was overlaid with 50% transparency onto the DEM for each experiment, allowing for a clear visual comparison. A profile line was then drawn for each plant identified in the RGB image and verified against the field map for accuracy. This line was positioned to intersect both the lowest point on the DEM, representing ground level, and the highest point on the plant, enabling height estimation.

5.6. Acquisition of Vegetation Indices

Vegetation indices are mathematical formulations derived from spectral data collected via remote sensing, primarily in the red (R) and near-infrared (NIR) bands. In this study, we obtained 18 commonly used vegetation indices that are valuable for assessing plant vigor, growth, and predictive analysis (Table 4). These indices were calculated using the FIELDimageR package [64] in R software [65], which facilitated soil and weed removal through image segmentation with the fieldMask function.

5.7. Data Analysis

Agronomic traits at various flight heights (20, 30, 40, and 60 m) were evaluated using mixed models, defined by the formula Y = X β + Z g + ε , where y is the vector of phenotypic observations, β is the vector of fixed effects for blocks plus the overall mean, g is the vector of genotype-adjusted means with random effects, and ε is the vector of residuals, also treated as random effects. The matrices X and Z connect the independent variables to the response variable y .
Model effects were estimated using the lme4 package [79] in R software [65], with variance components calculated via restricted maximum likelihood (REML), and the best linear unbiased predictors (BLUPs) obtained for the random genotype effects. The significance of the model effects was tested using deviance analysis based on the likelihood ratio test (LRT) with a χ 2 distribution at a 5% significance level.
Broad-sense heritability was estimated following the formula proposed by Cullis et al. [80] as an alternative expression for unbalanced data and mixed models: H C u l l i s 2 = 1 V ¯ Δ B l u p   2 σ g 2 , where σ g 2 is the genetic variance and V ¯ Δ B l u p is the mean standard error of the genotypic BLUPs.
Pearson correlations were calculated to assess relationships between vegetation indices and agronomic characteristics obtained via UAV phenotyping, with significance determined using the t-test with n-2 degrees of freedom, via corrplot package in R software [65]. To assess the agreement between ground-measured plant heights and point cloud heights at different flight heights (20, 30, 40, and 60 m), we used the Bland–Altman [81] method. This method analyzes the agreement between two different measurement techniques assessing the same variable in the same units, allowing for the determination of whether the differences in measurements from the two methods are acceptable and equivalent. The Bland–Altman plot depicted the differences between each pair of observations (ground plant height—point cloud height) on the vertical axis, while the average of the pairs of observations [(ground plant height + Point cloud height2)/2] was plotted on the horizontal axis. The 95% limits of agreement were established as ±1.96 times the standard deviation (SD) of the bias. The Bland–Altman plot was created using the dplyr [82] and ggpubr [83] packages in R.
Vegetation indices were utilized to assess the predictive ability of agronomic traits using four prediction models: Generalized Linear Model with Stepwise Feature Selection (GLMSS), Partial Least Squares (PLS), Support Vector Machine (SVM), and K-Nearest Neighbor (KNN). The models were cross-validated using a scheme with 5 repetitions and 6 folds per repetition. A total of 60% of the samples were allocated for training, with the remaining 40% used for validation. All predictive models were implemented in the caret package [84] in R version 4.3 [65].
Model performance was evaluated based on the root mean square error (RMSE) and the coefficient of determination ( R 2 ) for each cross-validation fold. RMSE measures the average magnitude of the estimated errors; the closer its value is to 0, the better the quality of the estimates. The RMSE was computed using the formula: R M S E = 1 n i = l n ( y i y ^ i   ) 2 , where y i   and y ^ i   are the observed and predicted values, respectively, and n is the number of observations. The R 2 statistic represents the proportion of the total variation in the dependent variable explained by the variation in the independent variable, estimated as follows: R 2   =   i = l n ( y ^ i   y ¯ ) 2 i = l n ( y i   y ¯ ) 2 , where i = l n ( y ^ i   y ¯ ) 2 and i = l n ( y i   y ¯ ) 2 correspond to the explained and unexplained variance by the model, respectively, with y ¯ being the mean of y i .

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/plants14010032/s1.

Author Contributions

Conceptualization: D.F.M.C. and E.J.d.O.; methodology: D.F.M.C., L.R.B.d.A., R.B.d.A.G. and R.L.B.; formal analysis: J.H.B.N., D.F.M.C., R.B.d.A.G. and R.L.B.; investigation: J.H.B.N.; resources: E.J.d.O.; data curation: J.H.B.N., D.F.M.C. and R.B.d.A.G.; writing—original draft preparation: J.H.B.N.; writing—review and editing: D.F.M.C., L.R.B.d.A., R.B.d.A.G., R.L.B. and E.J.d.O.; supervision: D.F.M.C., L.R.B.d.A. and E.J.d.O.; project administration: E.J.d.O.; funding acquisition: E.J.d.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by CAPES (Coordenação de Aperfeiçoamento de Pessoal de Nível Superior) grant number [88882.424467/2019-01—J.H.B.N.]; Funarbe (Fundação Arthur Bernardes) grant number [4390—D.F.M.C.]; Empresa Brasileira de Pesquisa Agropecuária grant number [20.18.01.012.00.00—L.R.B.A.]; CNPq (Conselho Nacional de Desenvolvimento Científico e Tecnológico) grant number [409229/2018-0, 442050/2019-4 and 303912/2018-9—E.J.O.]; FAPESB (Fundação de Amparo à Pesquisa do Estado da Bahia) grant number [Pronem 15/2014—E.J.O.]. This work was partially funded by the UK’s Foreign, Commonwealth & Development Office (FCDO) and the Bill & Melinda Gates Foundation. Grant INV-007637. Under the grant conditions of the Foundation, a Creative Commons Attribution 4.0 Generic License has already been assigned to the Author Accepted Manuscript version that might arise from this submission. The funder provided support in the form of fellowship and funds for the research, but did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability Statement

The original contributions presented in the study are included in the article and the Supplementary Materials; further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Hormhuan, P.; Viboonjun, U.; Sojikul, P.; Narangajavana, J. Enhancing of anthracnose disease resistance indicates a potential role of antimicrobial peptide genes in cassava. Genetica 2020, 148, 135–148. [Google Scholar] [CrossRef] [PubMed]
  2. Jansson, C.; Westerbergh, A.; Zhang, J.; Hu, X.; Sun, C. Cassava, a potential biofuel crop in (the) People’s Republic of China. Appl. Energy 2009, 86, S95–S99. [Google Scholar] [CrossRef]
  3. Tomlinson, K.R.; Bailey, A.M.; Alicai, T.; Seal, S.; Foster, G.D. Cassava brown streak disease: Historical timeline, current knowledge and future prospects. Mol. Plant Pathol. 2018, 19, 1282–1294. [Google Scholar] [CrossRef] [PubMed]
  4. FAO, Foundation Agricultural Organization. FAOSTAT Database. Available online: https://www.fao.org/faostat/en/#home (accessed on 7 March 2022).
  5. de Andrade, L.R.B.; e Sousa, M.B.; Oliveira, E.J.; de Resende, M.D.V.; Azevedo, C.F. Cassava yield traits predicted by genomic selection methods. PLoS ONE 2019, 14, e0224920. [Google Scholar] [CrossRef]
  6. Chawade, A.; van Ham, J.; Blomquist, H.; Bagge, O.; Alexandersson, E.; Ortiz, R. High-throughput field-phenotyping tools for plant breeding and precision agriculture. Agronomy 2019, 9, 258. [Google Scholar] [CrossRef]
  7. Quirós Vargas, J.J.; Zhang, C.; Smitchger, J.A.; McGee, R.J.; Sankaran, S. Phenotyping of plant biomass and performance traits using remote sensing techniques in pea (Pisum Sativum L.). Sensors 2019, 19, 2031. [Google Scholar] [CrossRef]
  8. Zhao, C.; Zhang, Y.; Du, J.; Guo, X.; Wen, W.; Gu, S.; Wang, J.; Fan, J. Crop phenomics: Current status and perspectives. Front. Plant Sci. 2019, 10, 714. [Google Scholar] [CrossRef]
  9. Fu, Y.; Yang, G.; Wang, J.; Song, X.; Feng, H. Winter wheat biomass estimation based on spectral indices, band depth analysis and partial least squares regression using hyperspectral measurements. Comput. Electron. Agric. 2014, 100, 51–59. [Google Scholar] [CrossRef]
  10. Verger, A.; Vigneau, N.; Chéron, C.; Gilliot, J.-M.; Comar, A.; Baret, F. Green area index from an unmanned aerial system over wheat and rapeseed crops. Remote Sens. Environ. 2014, 152, 654–664. [Google Scholar] [CrossRef]
  11. Liang, L.; Di, L.; Zhang, L.; Deng, M.; Qin, Z.; Zhao, S.; Lin, H. Estimation of crop LAI using hyperspectral vegetation indices and a hybrid inversion method. Remote Sens. Environ. 2015, 165, 123–134. [Google Scholar] [CrossRef]
  12. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  13. Harmse, C.J.; Gerber, H.; van Niekerk, A. Evaluating Several vegetation indices derived from sentinel-2 imagery for quantifying localized overgrazing in a semi-arid region of South Africa. Remote Sens. 2022, 14, 1720. [Google Scholar] [CrossRef]
  14. Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting Grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  15. Xue, J.; Su, B. Significant remote sensing vegetation indices: A review of developments and applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef]
  16. Qian, C.; Shao, L.; Hou, X.; Zhang, B.; Chen, W.; Xia, X. Detection and attribution of vegetation greening trend across distinct local landscapes under China’s Grain to Green Program: A Case Study in Shaanxi Province. CATENA 2019, 183, 104182. [Google Scholar] [CrossRef]
  17. Pezzopane, J.R.M.; Bernardi, A.C.d.C.; Bosi, C.; Crippa, P.H.; Santos, P.M.; Nardachione, E.C. Assessment of Piatã palisadegrass forage mass in integrated livestock production systems using a proximal canopy reflectance sensor. Eur. J. Agron. 2019, 103, 130–139. [Google Scholar] [CrossRef]
  18. Rouse, J.W., Jr.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. NASA Spec. Publ. 1974, 351, 309. [Google Scholar]
  19. Bernardi, A.C.d.C.; Rabello, L.M.; Inamasu, R.Y.; Grego, C.R.; Andrade, R.G. Variabilidade espacial de parâmetros físico-químicas do solo e biofísicos de superfície em cultivo do sorgo. Rev. Bras. Eng. Agríc. Ambient. 2014, 18, 623–630. [Google Scholar] [CrossRef]
  20. Hunt, E.R.; Cavigelli, M.; Daughtry, C.S.T.; Mcmurtrey, J.E.; Walthall, C.L. Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precis. Agric. 2005, 6, 359–378. [Google Scholar] [CrossRef]
  21. Ballesteros, R.; Ortega, J.F.; Hernandez, D.; del Campo, A.; Moreno, M.A. Combined use of agro-climatic and very high-resolution remote sensing information for crop monitoring. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 66–75. [Google Scholar] [CrossRef]
  22. Maresma, Á.; Ariza, M.; Martínez, E.; Lloveras, J.; Martínez-Casasnovas, J.A. Analysis of vegetation indices to determine nitrogen application and yield prediction in maize (Zea mays L.) from a standard UAV service. Remote Sens. 2016, 8, 973. [Google Scholar] [CrossRef]
  23. Selvaraj, M.G.; Valderrama, M.; Guzman, D.; Valencia, M.; Ruiz, H.; Acharjee, A. Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 2020, 16, 87. [Google Scholar] [CrossRef] [PubMed]
  24. Kyratzis, A.C.; Skarlatos, D.P.; Menexes, G.C.; Vamvakousis, V.F.; Katsiotis, A. Assessment of vegetation indices derived by UAV Imagery for durum wheat phenotyping under a water limited and heat stressed Mediterranean environment. Front. Plant Sci. 2017, 8, 1114. [Google Scholar] [CrossRef]
  25. Santos Silva, P.P.; e Sousa, M.B.; de Oliveira, E.J. Prediction models and selection of agronomic and physiological traits for tolerance to water deficit in cassava. Euphytica 2019, 215, 73. [Google Scholar] [CrossRef]
  26. Adu, M.O.; Asare, P.A.; Asare-Bediako, E.; Amenorpe, G.; Ackah, F.K.; Afutu, E.; Amoah, M.N.; Yawson, D.O. Characterising shoot and root system trait variability and contribution to genotypic variability in juvenile cassava (Manihot esculenta Crantz) Plants. Heliyon 2018, 4, e00665. [Google Scholar] [CrossRef]
  27. Elbasi, E.; Zaki, C.; Topcu, A.E.; Abdelbaki, W.; Zreikat, A.I.; Cina, E.; Shdefat, A.; Saker, L. Crop prediction model using machine learning algorithms. Appl. Sci. 2023, 13, 9288. [Google Scholar] [CrossRef]
  28. Rashid, M.; Bari, B.S.; Yusup, Y.; Kamaruddin, M.A.; Khan, N. A Comprehensive review of crop yield prediction using machine learning approaches with special emphasis on palm oil yield prediction. IEEE Access 2021, 9, 63406–63439. [Google Scholar] [CrossRef]
  29. Hu, P.; Guo, W.; Chapman, S.C.; Guo, Y.; Zheng, B. Pixel size of aerial imagery constrains the applications of unmanned aerial vehicle in crop breeding. ISPRS J. Photogramm. Remote Sens. 2019, 154, 1–9. [Google Scholar] [CrossRef]
  30. Mesas-Carrascosa, F.-J.; Notario García, M.D.; Meroño de Larriva, J.E.; García-Ferrer, A. An analysis of the influence of flight parameters in the generation of unmanned aerial vehicle (UAV) orthomosaicks to survey archaeological areas. Sensors 2016, 16, 1838. [Google Scholar] [CrossRef]
  31. Rattanasopa, K.; Saengprachatanarug, K.; Wongpichet, S.; Posom, J.; Saikaew, K.; Ungsathittavorn, K.; Pilawut, S.; Chinapas, A.; Taira, E. UAV-based multispectral imagery for estimating cassava tuber yields. Eng. Agric. Environ. Food 2022, 15, 1–12. [Google Scholar] [CrossRef]
  32. Zhao, J.; Kumar, A.; Banoth, B.N.; Marathi, B.; Rajalakshmi, P.; Rewald, B.; Ninomiya, S.; Guo, W. Deep-learning-based multispectral image reconstruction from single natural color RGB image—Enhancing UAV-based phenotyping. Remote Sens. 2022, 14, 1272. [Google Scholar] [CrossRef]
  33. Lipovac, A.; Bezdan, A.; Moravčević, D.; Djurović, N.; Ćosić, M.; Benka, P.; Stričević, R. Correlation between ground measurements and UAV sensed vegetation indices for yield prediction of common bean grown under different irrigation treatments and sowing periods. Water 2022, 14, 3786. [Google Scholar] [CrossRef]
  34. Avtar, R.; Suab, S.A.; Syukur, M.S.; Korom, A.; Umarhadi, D.A.; Yunus, A.P. Assessing the influence of UAV altitude on extracted biophysical parameters of young oil palm. Remote Sens. 2020, 12, 3030. [Google Scholar] [CrossRef]
  35. Glenn, E.P.; Huete, A.R.; Nagler, P.L.; Nelson, S.G. Relationship between remotely-sensed vegetation indices, canopy attributes and plant physiological processes: What vegetation indices can and cannot tell us about the landscape. Sensors 2008, 8, 2136–2160. [Google Scholar] [CrossRef]
  36. Gómez-Candón, D.; De Castro, A.I.; López-Granados, F. Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precis. Agric. 2014, 15, 44–56. [Google Scholar] [CrossRef]
  37. Whitehead, K.; Hugenholtz, C.H.; Myshak, S.; Brown, O.; LeClair, A.; Tamminga, A.; Barchyn, T.E.; Moorman, B.; Eaton, B. Remote sensing of the environment with small unmanned aircraft systems (UASs), Part 2: Scientific and commercial applications. J. Unmanned Veh. Syst. 2014, 2, 86–102. [Google Scholar] [CrossRef]
  38. Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of drone altitude, image overlap, and optical sensor resolution on multi-view reconstruction of forest images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef]
  39. Assmann, J.J.; Kerby, J.T.; Cunliffe, A.M.; Myers-Smith, I.H. Vegetation monitoring using multispectral sensors—Best practices and lessons learned from high latitudes. J. Unmanned Veh. Syst. 2019, 7, 54–75. [Google Scholar] [CrossRef]
  40. Stow, D.; Nichol, C.J.; Wade, T.; Assmann, J.J.; Simpson, G.; Helfter, C. Illumination geometry and flying height influence surface reflectance and NDVI derived from multispectral UAS imagery. Drones 2019, 3, 55. [Google Scholar] [CrossRef]
  41. Perroy, R.L.; Sullivan, T.; Stephenson, N. Assessing the impacts of canopy openness and flight parameters on detecting a sub-canopy tropical invasive plant using a small unmanned aerial system. ISPRS J. Photogramm. Remote Sens. 2017, 125, 174–183. [Google Scholar] [CrossRef]
  42. Quirós, J.J.; Khot, L.R. Potential of low altitude multispectral imaging for in-field apple tree nursery inventory mapping. IFAC-PapersOnLine 2016, 49, 421–425. [Google Scholar] [CrossRef]
  43. e Silva, C.M.; Mezzomo, H.C.; Ribeiro, J.P.O.; Signorini, V.S.; Lima, G.W.; Vieira, E.F.T.; Portes, M.F.; Morota, G.; de Paula Corredo, L.; Nardino, M. Insights on multi-spectral vegetation indices derived from UAV-based high-throughput phenotyping for indirect selection in tropical wheat breeding. Euphytica 2024, 220, 35. [Google Scholar] [CrossRef]
  44. Tao, X.; Li, Y.; Yan, W.; Wang, M.; Tan, Z.; Jiang, J.; Luan, Q. Heritable variation in tree growth and needle vegetation indices of slash pine (Pinus elliottii) using unmanned aerial vehicles (UAVs). Ind. Crops Prod. 2021, 173, 114073. [Google Scholar] [CrossRef]
  45. Ferreira, F.M.; Leite, R.V.; Malikouski, R.G.; Peixoto, M.A.; Bernardeli, A.; Alves, R.S.; de Magalhães Júnior, W.C.P.; Andrade, R.G.; Bhering, L.L.; Machado, J.C. Bioenergy elephant grass genotype selection leveraged by spatial modeling of conventional and high-throughput phenotyping data. J. Clean. Prod. 2022, 363, 132286. [Google Scholar] [CrossRef]
  46. Falconer, D.S.; Mackay, T.F.C. Introduction to Quantitative Genetics, 4th ed.; Benjamin Cummings: Harlow, UK, 1996. [Google Scholar]
  47. Ceballos, H.; Pérez, J.C.; Joaqui Barandica, O.; Lenis, J.I.; Morante, N.; Calle, F.; Pino, L.; Hershey, C.H. Cassava breeding I: The value of breeding value. Front. Plant Sci. 2016, 7, 1227. [Google Scholar] [CrossRef]
  48. Sampaio Filho, J.S.; Olivoto, T.; Campos, M.d.S.; de Oliveira, E.J. Multi-trait selection in multi-environments for performance and stability in cassava genotypes. Front. Plant Sci. 2023, 14, 1282221. [Google Scholar] [CrossRef]
  49. da Conceicão, L.V.; Cortes, D.F.M.; Klauser, D.; Robinson, M.; de Oliveira, E.J. New Protocol for rapid cassava multiplication in field conditions: A perspective on speed breeding. Front. Plant Sci. 2023, 14, 1258101. [Google Scholar] [CrossRef]
  50. Wang, L.; Chang, Q.; Li, F.; Yan, L.; Huang, Y.; Wang, Q.; Luo, L. Effects of growth stage development on paddy rice leaf area index prediction models. Remote Sens. 2019, 11, 361. [Google Scholar] [CrossRef]
  51. Ganeva, D.; Roumenina, E.; Dimitrov, P.; Gikov, A.; Jelev, G.; Dyulgenova, B.; Valcheva, D.; Bozhanova, V. Remotely sensed phenotypic traits for heritability estimates and grain yield prediction of barley using multispectral imaging from UAVs. Sensors 2023, 23, 5008. [Google Scholar] [CrossRef]
  52. Li, Z.; Chen, Z.; Cheng, Q.; Duan, F.; Sui, R.; Huang, X.; Xu, H. UAV-based hyperspectral and ensemble machine learning for predicting yield in winter wheat. Agronomy 2022, 12, 202. [Google Scholar] [CrossRef]
  53. Guo, A.; Huang, W.; Dong, Y.; Ye, H.; Ma, H.; Liu, B.; Wu, W.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
  54. Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of Banana Fusarium Wilt Based on UAV Remote Sensing. Remote Sens. 2020, 12, 938. [Google Scholar] [CrossRef]
  55. Wang, X.; Reddy, C.K.; Xu, B. A systematic comparative study on morphological, crystallinity, pasting, thermal and functional characteristics of starches resources utilized in China. Food Chem. 2018, 259, 81–88. [Google Scholar] [CrossRef]
  56. Iizuka, K.; Kato, T.; Silsigia, S.; Soufiningrum, A.Y.; Kozan, O. Estimating and Examining the Sensitivity of Different Vegetation Indices to Fractions of Vegetation Cover at Different Scaling Grids for Early Stage Acacia Plantation Forests Using a Fixed-Wing UAS. Remote Sens. 2019, 11, 1816. [Google Scholar] [CrossRef]
  57. Wolfe, M.D.; Del Carpio, D.P.; Alabi, O.; Ezenwaka, L.C.; Ikeogu, U.N.; Kayondo, I.S.; Lozano, R.; Okeke, U.G.; Ozimati, A.A.; Williams, E.; et al. Prospects for genomic selection in cassava breeding. Plant Genome 2017, 10, plantgenome2017.03.0015. [Google Scholar] [CrossRef]
  58. Torres, L.G.; Vilela de Resende, M.D.; Azevedo, C.F.; Fonseca e Silva, F.; de Oliveira, E.J. Genomic selection for productive traits in biparental cassava breeding populations. PLoS ONE 2019, 14, e0220245. [Google Scholar] [CrossRef]
  59. Bongomin, O.; Lamo, J.; Guina, J.M.; Okello, C.; Ocen, G.G.; Obura, M.; Alibu, S.; Owino, C.A.; Akwero, A.; Ojok, S. UAV image acquisition and processing for high-throughput phenotyping in agricultural research and breeding programs. Plant Phenome J. 2024, 7, e20096. [Google Scholar] [CrossRef]
  60. Chivasa, W.; Mutanga, O.; Biradar, C. Phenology-Based Discrimination of Maize (Zea mays L.) Varieties using multitemporal hyperspectral data. J. Appl. Remote Sens. 2019, 13, 017504. [Google Scholar] [CrossRef]
  61. Herzig, P.; Borrmann, P.; Knauer, U.; Klück, H.-C.; Kilias, D.; Seiffert, U.; Pillen, K.; Maurer, A. Evaluation of RGB and multispectral unmanned aerial vehicle (UAV) imagery for high-throughput phenotyping and yield prediction in barley breeding. Remote Sens. 2021, 13, 2670. [Google Scholar] [CrossRef]
  62. da Silva Souza, L.; Farias, A.R.N.; de Mattos, P.L.P.; Fukuda, W.M.G. Livro Aspectos Socioeconomicos e Agronomicos da Mandioca; Embrapa Mandioca e Fruticultura: Cruz das Almas, Brazil, 2006; Volume 1. [Google Scholar]
  63. Kawano, K.; Fukuda, W.M.G.; Cenpukdee, U. Genetic and environmental effects on dry matter content of cassava root. Crop Sci. 1987, 27, 69–74. [Google Scholar] [CrossRef]
  64. Matias, F.I.; Caraza-Harter, M.V.; Endelman, J.B. FIELDimageR: An R package to analyze orthomosaic images from agricultural field trials. Plant Phenome J. 2020, 3, e20005. [Google Scholar] [CrossRef]
  65. R Development Core Team. R A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2021; Available online: https://www.scirp.org/(S(czeh2tfqw2orz553k1w0r45))/reference/referencespapers.aspx?referenceid=3131254 (accessed on 11 March 2022).
  66. Zarco-Tejada, P.J.; Berjón, A.; López-Lozano, R.; Miller, J.R.; Martín, P.; Cachorro, V.; González, M.R.; de Frutos, A. Assessing vineyard condition with hyperspectral indices: Leaf and canopy reflectance simulation in a row-structured discontinuous canopy. Remote Sens. Environ. 2005, 99, 271–287. [Google Scholar] [CrossRef]
  67. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially located platform and aerial photography for documentation of grazing impacts on wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  68. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  69. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
  70. Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-destructive optical detection of pigment changes during leaf senescence and fruit ripening. Physiol. Plant. 1999, 106, 135–141. [Google Scholar] [CrossRef]
  71. Escadafal, R. Soil spectral properties and their relationships with environmental parameters—Examples from arid regions. In Imaging Spectrometry—A Tool for Environmental Observations; Hill, J., Mégier, J., Eds.; Springer: Dordrecht, The Netherlands, 1994; pp. 71–87. [Google Scholar] [CrossRef]
  72. Mathieu, R.; Pouget, M.; Cervelle, B.; Escadafal, R. Relationships between satellite-based radiometric indices simulated using laboratory reflectance data and typic soil color of an arid environment. Remote Sens. Environ. 1998, 66, 17–28. [Google Scholar] [CrossRef]
  73. Richardson, A.J.; Wiegand, C.L. Distinguishing vegetation from soil background information. Photogramm. Eng. Remote Sens. 1977, 43, 1541–1552. [Google Scholar]
  74. Gitelson, A.A.; Viña, A.; Arkebauer, T.J.; Rundquist, D.C.; Keydan, G.; Leavitt, B. Remote estimation of leaf area index and green leaf biomass in maize canopies. Geophys. Res. Lett. 2003, 30, 1248. [Google Scholar] [CrossRef]
  75. Gitelson, A.; Merzlyak, M.N. Quantitative estimation of chlorophyll-a using reflectance spectra: Experiments with autumn chestnut and maple leaves. J. Photochem. Photobiol. B Biol. 1994, 22, 247–252. [Google Scholar] [CrossRef]
  76. Jordan, C.F. Derivation of leaf-area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  77. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  78. Pearson, R.L.; Miller, L.D. Remote Spectral Measurements as a Method for Determining Plant Cover; Technical Report No. 167 (U.S. National Committee for the International Biological Program); U.S. International Biological Program: Ft. Collins, CO, USA, 1972. [Google Scholar]
  79. Bates, D.; Mächler, M.; Bolker, B.; Walker, S. Fitting linear mixed-effects models using lme4. J. Stat. Softw. 2015, 67, 1–48. [Google Scholar] [CrossRef]
  80. Cullis, B.R.; Smith, A.B.; Coombes, N.E. On the design of early generation variety trials with correlated data. J. Agric. Biol. Environ. Stat. 2006, 11, 381–393. [Google Scholar] [CrossRef]
  81. Bland, J.M.; Altman, D.G. Measuring agreement in method comparison studies. Stat. Methods Med. Res. 1999, 8, 135–160. [Google Scholar] [CrossRef]
  82. Wickham, H.; François, R.; Henry, L.; Müller, K.; Vaughan, D. dplyr: A Grammar of Data Manipulation—Dplyr-Package, R Package Version 1.1.4; R Foundation for Statistical Computing: Vienna, Austria, 2024; Available online: https://dplyr.tidyverse.org/reference/dplyr-package.html (accessed on 7 September 2024).
  83. Kassambara, A. Ggpubr: ‘Ggplot2’ Based Publication Ready Plots. Available online: https://rpkgs.datanovia.com/ggpubr/authors.html (accessed on 7 September 2024).
  84. Kuhn, M. Building predictive models in r using the caret package. J. Stat. Softw. 2008, 28, 1–26. [Google Scholar] [CrossRef]
Figure 1. (AD) Correlation graphs between ground plant heights and those obtained via point cloud from aerial images captured by the unmanned aerial vehicle (UAV) at different flight heights (20 m, 30 m, 40 m, and 60 m). (EH) Bland–Altman agreement graphs comparing manually measured plant heights with those obtained from point clouds at different flight heights. The blue dashed line indicates the mean difference (bias) between the measurement methods, and the red (upper) and green (lower) lines represent the 95% limits of agreement (±0.96 × standard deviation).
Figure 1. (AD) Correlation graphs between ground plant heights and those obtained via point cloud from aerial images captured by the unmanned aerial vehicle (UAV) at different flight heights (20 m, 30 m, 40 m, and 60 m). (EH) Bland–Altman agreement graphs comparing manually measured plant heights with those obtained from point clouds at different flight heights. The blue dashed line indicates the mean difference (bias) between the measurement methods, and the red (upper) and green (lower) lines represent the 95% limits of agreement (±0.96 × standard deviation).
Plants 14 00032 g001
Figure 2. Correlations between vegetation indices obtained from aerial images captured by multispectral cameras on an unmanned aerial vehicle (UAV) at flight heights of 20 m, 30 m, 40 m, and 60 m, alongside agronomic characteristics. PlVig: plant vigor; ShY: above-ground biomass yield; FRY: fresh root yield; DMC: dry matter content in roots; LeRet: leaf retention; LeDis: leaf spot resistance; BI: Brightness Index; BGI: Blue Green Pigment Index; GLI: Green Leaf Index; HI: Primary Colors Hue Index; HUE: Hue Index; NGRDI: Normalized Green–Red Difference Index; SCI: Soil Color Index; SI: Spectral Slope Saturation Index; VARI: Visible Atmospherically Resistant Index; PSRI: Plant Senescence Reflectance Index; NDRE: Normalized Difference Red Edge Index; CIRE: Red-edge chlorophyll index; DVI: Difference Vegetation Index; NDVI: Normalized Difference Vegetation Index; GNDVI: Green normalized differences vegetation index; RVI: Ratio Vegetation Index; CVI: Chlorophyll Vegetation Index; CIG: Chlorophyll Index—green. The symbol × in the values indicates that the correlation was not significant according to the t-test at a 5% significance level.
Figure 2. Correlations between vegetation indices obtained from aerial images captured by multispectral cameras on an unmanned aerial vehicle (UAV) at flight heights of 20 m, 30 m, 40 m, and 60 m, alongside agronomic characteristics. PlVig: plant vigor; ShY: above-ground biomass yield; FRY: fresh root yield; DMC: dry matter content in roots; LeRet: leaf retention; LeDis: leaf spot resistance; BI: Brightness Index; BGI: Blue Green Pigment Index; GLI: Green Leaf Index; HI: Primary Colors Hue Index; HUE: Hue Index; NGRDI: Normalized Green–Red Difference Index; SCI: Soil Color Index; SI: Spectral Slope Saturation Index; VARI: Visible Atmospherically Resistant Index; PSRI: Plant Senescence Reflectance Index; NDRE: Normalized Difference Red Edge Index; CIRE: Red-edge chlorophyll index; DVI: Difference Vegetation Index; NDVI: Normalized Difference Vegetation Index; GNDVI: Green normalized differences vegetation index; RVI: Ratio Vegetation Index; CVI: Chlorophyll Vegetation Index; CIG: Chlorophyll Index—green. The symbol × in the values indicates that the correlation was not significant according to the t-test at a 5% significance level.
Plants 14 00032 g002
Figure 3. Correlations between vegetation indices obtained from aerial images captured by RGB (Red, Green, Blue) cameras mounted on an unmanned aerial vehicle (UAV) at flight heights of 20 m, 30 m, 40 m, and 60 m, alongside various agronomic traits. PlVig: plant vigor; ShY: above-ground biomass yield; FRY: fresh root yield; DMC: dry matter content in roots; LeRet: leaf retention; LeDis: leaf spot resistance. BI: Brightness Index (BI); BGI: Blue Green Pigment Index; CLI: Green Leaf Index (GLI); HI: Primary Colors Hue Index; HUE: Hue Index; NGRDI: Normalized Green–Red Difference Index; SCI: Soil Color Index; SI: Spectral Slope Saturation Index; VARI: Visible Atmospherically Resistant Index; PSRI: Plant Senescence Reflectance Index; NDRE: Normalized Difference Red Edge Index (NDRE); CIRE: Red-edge chlorophyll index; DVI: Difference Vegetation Index; NDVI: Normalized Difference Vegetation Index; GNDVI: Green normalized difference vegetation index. The symbol × indicates that the correlation was not significant according to the t-test at a 5% significance level.
Figure 3. Correlations between vegetation indices obtained from aerial images captured by RGB (Red, Green, Blue) cameras mounted on an unmanned aerial vehicle (UAV) at flight heights of 20 m, 30 m, 40 m, and 60 m, alongside various agronomic traits. PlVig: plant vigor; ShY: above-ground biomass yield; FRY: fresh root yield; DMC: dry matter content in roots; LeRet: leaf retention; LeDis: leaf spot resistance. BI: Brightness Index (BI); BGI: Blue Green Pigment Index; CLI: Green Leaf Index (GLI); HI: Primary Colors Hue Index; HUE: Hue Index; NGRDI: Normalized Green–Red Difference Index; SCI: Soil Color Index; SI: Spectral Slope Saturation Index; VARI: Visible Atmospherically Resistant Index; PSRI: Plant Senescence Reflectance Index; NDRE: Normalized Difference Red Edge Index (NDRE); CIRE: Red-edge chlorophyll index; DVI: Difference Vegetation Index; NDVI: Normalized Difference Vegetation Index; GNDVI: Green normalized difference vegetation index. The symbol × indicates that the correlation was not significant according to the t-test at a 5% significance level.
Plants 14 00032 g003
Figure 4. Estimates of broad-sense heritability ( H C u l l i s 2 ) for 18 vegetation indices obtained from aerial images captured by multispectral cameras using an unmanned aerial vehicle (UAV) at different flight heights (20 m, 30 m, 40 m, and 60 m). (A) Brightness Index (BI); (B) Blue Green Pigment Index (BGI); (C) Green Leaf Index (GLI); (D) Primary Colors Hue Index (HI); (E) Hue Index (HUE); (F) Normalized Green–Red Difference Index (NGRDI); (G) Soil Color Index (SCI); (H) Spectral Slope Saturation Index (SI); (I) Visible Atmospherically Resistant Index (VARI); (J) Plant Senescence Reflectance Index (PSRI); (K) Normalized Difference Red Edge Index (NDRE); (L) Red-edge chlorophyll index (CIRE); (M) Difference Vegetation Index (DVI); (N) Normalized Difference Vegetation Index (NDVI); (O) Green normalized differences vegetation index (GNDVI); (P) Ratio Vegetation Index (RVI); (Q) Chlorophyll Vegetation Index (CVI); (R) Chlorophyll Index—green (CIG).
Figure 4. Estimates of broad-sense heritability ( H C u l l i s 2 ) for 18 vegetation indices obtained from aerial images captured by multispectral cameras using an unmanned aerial vehicle (UAV) at different flight heights (20 m, 30 m, 40 m, and 60 m). (A) Brightness Index (BI); (B) Blue Green Pigment Index (BGI); (C) Green Leaf Index (GLI); (D) Primary Colors Hue Index (HI); (E) Hue Index (HUE); (F) Normalized Green–Red Difference Index (NGRDI); (G) Soil Color Index (SCI); (H) Spectral Slope Saturation Index (SI); (I) Visible Atmospherically Resistant Index (VARI); (J) Plant Senescence Reflectance Index (PSRI); (K) Normalized Difference Red Edge Index (NDRE); (L) Red-edge chlorophyll index (CIRE); (M) Difference Vegetation Index (DVI); (N) Normalized Difference Vegetation Index (NDVI); (O) Green normalized differences vegetation index (GNDVI); (P) Ratio Vegetation Index (RVI); (Q) Chlorophyll Vegetation Index (CVI); (R) Chlorophyll Index—green (CIG).
Plants 14 00032 g004
Figure 5. Estimates of broad-sense heritability ( H C u l l i s 2 ) for 15 vegetation indices obtained from aerial image analysis using RGB (Red, Green, Blue) cameras mounted on an unmanned aerial vehicle (UAV) at different flight heights (20 m, 30 m, 40 m, and 60 m). (A) Brightness Index (BI); (B) Blue Green Pigment Index (BGI); (C) Green Leaf Index (GLI), (D) Primary Colors Hue Index (HI); (E) Hue Index (HUE); (F) Normalized Green–Red Difference Index (NGRDI); (G) Soil Color Index (SCI); (H) Spectral Slope Saturation Index (SI); (I) Visible Atmospherically Resistant Index (VARI); (J) Plant Senescence Reflectance Index (PSRI); (K) Normalized Difference Red Edge Index (NDRE); (L) Red-edge chlorophyll index (CIRE); (M) Difference Vegetation Index (DVI); (N) Normalized Difference Vegetation Index (NDVI); (O) Green normalized difference vegetation index (GNDVI).
Figure 5. Estimates of broad-sense heritability ( H C u l l i s 2 ) for 15 vegetation indices obtained from aerial image analysis using RGB (Red, Green, Blue) cameras mounted on an unmanned aerial vehicle (UAV) at different flight heights (20 m, 30 m, 40 m, and 60 m). (A) Brightness Index (BI); (B) Blue Green Pigment Index (BGI); (C) Green Leaf Index (GLI), (D) Primary Colors Hue Index (HI); (E) Hue Index (HUE); (F) Normalized Green–Red Difference Index (NGRDI); (G) Soil Color Index (SCI); (H) Spectral Slope Saturation Index (SI); (I) Visible Atmospherically Resistant Index (VARI); (J) Plant Senescence Reflectance Index (PSRI); (K) Normalized Difference Red Edge Index (NDRE); (L) Red-edge chlorophyll index (CIRE); (M) Difference Vegetation Index (DVI); (N) Normalized Difference Vegetation Index (NDVI); (O) Green normalized difference vegetation index (GNDVI).
Plants 14 00032 g005
Table 1. Summary of flight parameters and information on image collection and processing.
Table 1. Summary of flight parameters and information on image collection and processing.
Flight Altitude (m)Number of ImagesFlight DurationFlight Time (h/m/s)Processing Time (RGB) h/mProcessing Time (Multispectral) h/m
2022210 min10:17:4602:5202:12
301297 min10:00:1002:1201:44
40695 min09:46:5601:3201:29
60504 min09:29:2901:2201:15
h: hours; m: minutes; s: seconds.
Table 2. Estimates of broad-sense heritability ( H C u l l i s 2 ) for the main agronomic traits in cassava.
Table 2. Estimates of broad-sense heritability ( H C u l l i s 2 ) for the main agronomic traits in cassava.
H C u l l i s 2
PlVigShYFRYDMCLeRetLeDis
0.930.980.980.990.980.98
PlVig: plant vigor; ShY: above-ground biomass yield.; FRY: fresh root yield; DMC: dry matter content in roots; LeRet: leaf retention; LeDis: leaf spot resistance.
Table 3. Accuracy and prediction error for certain agronomic traits in cassava using vegetation indices obtained at different flight heights from unmanned aerial vehicles.
Table 3. Accuracy and prediction error for certain agronomic traits in cassava using vegetation indices obtained at different flight heights from unmanned aerial vehicles.
TraitsModel 1Flight Height (m)
20304060
RMSE R 2 RMSE R 2 RMSE R 2 RMSE R 2
Plant vigorGLMSS0.840.190.940.130.830.200.970.13
KNN0.800.300.910.130.800.260.910.09
PLS0.820.300.860.180.820.220.850.19
SVM0.840.201.120.140.840.251.070.18
Leaf retentionGLMSS0.770.290.690.270.840.180.920.03
KNN0.800.150.740.210.830.090.800.13
PLS0.790.130.690.270.760.050.770.11
SVM0.950.130.710.210.860.140.940.14
Above-ground biomass yieldGLMSS7.660.157.490.127.260.136.640.11
KNN7.290.117.370.097.300.066.830.11
PLS6.940.086.960.176.920.176.830.12
SVM8.700.098.470.078.870.117.980.22
Fresh root yieldGLMSS7.330.087.120.117.580.077.390.16
KNN7.290.097.180.177.260.067.180.06
PLS6.830.116.930.086.720.126.820.15
SVM8.430.148.230.048.360.158.410.08
Dry matter content in rootsGLMSS1.810.241.810.291.980.242.000.25
KNN1.840.201.830.251.940.181.860.24
PLS1.780.341.760.271.900.162.040.10
SVM1.780.251.770.275.250.103.110.07
Leaf spot resistanceGLMSS1.040.100.700.120.760.120.680.09
KNN0.700.110.670.160.710.150.670.11
PLS0.670.050.650.200.670.090.670.05
SVM0.740.140.920.080.850.130.770.09
1 Generalized Linear Model with Stepwise Feature Selection (GLMSS), Partial Least Squares (PLS), Support Vector Machine (SVM), K-Nearest Neighbor (KNN). RMSE: root mean square error, R2: coefficient of determination.
Table 4. Vegetation indices based on the RGB and multispectral cameras as described in the table.
Table 4. Vegetation indices based on the RGB and multispectral cameras as described in the table.
DescriptionIndices 1Formula 2Related TraitsReference
Blue Green Pigment IndexBGI rgb, M B G Chlorophyll and leaf area index[66]
Green Leaf IndexGLI rgb, M 2 G R B 2 G + R + B Chlorophyll[67]
Normalized Green–Red Difference IndexNGRD rgb, M G R G + R Chlorophyll, biomass, water content[68]
Visible Atmospherically Resistant IndexVARI rgb, M G R G + R + B Canopy, biomass, chlorophyll[69]
Plant Senescence Reflectance IndexPSRI rgb, M R G R E Chlorophyll, nitrogen, and maturation[70]
Spectral Slope Saturation IndexSI rgb, M R B R + B Saturation index[71]
Soil Color IndexSCI rgb, M R G R + G Soil color[72]
Primary Colors Hue IndexHI rgb, M 2 *   R G B G B Hue index[71]
Hue IndexHUE rgb, M a r c t a n 2 R G B 30.5 ( G B ) General Hue index[71]
Brightness IndexBI rgb, M R 2 + G 2 + B 2 3 Vegetation cover[73]
Chlorophyll Index—greenCIG M N I R G 1 Chlorophyll content[74]
Normalized Difference Red Edge IndexNDRE rgb, M N I R G N I R + G Chlorophyll content[75]
Red-edge Chlorophyll IndexCIRE rgb, M N I R R E 1
N I R R E
Leaf chlorophyll content[74]
Difference Vegetation IndexDVI rgb, M N I R R Nitrogen and chlorophyll[75]
Normalized Difference Vegetation IndexNDVI rgb, M N I R R N I R + R Chlorophyll, leaf area, biomass, and yield[18]
Green Normalized Difference Vegetation IndexGNDVI rgb, M N I R G N I R + G Chlorophyll, leaf area, nitrogen, and proteins[76]
Ratio Vegetation IndexRV M N I R R Biomass, water, and nitrogen[77]
Chlorophyll Vegetation IndexCVI M N I R * R G 2 Chlorophyll[78]
1 rgb: red–geen–blue; M: multispectral; 2 R: red; G: green; B: blue; NIR: near-infrared; RE: red edge.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nascimento, J.H.B.; Cortes, D.F.M.; Andrade, L.R.B.d.; Gallis, R.B.d.A.; Barbosa, R.L.; Oliveira, E.J.d. High-Throughput Phenotyping for Agronomic Traits in Cassava Using Aerial Imaging. Plants 2025, 14, 32. https://doi.org/10.3390/plants14010032

AMA Style

Nascimento JHB, Cortes DFM, Andrade LRBd, Gallis RBdA, Barbosa RL, Oliveira EJd. High-Throughput Phenotyping for Agronomic Traits in Cassava Using Aerial Imaging. Plants. 2025; 14(1):32. https://doi.org/10.3390/plants14010032

Chicago/Turabian Style

Nascimento, José Henrique Bernardino, Diego Fernando Marmolejo Cortes, Luciano Rogerio Braatz de Andrade, Rodrigo Bezerra de Araújo Gallis, Ricardo Luis Barbosa, and Eder Jorge de Oliveira. 2025. "High-Throughput Phenotyping for Agronomic Traits in Cassava Using Aerial Imaging" Plants 14, no. 1: 32. https://doi.org/10.3390/plants14010032

APA Style

Nascimento, J. H. B., Cortes, D. F. M., Andrade, L. R. B. d., Gallis, R. B. d. A., Barbosa, R. L., & Oliveira, E. J. d. (2025). High-Throughput Phenotyping for Agronomic Traits in Cassava Using Aerial Imaging. Plants, 14(1), 32. https://doi.org/10.3390/plants14010032

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop