Next Article in Journal
Research of an Unmanned Aerial Vehicle Autonomous Aerial Refueling Docking Method Based on Binocular Vision
Previous Article in Journal
Research on Direct Lift Carrier-Based Unmanned Aerial Vehicle Landing Control Based on Performance Index Intelligent Optimization/Dynamic Optimal Allocation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Individualized Indicators and Estimation Methods for Tiger Nut (Cyperus esculentus L.) Tubers Yield Using Light Multispectral UAV and Lightweight CNN Structure

School of Soil and Water Conservation, Beijing Forestry University, Beijing 100083, China
*
Author to whom correspondence should be addressed.
Drones 2023, 7(7), 432; https://doi.org/10.3390/drones7070432
Submission received: 12 June 2023 / Accepted: 23 June 2023 / Published: 28 June 2023
(This article belongs to the Section Drones in Agriculture and Forestry)

Abstract

:
Tiger nuts are a non-genetically modified organism crop with high adaptability and economic value, and they are being widely promoted for cultivation in China. This study proposed a new yield-estimation method based on a lightweight convolutional neural network (CNN) named Squeeze Net to provide accurate production forecasts for tiger nut tubers. The multispectral unmanned aerial vehicle (UAV) images were used to establish phenotypic datasets of tiger nuts, comprising vegetation indices (VIs) and plant phenotypic indices. The Squeeze Net model with a lightweight CNN structure was constructed to fully explore the explanatory power of the spectral UAV-derived information and compare the differences between the parametric and nonparametric models applied in tiger nut yield predictions. Compared with stepwise multiple linear regression (SMLR), both algorithms achieved good yield prediction performances. The highest obtained accuracies reflected an R2 value of 0.775 and a root-mean-square error (RMSE) value of 688.356 kg/ha with SMLR, and R2 = 0.780 and RMSE = 716.625 kg/ha with Squeeze Net. This study demonstrated that Squeeze Net can efficiently process UAV multispectral images and improve the resolution and accuracy of the yield prediction results. Our study demonstrated the enormous potential of artificial intelligence (AI) algorithms in the precise crop management of tiger nuts in the arid sandy lands of northwest China by exploring the interactions between various intensive phenotypic traits and productivity.

1. Introduction

Abnormal climate change, geopolitics, trade friction, and other factors have led to severe issues involving security and stability in food production and circulation [1,2]. A joint study by the World Bank and the United Nations proposed response strategies: to cope with the surge in food demand by 2050, we need to convert more land (600 million hectares) to agricultural land or increase agricultural production efficiency in a low-carbon and environmentally friendly way, depending on scientific and technological progress [3]. To save land resources, plants with strong adaptability that do not occupy arable land are getting people’s attention [4]. Among these, tiger nut (Cyperus esculentus L.) [5,6] shows great potential. The tubers of tiger nuts are sweet and edible with high oil content and are also commonly known as earth almond, yellow nut sedge, or Chufa [7,8]. It was introduced into nonarable lands in the arid regions of northwestern China as a potential substitute cultivar [9,10], aiming to alleviate import dependency on oil crops. Estimating the yields of tiger nuts is an essential part of related research topics regarding their cultivation and promotion [11].
Up until now, remote sensing yield estimation based on crop physiological and ecological mechanisms has made outstanding progress. Many scholars have tried to apply advanced hardware facilities and intelligent data-processing methods to many kinds of crop-yield estimation studies [12,13,14]. With the technological progress of sensors and flight platforms, remote sensing data has more diversified choices in terms of spatial resolution, temporal resolution, and spectral resolution [15,16,17,18]. Those large and miscellaneous initial image datasets with complex multi-sensor, multiresolution, multitemporal, and habitat-source heterogeneity characteristics [19,20] are used to retrieve plant morphological structures [21,22,23,24], component contents [25,26,27], and other indicators closely related to crop yields.
Also, artificial intelligence (AI) algorithms based on computer vision were used for initial UAV image processing and plant phenotype extraction [13,28]. AI algorithms not only greatly improved the utilization efficiency of images but also achieved increased accuracies [20,29]. Although classic machine learning (ML) algorithms have been most widely applied for a variety of crops thus far, many recent studies [22,28,29,30] found that deep learning (DL) algorithms have more potential for distinguishing multicategorical objects from complex big data generated by remote sensing systems [13,31]. A typical example is yield estimation [32,33]. The “biological nature” of DL is something that other ML methods do not have, which allows DL to accurately extract valuable crop traits without professional handcrafted labels [13]. In the field of computer science, many scholars have created disruptive innovations in deep learning network models with increased accuracies and shortened processing times [34,35]. Squeeze Net [36] is a refined and simplified lightweight convolutional neural network (CNN) structure proposed by UC Berkeley and others in 2016. On the basis of ensuring the accuracy of the large AlexNet structure, Squeeze Net contains fewer parameters and smaller models.
Many studies, such as crop-yield estimations of rice [37,38], corn [39], soybean [33], etc. and fruit-yield predictions of apple [40], mango [41], etc. [42,43], have shown the superiority of DL in yield-estimation applications. However, there is a limitation: the results can only reflect the yield level in a specific region, and the accuracy of the universal model of different crops with a large spatial span is hard to guarantee [44,45]. In view of the potential of tiger nuts to become a new oil crop by large-scale cultivation in Chinese barren land, there is relatively little research on it, especially its phenotype and yield using combined UAV and DL techniques. In this paper, we concentrate on effectively developing a personalized yield prediction model for the first time, with the aim of assisting in precise crop management and offering sustainable production guidelines for the advancement of tiger nut cultivation.
Our research aims to predict the yield of tiger nut tubers using a UAV platform combined with a lightweight CNN algorithm. We achieved the following objectives: (1) establish effective UAV image retrieval technologies to retrieve phenotype characteristics related to the yield of tiger nut tubers; (2) detect the effects of different growth stages, phenotypic parameters, and VIs on tuber biomass for choosing optimal variables of the yield prediction models; and (3) compare the predictive performance of linear regression models and Squeeze Net models to obtain optimized estimations with the highest possible efficiency and accuracy.

2. Materials and Methods

2.1. Experimental Site and Trial Conditions

The study area is located on the southern boundary of the Taklimakan Desert and belongs to Luopu County in the Hotan area of Xinjiang Uygur Autonomous Region, China (Figure 1). Luopu experiences an extremely dry continental climate characterized by dry and cold winters and hot summers with scarce precipitation and strong sunshine. The average annual evaporation in the study area exceeds 2226.2 mm, while the rainfall total is only 35.2 mm; there is an annual sunshine duration of 2662 h, and the average annual temperature ranges from 7.8–12 °C. A strong wind (entrained sand and dust) and dry heat are the main climatic disasters. The surface soils are loose and covered by floating sands, consisting mainly of fine sands (125–250 μm in diameter) and very fine sands (62.5–125 μm), with low organic matter contents and high salt contents.
An experimental site was selected for the planting and monitoring of tiger nuts in the 2021 growing season (June–October), located at a research farm (37° 15.587′ N, 80° 3.746′ E) with elevations ranging from 1300 m to 1310 m. It is a newly reclaimed desert sand land owned by the Xinjiang Institute of Ecology and Geography of the Chinese Academy of Sciences.
On 15 May 2021, mechanical ditching was carried out according to the long-strip path. The seed-sowing concentration was 300 kg ha−1, and the row spacing was approximately 0.3–1.2 m. The total applied fertilization amounts were as follows: urea was applied at 675 kg ha−1 (N ≥ 46%), diammonium phosphate was applied at 300 kg ha−1 (P2O5 ≥ 48%, N ≥ 18%), potassium sulfate was applied at 375 kg ha−1 (K2O ≥ 52%), and humic acid was applied at 300 kg ha−1; the total irrigation volume 5775 m3 ha−1. Specifically, the amount of irrigation water applied before sowing was 2025 m3 ha−1. After that, all water and fertilizer treatments were carried out using drip-integrated technology. In the early growth stage (May–June), the soil was dripped 4 times, and the amount of water dripped at each application was 187.5 m3·hm−2, 33.75 kg of urea per hectare, and 15 kg of potassium sulfate compound fertilizer per hectare. In the mid-growth period (June to mid-August), drip water was applied 12 times, and in the late growth period (mid-August to mid-September), drip water was applied 4 times. Manual weeding combined with herbicide application was used to control weeds, following standard agronomic, insect, and disease management practices.

2.2. Experimental Design and Data Collection

The leaves of Cyperus esculentus are basal and alternate, with expanded and greenish leaf blades. From the seedling stage to the maturation stage, the height increased from 15 to 60 cm approximately; in the meantime, the tuber begins to form 40–50 days after emergence and then stabilizes around 120 days [46,47]. Those previous studies of growth rhythm in tiger nuts were applied to determine the period of experiment monitoring. So we divided two key phenological growth stages to conduct sampling: the early crop growth stage (before the end of June) and the late crop growth stage (before the end of September). The UAV flights took place on 26 June and 26 September 2021, and ground-truth measurements were conducted within two days after UAV imagery was captured.

2.2.1. Field Data Acquisition

To obtain agronomic trait data corresponding to the final yield, a field with an area of 4680 m2 was designed and divided into a total of 120 plots, each with an area of 3.9 m × 10 m. Different planting row spacing distances and watering treatment levels were included, but we do not intend to make any clear distinction among these factors in this study. Within each sampling plot, the agronomic traits of tiger nuts were collected including the above-ground biomass (AGB), below-ground biomass (BGB) and moisture content (MC). In early October 2021, the crops growing in each plot were harvested and weighed. The yield was expressed as kg ha−1 and normalized to a moisture content of 50.0%.
During the ground-sampling collection campaign, three small subplots, each with an area of 0.25 m 0.25 m, were randomly selected from each plot, and all the plants were excavated from each of these subplots. The grasses, roots, and tubers were bagged separately to record the biomass characteristics. The underground components were washed with clean water, weighed fresh, and put into an oven at 105 °C for 30 min before adjusting the temperature to 70 °C until the specimens reached a constant weight. We weighed the dry weight of the AGB and BGB components using a scale with an accuracy of 1 g.

2.2.2. Statistical Analysis

Descriptive Statistics and analysis of variance (ANOVA) with Type III sums of squares were performed after yield data collection using the General Linear Model procedure (GLM) of the IBM SPSS software, version 26.0. The main factor studied was the effect of yield samples on all dependent variables. All statistical tests were performed at a 5% significance level. The harvest scenery and statistical data of the tiger nut tuber yield are shown in Figure 2. To investigate the relationship among all of the variables, Pearson functions were calculated using Origin 2018 software. A p value less than 0.05 was considered significant.

2.2.3. UAV Image Acquisition and Preprocessing

The field canopy images captured by the UAV contain abundant spectral information representing the leaves, stems, and underground biomass. Both experiments were conducted within the local timeframe of 11:00–14:00. The flight-control app DJI Ground Station Pro Application (DJI, Shenzhen, China, https://www.dji.com/dk/ground-station-pro, accessed on 22 June 2023) was used to generate efficient flight paths after setting the flight area, image overlap, and flight altitude conditions. The altitude selected for image acquisition was 40 m above ground level, so the ground sampling distance (GSD) of red-green-blue (RGB) and multispectral cameras is 2.7 mm/pixel and 3.4 mm/pixel, respectively. The lateral and frontal image overlaps were both 80%. To prevent the reflectivity accuracy of the aerial images from being disturbed by weather conditions, a radiation panel image was taken 5 m away from the ground before each takeoff and landing for radiometric calibration. The official instructions (P4_Multispectral_Image_Processing_Guide_CHS) indicate that the light sensor on the top of the UAV can replace the radiation calibration gray plate, making the spectral radiance of the images taken between phases and at different times comparable. In addition, due to the flat terrain in the monitoring field, the DJI real-time kinematic (RTK) system used in the aerial surveys was sufficient to meet the accuracy required for our spectral research (Table 1), so no ground-control points were set. To ensure the reflectance quality of the orthomosaic, we used DJI-Terra (https://www.dji.com/cn, accessed on 22 June 2023) for automated aerial image processing following the presentation offered by DJI.

2.2.4. Dataset Preparation

Inadequate adaptation to the spatial variation of crops in the field will lead to inaccurate model estimations and even wrong conclusions [11]. The field-measured information (biomass and water content) was retrieved to the centimeter-pixel scale using UAV-derived images. Multiple linear fitting operations were performed in SPSS software using the plot-wise results of correlation and mean statistical analysis. Fitting formulas that met the significance test were used to produce pixel-level agronomic maps using the Raster Calculator Tool in ArcGIS. The accumulated VIs in two growth stages were also calculated and utilized.
In the final step of preprocessing, each image was stored in a 32-bit tiff image with 2046 × 3532 number of rows and columns and with 0.03m pixel resolution, after radiation calibration, the unification of the projection and coordinate systems, clipping, and other pretreatment steps were applied. Then, the phenotypic datasets of tiger nut including 17 vegetation indices (VIs) and 8 plant phenotypic indices (PHs) was established for yield prediction. Finally, the dataset was randomly divided 2/3 into a training set (84 plots) and 1/3 into a testing set (36 plots).

2.3. Yield Prediction and Validation

Based on the statistical analysis, variables with significance levels lower than 0.01 (including E_BGB, E_MCa, E_MCb, L_MCa, L_MCb, and L_VDVI) were excluded. The remaining 19 independent variables were divided into three categories under different yield-estimation requirement scenarios: (1) variables that use phenotypic data alone; (2) variables that use UAV_spectral indices alone; and (3) variables that use all indicators.
According to the three classifications above, each category is stored as a tiff image with a multi-band representing different argument information.

2.3.1. SMLR Yield Predicting Model Construction

The linear model is used as the baseline method because it is one of the simplest forms of modeling to construct the relationship between crop yield and multiple agronomic characteristics [48]. Stepwise multiple linear regression (SMLR) models were established using the plot-wise field phenotypic trait and VI data as independent variables to evaluate whether the linear model had the ability to directly indicate the end-of-season yield of tiger nut tubers.

2.3.2. CNN Yield Predicting Model Construction

Overfitting and multicollinearity are common problems in DL. Therefore, before the network construction, principal component analysis (PCA) was applied to improve the training efficiency of network models by reducing redundant information and reducing noise interference. PCA was implemented through RStudio and ArcGIS.
A total of 13,000 samples were obtained by sliding sampling (at a sample interval of 20 and a sample size of 30 × 30), among which 10,000 were loaded into the training set to fit the data and 3000 were used for the network validation and optimization steps. The test set was consistent with the SLMR models described above, using the remaining 30% of plots.
The structure of the Squeeze Net (Figure 3) starts with a convolution layer for feature extraction, before being fed into the Fire module (Layers 2–9). In the process, the image size is reduced by three successive Maxpooling operations. The fire module is the core part of this network and is mainly composed of the squeeze layer and the expand layer, each of which layer uses the rectified linear unit (ReLU) as the activation function. The squeeze layer creatively replaces the 3 × 3 convolution filter with a 1 × 1 filter. The convolution filter of the expand layer has both 1 × 1 and 3 × 3 components. Through the fire module structure, the size of the feature image does not change, but the number of model parameters is greatly reduced. For the output layers, we use a fully connected layer in conjunction with a regression layer to predict the continuous pixel-wise yield values using the Mean Square Error Loss and Cross Entropy Loss.
The data processing computational environment is configured with an 11th Gen Intel® Core i7-11700F CPU @2.50GHz processor (Intel, Santa Clara, CA, USA), 16 G RAM, 1 TB hard drive, and 11 GB NVIDIA GeForce GTX 1080Ti GPU. Squeeze Net was implemented by MATLAB R2018b.

2.3.3. Assessment of the Model Quality

We evaluated the performance of the different models using the mean absolute error (R2), root-mean-square error (RMSE) and normalized root-mean-square error (nRMSE). A higher R2 value and lower RMSE and nRMSE values indicate a better estimation performance. These evaluation indicators were calculated as follows:
R 2 = 1   i = 1 n ( y i y ^ i ) 2 i = 1 n y i y ¯ 2
RMSE = i = 1 n ( y i y ^ i ) 2 n
nRMSE = RMSE y ¯
where y i is the measured value, y ¯ is mean value of all measured values, and y ^ is the predicted value. The input variables are entered by changing the parameter i continuously.
In the process of CNN model construction, Softmax cross entropy was used as loss function. The L2 Regularization penalty term was added to the Cross Entropy Loss function. By repeatedly adjusting the hyperparameter λ of L2-Softmax to make the loss function approach 0 and optimize the objective function. The formula is as follows:
Loss ( W , b ) = 1 m i = 1 m y ( i ) log y ^ i + λ j = 1 n W j 2 + λ k = 1 n b k 2
where m is the number of training examples, y ( i ) is the true label of the i -th example, y ^ i is the predicted label of the i -th example, n is the number of model parameters, λ is the regularization parameter, and W j 2 and b k 2 are the squared norms of the j -th weight and k -th bias, respectively.

3. Results and Discussion

3.1. Variations in RGB Images, VIs and Phenotypic Information with Crop Growth

An important consideration in designing yield prediction models from UAV data is how to account for temporal changes in the spectral characteristics of growing crops [49]. The statistical values of the VIs and phenotype data at the plot scale in the early stages (ES) and late stages (LS) of the growing season are listed in Table 2. It shows that all vegetation index values have increased in different degrees, while the coefficient of variance of all VIs is not large due to its normalized characteristics. Moreover, the phenotype data show that as the plants matured, the biomass continued to accumulate and increased, and the plants became significantly drier indicated by above-ground moisture content (MC_A), and below-ground moisture content (MC_B).
From the initial growing stage (26 June) to the maturing stage (26 September), the RGB images intuitively show that the coverage substantially increased. The plant AGB and BGB showed significant increasing tendencies similar to the change in canopy greenness in the RGB images, while the MC maintained a decreasing trend. These performances directly reflect the transfer of nutritional strategies during tiger nut growth: the leaf and root moisture levels were maintained at high levels to ensure the water demands required for healthy development are met; until the plants mature and adapt to cold weather, the dry matter accumulation rate increases and the water content decreases significantly [9,10]. With these regular and dynamic plant trait changes occurring throughout the growing season, the leaves gradually aged and yellowed, and the spectral reflectance properties of the canopy also changed accordingly. The great spatial heterogeneities among the different plots in the same growth stage are shown in Figure 4, which mainly resulted from the management differences (water, nutrients, weeds, pest stress, etc.) and row spacing differences by effectively analyzing the plant biomass, water content, and VI situations at the plot scale.
The AGB, BGB, MC_A and MC_B changed as the growth period progressed; the corresponding changes can also be seen in each VI map. VI monitoring is an important means of reflecting crop growth characteristics [49,50,51]. The spectral images obtained by UAVs can reflect changes not only in canopy information but also in the above-ground and underground correlations of plants. The measurement of spectral bands can further characterize the growth of belowground plant biomass and serve as a reliable scientific basis for agricultural management. Therefore, the VIs obtained by multispectral UAV can be used to extract and predict other phenotypic indicators.

3.2. Correlations between Aerial Imaging Features and Plant Field Traits at Different Growth Stages

It is necessary to consider the main factors and key growth periods that affect crop yield in the establishment of a crop remote sensing yield estimation model [52]. Many studies have shown that there is a significant linear or nonlinear relationship between crop yield and different reflection spectral values or their combination forms, which is the theoretical basis for crop yield estimation based on spectral reflection characteristics.
As shown in Figure 5, the phenotypic features describing crop biomass are highly correlated with the tuber yield across the growth stages, meaning they have a greater influence on productivity predictions than spectral features. The highest significant positive correlation was found between the yield and AGB, with an absolute R value of 0.824, followed by that between the yield and GNDVI (r = 0.590) in the early growth stage. In addition, the correlations between the different VI variables and the crop yield were very similar (Figure 5), but the GNDVI correlation was more prominent than the other VI correlations. High autocorrelations also existed among several VI variables, such as the GNDVI, NDVI, NDRE, LCI, OSAVI, and VDVI. This was the case regardless of which growth stage was assessed and for both single-time-point traits and cumulative trait values. Additionally, we speculate that the accumulated values of some VIs, such as the NDVI and NDRE, in different growing seasons were also promising indicators for the end-of-season yield estimations conducted in this field experiment. Although the water content changed during the vegetation growth process, the relationship between the water content and yield was not significant, which was in line with our expectations.
However, the correlation of each variable with the crop yield varied among the different growth stages [30,53]. Combining the UAV images and field experimental features, good yield co-relationships with the AGB, GNDVI, OSAVI, NDVI, LCI, and NDRE are found in the early stage, with r values of 0.824, 0.590, 0.588, 0.586, 0.542, and 0.540, respectively. After maturity, all variables derived from the UAV images had relatively low correlations with the crop yield, indicating the limited ability of yield estimation at this time. In past crop biomass prediction research, this phenomenon has been attributed to the saturation of spectral indices at high vegetation densities. Due to the lack of relevant studies about the relationship between spectral characteristics and biomass of tiger nuts, our work cannot be verified yet with the same crop. However, we compared the studies of other crops, and they deduced similar conclusions. It is pointed out that agronomic features are more important early in the season than late in the season in the construction of yield models, such as soybean [11] and sorghum [30].
BGB sampling is typically a very laborious and time-consuming multistep process that is prone to errors. However, we found that the aboveground biomass before canopy closure (AGB_E) was a key factor affecting tiger nut tuber yield predictions, providing better prediction information than both the full-growth-process BGB and the end-of-season AGB. Therefore, tiger nut canopy phenotype data can contribute to the development of tuber yield prediction models and could help breeders in their selection procedures by providing early hints regarding the performance of novel lines.

3.3. Yield Estimations

3.3.1. SMLR Model Estimations

The calculation results are shown in Table 3, and three linear models were obtained, in which only one variable was selected for models 1 and 2, while three variables were selected for model 3.
In the yield prediction results calculated by the different independent-variable parameters, the linear regression algorithm model 3 showed the best performance when using the AGB and GNDVI in the early growth stage and the cumulative GNDVI values in the early and late stages as independent variables, with the lowest RMSE of 688.356 kg/ha (nRMSE = 0.129) and the highest proportion of variance explained (R2 = 0.775). In model 3, the contributions of the three considered independent variables to the yield prediction results can be ranked as follows: E_AGB > E_GNDVI > E-L_GNDVI (with standardized coefficients (beta values) of 0.623, 0.277 and 0.100, respectively).
The canopy of healthy plants has a very high reflectance in the near-infrared spectrum. As expected, the VDVI from RGB images did not perform well in estimating yields, reflecting that the NIR band plays a key role in the remote sensing inversion of plant phenotypic traits. Other related studies using UAV sensor platforms to estimate production have also revealed similar associations [33,54]. Surprisingly, among the four types of spectral indices, the GNDVI was the most frequently selected by the models constructed above, and it shows great importance beyond that of the NDVI. Previous studies have confirmed that the GNDVI has a wider dynamic range and is at least five times more sensitive to chlorophyll than the NDVI [50].
Moreover, Model 2, based on the early-stage VI indicators alone, evidenced a poor prediction capability, as indicated by the error test results (Table 3). The precision-limiting factors could be attributed to the small sample size of observations (n = 120) and the limitations of the linear regression algorithms, suggesting that the relationships between the tiger nut yield and the considered variables may not conform to simple linear fits. The current work has a normal accuracy (accuracy: R2: 0.701, RMSE: 821.342) in using linear regression to make simple and effective tiger nut tuber yield predictions under the condition of limited data.

3.3.2. DL Regression Model Estimations

Before the modeling step, the third category with all indicators was reconstructed using the optimal predictors selected by the principal component analysis (PCA) [28]. The number of factors to keep in the data treatment was evaluated by the Scree plot (Figure 6). The PCA results were obtained by comprehensively considering the eigenvalues of all principal components greater than one and the top 85% of the cumulative variance contribution rate. Then, DL models were built using the first six components (explaining 85.613% of the total variance, Figure 6b) provided by the PCA to predict yields using combined layers of multitemporal VIs and canopy metrics. The VIs show a high contribution (Figure 6c) and correlation (Figure 6a).
In this case, separate DL regression models were constructed based on the selected VIs, original phenotype data, and phenotype data derived from the two flight surveys. The DL prediction models were trained using Squeeze Net with default parameters (a filter size of 9, a filter number of 32, and an encoding and decoding depth of 5). Sample data were sent to the net for model training, and an overall test error improvement was achieved by adjusting the regularization and the hyperparameters of the training algorithm. The final optimal parameter values were determined with an L2 regularization value of 0.0001 and an initial learning rate of 0.00001 by observing the changes in the model output that occurred when the model parameters were changed.
We selected some of the most optimal models to list and report the R2, RMSE, and nRMSE values corresponding to the estimation results obtained with the testing set (Table 3 and Figure 7). The DL method effectively utilizes spectral data processing, thereby improving the estimation accuracy of linear regression. Compared to the traditional SMLR method applied to each corresponding variable, the DL method obtains better performance. Model 3 had the second-lowest errors (RMSE = 716.625 kg/ha; nRMSE = 0.134) and the greatest proportion of variance explained (R2 = 0.780) among the considered models.
Moreover, we found that the models constructed by VIs could discriminate trends only between low and high values rather than providing accurate tiger nut yield predictions without the support of plant phenotype data. The main reason for this result is that the BGB characteristics cannot be directly reflected in the canopy spectrum information captured by UAVs, which is also the reason why the VIs inversion using relevant HTP indicators is required to obtain accurate yield estimations. Nonetheless, VI information still exhibits great potential to improve yield prediction accuracy, as was further revealed in our study. For example, model 6, which involved VIs combined with HTPs inverted by the VIs, exhibited a 4.74% increase in R2 and an 8.19% decrease in the RMSE compared to model 5, which used VIs combined with the original HTPs. More obviously, in the linear regression model, compared to model 1, after VIs were added as independent variables in model 3, the prediction accuracy was greatly improved (R2 increased by 9.55%; RMSE decreased by 16.19% kg/ha). Furthermore, the DNN improved the spatial resolution of the prediction results from the plot scale to the centimeter-level pixel scale, an ability that could provide technical support for more detailed field precision management (Figure 8).
CNN can optimally process complex UAV multispectral image data without requiring prelabeling or computing image features. The optimal feature extraction operation was performed by the convolutional layers of the network, allowing for increased portability. With respect to crop yield prediction, some studies using multispectral drone images to train DL models have achieved low-error modeling, which is mostly applied to grain and oil crops [33,55,56,57] (Table 4). However, because of this structure, the model also requires a large amount of training data to converge. The massive, high-resolution yield statistics that need to be matched when performing DL algorithm fitting are not easy to collect, and the hardware required for this computation process is relatively expensive, which greatly limits its application and promotion in agriculture [55]. Therefore, the lightweight CNN Squeeze Net was used in our study to focus on the important details of multisource image data while suppressing the unimportant information as much as possible; this method greatly improved the computational efficiency. Compared with other studies using lightweight CNN, we report slightly better performance of our model compared to Han et al. [57], after accounting for a higher R2 in our study (R2 of 0.78 vs. 0.646 (Table 4)). The addition of regularization terms could inflate model performance.

3.4. Limitations and Future Work

The main objective of this work was to evaluate the individualized indicators extracted from light UAV multispectral images for use in yield estimation at the end of the season. Our present study conducted Squeeze Net to take an initial approach to the potential use of deep learning-based approaches in yield estimation, which have not been used in this area.
However, there are more studies worth exploring in the innovation and comparison of more different approaches, which were insufficient in this work. Moreover, the complexity and particularity of DL algorithms lead to a lack of any clear regression relationship, so for data, we still lack the ability to interpret plant development mechanisms and laws directly using DL. As well, due to the deficiency of planting areas in China and the travel restrictions during this growth stage, only two growth stages of on-site data were obtained. For future work, UAV platforms can be used as a medium in combination with high-resolution satellite remote sensing products to reduce the constraints of field data acquisition.

4. Conclusions

This study demonstrated the practicality of applying multispectral imagery taken by light drones to predict the underground yield of tiger nut tubers. We conclude that the comprehensive utilization of multiple data sources (phenological traits and VIs) can significantly improve yield prediction accuracy, and we especially found that multispectral information plays a more prominent role than other factors in crop yield predictions. Achieving early prediction is a goal that cultivators have been pursuing. AGB and GNDVI after 40 days of sowing and the accumulated GNDVI in the early and late growing seasons were screened as the most important prediction factors. This combination of factors implied a high probability of achieving an accurate tiger nut yield forecast before the growing season. Compared to the linear model, Squeeze Net showed an advantage of higher accuracy and had a strong and stable feature extraction capability. More importantly, Squeeze Net demonstrated its effectiveness by greatly improving the spatiotemporal accuracy of underground agronomic traits using high-throughput image phenotypic data processing technology. Through our method, crop yield estimation work is expected to be improved into a more efficient and comprehensive strategy that can realize high-throughput crop phenotype investigations, inversions, and predictions, thereby assisting in precise crop management and providing sustainable production guidelines in specific locations.

Author Contributions

Conceptualization, D.L. and X.W.; methodology, D.L. and X.W.; validation, D.L.; formal analysis D.L.; investigation D.L. and X.W.; data curation, D.L.; writing—original draft preparation, D.L.; writing—review and editing, X.W.; project administration, X.W.; funding acquisition, X.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Key Research and Development Program of China. (No. 2019YFC0507600/2019YFC0507601).

Data Availability Statement

The raw/processed data required to reproduce these findings cannot be shared at this time as the data also forms part of an ongoing study.

Acknowledgments

The authors sincerely appreciate the support and cooperation of Xinjiang Institute of Ecology and Geography Chinese Academy of Sciences. Additionally, the authors would like to thank Yinglin Liu, a farm operator from Hainan Yinglin Agricultural Technology Co., Ltd., for help and support during yield data acquisition.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Food Security Information Network. Global Report on Food Crises—2022. 2022. Available online: https://www.wfp.org/publications/global-report-food-crises-2022 (accessed on 4 May 2022).
  2. International Food Policy Research Institute. 2021 Global Food Policy Report: Transforming Food Systems after COVID-19; International Food Policy Research Institute: Washington, DC, USA, 2021. [Google Scholar] [CrossRef]
  3. Searchinger, T.; Waite, R.; Hanson, C.; Ranganathan, J.; Matthews, E. Creating Sustainable Food Future: A Menu of Solutions to Feed Nearly 10 Billion People by 2050. 2019. Available online: https://www.wri.org/research/creating-sustainable-food-future (accessed on 19 July 2019).
  4. Bailey-Serres, J.; Parker, J.E.; Ainsworth, E.A.; Oldroyd, G.E.D.; Schroeder, J.I. Genetic strategies for improving crop yields. Nature 2019, 575, 109–118. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Nwosu, L.C.; Edo, G.I.; Ozgor, E. The phytochemical, proximate, pharmacological, gc-ms analysis of Cyperus esculentus (tiger nut): A fully validated approach in health, food and nutrition. Food Biosci. 2022, 46, 10. [Google Scholar] [CrossRef]
  6. De Vries, F.T. Chufa (Cyperus esculentus, Cyperaceae): A weedy cultivar or a cultivated weed? Econ. Bot. 1991, 45, 27–37. [Google Scholar] [CrossRef]
  7. Clemente-Villalba, J.; Cano-Lamadrid, M.; Issa-Issa, H.; Hurtado, P.; Hernandez, F.; Carbonell-Barrachina, A.A.; Lopez-Lluch, D. Comparison on sensory profile, volatile composition and consumer’s acceptance for pdo or non-pdo tigernut (Cyperus esculentus L.) milk. Lwt-Food Sci. Technol. 2021, 140, 110606. [Google Scholar] [CrossRef]
  8. Djikeng, F.T.; Djikeng, C.F.T.; Womeni, H.M.; Ndefo, D.K.K.; Pougoué, A.A.N.; Tambo, S.T.; Esatbeyoglu, T. Effect of different processing methods on the chemical composition, antioxidant activity and lipid quality of tiger nuts (Cyperus esculentus). Appl. Food Res. 2022, 2, 100124. [Google Scholar] [CrossRef]
  9. Yang, M.; Tian, L.; Xue, L. Quality and production potential of different chufa varieties in arid climate region of Xinjiang. Chin. J. Oil Crop Sci. 2013, 35, 451–454. [Google Scholar] [CrossRef]
  10. Yang, X.; Niu, L.; Zhang, Y.; Ren, W.; Yang, C.; Yang, J.; Xing, G.; Zhong, X.; Zhang, J.; Slaski, J.; et al. Morpho-agronomic and biochemical characterization of accessions of tiger nut (Cyperus esculentus) grown in the north temperate zone of China. Plants 2022, 11, 923. [Google Scholar] [CrossRef]
  11. Vogel, J.T.; Liu, W.; Olhoft, P.; Crafts-Brandner, S.J.; Pennycooke, J.C.; Christiansen, N. Soybean yield formation physiology—A foundation for precision breeding based improvement. Front. Plant Sci. 2021, 12, 719706. [Google Scholar] [CrossRef]
  12. Jin, X.; Zarco-Tejada, P.J.; Schmidhalter, U.; Reynolds, M.P.; Hawkesford, M.J.; Varshney, R.K.; Yang, T.; Nie, C.; Li, Z.; Ming, B.; et al. High-throughput estimation of crop traits: A review of ground and aerial phenotyping platforms. IEEE Geosci. Remote Sens. Mag. 2021, 9, 200–231. [Google Scholar] [CrossRef]
  13. Jiang, Y.; Li, C. Convolutional neural networks for image-based high-throughput plant phenotyping: A review. Plant Phenomics 2020, 2020, 4152816. [Google Scholar] [CrossRef] [Green Version]
  14. Fei, S.; Hassan, M.A.; Xiao, Y.; Su, X.; Chen, Z.; Cheng, Q.; Duan, F.; Chen, R.; Ma, Y. Uav-based multi-sensor data fusion and machine learning algorithm for yield prediction in wheat. Precis. Agric. 2023, 24, 187–212. [Google Scholar] [CrossRef]
  15. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef] [Green Version]
  16. Wang, N.; Guo, Y.; Wei, X.; Zhou, M.; Wang, H.; Bai, Y. Uav-based remote sensing using visible and multispectral indices for the estimation of vegetation cover in an oasis of a desert. Ecol. Indic. 2022, 141, 109155. [Google Scholar] [CrossRef]
  17. Saric, R.; Nguyen, V.D.; Burge, T.; Berkowitz, O.; Trtilek, M.; Whelan, J.; Lewsey, M.G.; Custovic, E. Applications of hyperspectral imaging in plant phenotyping. Trends Plant Sci. 2022, 27, 301–315. [Google Scholar] [CrossRef]
  18. Li, B.; Xu, X.M.; Zhang, L.; Han, J.W.; Bian, C.S.; Li, G.C.; Liu, J.G.; Jin, L.P. Above-ground biomass estimation and yield prediction in potato by using uav-based rgb and hyperspectral imaging. ISPRS J. Photogramm. 2020, 162, 161–172. [Google Scholar] [CrossRef]
  19. Cen, H.; Zhu, Y.; Sun, D.; Zhai, L.; Wan, L.; Ma, Z.; Liu, Z.; He, Y. Current status and future perspective of the application of deep learning in plant phenotype research. Trans. Chin. Soc. Agric. Eng. 2020, 36, 1–16. [Google Scholar] [CrossRef]
  20. Song, P.; Wang, J.L.; Guo, X.Y.; Yang, W.N.; Zhao, C.J. High-throughput phenotyping: Breaking through the bottleneck in future crop breeding. Crop J. 2021, 9, 633–645. [Google Scholar] [CrossRef]
  21. Xie, T.; Li, J.; Yang, C.; Jiang, Z.; Chen, Y.; Guo, L.; Zhang, J. Crop height estimation based on uav images: Methods, errors, and strategies. Comput. Electron. Agric. 2021, 185, 106155. [Google Scholar] [CrossRef]
  22. Chen, Q.; Zheng, B.; Chenu, K.; Hu, P.; Chapman, S.C. Unsupervised plot-scale lai phenotyping via uav-based imaging, modelling, and machine learning. Plant Phenomics 2022, 2022, 9768253. [Google Scholar] [CrossRef]
  23. Liu, M.L.; Liu, X.N.; Zhang, B.Y.; Ding, C. Regional heavy metal pollution in crops by integrating physiological function variability with spatio-temporal stability using multi-temporal thermal remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2016, 51, 91–102. [Google Scholar] [CrossRef]
  24. Zhao, Y.; Sun, Y.; Lu, X.; Zhao, X.; Yang, L.; Sun, Z.; Bai, Y. Hyperspectral retrieval of leaf physiological traits and their links to ecosystem productivity in grassland monocultures. Ecol. Indic. 2021, 122, 107267. [Google Scholar] [CrossRef]
  25. Niu, Y.X.; Han, W.T.; Zhang, H.H.; Zhang, L.Y.; Chen, H.P. Estimating fractional vegetation cover of maize under water stress from uav multispectral imagery using machine learning algorithms. Comput. Electron. Agric. 2021, 189, 106414. [Google Scholar] [CrossRef]
  26. Ball, K.R.; Liu, H.; Brien, C.; Berger, B.; Power, S.A.; Pendall, E. Hyperspectral imaging predicts yield and nitrogen content in grass–legume polycultures. Precis. Agric. 2022, 23, 2270–2288. [Google Scholar] [CrossRef]
  27. Zhu, W.X.; Rezaei, E.E.; Nouri, H.; Sun, Z.G.; Li, J.; Yu, D.Y.; Siebert, S. Uav-based indicators of crop growth are robust for distinct water and nutrient management but vary between crop development phases. Field Crop. Res. 2022, 284, 108582. [Google Scholar] [CrossRef]
  28. Selvaraj, M.G.; Valderrama, M.; Guzman, D.; Valencia, M.; Ruiz, H.; Acharjee, A. Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 2020, 16, 87. [Google Scholar] [CrossRef]
  29. Yang, H.; Yin, H.; Li, F.; Hu, Y.; Yu, K. Machine learning models fed with optimized spectral indices to advance crop nitrogen monitoring. Field Crop. Res. 2023, 293, 108844. [Google Scholar] [CrossRef]
  30. Varela, S.; Pederson, T.; Bernacchi, C.J.; Leakey, A.D.B. Understanding growth dynamics and yield prediction of sorghum using high temporal resolution uav imagery time series and machine learning. Remote Sens. 2021, 13, 1763. [Google Scholar] [CrossRef]
  31. Kyratzis, A.C.; Skarlatos, D.P.; Menexes, G.C.; Vamvakousis, V.F.; Katsiotis, A. Assessment of vegetation indices derived by uav imagery for durum wheat phenotyping under a water limited and heat stressed mediterranean environment. Front. Plant Sci. 2017, 8, 1114. [Google Scholar] [CrossRef] [Green Version]
  32. Nevavuori, P.; Narra, N.G.; Lipping, T.J.C.E.A. Crop yield prediction with deep convolutional neural networks. Front. Plant Sci. 2019, 163, 621. [Google Scholar] [CrossRef]
  33. Maitiniyazi, M.; Vasit, S.; Paheding, S.; Sean, H.; Flavio, E.; Felix, B.F. Soybean yield prediction from uav using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
  34. Deng, L.; Yu, D. Deep learning: Methods and applications. Found. Trends Signal Process. 2014, 7, 197–387. [Google Scholar] [CrossRef] [Green Version]
  35. Han, S.; Liu, X.; Mao, H.; Pu, J.; Pedram, A.; Horowitz, M.; Dally, W. Eie: Efficient inference engine on compressed deep neural network. ACM SIGARCH Comput. Archit. News 2016, 44, 243–254. [Google Scholar] [CrossRef]
  36. Iandola, F.N.; Moskewicz, M.W.; Ashraf, K.; Han, S.; Dally, W.J.; Keutzer, K.J.A. Squeezenet: Alexnet-level accuracy with 50x fewer parameters and < 1 mb model size. arXiv 2016, arXiv:1602.07360. [Google Scholar] [CrossRef]
  37. Chu, Z.; Yu, J. An end-to-end model for rice yield prediction using deep learning fusion. Comput. Electron. Agric. 2020, 174, 105471. [Google Scholar] [CrossRef]
  38. Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P.J.F.C.R. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using uav-based remotely sensed images. Field Crop. Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
  39. Khaki, S.; Pham, H.; Han, Y.; Kuhl, A.; Kent, W.; Wang, L. Deepcorn: A semi-supervised deep learning method for high-throughput image-based corn kernel counting and yield estimation. Knowl.-Based Syst. 2021, 218, 106874. [Google Scholar] [CrossRef]
  40. Chen, R.Q.; Zhang, C.J.; Xu, B.; Zhu, Y.H.; Zhao, F.; Han, S.Y.; Yang, G.J.; Yang, H. Predicting individual apple tree yield using uav multi-source remote sensing data and ensemble learning. Comput. Electron. Agric. 2022, 201, 107275. [Google Scholar] [CrossRef]
  41. Rahman, M.M.; Robson, A.; Bristow, M. Exploring the potential of high resolution worldview-3 imagery for estimating yield of mango. Remote Sens. 2018, 10, 1866. [Google Scholar] [CrossRef] [Green Version]
  42. Koirala, A.; Walsh, K.B.; Wang, Z.L.; McCarthy, C. Deep learning—Method overview and review of use for fruit detection and yield estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
  43. James, K.M.F.; Sargent, D.J.; Whitehouse, A.; Cielniak, G. High-throughput phenotyping for breeding targets—Current status and future directions of strawberry trait automation. Plants People Planet 2022, 4, 432–443. [Google Scholar] [CrossRef]
  44. Tripathi, A.; Tiwari, R.K.; Tiwari, S.P. A deep learning multi-layer perceptron and remote sensing approach for soil health based crop yield estimation. Int. J. Appl. Earth Obs. Geoinf. 2022, 113, 102959. [Google Scholar] [CrossRef]
  45. Zhuo, W.; Huang, J.; Xiao, X.; Huang, H.; Bajgain, R.; Wu, X.; Gao, X.; Wang, J.; Li, X.; Wagle, P.J.E.J.o.A. Assimilating remote sensing-based vpm gpp into the wofost model for improving regional winter wheat yield estimation. Eur. J. Agron. 2022, 139, 126556. [Google Scholar] [CrossRef]
  46. Bezerra, J.J.L.; Feitosa, B.F.; Souto, P.C.; Pinheiro, A.A.V. Cyperus esculentus L. (Cyperaceae): Agronomic aspects, food applications, ethnomedicinal uses, biological activities, phytochemistry and toxicity. Biocatal. Agric. Biotechnol. 2023, 47, 102606. [Google Scholar] [CrossRef]
  47. Henry, G.M.; Elmore, M.T.; Gannon, T.W. Chapter 8—Cyperus esculentus and Cyperus rotundus. In Biology and Management of Problematic Crop Weed Species; Chauhan, B.S., Ed.; Academic Press: Cambridge, MA, USA, 2021; pp. 151–172. [Google Scholar]
  48. Leukel, J.; Zimpel, T.; Stumpe, C. Machine learning technology for early prediction of grain yield at the field scale: A systematic review. Comput. Electron. Agric. 2023, 207, 107721. [Google Scholar] [CrossRef]
  49. Zou, X.; Mõttus, M.J.R.S. Sensitivity of common vegetation indices to the canopy structure of field crops. Remote Sens. 2017, 9, 994. [Google Scholar] [CrossRef] [Green Version]
  50. Anatoly, A.G.; Yoram, J.K.; Mark, N.M. Use of a green channel in remote sensing of global vegetation from eos-modis. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  51. Zeng, Y.L.; Hao, D.L.; Huete, A.; Dechant, B.; Berry, J.; Chen, J.M.; Joiner, J.; Frankenberg, C.; Bond-Lamberty, B.; Ryu, Y.; et al. Optical vegetation indices for monitoring terrestrial ecosystems globally. Nat. Rev. Earth Environ. 2022, 3, 477–493. [Google Scholar] [CrossRef]
  52. Liu, F.; Hu, P.; Zheng, B.; Duan, T.; Zhu, B.; Guo, Y. A field-based high-throughput method for acquiring canopy architecture using unmanned aerial vehicle images. Agric. For. Meteorol. 2021, 296, 108231. [Google Scholar] [CrossRef]
  53. Li, Y.; Zeng, H.; Zhang, M.; Wu, B.; Zhao, Y.; Yao, X.; Cheng, T.; Qin, X.; Wu, F. A county-level soybean yield prediction framework coupled with xgboost and multidimensional feature engineering. Int. J. Appl. Earth Obs. Geoinf. 2023, 118, 103269. [Google Scholar] [CrossRef]
  54. Bellis, E.S.; Hashem, A.A.; Causey, J.L.; Runkle, B.R.K.; Moreno-García, B.; Burns, B.W.; Green, V.S.; Burcham, T.N.; Reba, M.L.; Huang, X. Detecting intra-field variation in rice yield with unmanned aerial vehicle imagery and deep learning. Front. Plant Sci. 2022, 13, 716506. [Google Scholar] [CrossRef]
  55. Das Choudhury, S.; Samal, A.; Awada, T. Leveraging image analysis for high-throughput plant phenotyping. Front. Plant Sci. 2019, 10, 508. [Google Scholar] [CrossRef]
  56. Moghimi, A.; Yang, C.; Anderson, J.A. Aerial hyperspectral imagery and deep neural networks for high-throughput yield phenotyping in wheat. Comput. Electron. Agric. 2020, 172, 105299. [Google Scholar] [CrossRef] [Green Version]
  57. Han, J.; Shi, L.; Yang, Q.; Chen, Z.; Yu, J.; Zha, Y. Rice yield estimation using a cnn-based image-driven data assimilation framework. Field Crop. Res. 2022, 288, 108693. [Google Scholar] [CrossRef]
Figure 1. Location of the study area in relation to sandy land areas of Tarim Basin.
Figure 1. Location of the study area in relation to sandy land areas of Tarim Basin.
Drones 07 00432 g001
Figure 2. (a) Tiger nut tubers, (b) and harvest operation at the end of the season. (c) Summary of the statistical yield data.
Figure 2. (a) Tiger nut tubers, (b) and harvest operation at the end of the season. (c) Summary of the statistical yield data.
Drones 07 00432 g002
Figure 3. A workflow diagram of data processing modeling.
Figure 3. A workflow diagram of data processing modeling.
Drones 07 00432 g003
Figure 4. Spatial and temporal variations in the RGB, AGB, BGB (kg/ha), MC_A and MC_B of tiger nuts and in the derived VIs (NDVI, NDRE, GNDVI, LCI, and OSAVI).
Figure 4. Spatial and temporal variations in the RGB, AGB, BGB (kg/ha), MC_A and MC_B of tiger nuts and in the derived VIs (NDVI, NDRE, GNDVI, LCI, and OSAVI).
Drones 07 00432 g004
Figure 5. Pearson correlation analysis results obtained between UAV image-derived features and the phenological traits of tiger nuts during two growth stages.
Figure 5. Pearson correlation analysis results obtained between UAV image-derived features and the phenological traits of tiger nuts during two growth stages.
Drones 07 00432 g005
Figure 6. (a) PCA Bioplot, (b) Scree plot of the percentages of the aforementioned variances, and (c) Total variance explained by all selected components.
Figure 6. (a) PCA Bioplot, (b) Scree plot of the percentages of the aforementioned variances, and (c) Total variance explained by all selected components.
Drones 07 00432 g006
Figure 7. Plot-based comparisons of the sampled and predicted yields of the test dataset; the SMLR models are shown in the upper panels, and the DL models are shown in the lower panels.
Figure 7. Plot-based comparisons of the sampled and predicted yields of the test dataset; the SMLR models are shown in the upper panels, and the DL models are shown in the lower panels.
Drones 07 00432 g007
Figure 8. Prediction yield map created from SMLR and CNN as well as the actual yield map.
Figure 8. Prediction yield map created from SMLR and CNN as well as the actual yield map.
Drones 07 00432 g008
Table 1. Spectra and vegetation indices used for the phenotypic trait inversion.
Table 1. Spectra and vegetation indices used for the phenotypic trait inversion.
Spectrum and Vegetation IndexRange and Expression
B (Blue)450 ± 16 nm
G (Green)560 ± 16 nm
R (Red)650 ± 16 nm
RE (Red Edge)730 ± 16 nm
NIR (Near Infrared)840 ± 26 nm
NDVI (Normalized Difference Vegetation Index)NDVI = (NIR − Red)/(NIR + Red)
GNDVI (Green Normalized Difference Vegetation Index)GNDVI = (NIR − Green)/(NIR + Green)
OSAVI (Optimized Soil Adjusted Vegetation)OSAVI = (NIR − Red)/(NIR + Red + 0.16)
NDRE (Normalized Difference Vegetation)NDRE = (NIR − RedEdge)/(NIR + RedEdge)
LCI (Leaf Chlorophyll Index)LCI = (NIR − RedEdge)/(NIR + Red)
Table 2. Basic statistics of the plotwise information derived from UAV images and field sampling. (The table includes VIs derived from RGB and multispectral images with a range of 0–1, as well as biophysical phenotypic traits AGB, BGB (kg/ha), and MC_ A, MC_ B (%)).
Table 2. Basic statistics of the plotwise information derived from UAV images and field sampling. (The table includes VIs derived from RGB and multispectral images with a range of 0–1, as well as biophysical phenotypic traits AGB, BGB (kg/ha), and MC_ A, MC_ B (%)).
MinMaxMeanSDCV
ESLSESLSESLSESLSESLS
RGBVDVI0.0050.0360.0180.0700.0090.0550.0020.00522.30%9.87%
VIsNDVI0.0710.4940.1620.6270.1090.5820.0200.02718.13%4.69%
NDRE0.0190.1500.0490.1940.0320.1760.0060.00819.46%4.83%
LCI0.0220.2000.0550.2640.0360.2370.0070.01320.02%5.43%
GNDVI0.1340.4970.2170.5980.1730.5620.0170.02010.07%3.48%
OSAVI0.0520.3480.1180.4350.0800.4020.0140.02017.84%5.02%
PTAGB115.2002080146414,400656.5567684.733316.7842344.70648.25%30.51%
BGB49.6103824147226,176486.2811,339.653291.5564398.50259.96%38.79%
MC_A0.6030.4690.8220.748060.7660.6070.0370.0564.79%9.21%
MC_B0.6420.4400.8040.645290.7340.5400.0380.0335.18%6.17%
Table 3. Accuracy comparison among the yield estimation models.
Table 3. Accuracy comparison among the yield estimation models.
Algorithm VariableModelValidation Set
R2
Test Set
R2
RMSE
(kg/ha)
nRMSE
S M L R1HTPY = 3.295 × E_AGB + 2997.6110.6790.701821.3420.154
2VIsY = 43,369.522 × E_GDVI-2212.7260.3750.3131209.6940.226
3HTP&VIsY = 2.696 × E_AGB + 19,600.812 × E_GNDVI + 3866.837 × E-L_GNDVI-2767.8400.7610.775688.3560.129
D L4VIsSqueeze Net0.6270.5871047.6920.200
5Original HTP + VIs0.7380.744780.5200.146
6Inverted HTP
+ VIs
0.7960.780716.6250.134
Table 4. Model performance for comparable studies using DL algorithms for crop yield prediction.
Table 4. Model performance for comparable studies using DL algorithms for crop yield prediction.
ReferencesCropModelPerformance
This workTiger nutsCNN-Squeeze NetR20.78
RMSE716.625 (kg/ha)
nRMSE13.4%
Nevavuori
et al. (2019) [32]
Wheat and BarleyCNNMAE484 (kg/ha)
MAPE8.8%
Maitiniyazi
et al. (2020) [33]
SoybeanDeep Neural Network (DNN)R20.72
RMSE478.9 (kg/ha)
nRMSE15.9%
Bellis
et al. (2022) [54]
Rice2D-CNNR20.22
RMSE720 (kg/ha)
nRMSE7.9%
Han et al. (2022) [57]RiceCNN-ResNetR20.646
RMSE679 (kg/ha)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, D.; Wu, X. Individualized Indicators and Estimation Methods for Tiger Nut (Cyperus esculentus L.) Tubers Yield Using Light Multispectral UAV and Lightweight CNN Structure. Drones 2023, 7, 432. https://doi.org/10.3390/drones7070432

AMA Style

Li D, Wu X. Individualized Indicators and Estimation Methods for Tiger Nut (Cyperus esculentus L.) Tubers Yield Using Light Multispectral UAV and Lightweight CNN Structure. Drones. 2023; 7(7):432. https://doi.org/10.3390/drones7070432

Chicago/Turabian Style

Li, Dan, and Xiuqin Wu. 2023. "Individualized Indicators and Estimation Methods for Tiger Nut (Cyperus esculentus L.) Tubers Yield Using Light Multispectral UAV and Lightweight CNN Structure" Drones 7, no. 7: 432. https://doi.org/10.3390/drones7070432

Article Metrics

Back to TopTop