Next Article in Journal
Remotely Estimating Beneficial Arthropod Populations: Implications of a Low-Cost Small Unmanned Aerial System
Next Article in Special Issue
Evaluating Late Blight Severity in Potato Crops Using Unmanned Aerial Vehicles and Machine Learning Algorithms
Previous Article in Journal
Evaluation of SMAP Freeze/Thaw Retrieval Accuracy at Core Validation Sites in the Contiguous United States
Previous Article in Special Issue
Three-Dimensional Reconstruction of Soybean Canopies Using Multisource Imaging for Phenotyping Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape

1
College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
2
Key Laboratory of Spectroscopy Sensing, Ministry of Agriculture and Rural Affairs, Hangzhou 310058, China
3
College of Agriculture and Biotechnology, Zhejiang University, Hangzhou 310058, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2018, 10(9), 1484; https://doi.org/10.3390/rs10091484
Submission received: 14 August 2018 / Revised: 8 September 2018 / Accepted: 14 September 2018 / Published: 17 September 2018

Abstract

:
Remote estimation of flower number in oilseed rape under different nitrogen (N) treatments is imperative in precision agriculture and field remote sensing, which can help to predict the yield of oilseed rape. In this study, an unmanned aerial vehicle (UAV) equipped with Red Green Blue (RGB) and multispectral cameras was used to acquire a series of field images at the flowering stage, and the flower number was manually counted as a reference. Images of the rape field were first classified using K-means method based on Commission Internationale de l’Éclairage (CIE) L*a*b* space, and the result showed that classified flower coverage area (FCA) possessed a high correlation with the flower number (r2 = 0.89). The relationships between ten commonly used vegetation indices (VIs) extracted from UAV-based RGB and multispectral images and the flower number were investigated, and the VIs of Normalized Green Red Difference Index (NGRDI), Red Green Ratio Index (RGRI) and Modified Green Red Vegetation Index (MGRVI) exhibited the highest correlation to the flower number with the absolute correlation coefficient (r) of 0.91. Random forest (RF) model was developed to predict the flower number, and a good performance was achieved with all UAV variables (r2 = 0.93 and RMSEP = 16.18), while the optimal subset regression (OSR) model was further proposed to simplify the RF model, and a better result with r2 = 0.95 and RMSEP = 14.13 was obtained with the variable combination of RGRI, normalized difference spectral index (NDSI (944, 758)) and FCA. Our findings suggest that combining VIs and image classification from UAV-based RGB and multispectral images possesses the potential of estimating flower number in oilseed rape.

Graphical Abstract

1. Introduction

Oilseed rape, which belongs to the Brassicaceae family, is one of the most important oil crops. It is grown all around the world with the leading producers including European Union, Canada, China, India and Australia [1,2]. The yield of oilseed rape largely depends on the number of flowers at the peak-flowering stage that could turn into pods, and is also affected by the seed abortion [3,4]. From a breeding perspective, researchers are interested in breeding varieties not only with improved yield and health, but also with a uniform flowering and ripening time [5,6]. Therefore, it is essential to measure flower number in oilseed rape under different nitrogen (N) treatments. Traditionally, the most commonly used method to assess flower number is by manually counting in the field, which is time-consuming and labor-intensive for researchers to conduct field measurements in a large scale. It is thus urgent to develop a fast, non-destructive, and reliable technique that can accurately count flower number of oilseed rape in the field.
Advanced remote sensing has become a popular technique in acquiring crop information due to its ability to collect multi-temporal images of crop growth in the field [7]. In general, there are three commonly used remote sensing platforms, including satellite-based, ground-based and UAV-based platforms. The ground-based platform is an alternative to collect crop growth-related data with a higher spatial resolution and accuracy, but it is limited to small plots [8]. In addition, ground platform could destroy the plants, especially the oilseed rape at the flowering stage. In terms of satellite platforms, various studies have been reported to estimate the crop yield [9,10], chlorophyll and N contents [11,12], leaf area index (LAI) [13] and vegetation fraction [14,15]. However, satellite platforms are limited to their spatial resolutions, especially for the applications that require detailed canopy structural information. Although recent development of satellite platforms such as Landsat, SPOT5, and Quickbird has gradually improved the spatial resolution of images to 30 m, 10 m, and 3 m, it is still difficult and expensive to frequently acquire growth information of small plots due to long visiting cycle and cloud coverage [16]. Considering these restrictions, a more promising remote sensing platform with a high operability and resolution is needed for crop growth monitoring.
The recent increase in availability of unmanned aerial vehicles (UAVs) has relieved the bottleneck of satellite platform and ground-based platform. UAVs could conduct flight experiments frequently where and when needed, which allow for observation of fine-scale spatial patterns to collect multi-temporary images for crop monitoring [17]. The advantages of their low cost and high flexibility make them popular for field studies [18], and a set of studies have been conducted to estimate crop growth parameters using a UAV platform carried with various image sensors. Yu et al. [19] utilized a UAV platform equipped with Red Green Blue (RGB) and near-infrared (NIR) sensors to improve soybean yield estimation and predict plant maturity with the correlation coefficient (r) of 0.82. The thermal sensor was also used on the UAV platform to map plant water stress and its spatial variability, showing that the adaptive crop water stress index (CWSI) was correlated to both stem water potential and stomatal conductance with r2 of 0.72 and 0.82, respectively [20]. Duan et al. [21] utilized a UAV-based hyperspectral sensor to estimate LAI for three crops with a root mean square error (RMSE) of 0.62 m2m−2.
This brief review pointed out that various applications of UAV have been developed to acquire growth information of field crops. In general, there are two main methods used to estimate crop growth traits. A well-established method is to apply image classification to obtain growth status such as plant density of wheat crops [22], vegetation coverage of weed [18] and lodging identification of rice [23], which commonly referred to the high-resolution RGB images. Another possibility is to calculate the vegetation indices (VIs) from UAV-based RGB and multispectral images to estimate the growth status such as yield of wheat [24], biomass of maize and barley [7,25] and height of maize [7]. However, few studies combined the VIs and image classification to estimate crop growth status in a field scale. Maimaitijiang et al. [26] proposed to fuse VIs and classified vegetation coverage to estimate dry biomass, which outperformed single multispectral and thermal cameras. More recently, Liu et al. [27] demonstrated that combination of spectral and texture features significantly increased the rice lodging recognition accuracy. It is thus imperative to fuse VIs and image classification to assess crop growth and improve the estimation accuracy.
Furthermore, only little attention was devoted to the estimation of flower number in oilseed rape using a UAV dual-camera platform. Sulik and Long [28] found that a band ratio of green and blue light derived from UAV-based multispectral aerial images was strongly related to the number of yellow flowers (r2 = 0.87). Recently, Fang et al. [1] explored the potential of using canopy reflectance and VIs extracted from multispectral images to remotely estimate flower coverage in oilseed rape with the RMSE lower than 6%. Furthermore, the transposition of the single camera to the combination of RGB and multispectral cameras for the field observation could acquire more growth information, which could contribute to estimate yellow flower number in the rape field.
Therefore, this research was aimed to explore the use of combining UAV-based VIs and image classification for evaluating flower number of oilseed rape. The specific objectives were to: (1) compare the image classification results of flower coverage area (FCA) with different methods; (2) analyze the relationships between VIs and flower number, and (3) establish the models to estimate yellow flower number, and compare the estimation performance of individual UAV variables with variable importance estimations.

2. Materials and Methods

2.1. Field Experimental Design

The data used in this study was obtained from two field experiments within two years involving different N treatments and cultivars, as described below.
Experiment 1 was conducted at the Agricultural Research Station (30°18′26′′N, 120°4′29′′E) of Zhejiang University in Hangzhou, China during the oilseed rape growing season in 2016–2017. The mean elevation is 6.4 m above sea level, and the mean annual temperature is 16 °C with the coldest temperature in January and the hottest in July. The test field included 43 lines with the area of 24.4 m × 1.4 m and 0.3 m space between subplots (Figure 1a). After irregular planting areas laid out, it finally totalled 109 sampling plots. Four different treatments of N fertilizer were applied among all plots from N0 to N3 (0, 75, 150 and 225 kg/ha), and all subplots were treated with the same amount of phosphorus (P) (60 kg/ha) and potassium (K) (150 kg/ha). N fertilizers were applied twice with 60% in mid-December and 40% in mid-February, respectively, while phosphate and potash fertilizers were applied as a one-time base fertilizer. The cultivars of oilseed rape are ZD630 for most of the subplots. The other three cultivars (GY605, ZS758 and ZD622) were allocated to the zones with N1.
Experiment 2 was located at the Grain-production Functional Area of Anhua Town, Zhuji City, Zhejiang Province in China (29°31′5.35′′N, 120°6′6.12′′E), as shown in Figure 1b. The cultivar of oilseed rape was ZD630, which was treated with different N treatments, P treatments and K treatments. It totally included 100 subplots with 8.5 m × 4.5 m of each and 1 m space between neighboring subplots. Field subplots were treated with five levels of N fertilizers (0, 75, 150, 225 and 300 kg N/ha), which were applied in the form of urea with the rate of 50%, 20% and 30% at the stages of early November, mid-December in 2017, and early March in 2018, respectively. In addition, three levels of P fertilizers (30, 60 and 90 kg N/ha) and three levels of K fertilizers (75, 150 and 225 kg N/ha) were applied at the preplanting stage.

2.2. Data Collection

UAV remote sensing images were acquired by an octorotor UAV equipped with a RGB camera (NEX-7 camera, Sony, Tokyo, Japan) with a spatial resolution of 6000 × 4000 pixels and a 25-band multispectral camera (CMV2K; IMEC, Inc., Leuven, Belgium) with the spatial resolution of 409 × 216 pixels and the spectral region of 600–1000 nm. Flight campaigns were conducted from 2:00 p.m. to 4:00 p.m. on 21 March, 29 March, 12 April 2017 and 28 March 2018 with the flight attitude and the flight speed of 25 m and 2.5 m/s, respectively. The weather was sunny without much wind, so image distortion affected by the weather condition could be eliminated. In order to avoid abnormal remote sensing images, the camera exposure time was adjusted based on the brightness measured with an illuminometer (MQ-200, Apogee Instruments, Logan, UT, USA). To achieve a good performance of image stitching, the forward and side overlaps were 75% and 60%, respectively. After the image acquisition, the number of yellow flowers was manually counted based on the division of the different plots. The principle of counting excluded the overlapping and occlusion of flowers. Finally, the number of oilseed rape flowers at every subplot was recorded with 109 and 100 sampling spots in 2017 and 2018, respectively, with a total data set of 209.

2.3. Image Classification

Image classification is one of the critical methods in remote sensing since images obtained from remote sensing include different background information. The main method in our study was an unsupervised classification method of K-means, and it included a series of different processing techniques as shown in Figure 2. The main process of image classification was implemented in Matlab 2011a (The Mathworks, Inc., Natick, MA, USA).

2.3.1. Image Preprocessing and Color Space Conversion

Image mosaicking was first conducted using Agisoft PhotoScan Professional Software (Agisoft LLC, St. Petersburg, Russia). Geometric correction was also performed to eliminate the image distortion using affine transformation and nearest neighbor algorithm functions in Matlab.
After acquiring an image of each subplot, the key step was to convert RGB space to the International Commission on Illumination (Commission Internationale de l’Éclairage, CIE) L*a*b* space. This color space was developed by the CIE based on the human perception of color, and it could be used in the classification of images captured from different devices without the negative effects of differing color representations [29]. In particular, converting RGB space to L*a*b* space can reduce the influence of unsuitable luminescence information such as excessive brightness. In the CIE L*a*b* space, the L* component represents the brightness of the pixel from pure black to pure white, a* component is related to the values from red to green, and b* component represents the range from yellow to blue [30,31]. The RGB space can be converted to the L*a*b* space using the following equations:
[ X Y Z ] =   [ 0.412453   0.357580   0.180423 0.212671   0.715160   0.072169 0.019334   0.119193   0.950227 ] [ R 255 G 255 B 255 ] ,
{ L = 116 Y 1 3     f o r   Y   > 0.008856 L = 903.3 Y   f o r   Y   0.008856 ,
a = 500 ( f ( X ) f ( Y ) ) ,
b = 200 ( f ( Y ) f ( Z ) ) ,
f ( t ) = { t 1 / 3 ,     t > 0.008856 7.787 t + 16 / 116 ,     t 0.008856 ,
where L, a, and b represent the L*, a* and b* channels of the CIE L*a*b* space. X, Y, and Z represent the X*, Y* and Z* channels of the CIE X*Y*Z* space. R, G and B represent the red, green and blue channels of the original RGB image. The t value belongs to X, Y, and Z.

2.3.2. K-Means Clustering and FCA Calculation

Image data in the L*a*b* space was then used to build a classifier with the K-means clustering method. The number of cluster K was determined based on the object classes of rape field, and the procedure included the following steps [32]: (1) choose K as the initial cluster center (centroid); (2) compute point-to-cluster-centroid distances of all observations to each centroid using the Euclidean Distance (ED); (3) assign each observation to the cluster with the closest centroid; (4) compute the average of the observations in each cluster to obtain K new centroid locations based on the sum of the squared errors (SSE); and (5) repeat steps 2–4 until cluster assignments do not change, or reach the maximum number of iterations. Based on the visual observation, the image mainly included flower, leaves, soil and black shadow. Therefore, the number of the initial cluster centers of K was set as 4, and the result of K-means classification was a pseudo color image with four labels of 1, 2, 3, and 4. At the flowering stage of the oilseed rape, the flower pixels occupied the most of the image, indicating that the most number of labels represented the flower class. Finally, all labels related to the flower class at each subplot were computed as FCA. In addition, the ED and SSE were calculated as the following equations:
ED =   i = 1 n ( C x i ) 2 ,
SSE =   i = 1 K E D K ,
where C is the cluster center and x i is the data point of this cluster of C. K and n represent the number of cluster centers and the number of data points in the cluster of C, respectively.

2.3.3. Accuracy Estimation

From our knowledge, different classification methods could lead to different results, and there existed large differences. Therefore, it was crucial to compare the classification result of K-means with other classification methods. In this study, six other methods including RGB-based threshold, RGB-based back propagation neural network (BPNN), RGB-based support vector machine (SVM), RGB-based K-means, HSI-based K-means, and HSV-based K-means were proposed to classify yellow flowers, and it could further verify the classification performance of the K-means clustering algorithm by CIE L*a*b* space. There, classified FCA was then correlated to the flower number with the correlation coefficient of r2.

2.4. Vegetation Indices Calculation

The DN values of images were first extracted with a maximum rectangle around sampling subplot, which were then converted into the reflectance values to calculate the VIs using the empirical regression equation. The reflectance correction was conducted using five reflectance targets with the known reflectance of 5%, 15%, 31%, 40% and 46%, which were measured by a ground-based spectrometer (QE65000, Ocean Optics, Dunedin, FL, USA). A large number of VIs have been employed to estimate crop growth status, and ten commonly used VIs were chosen to estimate flower number in this study, and they were calculated from UAV-based RGB and multispectral images using the equations shown in Table 1. Different from the VIs extracted from the RGB images, the simple ratio index (SRI) and the normalized difference spectral index (NDSI) extracted from multispectral images requires determining the optimal wavelength combinations from the wavelength region of 600–1000 nm using the contour maps as shown in Figure 3.
Visible-band Difference Vegetation Index (VDVI) is designed to extract green vegetation. Visible Atmospherically Resistant Index (VARI) and Normalized Green-Red Difference Index (NGRDI) are usually used to estimate vegetation fraction (VF). VARI was found to be less sensitive to atmospheric effects allowing a good estimation of VF [34]. NGRDI and Modified Green Red Vegetation Index (MGRVI) are considered as a phenology indicator, and have the potential for biomass estimation. Red-Green Ratio Index (RGRI) is useful to analyze the angular sensitivity of vegetation indices, which could deal with the complex canopy structure. Excess Green Index (ExG), Color Index of Vegetation (CIVE) and Vegetativen (VEG) are designed to identify the green vegetation, and they were sensitive to the canopy color without the influence of shaded sunlit conditions [36,37,38]. SRI and NDSI are mainly related to crop physiological traits. Although previous studies have reported the capabilities of these VIs for different applications, it is still challengeable to select the optimal VIs due to the different canopy structures of plants and variable illumination conditions during UAV campaigns. Therefore, it is worthy to investigate the potential of these commonly used VIs for estimating the flower number of oilseed rape.

2.5. Model Selection and Validation

Before developing prediction models, correlation analysis between VIs and flower number was first performed to pre-check the relationship among different variables. The random forest (RF) model that can handle nonlinear, overfitting problems and high dimensional dataset was then developed for estimating flower number [41,42]. It contained a set of regression tress (500 in this study), and each regression tree was constructed with randomly selected samples using a bootstrapping method. The remaining data (out-of-bag) was then used to estimate the variable importance based on the error from out-of-bag using the following equation:
i m p o r t a n c e   ( x ) =   i = 1 n O O B e r r o r 2 O O B e r r o r 1 n ,
where O O B e r r o r 1 and O O B e r r o r 2 represent the errors of out of bag and the adding noise of variable x with one regression tree, respectively, and n represents the number of regression trees. During the training process, the RF model randomly selected partial variables to construct a regression tree to train the model and calculated the O O B e r r o r of each variable. Finally, all of the regression trees were merged to reduce the prediction error with ranking variable importance based on the O O B e r r o r .
In addition, the commonly used stepwise linear regression (SWL) model was proposed to examine the linear relationship between variables, and optimal subset regression (OSR) was used to select the variables in the SWL model. Relative to RF, the OSR model fully explores the explaining power from the combination of different UAV variables, and can order all possible models based on the value of r2 and Bayesian information criterion (BIC) [43]. The BIC was calculated as the following equation:
BIC =   2 ln ( L ) + k ln ( n ) ,
where L , k and n are the maximum likelihood of the model, variable number, and sample number, respectively. The estimation model with the highest r2 and the lowest BIC value was considered as the optimal model. In this study, we used two classes of features, including FCA from image classification results, and the spectral VIs calculated from UAV images, while the combination represented the fusion of FCA and spectral VIs features. In the model development, the dataset was divided into two parts: the train dataset (2/3) and the test dataset (1/3) using the Kennard–Stone (KS) algorithm. The r2 value and the root mean square error of prediction (RMSEP) were used to quantify the model performance. The r2 and RMSEP were calculated as the following equations:
r 2 = 1   1 n ( y i y ^ i ) 2 1 n ( y i y ¯ i ) 2 ,
RMSEP =   1 n ( y i y ^ i ) 2 n ,
where y i , y ^ i and y ¯ i represent the measured, predicted and mean measured flower number for the sample i . n is the sample number. All the data analysis was implemented in Matlab 2011a (The Mathworks, Inc., Natick, MA, USA).

3. Results

3.1. Image Classification

The image classification of flowers in the rape field was conducted by the K-means clustering algorithm based on CIE L*a*b* space, and the classified FCA was then calculated. A high correlation between the FCA and ground-counted flower number was achieved with the r2 of 0.89 as presented in Figure 4, indicating that the classified FCA had a good linear relationship with the actual number of yellow flowers. The classified RGB images of rape fields on 28 March 2018 were also shown in Figure 5. It provided a straightforward visualization of the change of flower coverage at each subplot. Variations in different N levels were also observed.
High correlations between the classified pixels of yellow flowers and the measured yellow flower number were observed with the r2 of 0.70–0.82, which indicated that FCA based on the image classification had a good linear relationship with the measured flower number (Figure 6). In addition, the BPNN method achieved a better classification result than that of the RGB-based threshold method. The SVM method is also a widely used technique for image classification [1,44], and provided a good result with the r2 of 0.72. Furthermore, the highest correlation between the classified pixels of yellow flowers and the measured yellow flower number was obtained by K-means clustering based on HSV and HSI space with the r2 of 0.82, while their performances were not better than that of K-means algorithm based on CIE L*a*b* space.

3.2. Correlations for VIs and Flower Number

The result of correlation analysis with p < 0.05 showed that the absolute value of r varied between 0.61 and 0.91 (Figure 7), which indicated that different VIs might lead to large differences on estimating flower number of oilseed rape. It was found that the optimal wavelength combinations extracted from multispectral images to estimate flower number were NDSI (944, 758) and SRI (944, 758). NGRDI, RGRI and MGRVI showed the highest correlation with flower number followed by the VARI, with the absolute value of r of 0.91, 0.91, 0.91 and 0.90, respectively. Different from RGRI, NGRDI and MGRVI exhibited a negative correlation to flower number. Compared with different image sensors, the VIs derived from multispectral images possessed a relatively low r value of 0.85. In addition, some high correlations were also observed among UAV variables such as NGRDI, RGRI and MGRVI, which suggested that there existed a multicollinearity among these variables.

3.3. Model Development and Comparision

3.3.1. Model Development with Individual UAV Variables

To compare the estimation performance of individual UAV variables (VIs and FCA), the RF model with individual variables for estimating flower number was developed, and the results are shown in Figure 8. It was found that individual UAV variables could also achieve reasonable results of assessing the flower number with r2 ranging from 0.65 to 0.88. Among all UAV variables, the FCA exhibited the best result to estimate flower number with r2 and RMSEP of 0.88 and 18.61, respectively. Compared with different image sensors, the VIs derived from RGB images obtained a relatively good estimation results, and VARI presented the best performance with r2 = 0.88 and RMSEP = 19.78, followed by RGRI and NGRDI.

3.3.2. Model Development and Comparison with All UAV Variables

To investigate the feasibility of fusion of VIs and image classification result (FCA) to estimate flower number, the performance of RF model developed with the combination of all UAV variables was evaluated. Compared to the result shown in Figure 8, the established model (Figure 9a) achieved a better performance for estimating the flower number with r2 and RMSEP of 0.93 and 16.18, respectively. This indicated that fusion of VIs and image classification could improve the estimation of flower number. In addition, the variable importance in the RF model is presented in Figure 9b. Among all UAV variables, the FCA possessed the highest importance in the model followed by the RGRI and VARI, which was consistent with the performance of individual UAV variables as shown in Figure 8. In addition, the VIs derived from multispectral images were also valuable to improve the model performance.
To further simply the prediction model, the OSR model with the forward and backward selections with a branch-and-bound algorithm was employed to select the optimal variable combination. As shown in Figure 10, a subset of UAV variables with different adjusted r2 and BIC values was obtained, and the highest adjusted r2 and the lowest BIC value were 0.9 and −300, respectively. The results showed that FCA and NDSI (944, 758) contributed significantly to the estimation model, followed by the RGRI. The final selected variable combinations with the highest r2 and the lowest RMSEP were the group of VDVI, NGRDI, VEG, SRI (944, 758), NDSI (944, 758) and FCA, and the group of RGRI, NDSI (944, 758) and FCA. Finally, the model with fewer variables was determined as the optimal model, and the estimation result was presented in Figure 10c. It was found that OSR model with the variable combination of RGRI, NDSI (944, 758) and FCA exhibited the better result than the RF model (r2 = 0.95 and RMSEP = 14.31). The results confirmed that OSR model with fewer variables achieved a comparable or better result compared with the RF model, and the value of RMSEP from the OSR model was reduced by 12.67%.

4. Discussion

This study has demonstrated the feasibility of UAV-based RGB and multispectral images data to estimate flower number in oilseed rape grown in two different experimental fields. The potential of fusing VIs and image classification to improve the estimation of flower number was also confirmed.

4.1. Applicability of the Method

In agricultural remote sensing, UAVs have been widely employed to capture images to monitor crop growth status using different data analysis methods, e.g., image classification and spectral VIs [18,22,45]. Although reasonable estimation result by image classification can be achieved, its accuracy was easily influenced by the soil, weeds and other field backgrounds. Moreover, the limited spatial resolution of images could also influence the performance on extracting detailed texture features, such as flower counting. Compared to image classification, the spectral VIs are mainly constructed by the spectral reflectance data at different wavelengths, which provide more information related to the soil background and the crop growth status [46]. However, some NIR VIs could reach a saturation level after leaf area index or biomass exceeds a certain value [47], which would reduce the accuracy of the assessment. In addition, the multispectral images with a lower resolution were constrained on the prediction of crop phenotypic features. Although some studies have demonstrated that spectral VIs possessed the capacity to estimate phenotypic features such as plant height, they were only statistically significant [7,25]. In addition, as for different crop cultivars with different canopy color, the applications of spectral VIs were limited. Furthermore, some studies began to combine phenotypic features and spectral VIs to evaluate various crop traits [26,27]. This indicated that fusion of phenotypic features and spectral VIs could improve the estimations of growth status, which was confirmed by the results shown in Figure 9 and Figure 10.

4.2. Importance of Variable Rankings

Variable importance ranking was crucial for variable selection and model simplification. From Figure 9b, the results showed that FCA (0.89), RGRI (0.72), VARI (0.60) and NDSI (944, 758) (0.59) played a dominant role in the estimation of flower number in oilseed rape. This also indicated that the prediction of flower number was highly sensitive to the FCA. It is thus imperative to employ an image classification method to measure flower number. The use of VIs such as RGRI, VARI and NDSI (944, 758) for estimating flower number also acquired satisfactory results, which was consistent with the results shown in Figure 8. From our knowledge, the VIs calculated from RGB images mainly reflected the changes of canopy greenness [48,49], and the multispectral VIs were closely related to crop physiological characteristics [26,50]. Moreover, excessive VIs are prone to cause multi-collinearity and over-fitting problems. Therefore, it is necessary to select the optimal combination of VIs. In this study, two methods (RF and OSR) were introduced to select variables (Figure 9 and Figure 10), and similar results were obtained. Finally, the variables of RGRI, NDSI (944, 758) and FCA were determined as the optimal combination to evaluate flower number, and the estimation result was improved with the estimation error of RMSEP reduced by 12.67%. This suggested that ranking variable importance can improve the prediction accuracy and simplify the model.

4.3. The Implications and Limitations in This Study

A great advantage of this study was that we demonstrated the reliability of using a commercial RGB camera carried on a UAV to obtain estimates of flower number in oilseed rape. This allows a significant reduction of the camera equipment cost when compared with multispectral cameras [51]. Moreover, RGB images with a high spatial resolution could give an intuitive view on the dynamics of field crop growing, which has been reported in previous studies [24,48,52]. As shown in Figure 11, it clearly showed that the yellow flower number changed from the pre-flowering period to the full-flowering period, also called early pod period. Variations in different varieties and different N levels were also observed. Based on the dynamical changes of flower number classified from UAV-based RGB images, it was determined that the period of flowering and the changes of flower coverage for different cultivars and N treatment were different, so it is beneficial to predict the yield by estimating the flower number. Overall, UAV-based RGB images are promising for field phenotypic research.
However, due to the limited wavebands in RGB images, few studies tried to utilize UAV-based RGB camera to estimate growth traits in oilseed rape. The main reason is that the information of RGB images is very limited, which cannot reflect more physiological information. In fact, a band ratio of green and blue light was strongly related to the number of yellow flowers per unit area [28], which pointed out that the floral contribution to the reflectance is manifest most strongly in the green waveband. Moreover, Yellow rape petal coloration is due to carotenoid absorption at ~450 nm [53], and reflectance at 550 nm was also found best suited for flower coverage estimation with the r2 over 0.6 [1]. We could conclude that UAV-based RGB images with visible wavebands possessed the capacity of assessing flower number in oilseed rape, which was consistent with the results shown in Figure 8. Compared with RGB camera, more diverse spectral characteristics can be obtained when a multispectral camera loaded on the UAV system, which could perform better in biochemical traits estimation due to the contribution of NIR spectral information [26]. However, the fact is that flower canopy is prone to more reflections and less absorption between 500 nm and 700 nm without little impact on the red edge or NIR [54]. This is the key limitation on the application of multispectral VIs to estimate flower number, but multispectral VIs were still critical for the assessment of flower number. From Figure 8, it could be found that the VIs from multispectral images also exhibited a good performance of estimating flower number. Further combination of RGB and multispectral images data demonstrated that image data fusion could improve the estimation of flower number (Figure 10c), and it could be also extended to monitor other crop growth-related traits in the field. Furthermore, data fusion of multiple sensors is critical for UAV applications, as it allows a significant extension of the range of sensors and platforms available from these systems.

5. Conclusions

We developed a UAV-based dual-camera platform that collected a series of field images with high resolution on the flowering stage in oilseed rape and compared the estimation models based on VIs and image classification on flower number. The results showed that classified FCA using K-means clustering method based on CIE L*a*b* space was closely related to flower number (r2 = 0.89). The highest correlations to flower number conducted by the VIs from RGB and multispectral images were 0.91 and 0.85, respectively. This study also demonstrated that combining VIs and image classification from UAV-based RGB and multispectral images could improve the estimation of flower number. Future studies should be taken to evaluate this method for multiple year dataset, multiple experimental fields and multiple cultivars to improve the robustness and applicability of the predictive model. Furthermore, combining UAV-based RGB and multispectral cameras will be a promising tool for estimate flower number, which would provide new insights to the field high-throughput phenotypic research.

Author Contributions

All authors made significant contributions to this manuscript. H.C., L.W. and W.Z. designed the experiment, L.W., Y.L., J.Z., W.Y., W.W. and H.Z. performed field data collection, L.W., Y.L., H.C. and J.Z. contributed to the data analysis, L.W., Y.L., H.C. and D.S. wrote the manuscript, and W.Z. and Y.H. provided suggestions on the experiment design and discussion sections.

Acknowledgments

This work was funded by National Key R & D Program supported by Ministry of Science and Technology of China (2016YFD0200600, 2016YFD0200603). We would like to thank Zhihong Ma, Xiaoran Li, Haixia Xu, Jinlin Jiang, and Qishuai Zheng who participated in the experiments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Fang, S.; Tang, W.; Peng, Y.; Gong, Y.; Dai, C.; Chai, R.; Liu, K. Remote estimation of vegetation fraction and flower fraction in oilseed rape with unmanned aerial vehicle data. Remote Sens. 2016, 8, 416. [Google Scholar] [CrossRef]
  2. Blackshaw, R.E.; Johnson, E.N.; Gan, Y.; May, W.E.; McAndrews, D.W.; Barthet, V.; McDonald, T.; Wispinski, D. Alternative oilseed crops for biodiesel feedstock on the Canadian prairies. Can. J. Plant Sci. 2011, 91, 889–896. [Google Scholar] [CrossRef]
  3. Faraji, A. Flower formation and pod/flower ratio in canola (Brassica napus L.) affected by assimilates supply around flowering. Int. J. Plant Prod. 2010, 4, 271–280. [Google Scholar] [CrossRef]
  4. Faraji, A.; Latifi, N.; Soltani, A.; Rad, A.H.S. Effect of high temperature stress and supplemental irrigation on flower and pod formation in two canola (Brassica napus L.) cultivars at Mediterranean climate. Asian J. Plant Sci. 2008, 7, 343–351. [Google Scholar] [CrossRef]
  5. Burton, W.A.; Flood, R.F.; Norton, R.M.; Field, B.; Potts, D.A.; Robertson, M.J.; Salisbury, P.A. Identification of variability in phenological responses in canola-quality brassica juncea for utilisation in Australian breeding programs. Aust. J. Agric. Res. 2008, 59, 874–881. [Google Scholar] [CrossRef]
  6. Leflon, M.; Husken, A.; Njontie, C.; Kightley, S.; Pendergrast, D.; Pierre, J.; Renard, M.; Pinochet, X. Stability of the cleistogamous trait during the flowering period of oilseed rape. Plant Breed. 2010, 129, 13–18. [Google Scholar] [CrossRef]
  7. Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  8. Atena, H.; Lorena, G.P.; Suchismita, M.; Daljit, S.; Dale, S.; Jessica, R.; Ivan, O.M.; Prakash, S.R.; Douglas, G.; Jesse, P. Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries. Plant Methods 2016, 12, 1–15. [Google Scholar] [CrossRef]
  9. Ines, A.V.M.; Das, N.N.; Hansen, J.W.; Njoku, E.G. Assimilation of remotely sensed soil moisture and vegetation with a crop simulation model for maize yield prediction. Remote Sens. Environ. 2013, 138, 149–164. [Google Scholar] [CrossRef]
  10. Huang, J.; Sedano, F.; Huang, Y.; Ma, H.; Li, X.; Liang, S.; Tian, L.; Zhang, X.; Fan, J.; Wu, W. Assimilating a synthetic kalman filter leaf area index series into the wofost model to improve regional winter wheat yield estimation. Agric. For. Meteorol. 2016, 216, 188–202. [Google Scholar] [CrossRef]
  11. Clevers, J.G.P.W.; Gitelson, A.A. Remote estimation of crop and grass chlorophyll and nitrogen content using red-edge bands on sentinel-2 and-3. Int. J. Appl. Earth Obs. 2013, 23, 344–351. [Google Scholar] [CrossRef]
  12. Moharana, S.; Dutta, S. Spatial variability of chlorophyll and nitrogen content of rice from hyperspectral imagery. ISPRS J. Photogramm. 2016, 122, 17–29. [Google Scholar] [CrossRef]
  13. Zhu, Y.; Liu, K.; Liu, L.; Myint, S.; Wang, S.; Liu, H.; He, Z. Exploring the potential of worldview-2 red-edge band-based vegetation indices for estimation of mangrove leaf area index with machine learning algorithms. Remote Sens. 2017, 9, 1060. [Google Scholar] [CrossRef]
  14. Castro, A.I.D.; López-Granados, F.; Jurado-Expósito, M. Broad-scale cruciferous weed patch classification in winter wheat using quickbird imagery for in-season site-specific control. Precis. Agric. 2013, 14, 392–413. [Google Scholar] [CrossRef]
  15. Martín, M.P.; Barreto, L.; Fernándezquintanilla, C. Discrimination of sterile oat (Avena sterilis) in winter barley (Hordeum vulgare) using quickbird satellite images. Crop Prot. 2011, 30, 1363–1369. [Google Scholar] [CrossRef]
  16. Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from uav-based multispectral and digital imagery. ISPRS J. Photogramm. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  17. Verger, A.; Vigneau, N.; Chéron, C.; Gilliot, J.M.; Comar, A.; Baret, F. Green area index from an unmanned aerial system over wheat and rapeseed crops. Remote Sens. Environ. 2014, 152, 654–664. [Google Scholar] [CrossRef]
  18. Gao, J.; Liao, W.; Nuyttens, D.; Lootens, P.; Vangeyte, J.; Pižurica, A.; He, Y.; Pieters, J.G. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. Int. J. Appl. Earth Obs. 2018, 67, 43–53. [Google Scholar] [CrossRef]
  19. Yu, N.; Li, L.; Schmitz, N.; Tiaz, L.F.; Greenberg, J.A.; Diers, B.W. Development of methods to improve soybean yield estimation and predict plant maturity with an unmanned aerial vehicle based platform. Remote Sens. Environ. 2016, 187, 91–101. [Google Scholar] [CrossRef]
  20. Park, S.; Ryu, D.; Fuentes, S.; Chung, H.; Hernández-Montes, E.; O’Connell, M. Adaptive estimation of crop water stress in nectarine and peach orchards using high-resolution imagery from an unmanned aerial vehicle (uav). Remote Sens. 2017, 9, 828. [Google Scholar] [CrossRef]
  21. Duan, S.-B.; Li, Z.-L.; Wu, H.; Tang, B.-H.; Ma, L.; Zhao, E.; Li, C. Inversion of the prosail model to estimate leaf area index of maize, potato, and sunflower fields from unmanned aerial vehicle hyperspectral data. Int. J. Appl. Earth Obs. 2014, 26, 12–20. [Google Scholar] [CrossRef]
  22. Jin, X.; Liu, S.; Baret, F.; Hemerle, M.; Comar, A. Estimates of plant density of wheat crops at emergence from very low altitude uav imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef]
  23. Yang, M.-D.; Huang, K.-S.; Kuo, Y.-H.; Tsai, H.P.; Lin, L.-M. Spatial and spectral hybrid image classification for rice lodging assessment through uav imagery. Remote Sens. 2017, 9, 583. [Google Scholar] [CrossRef]
  24. Du, M.M.; Noguchi, N. Monitoring of wheat growth status and mapping of wheat yield’s within-field spatial variations using color images acquired from uav-camera system. Remote Sens. 2017, 9, 289. [Google Scholar] [CrossRef]
  25. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining uav-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  26. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S. Unmanned aerial system (uas)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. 2017, 134, 43–58. [Google Scholar] [CrossRef]
  27. Liu, T.; Li, R.; Zhong, X.; Jiang, M.; Jin, X.; Zhou, P.; Liu, S.; Sun, C.; Guo, W. Estimates of rice lodging using indices derived from uav visible and thermal infrared images. Agric. For. Meteorol. 2018, 252, 144–154. [Google Scholar] [CrossRef]
  28. Sulik, J.J.; Long, D.S. Spectral indices for yellow canola flowers. Int. J. Remote Sens. 2015, 36, 2751–2765. [Google Scholar] [CrossRef]
  29. Coy, A.; Rankine, D.; Taylor, M.; Nielsen, D.C.; Cohen, J. Increasing the accuracy and automation of fractional vegetation cover estimation from digital photographs. Remote Sens. 2016, 8, 474. [Google Scholar] [CrossRef]
  30. McLaren, K. Development of cie 1976 (lab) uniform color space and color-difference formula. J. Soc. Dyers Colour. 1976, 92, 338–341. [Google Scholar] [CrossRef]
  31. Lopez, F.; Valiente, J.M.; Baldrich, R.; Vanrell, M. Fast surface grading using color statistics in the cie lab space. Lect. Notes Comput. Sci. 2005, 3523, 666–673. [Google Scholar] [CrossRef]
  32. Lloyd, S. Least squares quantization in PCM. IEEE Trans. Inform. Theory 1982, 28, 129–137. [Google Scholar] [CrossRef] [Green Version]
  33. Wang, X.; Wang, M.; Wang, S.; Wu, Y. Extraction of vegetation information from visible unmanned aerial vehicle images. Trans. Chin. Soc. Agric. Eng. 2015, 31, 152–159. [Google Scholar] [CrossRef]
  34. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
  35. Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubuehler, M. Angular sensitivity analysis of vegetation indices derived from chris/proba data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
  36. Woebbecke, D.M.; Meyer, G.E.; Vonbargen, K.; Mortensen, D.A. Color indexes for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  37. Kataoka, T.; Kaneko, T.; Okamoto, H.; Hata, S. Crop growth estimation system using machine vision. In Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Kobe, Japan, 20–24 July 2003; pp. 1079–1083. [Google Scholar] [CrossRef]
  38. Hague, T.; Tillett, N.D.; Wheeler, H. Automated crop and weed monitoring in widely spaced cereals. Precis. Agric. 2006, 7, 21–32. [Google Scholar] [CrossRef]
  39. Jordan, C.F. Derivation of leaf-area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  40. Rouse, J.W.; Haas, R.W.; Schell, J.A.; Deering, D.W.; Harlan, J.C. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation; NASA Goddard Space Flight Center: Houston, TX, USA, 1974; pp. 1–8. Available online: https://ntrs.nasa.gov/search.jsp?R=19730017588 (accessed on 1 April 1973).
  41. Horning, N. Random forests: An algorithm for image classification and generation of continuous fields data sets. In Proceedings of the International Conference on Geoinformatics for Spatial Infrastructure Development in Earth and Allied Sciences, Osaka, Japan, 9–11 December 2010. [Google Scholar] [CrossRef]
  42. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  43. Neath, A.A.; Cavanaugh, J.E. The Bayesian information criterion: Background, derivation, and applications. Comput. Stat. 2012, 4, 199–203. [Google Scholar] [CrossRef]
  44. Schwieder, M.; Leitão, P.; Suess, S.; Senf, C.; Hostert, P. Estimating fractional shrub cover using simulated enmap data: A comparison of three machine learning regression techniques. Remote Sens. 2014, 6, 3427–3445. [Google Scholar] [CrossRef]
  45. Duan, T.; Chapman, S.C.; Guo, Y.; Zheng, B. Dynamic monitoring of ndvi in wheat agronomy and breeding trials using an unmanned aerial vehicle. Field Crops Res. 2017, 210, 71–80. [Google Scholar] [CrossRef]
  46. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef] [PubMed]
  47. Fu, Y.; Yang, G.; Wang, J.; Song, X.; Feng, H. Winter wheat biomass estimation based on spectral indices, band depth analysis and partial least squares regression using hyperspectral measurements. Comput. Electron. Agric. 2014, 100, 51–59. [Google Scholar] [CrossRef]
  48. Torres-Sanchez, J.; Pena, J.M.; de Castro, A.I.; Lopez-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from uav. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  49. Rasmussen, J.; Ntakos, G.; Nielsen, J.; Svensgaard, J.; Poulsen, R.N.; Christensen, S. Are vegetation indices derived from consumer-grade cameras mounted on uavs sufficiently reliable for assessing experimental plots? Eur. J. Agron. 2016, 74, 75–92. [Google Scholar] [CrossRef]
  50. Lebourgeois, V.; Bégué, A.; Labbé, S.; Houlès, M.; Martiné, J.F. A light-weight multi-spectral aerial imaging system for nitrogen crop monitoring. Precis. Agric. 2012, 13, 525–541. [Google Scholar] [CrossRef]
  51. Chianucci, F.; Disperati, L.; Guzzi, D.; Bianchini, D.; Nardino, V.; Lastri, C.; Rindinella, A.; Corona, P. Estimation of canopy attributes in beech forests using true colour digital images from a small fixed-wing uav. Int. J. Appl. Earth Obs. 2016, 47, 60–68. [Google Scholar] [CrossRef] [Green Version]
  52. Iersel, W.V.; Straatsma, M.; Addink, E.; Middelkoop, H. Monitoring height and greenness of non-woody floodplain vegetation with uav time series. ISPRS J. Photogramm. 2018, 141, 112–123. [Google Scholar] [CrossRef]
  53. Zhang, B.; Liu, C.; Wang, Y.; Yao, X.; Wang, F.; Wu, J.; King, G.J.; Liu, K. Disruption of a carotenoid cleavage dioxygenase 4 gene converts flower colour from white to yellow in brassica species. New Phytol. 2015, 206, 1513–1526. [Google Scholar] [CrossRef] [PubMed]
  54. Yates, D.J.; Steven, M.D. Reflexion and absorption of solar radiation by flowering canopies of oil-seed rape (Brassica napus L.). J. Agric. Sci. 1987, 109, 495–502. [Google Scholar] [CrossRef]
Figure 1. The general locations of two experimental sites and the overview of the images obtained by unmanned aerial vehicle (UAV) remote sensing platform for the oilseed rape fields at Zhejiang University on 21 March 2017 (a) and at Anhua city, Zhuji on 28 March 2018 (b), respectively.
Figure 1. The general locations of two experimental sites and the overview of the images obtained by unmanned aerial vehicle (UAV) remote sensing platform for the oilseed rape fields at Zhejiang University on 21 March 2017 (a) and at Anhua city, Zhuji on 28 March 2018 (b), respectively.
Remotesensing 10 01484 g001
Figure 2. Flowchart of image classification of yellow flowers in the rape field by K-means clustering algorithm by CIE L*a*b* space.
Figure 2. Flowchart of image classification of yellow flowers in the rape field by K-means clustering algorithm by CIE L*a*b* space.
Remotesensing 10 01484 g002
Figure 3. Contour maps of the coefficient of determination (r2) between flower number and normalized difference spectral index (NDSI) using random spectral band λ1 and λ2 within the spectral region of 600−1000 nm.
Figure 3. Contour maps of the coefficient of determination (r2) between flower number and normalized difference spectral index (NDSI) using random spectral band λ1 and λ2 within the spectral region of 600−1000 nm.
Remotesensing 10 01484 g003
Figure 4. The relationships between measured flower number and classified flower coverage area (FCA) on 21 March 2017at Zhuji and 28 March 2018 at Hangzhou. The result was conducted by the K-means algorithm based on Commission Internationale de l’Éclairage (CIE) L*a*b* space. (n = 209).
Figure 4. The relationships between measured flower number and classified flower coverage area (FCA) on 21 March 2017at Zhuji and 28 March 2018 at Hangzhou. The result was conducted by the K-means algorithm based on Commission Internationale de l’Éclairage (CIE) L*a*b* space. (n = 209).
Remotesensing 10 01484 g004
Figure 5. An example of image classification result of rape fields at Zhuji on 28 March 2018. (a) The original image of oilseed rape field and (b) the result of image classification using K-means method based on Commission Internationale de l’Éclairage (CIE) L*a*b* space were presented.
Figure 5. An example of image classification result of rape fields at Zhuji on 28 March 2018. (a) The original image of oilseed rape field and (b) the result of image classification using K-means method based on Commission Internationale de l’Éclairage (CIE) L*a*b* space were presented.
Remotesensing 10 01484 g005
Figure 6. The relationships between measured flower number and classified flower coverage area (FCA) by different classification methods. BPNN and SVM represent back propagation neural network and support vector machine, respectively.
Figure 6. The relationships between measured flower number and classified flower coverage area (FCA) by different classification methods. BPNN and SVM represent back propagation neural network and support vector machine, respectively.
Remotesensing 10 01484 g006
Figure 7. Correlation analyses (r) between UAV variables including vegetation indices (VIs) and flower coverage area (FCA) and ground-counted flower number of oilseed rape.
Figure 7. Correlation analyses (r) between UAV variables including vegetation indices (VIs) and flower coverage area (FCA) and ground-counted flower number of oilseed rape.
Remotesensing 10 01484 g007
Figure 8. Estimation of flower number developed by random forest (RF) model using individual UAV variables including vegetation indices (VIs) and flower coverage area (FCA). The coefficient of determination (r2) and the prediction of root mean square error (RMSEP) were presented to estimate the model performance.
Figure 8. Estimation of flower number developed by random forest (RF) model using individual UAV variables including vegetation indices (VIs) and flower coverage area (FCA). The coefficient of determination (r2) and the prediction of root mean square error (RMSEP) were presented to estimate the model performance.
Remotesensing 10 01484 g008
Figure 9. Estimation of flower number developed by random forest (RF) model with all UAV variables extracted from Red Green Blue (RGB) and multispectral images (a). Dashed red line is the 1:1 line. The right figure shows the variable importance estimation of the RF model (b). The r2 and RMSEP represent the coefficient of determination and the prediction of root mean square error, respectively.
Figure 9. Estimation of flower number developed by random forest (RF) model with all UAV variables extracted from Red Green Blue (RGB) and multispectral images (a). Dashed red line is the 1:1 line. The right figure shows the variable importance estimation of the RF model (b). The r2 and RMSEP represent the coefficient of determination and the prediction of root mean square error, respectively.
Remotesensing 10 01484 g009
Figure 10. Subsets of variables selected by optimal subset regression (OSR) for all possible models ordered by the Bayesian information criterion (BIC) (a) and the adjusted coefficient of determination (Adj r2) (b) for the estimation of flower number. The result of the optimal model was also shown with the r2 and the prediction of root mean square error (RMSEP) (c).
Figure 10. Subsets of variables selected by optimal subset regression (OSR) for all possible models ordered by the Bayesian information criterion (BIC) (a) and the adjusted coefficient of determination (Adj r2) (b) for the estimation of flower number. The result of the optimal model was also shown with the r2 and the prediction of root mean square error (RMSEP) (c).
Remotesensing 10 01484 g010aRemotesensing 10 01484 g010b
Figure 11. The flower mapping of rape fields: (ac) the original Red Green Blue (RGB) image; (df) the results of image classification by the K -means algorithm based on Commission Internationale de l’Éclairage (CIE) L*a*b* space. 21 March 2017 (a,d); 29 March 2017 (b,e); 12 April 2017(c,f).
Figure 11. The flower mapping of rape fields: (ac) the original Red Green Blue (RGB) image; (df) the results of image classification by the K -means algorithm based on Commission Internationale de l’Éclairage (CIE) L*a*b* space. 21 March 2017 (a,d); 29 March 2017 (b,e); 12 April 2017(c,f).
Remotesensing 10 01484 g011
Table 1. Vegetation indices (VIs) derived from RGB and multispectral images in this study (R, G and B are related to the DN value or reflectance of red, green, and blue bands, respectively. Rλ1 represents the reflectance of a variable band in the spectral range of 600–1000 nm).
Table 1. Vegetation indices (VIs) derived from RGB and multispectral images in this study (R, G and B are related to the DN value or reflectance of red, green, and blue bands, respectively. Rλ1 represents the reflectance of a variable band in the spectral range of 600–1000 nm).
Vegetation IndicesFormulaReferences
VIs Calculated from RGB Images
Visible-band Difference Vegetation Index (VDVI)(2*G − R − B)/(2*G + R + B)[33]
Visible Atmospherically Resistant Index (VARI)(G − R)/(G + R − B)[34]
Normalized Green-Red Difference Index (NGRDI)(G − R)/(G + R)[34]
Red-Green Ratio Index (RGRI)R/G[35]
Modified Green Red Vegetation Index (MGRVI)(G2 − R2)/(G2 + R2)[25]
Excess Green Index (ExG)2*G − R − B[36]
Color Index of Vegetation (CIVE)0.441*R − 0.881*G + 0.385*B + 18.787[37]
Vegetativen (VEG)G/(Ra*B(1 − a)) a = 0.667[38]
VIs Calculated from Multispectral Images
Simple Ratio Index (SRI)Rλ1/Rλ2[39]
Normalized Difference Spectral Index (NDSI)(Rλ1 − Rλ2)/(Rλ1 + Rλ2)[40]

Share and Cite

MDPI and ACS Style

Wan, L.; Li, Y.; Cen, H.; Zhu, J.; Yin, W.; Wu, W.; Zhu, H.; Sun, D.; Zhou, W.; He, Y. Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape. Remote Sens. 2018, 10, 1484. https://doi.org/10.3390/rs10091484

AMA Style

Wan L, Li Y, Cen H, Zhu J, Yin W, Wu W, Zhu H, Sun D, Zhou W, He Y. Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape. Remote Sensing. 2018; 10(9):1484. https://doi.org/10.3390/rs10091484

Chicago/Turabian Style

Wan, Liang, Yijian Li, Haiyan Cen, Jiangpeng Zhu, Wenxin Yin, Weikang Wu, Hongyan Zhu, Dawei Sun, Weijun Zhou, and Yong He. 2018. "Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape" Remote Sensing 10, no. 9: 1484. https://doi.org/10.3390/rs10091484

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop