Next Article in Journal
Optimized Kernel Minimum Noise Fraction Transformation for Hyperspectral Image Classification
Previous Article in Journal
Spatial Resolution Enhancement of Hyperspectral Images Using Spectral Unmixing and Bayesian Sparse Representation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Counts of Maize Plants by Unmanned Aerial Vehicles (UAVs)

Chair of Plant Nutrition, Department of Plant Sciences, Technical University of Munich, Freising 85354, Germany
*
Author to whom correspondence should be addressed.
Remote Sens. 2017, 9(6), 544; https://doi.org/10.3390/rs9060544
Submission received: 28 March 2017 / Revised: 22 May 2017 / Accepted: 28 May 2017 / Published: 31 May 2017

Abstract

:
Precision phenotyping, especially the use of image analysis, allows researchers to gain information on plant properties and plant health. Aerial image detection with unmanned aerial vehicles (UAVs) provides new opportunities in precision farming and precision phenotyping. Precision farming has created a critical need for spatial data on plant density. The plant number reflects not only the final field emergence but also allows a more precise assessment of the final yield parameters. The aim of this work is to advance UAV use and image analysis as a possible high-throughput phenotyping technique. In this study, four different maize cultivars were planted in plots with different seeding systems (in rows and equidistantly spaced) and different nitrogen fertilization levels (applied at 50, 150 and 250 kg N/ha). The experimental field, encompassing 96 plots, was overflown at a 50-m height with an octocopter equipped with a 10-megapixel camera taking a picture every 5 s. Images were recorded between BBCH 13–15 (it is a scale to identify the phenological development stage of a plant which is here the 3- to 5-leaves development stage) when the color of young leaves differs from older leaves. Close correlations up to R2 = 0.89 were found between in situ and image-based counted plants adapting a decorrelation stretch contrast enhancement procedure, which enhanced color differences in the images. On average, the error between visually and digitally counted plants was ≤5%. Ground cover, as determined by analyzing green pixels, ranged between 76% and 83% at these stages. However, the correlation between ground cover and digitally counted plants was very low. The presence of weeds and blurry effects on the images represent possible errors in counting plants. In conclusion, the final field emergence of maize can rapidly be assessed and allows more precise assessment of the final yield parameters. The use of UAVs and image processing has the potential to optimize farm management and to support field experimentation for agronomic and breeding purposes.

Graphical Abstract

1. Introduction

Unmanned aerial vehicles (UAVs) are very promising instruments in agricultural sciences [1,2]. Flights are fairly independent of weather conditions and time. Images can be captured on cloudy days, whereas image recording in similar conditions by satellite is not possible [3]. Hence, a higher degree of information can be obtained because the higher flexibility allows for an extended range of measurement days. Furthermore, UAVs offer time-saving and cheaper image recording, enable flexible and immediate image processing and give a survey of the health of farm systems [4]. In addition to the time-saving and cheaper image processing, the flexible handling options for aim-oriented use, e.g., variable flight height and better image resolution [5,6], UAVs represent new opportunities in the agricultural sciences, especially in precision farming and precision phenotyping. Precision farming has created a critical need for spatial data on plant density, crop yield and related soil characteristics [7]. Recent technological advances in UAV technology offer new opportunities for assessing agricultural plot experiments using UAV imagery [8]. The use of high-throughput phenotyping is expected to improve crop performance and hence accelerate the breeding progress [9]. Field-based phenotyping is the most promising approach for delivering the required throughput in terms of numbers of plants as well as populations for a precise description of plant traits in cropping systems [10].
The human eye is a sensitive system, which recognizes contrast better than absolute luminance as well as the structural properties of an object. However, image analysis could provide a wealth of metric information about positions, size and interrelationships between objects [11]. The human eye is always combined with subjective perception, and therefore the degree of ground cover in a cropping system can be only relatively assessed compared with imaging [12]; scaling-up is difficult and a comparison of ground cover and the number of plants is not feasible. Weeds between and within crop rows were successfully recorded using the k-means clustering method [6] and crop row detection was accomplished by mounting a camera on a tractor. Rows were counted by evaluating pixel values, and their positioning was demonstrated [13]. Burgos-Artizzu et al. [14] and Berge et al. [15] successfully detected weeds with RGB images (with three channels: red, green and blue to define color space) using a camera mounted on a tractor. Additionally, nitrogen level and the LAI can be detected with aerial visible and near-band images by calculating the Normalized Difference Vegetation Index (NDVI) and Grassland drought index (GDI) [16,17]. Even object detection is possible through image processing using the mathematical Bernoulli distribution [18]. Whereas detection of weeds, biomass and ground cover with aerial RGB and remote imaging has already been successfully shown in previous works [5,6,12]; the counting and segmentation of individual plants has not yet been demonstrated and represents the goal of this work.
The aim of this work is to advance UAV use and image analysis as a possible high-throughput phenotyping technique. Therefore, image analysis applied to two different sowing systems, conventional row and equidistant planting, was performed. In theory, plants growing in triangular planting systems should have better light and water availability and therefore competition between plants is reduced [19,20,21]. Additionally, soil erosion due to heavy rainfall, which can exert a strong influence on plant cultivation, particularly maize, could also be diminished or even prevented. Moreover, with faster row cover development in triangular plantings, competition with weeds may be suppressed [22], growth potential may be enhanced and the intensive application of pesticides and soil tillage could be reduced [23].
Determination of the plant number per hectare represents an important index to assess plant density as well as field emergence. Ultimately, the final yield can be best determined by including the exact plant number. Identifying optimal plant density and row spacing is a critical management decision for maize production to investigate grain yield response to plant density and to explore genotype x environment interactions [24,25]. The goal of this work is to use aerial images to detect ground cover and to determine the plant number of different maize cultivars grown in different row spacings.

2. Materials and Methods

The field experiment was conducted in 2016 in Dürnast (48°24′10.3′′N, 11°41′37.5′′E), close to Freising in southern Bavaria, at the experimental station of the Chair of Plant Nutrition belonging to the Technical University of Munich. The silty cambisol is characterized by a homogeneous soil texture across the whole experimental site, which is exposed from north to south. The average annual temperature was 8.1 °C and the average precipitation reached 791 mm. The phenological plant growth proceeded regularly during the season. Optimal climate conditions with sunny days and enough rain led to above average yield. The fully randomized block design consisted of 96 plots with four cultivars, three nitrogen levels (50, 150 and 250 kg N ha−1), four replicates and two planting systems—row planting (RP) and triangular planting (TP). Cultivars were from different maturity groups representing different agronomic purposes (Table 1). Conventional farming was applied using the herbicides Roundup PowerFlex (20 April 2016) and Gardo Gold + Callisto (10 June 2016). Different fertilization levels were applied on 1 June 2016.
Aerial images were taken on 16 June 2016, when plants reached the BBCH stage 13–15 (it is a scale to identify the phenological development stage of a plant which is here the 3- to 5-leaves development stage). The flying height was 50 m above the field, covering an area of 9000 m2, to obtain a seamless orthophoto mosaic photo and to cover all plots.
The flight direction of the UAV called MKSET_BASIS_OKTO2 (KS Model Company Ltd., Hong Kong, China) was perpendicular to the plots. One image was captured on average per three plots as illustrated in Figure 1. The recorded images overlapped on average by one third. After the flyover of one plot row, the UAV was navigated back to the first plot of the second row to maintain a perpendicular and centered flight direction (over the next row).
DINA4 posters (210 × 297 mm2) labeled with the plot number were placed on each plot. Images were taken with a Canon G12 digital camera (Canon, Tokyo, Japan) with 1/1.7 inch CDD sensor, 10 megapixels, 28-mm focal length, an image quality of 180 pixels/inch, triggering time of 281 milliseconds, f-stop of 2.97 and disk size of 4.91 MB mounted on the octocopter. Images were captured in auto mode due to changing light conditions and to avoid wrong ISO (light sensitivity of the camera sensor) the, f-stop and shutter speed settings. Despite the stable mounting of the camera on the UAV, the camera with an outstanding position was partly affected by the wind and vibrations of the UAV comparatively to integrated cameras fixed within the UAV. Additionally, the focus of the camera could not be used properly due to limited remote tripping contrasting to integrated cameras where the focus is fixed at infinity. Therefore, occasionally blurry images resulted, however, not affecting the subsequent image analysis. The position of the octocopter could be detected by GNSS (Differential Global Positioning System) and a magnetic compass to maintain flight direction. The height could be detected by measuring barometric air pressure; air pressure was calibrated as zero at the beginning of the flight campaign. The UAV used is a vertical take-off and landing aircraft with eight brushless external rotor motors. The UAV is air-remotely controlled with a bidirectional transmission frequency of 2.4 GHz that receives data of the battery voltage, temperature of the motor controller, Global Positioning System (GPS) reception and the height differences from the starting point. The angle of the camera can be changed remotely. With a lithium polymer battery, the octocopter has a flight time of 16 min when carrying the camera.

2.1. Description of the Image Analysis Process

To detect the number of plants and their spatial distribution in the field, weed-free conditions on the field are required. All images of the plots were cut into correct form using the program Adobe Photoshop CC (Adobe Systems Software Ireland Limited, Dublin, Ireland). Segmentation of the green pixels and detection of the plant numbers were completed using MATLAB (Mathworks, Natick, MA, USA). The script is attached in the Appendix A.

2.2. Creation of Color Histograms and Employment of the Contrast Enhancement Procedure

Color histograms help to judge, correct and optimize the brightness and contrast of images. For RGB images, four histograms are created per image: the red channel histogram, the green channel histogram, the blue channel histogram, and the gray histogram, which explain the luminance of the image. A histogram has values ranging from 0 to 255, where zero is black and 255 is white. Between these border values are gray values. The height of the bars (deducted from the histogram) demonstrates the frequency of the appearing color value in the image represented by pixels. The color histogram is created to evaluate the quality of the image and to collect more information for further processing [26]. The decorrstretch contrast enhancement procedure, which is suitable for visual interpretation, was adapted to enhance and stretch the color difference in the original picture (used in images with significant band–band correlations) and produces an image with high correlation among its bands [27]. Every pixel of the three channels of the original RGB image was transformed into the color eigenspace, where a new, wider and stretched range of color values was created (Figure 2) and transformed back to the original band (Figure 2). During the process all three bands were stretched through decorrelation into a new 3 × 3-bands correlation matrix and equalized to maintain the band variances. Additionally, a linear contrast stretch is applied to further expand the color range in all three bands equally and to find limits to contrast the stretch in an image because pixel values must be located in the range [0, 255] [26].

2.3. Creation of a Threshold Value

The HSV color model is a nonlinear transformation of the RGB mode since it separates out the luminance from the color information. There are three channels which describe the HSV color model: the hue (channel 1), the saturation (channel 2) and the intensity values of an image pixel (channel 3) [28]. The HSV color model is described as a hexacone where the color values are split up in a circle with red at angle 0, green at 2π/3, blue at 4π/3 and red again at 2π. The saturation channel defines the depth or purity of the color and passes from the center of the circle where S = 0% (white) to the edge of the circle where S = 100% (complete saturation). Along the perpendicular axis the hue channel can be measured, between H = 100% and H = 0%. Along the vector S-H the grey scale between black and white is defined. It should be considered that the HSV color model is referenced to the RGB color space and lightness and hue could be confounded—for example two saturated colors could designated as the same lightness but have wide differences in perceived lightness. To express brightness, saturation and hue numerically could show some problems. The HSV color model of the already processed (decorrstrechted) image was used to select the threshold. To define the threshold, the “Color Thresholder App of MATLAB” was used once and channel limits were implemented in the attached script. The thresholds were defined for the three HSV channels and set as follows:
  • channel 1: channel1Min = 0.115; channel1Max = 0.436;
  • channel 2: channel2Min = 0.526; channel2Max = 1.000;
  • channel 3: channel3Min = 0.627; channel3Max = 1.000;
Selected pixels were set at zero after thresholding, and thereafter the objects could be counted.

2.4. Creation of the “Open Area”

With the command bwareaopen (BW, p), combined clusters/objects under the defined pixel value p were removed from the binary image and were not counted [26].

2.5. Creation of a Threshold Value to Complete Total Green Pixel Segmentation/Classification

The L*a*b* model is similar to the HSV model and is defined as a rectangular coordinate system with the two vectors color value and saturation. The L*a*b* color model was used to select green pixels because the distribution of the color area was only in this model sufficient. Euclidean distance of two complementary colors in the L*a*b* model space is directly proportional to the visual similarity of the colors. This can provide simple metrics for a clustering. The clustering can be performed only in the “a”, ”b” space, which represents the color value component. The “L” component in the CIE-Lab space represents the luminosity [29]. The command used was “I2 = im2double(I)”, which increases the intensity of the original image twofold, rescaling the data if necessary. The second image appears like the original. The command “im2double” converts the images to double precision. Green pixels were set to zero and pixels could then be counted to capture the degree of coverage. Additionally, the percentage of green pixels was calculated [26].

2.6. Creation of a Table

At the end of the loop, all information collected from each image was saved in a table, including the number of plants, the amount of green pixels and the percentage of green pixels.

3. Results and Discussion

Most of the images were sharp, but in some images slight blurriness resulted from the motion of the platform created by wind; however, all images were still useable and did not require different processing for image analysis, and a batch-mode was feasible. Figure 3 indicates the original image, illustrates the segmentation of the green pixels and the ground cover of the plot.
The correlation of the green pixel percentage and the plants recorded visually in the plots indicated little relationship (R2 = 0.023), which suggests that no relationship between ground cover and plant number existed. Digital detection of plants was thus not possible. Ground cover obtained from the segmentation of the green pixels could be detected quite well (Figure 4) and indicates the health of the crop. At the BBCH stages 13–15, the ground cover ranged between 76 and 83% green pixels for all cultivars (Figure 5).
The equidistantly cropped plots tended to exhibit a higher amount of green pixels, indicating enhanced growth and biomass production. Goetz [23] and Bullock et al. [30] also detected a higher ground cover two months after sowing in equidistant plantings compared with row plantings, and a slightly higher grain yield as observed by Hoff et al. [31] in equidistant plantings compared with row plantings. Detection of green pixels to assess ground cover and biomass production is considered an adequate and reliable digital technique to replace destructive methods in line with observations by Kipp, Mistele, Baresel and Schmidhalter [12]. However, ground cover did not reflect the plant number in the plots, as shown by low correlation coefficients. An increased number of plants does not automatically result in increased biomass as shown in Figure 6. This was also shown by Turgut et al. [32] who reported no significant increase in dry weight at more than 85,000 plants/ha. To record the digital number of plants in each image, the original image was processed with the decorrstretch contrast enhancement procedure in Figure 7.
Using the decorrstretch contrast enhancement procedure command and producing higher color contrasts in the images (Figure 7) enabled the counting of plants digitally with a close correlation. This offers reliable information about plant emergence, which also serves as basis to correctly determine the aerial yield per plant (Figure 8). A threshold that selects only the yellow and light green pixels in the range from 0.115 to 0.436 (V = channel one, HSV model) from the young leaves, which are located in the center of the plants, was used. It is defined with the HSV channels as indicated in the M & M section. Only the light green and yellow pixels were selected to ensure that overlapping plants were not counted as one single plant (Figure 9).
The pixel size, which minimizes the difference between manually and digitally counted plant numbers, can be defined with the command area opening: bwareaopen (BW, p) as described in the M & M section. The operation removes all clusters in the binary image that are smaller than p (the defined area). This is illustrated in Figure 10a for p = 5, resulting in the smallest range of percentage differences between the digitally counted plant number and the visually field-counted plant number serving as a reference. In contrast, p = 10 and p = 3 result in a higher spread, shown in box plots in Figure 10a, which is explained by larger differences in actually and digitally counted plants. For p = 3, more clusters were built and counted and the plant number was overestimated in contrast to p = 10, where the plant number was underestimated due to higher cluster extinction. The percentage difference between in situ and image-based counted plants was quite small (Figure 10b). The range of the percentage difference, including outliers, ranged between ±15% for all cultivars. The digital plant-counting model worked best for the cultivar Cannavaro, where the percentage difference was less than ±5% and with only one outlier over −5%. The cultivar Saludo and Vitallo showed the highest percentage differences between digitally and visually counted plants, exceeding slightly the ±5% range. The cultivar Vitallo included two outliers that extended to the −15% limit. Outliers from the cultivars Vitallo and Cannavaro are due to their fast and enhanced development in the seedling stage. Plants became too big and younger light-green leaves could not be separated sufficiently from older dark green leaves. The outlier of Lapriora is due to a higher illumination caused by the sun position since plants in the original image were very bright.
The image processing script can be used at the early leaf development stages within both planting systems, enabling successful segmentation of young plants. A good and clear segmentation depends on the type of object and/or the region.
A clear differentiation of two neighboring pixels or a pixel group depends on sharp color edges, which define a cluster [33]. Therefore, the triangular planting systems could be slightly better assessed than the row planting system because the intra-row distances between the plants were larger (Figure 11). Plants did not overlap too much; thus plant counting for both plant systems was possible [34]. The resolution and image sharpness may enhance digital plant counting but is not relevant for the detection of plant numbers on an image. Blurry pictures did not represent significant error in differentiating visually and digitally counted plants.
Another source of error could result from the presence of weeds; they can have the same spectral reflectance in the visible spectrum. This would result in an increased number of digitally counted plants and a higher difference compared with the visually counted plants. Yang et al. [35] used fuzzy logic to differentiate the greenness of wheat plants and weeds, defined by three clusters chosen by their position in the field. Another solution to decrease errors is to increase p in the command bwareaopen (BW, p), which will erase smaller clusters (Figure 10a). The spread of differences between digitally and visually counted plants was less than 10% for all cultivars. There were only three outliers that exceeded the 10% range (Figure 10b). The differences between visually and digitally counted plants can be due to various factors. Too-late imaging of plant growth may not allow for the separation of younger from older leaves. Plants standing too close together, with enhanced overlapping effects, could result in counting fewer plants. This effect appears more often in row planting systems, where there is less regular spacing between plants. A good example of this is the cultivar Vitallo, which exhibited the largest deviation as a result of enhanced plant growth and thus increased overlapping of plants (Figure 11). Additionally, young, pale and green plant leaves could not be adequately distinguished from older leaves using the decorrstrech contrast enhancement procedure when plants were ahead of the plant stage. However, the median values were close to zero for all cultivars, which indicates a close relationship between digitally and visually counted plant numbers. After obtaining images and image processing, a field map can be created illustrating the post-emergence of plants. This allows researchers to judge the success and accuracy of seeding management and to judge the uniformity of the plant distribution, depicting irregularities and gaps between or within the rows caused by soil erosion, soil compaction or soil fertility [36]. Equally important is the potential yield prediction calculated by the digitally counted plants. For post-emergence breeding purposes, the seed quality could be detected cheaply and quickly. The use of UAVs and image processing in agriculture is a promising tool to answer farm management questions, allowing researchers to optimize management and to support field experimentation in agronomy and breeding activities.

4. Conclusions

The use of UAVs provides time- and cost-saving data for further processing and allows for flexible and weather-independent data collection. The results of this study demonstrate the capability of image processing in agricultural fields to detect plant post-emergence. Ground cover detection did not correlate with the plant number on a plot level. It became possible to count plants only by introducing the decorrstretch command from MATLAB. Blurry effects and weed detection in images could lead to miscounting, which can be avoided by manually selecting thresholds and clustering of pixels. Plant number assessment is only possible during a specific window of early leaf development stages, when young, light-green leaves differ from older, dark-green leaves. If overlapping of the plants occurs, green pixel segmentation of young leaves is difficult or no longer feasible. Using an optimized time window enables image analysis in a batch procedure (as seen in the script from Appendix A). The use of UAVs and image processing has the potential to optimize farm management and to support field experimentation for agronomic and breeding purposes.

Acknowledgments

The authors wish to thank Stefan Huber from the Chair of Agricultural Systems Engineering at the Technical University of Munich (TUM) for his technical support and providing the drone. This research was partially supported by the DFG (German Research Foundation) funded project SCHM 1456/6-1. We appreciate the pertinent comments by the reviewers which allowed us to improve the precision of the manuscript.

Author Contributions

F.G. and U.S. conceived and designed the experiments; F.G. performed the experiments; F.G. analyzed the data; F.G. and U.S. wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Algorithm A1. MATLAB Script.
1: clear all
2: start = 01;
3: end = 96;
4: T = zeros(end-start + 1,3);
5:(start loop)
6: for j = start:end
7: start = 01;
8: end = 96;
9: String = num2str(j);
10: P = imread([‘P’ String ‘.JPG’]);
11: %P = imread(‘P93.jpg’);
12: %P = imrotate(P,−90);
13: %imshow(P)
14: %cut image to the right size
15: %P = P(20:631,:,:);
16: %imshow(P);
17: %create color histogram
18: imhist(P(:,:,2))
19: %increase contrast
20: P_contrast = decorrstretch(P, ‘Tol’, 0.01);
21: imshow(P_contrast);
22: % 1)create threshold
23: % Convert RGB image to chosen color space
24: I = rgb2hsv(P);
25: % Define thresholds for channel 1 based on histogram settings
26: channel1Min = 0.115;
27: channel1Max = 0.436;
28: % Define thresholds for channel 2 based on histogram settings
29: channel2Min = 0.526;
30: channel2Max = 1.000;
31: % Define thresholds for channel 3 based on histogram settings
32: channel3Min = 0.627;
33: channel3Max = 1.000;
34: % Create mask based on chosen histogram thresholds
35: BW = (I(:,:,1) >= channel1Min) & (I(:,:,1) <= channel1Max) & ...
36: (I(:,:,2) >= channel2Min) & (I(:,:,2) <= channel2Max) & ...
37: (I(:,:,3) >= channel3Min) & (I(:,:,3) <= channel3Max);
38: % Initialize output masked image based on input image.
39: maskedRGBImage = P;
40: % Set background pixels where BW is false to zero.
41: maskedRGBImage(repmat(~BW,[1 1 3])) = 0;
42: imshow(BW)
43: % 2) count plants
44: imshow(BW)
45: Test = bwareaopen(BW,5);
46: Test = imfill(Test, ’holes’);
47: B = bwboundaries(Test);
48: imshow(Test)
49: hold on
50: visboundaries(B)
51: cc = bwconncomp(Test);
52: graindata = regionprops(cc,’basic’);
53: number of plants = cc.NumObjects;
54: % convert image to matrix
55: figure(2);
56: Histi = reshape(P(:,:,2),[],1);
57: Histi1 = im2double(Histi) *255;
58: hist(Histi1,100);
59: title(‘green-value’);
60: % 3) create binary image and calculate ground cover
61: % Convert RGB image to chosen color space
62: RGB = im2double(P);
63: imshow(RGB)
64:cform=makecform(‘srgb2lab’,‘AdaptedWhitePoint’,whitepoint(‘D65’));
65: I = applycform(RGB,cform);
66: % Define thresholds for channel 1 based on histogram settings
67: channel1Min = 12.502;
68: channel1Max = 100.000;
69: % Define thresholds for channel 2 based on histogram settings
70: channel2Min = −10.414;
71: channel2Max = 8.329;
72: % Define thresholds for channel 3 based on histogram settings
73: channel3Min = −8.447;
74: channel3Max = 67.004;
75: % Create mask based on chosen histogram thresholds
76: BW_1 = (I(:,:,1) >= channel1Min) & (I(:,:,1) <= channel1Max) & ...
77: (I(:,:,2) >= channel2Min) & (I(:,:,2) <= channel2Max) & ...
78: (I(:,:,3) >= channel3Min) & (I(:,:,3) <= channel3Max);
79: % Initialize output masked image based on input image.
80: maskedRGBImage = RGB;
81: % Set background pixels where BW is false to zero.
82: maskedRGBImage(repmat(~BW_1,[1 1 3])) = 0;
83: imshow(BW_1)
84: %3.1) count green pixels
85: numtotal = nnz(P)
86: numgreenpixel = nnz(BW_1)
87: Green_percent = (numgreenpixel/numtotal) *100
88: T(j-start + 1,1) = number of plants;
89: T(j-start + 1,2) = numgreenpixel;
90: T(j-start + 1,3) = Green_percent;
91: clear(‘-regexp’,’[^T] *’);
92: end (end of loop)

References

  1. Floreano, D.; Wood, R.J. Science, technology and the future of small autonomous drones. Nature 2015, 521, 460–466. [Google Scholar] [CrossRef] [PubMed]
  2. Sankaran, S.; Khot, L.R.; Espinoza, C.Z.; Jarolmasjed, S.; Sathuvalli, V.R.; Vandemark, G.J.; Miklas, P.N.; Carter, A.H.; Pumphrey, M.O.; Knowles, N.R.; et al. Low-altitude, high-resolution aerial imaging systems for row and field crop phenotyping: A review. Eur. J. Agron. 2015, 70, 112–123. [Google Scholar] [CrossRef]
  3. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  4. Tripicchio, P.; Satler, M.; Dabisias, G.; Ruffaldi, E.; Avizzano, C.A. Towards smart farming and sustainable agriculture with drones. In Proceedings of the 2015 International Conference on Intelligent Environments IE 2015, Prague, Czech Republic, 15–17 July 2015; pp. 140–143. [Google Scholar]
  5. Pena, J.M.; Torres-Sanchez, J.; de Castro, A.I.; Kelly, M.; Lopez-Granados, F. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (uav) images. PLoS ONE 2013, 8. [Google Scholar] [CrossRef] [PubMed]
  6. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. Selecting patterns and features for between- and within- crop-row weed mapping using uav-imagery. Expert Syst. Appl. 2016, 47, 85–94. [Google Scholar] [CrossRef]
  7. Geesing, D.; Diacono, M.; Schmidhalter, U. Site-specific effects of variable water supply and nitrogen fertilisation on winter wheat. J. Plant Nutr. Soil Sci. 2014, 177, 509–523. [Google Scholar] [CrossRef]
  8. Rasmussen, J.; Ntakos, G.; Nielsen, J.; Svensgaard, J.; Poulsen, R.N.; Christensen, S. Are vegetation indices derived from consumer-grade cameras mounted on uavs sufficiently reliable for assessing experimental plots? Eur. J. Agron. 2016, 74, 75–92. [Google Scholar] [CrossRef]
  9. Barmeier, G.; Schmidhalter, U. High-throughput phenotyping of wheat and barley plants grown in single or few rows in small plots using active and passive spectral proximal sensing. Sensors 2016, 16, 1860. [Google Scholar] [CrossRef] [PubMed]
  10. Winterhalter, L.; Mistele, B.; Schmidhalter, U. Evaluation of active and passive sensor systems in the field to phenotype maize hybrids with high-throughput. Field Crop. Res. 2013, 154, 236–245. [Google Scholar] [CrossRef]
  11. Dharani, T.; Aroquiaraj, I.L.; Mageshwari, V. Diverse image investigation using image metrics for content based image retrieval system. In Proceedings of the 2016 International Conference on Inventive Computation Technologies (ICICT), Tamilnadu, India, 26–27 August 2016; pp. 1–8. [Google Scholar]
  12. Kipp, S.; Mistele, B.; Baresel, P.; Schmidhalter, U. High-throughput phenotyping early plant vigour of winter wheat. Eur. J. Agron. 2014, 52, 271–278. [Google Scholar] [CrossRef]
  13. Romeo, J.; Pajares, G.; Montalvo, M.; Guerrero, J.M.; Guijarro, M.; Ribeiro, A. Crop row detection in maize fields inspired on the human visual perception. Sci. World J. 2012. [Google Scholar] [CrossRef] [PubMed]
  14. Burgos-Artizzu, X.P.; Ribeiro, A.; Guijarro, M.; Pajares, G. Real-time image processing for crop/weed discrimination in maize fields. Comput. Electron. Agric. 2011, 75, 337–346. [Google Scholar] [CrossRef]
  15. Berge, T.W.; René cederkvist, H.; Aastveit, A.H.; Fykse, H. Simulating the effects of mapping and spraying resolution and threshold level on accuracy of patch spraying decisions and herbicide use based on mapped weed data. Acta Agric. Scand. Sect. B Soil Plant Sci. 2008, 58, 216–229. [Google Scholar] [CrossRef]
  16. Lelong, C.C.D.; Burger, P.; Jubelin, G.; Roux, B.; Labbe, S.; Baret, F. Assessment of unmanned aerial vehicles imagery for quantitative monitoring of wheat crop in small plots. Sensors 2008, 8, 3557–3585. [Google Scholar] [CrossRef] [PubMed]
  17. Gautam, R.; Panigrahi, S. Leaf nitrogen determination of corn plant using aerial images and artificial neural networks. Can. Biosyst. Eng. 2007, 49, 7. [Google Scholar]
  18. Martin, J.; Edwards, H.H.; Burgess, M.A.; Percival, H.F.; Fagan, D.E.; Gardner, B.E.; Ortega-Ortiz, J.G.; Ifju, P.G.; Evers, B.S.; Rambo, T.J. Estimating distribution of hidden objects with drones: From tennis balls to manatees. PLoS ONE 2012, 7. [Google Scholar] [CrossRef] [PubMed]
  19. Weiner, J.; Andersen, S.B.; Wille, W.K.M.; Griepentrog, H.W.; Olsen, J.M. Evolutionary agroecology: The potential for cooperative, high density, weed-suppressing cereals. Evol. Appl. 2010, 3, 473–479. [Google Scholar] [CrossRef] [PubMed]
  20. Maddonni, G.A.; Chelle, M.; Drouet, J.L.; Andrieu, B. Light interception of contrasting azimuth canopies under square and rectangular plant spatial distributions: Simulations and crop measurements. Field Crop. Res. 2001, 70, 1–13. [Google Scholar] [CrossRef]
  21. Maddonni, G.A.; Otegui, M.E.; Cirilo, A.G. Plant population density, row spacing and hybrid effects on maize canopy architecture and light attenuation. Field Crop. Res. 2001, 71, 183–193. [Google Scholar] [CrossRef]
  22. Abdin, O.A.; Zhou, X.M.; Cloutier, D.; Coulman, D.C.; Faris, M.A.; Smith, D.L. Cover crops and interrow tillage for weed control in short season maize (zea mays). Eur. J. Agron. 2000, 12, 93–102. [Google Scholar] [CrossRef]
  23. Götz, S.; Bernhardt, H. Produktionsvergleich von Gleichstandsaat und Normalsaat bei Silomais. LANDTECHNIK Agric. Eng. 2010, 65, 107–110. [Google Scholar]
  24. Assefa, Y.; Vara Prasad, P.V.; Carter, P.; Hinds, M.; Bhalla, G.; Schon, R.; Jeschke, M.; Paszkiewicz, S.; Ciampitti, I.A. Yield responses to planting density for us modern corn hybrids: A synthesis-analysis. Crop Sci. 2016, 56, 2802–2817. [Google Scholar] [CrossRef]
  25. Testa, G.; Reyneri, A.; Blandino, M. Maize grain yield enhancement through high plant density cultivation with different inter-row and intra-row spacings. Eur. J. Agron. 2016, 72, 28–37. [Google Scholar] [CrossRef]
  26. MathWorks. Image Processing Toolbox™ User’s Guide. MATLAB; The MathWorks Inc.: Natick, MA, USA, 2016; Volume R2016b. [Google Scholar]
  27. Padmapriya, A.; Vigneshnarthi, S. Image processing operations for 3-d image. Int. J. Sci. Res. Publ. 2012, 2, 1–6. [Google Scholar]
  28. Sural, S.; Gang, Q.; Pramanik, S. Segmentation and histogram generation using the HSV color space for image retrieval. In Proceedings of the International Conference on Image Processing, Rochester, NY, USA, 22–25 September 2002; Volume 582, pp. II-589–II-592. [Google Scholar]
  29. Recky, M.; Leberl, F. Windows detection using k-means in cie-lab color space. In Proceedings of the 2010 20th International Conference on Pattern Recognition (ICPR), Istanbul, Turkey, 23–26 August 2010; pp. 356–359. [Google Scholar]
  30. Bullock, D.G.; Nielsen, R.L.; Nyquist, W.E. A growth analysis comparison of corn grown in conventional and equidistant plant spacing. Crop Sci. 1988, 28, 254–258. [Google Scholar] [CrossRef]
  31. Hoff, D.J.; Mederski, H.J. Effect of equidistant corn plant spacing on yield. Agron. J. 1960, 52, 295–297. [Google Scholar] [CrossRef]
  32. Turgut, I.; Duman, A.; Bilgili, U.; Acikgoz, E. Alternate row spacing and plant density effects on forage and dry matter yield of corn hybrids (zea mays L.). J. Agron. Crop Sci. 2005, 191, 146–151. [Google Scholar] [CrossRef]
  33. Solomon, C.; Breckon, T. Fundamentals of Digital Image Processing: A Practical Approach with Examples in MATLAB; Wiley: Hoboken, NJ, USA, 2011. [Google Scholar]
  34. She, T.; Ehsani, R.; Robbins, J.; Leiva, J.N.; Owen, J. Applications of Small UAV Systems for Tree and Nursery Inventory Management. In Proceedings of the 12th International Conference on Precision Agriculture, Sacramento, CA, USA, 20–23 July 2014. [Google Scholar]
  35. Yang, C.; Prasher, S.; Landry, J.; Perret, J.; Ramaswamy, H. Recognition of weeds with image processing and their use with fuzzy logic for precision farming. Can. Agric. Eng. 2000, 42, 195–200. [Google Scholar]
  36. Blackmore, S. The interpretation of trends from multiple yield maps. Comput. Electron. Agric. 2000, 26, 37–51. [Google Scholar] [CrossRef]
Figure 1. Part of the trial design, where black arrow indicate the flight direction of the unmanned aerial vehicles (UAVs) and illustrate the field section that was captured in the image (in general three plots). The abbreviations RP and TP indicate row planting and triangular (equidistant) planting.
Figure 1. Part of the trial design, where black arrow indicate the flight direction of the unmanned aerial vehicles (UAVs) and illustrate the field section that was captured in the image (in general three plots). The abbreviations RP and TP indicate row planting and triangular (equidistant) planting.
Remotesensing 09 00544 g001
Figure 2. Flowchart of the theoretical image processing steps in MATLAB, where the color scatter plot of the original RGB image is first transformed in a new, wider scatter plot with the decorrstrech contrast enhancement procedure and a pixel segmentation of the light green pixel with a threshold selection in the HSV color model (with the three channels: hue, saturation and value) followed by count the plants at the end. Additionally the L*a*b* color model (where L stays for luminosity, and a and b are vectors to create a color space) was used to select green pixels from the original image to detect ground cover.
Figure 2. Flowchart of the theoretical image processing steps in MATLAB, where the color scatter plot of the original RGB image is first transformed in a new, wider scatter plot with the decorrstrech contrast enhancement procedure and a pixel segmentation of the light green pixel with a threshold selection in the HSV color model (with the three channels: hue, saturation and value) followed by count the plants at the end. Additionally the L*a*b* color model (where L stays for luminosity, and a and b are vectors to create a color space) was used to select green pixels from the original image to detect ground cover.
Remotesensing 09 00544 g002
Figure 3. Nadir view of an RGB image acquired with a drone from an equidistantly planted plot.
Figure 3. Nadir view of an RGB image acquired with a drone from an equidistantly planted plot.
Remotesensing 09 00544 g003
Figure 4. Example of ground cover segmentation of the green area. The black-filled area represents the amount of green pixels in the same plot as shown in Figure 2.
Figure 4. Example of ground cover segmentation of the green area. The black-filled area represents the amount of green pixels in the same plot as shown in Figure 2.
Remotesensing 09 00544 g004
Figure 5. Ground cover of four different cultivars at the BBCH 13–15 development stages. Segmentation of the percentage of green pixel amounts (%) for different nitrogen application levels (50, 150 and 250 kg N ha−1) and different planting systems, row planting.
Figure 5. Ground cover of four different cultivars at the BBCH 13–15 development stages. Segmentation of the percentage of green pixel amounts (%) for different nitrogen application levels (50, 150 and 250 kg N ha−1) and different planting systems, row planting.
Remotesensing 09 00544 g005
Figure 6. Correlation of the percentage of green pixel amounts (%) indicating the ground cover and the number of plants as counted in the experimental plots.
Figure 6. Correlation of the percentage of green pixel amounts (%) indicating the ground cover and the number of plants as counted in the experimental plots.
Remotesensing 09 00544 g006
Figure 7. Image after adapting the contrast with a decorrelation of MATLAB. The function highlights elements by enhancing the color differences indicated for plot number 12. Moreover, MATLAB offers various parameters to be set for the decorrelation stretch. We used the following command: P_Contrast = decorrstretch(P, ‘Tol’, 0.01); where “P” is the image, “Tol” is the linear contrast stretch, which further expands the color range and additionally finds limits to contrast the stretch because pixel values must be located in the range [0, 255] and “0.01” (which defines the level of the contrast stretch).
Figure 7. Image after adapting the contrast with a decorrelation of MATLAB. The function highlights elements by enhancing the color differences indicated for plot number 12. Moreover, MATLAB offers various parameters to be set for the decorrelation stretch. We used the following command: P_Contrast = decorrstretch(P, ‘Tol’, 0.01); where “P” is the image, “Tol” is the linear contrast stretch, which further expands the color range and additionally finds limits to contrast the stretch because pixel values must be located in the range [0, 255] and “0.01” (which defines the level of the contrast stretch).
Remotesensing 09 00544 g007
Figure 8. Correlation of visually counted plants serving as reference and the digitally recorded plant number. The ground cover indicated by the green pixel percentage ranged for all cultivars between 76% and 87%. Significant differences in the nitrogen levels were not observed at the investigated growth stages.
Figure 8. Correlation of visually counted plants serving as reference and the digitally recorded plant number. The ground cover indicated by the green pixel percentage ranged for all cultivars between 76% and 87%. Significant differences in the nitrogen levels were not observed at the investigated growth stages.
Remotesensing 09 00544 g008
Figure 9. Image after using the Color Thresholder App of MATLAB. Selecting the yellow and lime green pixels of the image with the decorrstretch contrast enhancement procedure as shown for plot number 12.
Figure 9. Image after using the Color Thresholder App of MATLAB. Selecting the yellow and lime green pixels of the image with the decorrstretch contrast enhancement procedure as shown for plot number 12.
Remotesensing 09 00544 g009
Figure 10. (a) Box plots illustrating the percentage differences between in situ and image-based counted plant numbers for all plots, depending on the command area opening bwareaopen (BW, p), which allows plants to be counted in a standardized way. A has an open area of p = 3; B shows the range for p = 5 and C for p = 10. The open area is defined through the pixel area entering a limit for counted or not counted combined components being below the limit. The percentage difference is calculated as difference (%) = (digitally measured plant number—actually counted plant number)/actually counted plant number x 100. The bold line inside the box shows the median, with the upper and lower lines of the box plot representing the 75th and 25th percentiles. The circles outside the boxes are outliers; (b) Box plots representing the distribution of the percentage differences between the actually and digitally counted plants number from all plots. The bold line inside the box shows the median, with the upper and lower lines of the box plot representing the 75th and 25th percentiles. The circles outside the boxes represent outliers.
Figure 10. (a) Box plots illustrating the percentage differences between in situ and image-based counted plant numbers for all plots, depending on the command area opening bwareaopen (BW, p), which allows plants to be counted in a standardized way. A has an open area of p = 3; B shows the range for p = 5 and C for p = 10. The open area is defined through the pixel area entering a limit for counted or not counted combined components being below the limit. The percentage difference is calculated as difference (%) = (digitally measured plant number—actually counted plant number)/actually counted plant number x 100. The bold line inside the box shows the median, with the upper and lower lines of the box plot representing the 75th and 25th percentiles. The circles outside the boxes are outliers; (b) Box plots representing the distribution of the percentage differences between the actually and digitally counted plants number from all plots. The bold line inside the box shows the median, with the upper and lower lines of the box plot representing the 75th and 25th percentiles. The circles outside the boxes represent outliers.
Remotesensing 09 00544 g010
Figure 11. Bar chart illustrating the mean of the percentage differences between visually and digitally determined numbers of four cultivars in triangular planting (TP) and row planting (RP) systems. The percentage difference is calculated as: (digitally measured plant number—visually counted plant number)/visually counted plant number × 100.
Figure 11. Bar chart illustrating the mean of the percentage differences between visually and digitally determined numbers of four cultivars in triangular planting (TP) and row planting (RP) systems. The percentage difference is calculated as: (digitally measured plant number—visually counted plant number)/visually counted plant number × 100.
Remotesensing 09 00544 g011
Table 1. Intended use, maturity class and number of the different maize cultivars.
Table 1. Intended use, maturity class and number of the different maize cultivars.
CultivarUsageMaturity GroupFAO Number
Cannavarobiogasvery lateS 310
LaprioracornearlyK 190
Saludosilage, cornearlyS10, K210
VitallosilagelateS270

Share and Cite

MDPI and ACS Style

Gnädinger, F.; Schmidhalter, U. Digital Counts of Maize Plants by Unmanned Aerial Vehicles (UAVs). Remote Sens. 2017, 9, 544. https://doi.org/10.3390/rs9060544

AMA Style

Gnädinger F, Schmidhalter U. Digital Counts of Maize Plants by Unmanned Aerial Vehicles (UAVs). Remote Sensing. 2017; 9(6):544. https://doi.org/10.3390/rs9060544

Chicago/Turabian Style

Gnädinger, Friederike, and Urs Schmidhalter. 2017. "Digital Counts of Maize Plants by Unmanned Aerial Vehicles (UAVs)" Remote Sensing 9, no. 6: 544. https://doi.org/10.3390/rs9060544

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop