Next Article in Journal
Illumination Geometry and Flying Height Influence Surface Reflectance and NDVI Derived from Multispectral UAS Imagery
Previous Article in Journal
Seed and Seedling Detection Using Unmanned Aerial Vehicles and Automated Image Classification in the Monitoring of Ecological Recovery
Previous Article in Special Issue
A Review on the Use of Unmanned Aerial Vehicles and Imaging Sensors for Monitoring and Assessing Plant Stresses
Open AccessArticle

Using Unmanned Aerial Systems (UAS) and Object-Based Image Analysis (OBIA) for Measuring Plant-Soil Feedback Effects on Crop Productivity

1
Faculty of Geosciences, Utrecht University, P.O. Box 80.115, 3508 TC Utrecht, The Netherlands
2
Faculty of Forestry, University of British Columbia, 2424 Main Mall, Vancouver, BC V6T 1Z4, Canada
3
Laboratory of Geo-Information Science and Remote Sensing, Wageningen University and Research, P.O. Box 47, 6700 AA Wageningen, The Netherlands
4
Soil Biology Group, Environmental Sciences, Wageningen University and Research, P.O. Box 47, 6700 AA Wageningen, The Netherlands
*
Author to whom correspondence should be addressed.
Drones 2019, 3(3), 54; https://doi.org/10.3390/drones3030054
Received: 24 May 2019 / Revised: 26 June 2019 / Accepted: 27 June 2019 / Published: 30 June 2019
(This article belongs to the Special Issue UAV/Drones for Agriculture and Forestry)

Abstract

Unmanned aerial system (UAS) acquired high-resolution optical imagery and object-based image analysis (OBIA) techniques have the potential to provide spatial crop productivity information. In general, plant-soil feedback (PSF) field studies are time-consuming and laborious which constrain the scale at which these studies can be performed. Development of non-destructive methodologies is needed to enable research under actual field conditions and at realistic spatial and temporal scales. In this study, the influence of six winter cover crop (WCC) treatments (monocultures Raphanus sativus, Lolium perenne, Trifolium repens, Vicia sativa and two species mixtures) on the productivity of succeeding endive (Cichorium endivia) summer crop was investigated by estimating crop volume. A three-dimensional surface and terrain model were photogrammetrically reconstructed from UAS imagery, acquired on 1 July 2015 in Wageningen, the Netherlands. Multi-resolution image segmentation (MIRS) and template matching algorithms were used in an integrated workflow to detect individual crops (accuracy = 99.8%) and delineate C. endivia crop covered area (accuracy = 85.4%). Mean crop area (R = 0.61) and crop volume (R = 0.71) estimates had strong positive correlations with in situ measured dry biomass. Productivity differences resulting from the WCC treatments were greater for estimated crop volume in comparison to in situ biomass, the legacy of Raphanus was most beneficial for estimated crop volume. The perennial ryegrass L. perenne treatment resulted in a significantly lower production of C. endivia. The developed workflow has potential for PSF studies as well as precision farming due to its flexibility and scalability. Our findings provide insight into the potential of UAS for determining crop productivity on a large scale.
Keywords: remote sensing; unmanned aerial systems; object-based image analysis; plant-soil feedback; plant productivity; template matching; segmentation; precision agriculture remote sensing; unmanned aerial systems; object-based image analysis; plant-soil feedback; plant productivity; template matching; segmentation; precision agriculture

1. Introduction

Plant-soil feedback (PSF) describes the reciprocal interactions between plants and soil biota [1]. Plants and their associated microorganisms influence soil properties, such as mineral nitrogen concentration and organic matter content, as well as the abundance of plant pathogens and mutualists [2]. The net effect of these changes can enhance or suppress the performance of succeeding plants relative to fallow soil [3]. Understanding PSF mechanisms is necessary to avoid the risk of negative PSF and generate potential positive PSF by applying well-matching crop rotations in agriculture systems [4,5]. Most PSF field studies are dependent on time-consuming and laborious destructive sampling methods which constrain the scale at which these studies can be performed [6]. Therefore, there is a need to develop new non-destructive methodologies that enable the investigation of PSF mechanisms at a high resolution under actual field conditions and at realistic spatial and temporal scales [4,7,8].
Remote sensing is used to study ecological phenomena and for applications in precision farming, which involves the use of sensors and information technologies to bring data from multiple sources to support decisions associated with crop productivity and a more efficient use of farm inputs, such as fertilizers and herbicides [9,10,11,12]. Data can be acquired by unmanned aerial systems (UAS), otherwise known as drones, with a more flexible spatial and temporal resolution compared to other remote sensing platforms [13,14]. Remote sensing technologies with high spatial and temporal resolutions allow testing PSF in the field, which minimizes and ultimately discards the need to destructively sample crops and generalize from a limited number of samples [8,15]. This is not only relevant for PSF research but also for assessment of crop productivity, which is based on multiple parameters such as dry biomass, height, and volume [11].
Crop parameters such as canopy cover, canopy biochemical composition, pigment concentration, and vegetation indices can be derived using high-resolution cameras and hyperspectral sensors [16,17]. Recent work showed that UAS carrying a hyperspectral sensor can be used to characterize plant traits and assess PSF effects of different cover crop treatments on a succeeding main grain crop [8]. Three-dimensional data can be obtained from light detection and ranging (LiDAR) or digital aerial photogrammetry [16]. Both LiDAR [16,17,18] and digital aerial photogrammetry [19,20,21] enable direct measurement of crop dimensions and indirect measurement of above ground biomass and biophysical parameters. Although unable to return points from below the canopy, multispectral UAS are more affordable and require a smaller payload and battery capacity compared to hyperspectral and LiDAR UAS [22,23,24]. Another advantage is that digital aerial photogrammetry provides high-resolution orthophoto mosaics in addition to accurate digital surface models.
There is potential for integration of crop discrimination methods, which take advantage of the high-resolution orthophoto mosaic, and estimation of crop dimensions. This enables incorporation of meaningful individual crop or plot objects, rather than pre-defined sample plots, in further calculation of crop dimensions. Object detection methods, such as template matching, are used in remote sensing to determine the amount of objects on an image and predict their positions [25]. Template matching is among the earliest and simplest of such methods but usually results in a high commission error in more complex images [26]. In recent work, a workflow was developed in which template matching was combined with an object-based image analysis (OBIA) object detection approach to overcome this problem [26,27]. A common base of OBIA is image segmentation, but it also incorporates other concepts that have been used for decades in remote sensing such as feature extraction, edge-detection, and classification [28,29]. A crucial advantage of an object approach over a pixel approach is the additional spatial dimension for objects such as distance, morphology and topology [29,30,31]. Grouped pixels can characterize fields remarkably better than single pixels using high-resolution imagery [32], several studies have shown OBIA produces higher accuracies compared to pixel-to-pixel analysis for thematic mapping [33,34,35,36,37].
In this study, the suitability of UAS-based optical remote sensing to measure differences of crop productivity of a leafy vegetable crop between multiple soil treatments (different cover crop treatments preceding the main crop) was assessed using an OBIA approach. Individual main crop detection and crop area segmentation and classification were performed based on an orthophoto mosaic. Mean crop volume was calculated for the experimental plots using the segments, detected crops, and a digital surface model. The developed methodology demonstrated the suitability of UAS and digital aerial photogrammetry for measuring crop productivity during crop growth in a non-destructive way and for application in large-scale PSF research.

2. Materials and Methods

2.1. Study Area

The study area of approximately 0.3 hectare, property of Wageningen University & Research, is located in the eastern part of the Netherlands around 9 m above sea level (51°59′41.72″ N, 5°39′17.89″ E; Figure 1). A field experiment was established in 2015 to investigate the legacies of different monocultures and mixtures of plant species grown during autumn and winter, i.e., winter cover crop (WCC) treatments, on the succeeding endive (Cichorium endivia) summer crop; more details are provided by Barel et al. (2018) [2]. The study area includes 60 experimental plots of 3 × 3 m2 with each plot planted with 10 × 10 endive individuals. The endive plants were four weeks old when planted in early May. Prior to endive different cover crop treatments were grown on the experimental plots from August-February, after which the cover crops were incorporated in the soil. The different cover crop treatments comprised of monocultures of Radish (Raphanus sativus; Rs), perennial ryegrass (Lolium perenne; Lp), white clover (Trifolium repens; Tr), and common vetch (Vicia sativa; Vs) and the species mixtures Lp+Tr and Rs+Vs. In addition, a fallow treatment was used as a reference (Table 1). Each treatment was performed five times and positioned randomly according to a randomised block design with five parallel blocks.

2.2. Data

Summer main crop biomass was obtained by destructive sampling on 6 July 2015, at time of harvest, using three plants per experimental unit. The samples were equally distributed, leaving out outer rows of plants to avoid potential edge effects [2]. Dry biomass (g m−2) was determined by weighing vegetation after it was dried in an oven at 70 °C for 48 h [8]. An UAS flight was undertaken close to harvest on 1 July 2015 at an average speed of 4 m s−1 and an altitude of 60 m. Aerial imagery was acquired at parallel flight lines, with images of 4608 × 3464 px having a forward and lateral overlap of 80% approximately, using the Panasonic GX1 camera with a 14 mm pancake lens. The camera was mounted onto an Aerialtronics Altura AT8 octocopter carrying an XSens MTi-G-700 GPS-Inertial Navigation System (INS) [38]. RTK-GPS equipment (Topcon FC-336) was used to register the field’s outer corners, identifiable on the aerial imagery, as ground control points.

2.3. Processing and Analysis

The processing approach developed in this study comprised five steps: photogrammetric pre-processing, template matching, object-based image analysis (OBIA), data fusion, and evaluation (Figure 2). First, UAS derived imagery was pre-processed using digital photogrammetry. Second, multi-resolution image segmentation (MIRS) and template matching were performed to segment main crop covered area and detect individual crops, respectively, on the orthophoto mosaic. Third, data fusion included: stratification of detected crops by main crop area, aggregation of CHM values on the OBIA-object level, and averaging objects’ area and volume by the amount of crops they represent. Finally, detection accuracies were determined and a comparison between crop volume estimates and field samples was undertaken.

2.3.1. Pre-Processing

Geometric calibration and further photogrammetric processing was performed as described by Suomalainen et al. (2014) [38]. The calibration was performed using the Agisoft Lens (v0.4.1) calibration software, which incorporates the shift and rotation of the photo camera and the relative position of the GPS-INS to the UAS frame. Photogrammetric processing followed Agisoft PhotoScan Pro’s workflow (v1.1.2), incorporating a Structure-from-Motion algorithm to find conjugate tie-points between overlapping images and a block bundle adjustment to fit the camera positions and found points together [38,39,40]. After photogrammetric processing, four ground control points were used in a final georectification procedure to correct for geometric distortions [41]. A digital surface model and an orthophoto mosaic were built with a resolution of 1.5 cm and 2.9 cm, respectively, covering a surface of 2.25 hectares.
A digital terrain model was created by removing the crop-covered areas, including an additional 30-cm buffer, from the digital surface model and interpolating the non-crop areas. The raster cells were converted to point features, of which 0.1% was randomly selected prior to interpolation. The natural neighbor interpolation technique was used as it is appropriate for point features with an uneven distribution and density; it limits overshoots of local high values and undershoots of local low values [42]. The digital terrain model was subtracted from the digital surface model to calculate the crop height model (CHM; Figure 3). Imagery of the study area was acquired during two separate flights which resulted in deviating crop heights along the border of the flights. Three experimental plots affected by the mentioned boundary related errors were excluded from further analysis.

2.3.2. Template Matching

Crops were detected on the orthophoto mosaic using Trimble eCognition Developer’s (v9.3) template matching workflow. The workflow includes building an optimised template in an iterative process and a matching procedure incorporating normalized cross-correlation [43]. The procedure can only be executed on a single band, the green band is selected because of highest contrast between crop-objects and background. An initial average template is generated based on eight samples selected according to the guidelines described by Tiede et al. (2017) [26], crops with different shapes, sizes, shadow directions, and light conditions were incorporated in the template (Figure 4). The matching procedure requires a cross-correlation coefficient threshold which was set low (R = 0.55) to prevent underdetection.

2.3.3. Object-Based Image Analysis

The region-growing MIRS algorithm was used to create multi-pixel object primitives based on spatial and spectral features [44,45]. The algorithm takes all individual pixels of the orthophoto mosaic as a starting point and merges similar adjacent regions considering a user-defined threshold, i.e., scale parameter, for the maximum internal heterogeneity within the features [30]. Defining the optimal scale parameter is not a standardized process [35]. The parameters, including the scale parameter, were kept on default because visual inspection showed that adjusting individual parameters did not lead to improvements, as presented by Nuijten (2018) [46].
The object primitives are aggregated during a classification process based on a vegetation index and hierarchy [30]. The visible-band-based triangular greenness index (TGI) developed by Hunt et al. (2011) [47] was applied, which is able to capture crops clearly [48]. Object primitives were merged with adjacent primitives based on an optimum TGI threshold value (TGI = 38) which was found by trial-and-error.
False positives were removed in an automated workflow based on the proximity of crops within plots. Buffers of 10 cm were created around all classified objects, thereafter, objects with intersecting buffers were merged. Only vector objects within a merged region larger than 1 m2 were included in further analysis.

2.3.4. Data Fusion

A concept of stratified template matching [26,27] was applied using objects resulting from image segmentation and classification to exclude irrelevant areas. The CHM height values are averaged on the object level. A pixel’s value is assigned to a vector object when its centroid intersects with the object. Mean crop volume is based on the area of the object, its average CHM height, and the number of detected crops within the object, calculated as:
Crop volume = (height × area)/crop count

2.3.5. Evaluation

The results of MIRS are assessed using a manually drawn binary wall-to-wall map of crop covered area, providing details on areas correctly segmented, oversegmented, and undersegmented. To evaluate the results of MIRS and stratified template matching the following measures were determined: omission (false negatives) and commission errors (false positives), detection rate, and accuracy index (AI). The AI, which quantifies the trade-off between omission and commission error, was calculated as [49]:
AI = 100(1 − (FP + FN)/REF)
where FP and FN are false positives and false negatives, respectively, and REF is the number of reference crops in the study area.
Pearson’s correlation coefficient was calculated to determine correlation on a plot level between the estimated values and in situ measured biomass. It assumes values to be continuous, normally distributed, and linearly related [50]. Tukey’s honestly significant difference (HSD) test was applied to the estimated and in situ data to determine whether population means between WCC treatments were significantly different (p < 0.05) [51]. Tukey’s HSD test is a common method for multiple pairwise comparisons with data meeting the assumption of normality and homogeneity of variance [52].

3. Results

Endive crops were well covered by the OBIA-based vector objects, with a detection and accuracy rate of 88.3% and 85.4%, respectively (Table 2). The commission error was remarkably low, inclusion of bare soil or grass between the crops in the segmentation results was scarce, which is illustrated by the four experimental plots in Figure 5. Crops surrounded by around 3–6 cm of bare soil or grass were represented by an individual vector object. Objects represented between 1 and 101 individual crops with an average of 48.8 individuals. Before automated removal of omission errors, 201 vector objects were segmented correctly while 138 objects were located outside the experimental plots.
Template matching resulted in 21,500 matches but after integration with the vector objects, a high detection rate and overall accuracy were achieved of 99.9% and 99.8% respectively (Table 2).
Figure 6 shows the statistical relation between in situ crop biomass and crop volume is higher than between in situ crop biomass and crop area. The figure includes the Pearson correlation coefficients for crop area (R = 0.61) and crop volume (R = 0.71) indicating a positive and strong relation with in situ biomass.
In situ crop biomass and estimated crop volume showed a comparable pattern but differences between treatments are clearer for estimated crop volume (Figure 7). Based on mean crop volume estimates, C. endivia crops receiving Rs WCC treatment seem to be most productive with a crop volume of 11,281 cm3 (Figure 7, right). Only the Lp and Lp+Tr cover crop treatments resulted in C. endivia plants that were significantly different (smaller) compared to plants grown after the cover crop Rs (Tukey’s test, p < 0.05). The Lp treatment resulted in lowest C. endivia crop volume of 4,369 cm3, which is significantly different from all other treatments, including its mixture Lp+Tr. Lp is followed by Lp+Tr and Tr with a volume of 8,351 cm3 and 9,242 cm3, respectively. Although the difference between the monoculture Rs and its mixture Rs+Vs is clearly visible in the graph, it is not significant.

4. Discussion

Effects of WCC treatments on crop productivity can be derived from UAS acquired high-resolution RGB imagery using an integrated OBIA approach, including MIRS and template matching algorithms. This enables PSF research under actual field conditions and at realistic spatial and temporal scales. Labor-intensive destructive sampling methods can be replaced by UAS-based remote sensing approaches. The developed scalable workflow is also interesting for applications in farming as timely and spatial information on crop productivity can be gathered, which can support evidence-based operational management decisions.

4.1. Object-Based Image Analysis

MIRS-based vector objects were a good representation of main crop area, while non-crop areas were correctly excluded. The experimental plots were isolated, which made calculation of crop volume on the plot-level relatively simple. In a non-experimental setting this set-up is likely to be less ideal, but the vector objects can be clipped to investigate local productivity differences within large fields. To investigate productivity differences between different sized areas, vector objects should be averaged based on the number of crops they represent. Individual crops were detected with a high accuracy using the stratified template matching approach, which averts the problem of overdetection typical for template matching [25].
The MIRS algorithm was not capable of building objects for individual crops because object primitives were often representing multiple crops. When segmenting individual crops is desired, we recommend further research on the development of new algorithms. Marker-based segmentation algorithms using detected crops as starting points are worth investigating.

4.2. Crop Height Model

The accuracy of the digital terrain model is highly important for estimating crop volume as crop heights are around 10–19 cm. The required digital terrain model can only be derived accurately when there are sufficient areas not covered by vegetation and the study area has a planar surface [53]. The accuracy of height measurements from consumer grade GPS-INS instruments can be insufficient, this likely caused the deviations of heights along the boundary of the two flights [38,54]. Imagery is preferably acquired in a single flight to avoid problems related to deviations in determined height. Distortions are common for surface models based on digital aerial photogrammetric techniques [55]. Underestimated height values occur at the edges of spatial objects, while overestimated values occur at the center [55]. This suggests that volume calculations for vector objects representing individual crops is inappropriate and it supports using a method that uses larger objects representing multiple crops for aggregation of height values.
A stronger positive correlation was found between mean crop volume estimates and dry biomass than between mean crop area estimates and dry biomass (Figure 6). This shows that using additional three-dimensional photogrammetric data for object-based crop productivity analysis is preferred when DTM accuracies allow it.

4.3. Plant-Soil Feedback

A comparable pattern of C. endivia productivity in response to the legacies of WCC treatments was found between the UAS-based results and the previous study focused on in situ measurements [2]. That study stated that C. endivia biomass increased in response to the legacies of WCC treatments as compared to fallow soil, with the exception of the legacy of Lp. The non-destructive analysis method as presented in this current paper showed larger differences between Lp and other WCC treatment effects, as well as between monocultures and their mixtures, in comparison to the destructive method. It proves that UAS-based methods are highly capable of providing detailed information about crop productivity and PSF effects.
More information about relevant PSF pathways influencing subsequent plant productivity is provided by Barel at al. (2018) [2], based on in situ measurements of soil-organic matter, potential mineral nitrogen availability, and plant-feeding nematodes. Remote sensing methods do not have the capability to determine biotic and abiotic soil properties and to proof directly which soil properties are accountable for differences between treatments. However, UAS-based hyperspectral remote sensing enables quantification of plant traits that are related to plant legacies in the soil as plants respond to altered soil conditions [8,56].

4.4. Scalability

PSF research and practices in precision farming mainly rely on laborious manual procedures, which include destructive sampling and spectral measurements of plants and soil [1,6,57]. UAS data, on the other hand, can be acquired with a flexible temporal and spatial resolution and systems can be deployed with increasing affordability [14]. This research project confirms UAS-based analysis can complement or replace destructive methods for measuring crop productivity and allows increased temporal observations. A non-destructive method allows studying PSF effects and crop productivity for larger areas and discards the need to destructively sample crops and generalize from a limited amount of samples.
Photogrammetric processing can be done automatically [38] and the developed OBIA workflow is straightforward. However, terrain model’s suitability, appropriate classification parameters, and crop template might vary with different plant species and physical environmental conditions. The crops in the study showed overlap, had irregular shapes, and included shadows. Based on these conditions, it is assumed that MIRS and template matching will deliver similar results for other crop species.

5. Conclusions

UAS in combination with digital aerial photogrammetry and OBIA methods enable crop productivity analysis. Stratified template matching and MIRS resulted in a crop detection accuracy of 99.8% and well delineated main crop areas with 85.4% accuracy. Within an integrated workflow, mean C. endivia crop volume estimates had a strong positive correlation (R = 0.71) with in situ measured dry biomass, which was stronger than for mean crop area estimates (R = 0.61). The Lp WCC treatment resulted in a significantly different (lower) production of C. endivia compared to the other WCC treatments. Productivity differences resulting from the WCC treatments were larger for C. endivia crop volume in comparison to field measured biomass, with crop volume being an important parameter given the crop is primarily marketed fresh. Although in situ measured biomass of C. endivia crops was not promoted more by Rs as compared to most other WCC legacies, for estimated crop volume, the legacy of Rs was most beneficial.

Author Contributions

Conceptualization, R.J.G.N., L.K., and G.B.D.D.; Pre-processing, R.J.G.N.; Formal analysis, R.J.G.N.; Investigation, R.J.G.N.; Methodology, R.J.G.N. and L.K.; Resources, L.K. and G.B.D.D.; Supervision, L.K.; Visualization, R.J.G.N.; Writing—original draft, R.J.G.N.; Writing—review & editing, R.J.G.N., L.K., and G.B.D.D.

Funding

This work was supported by an NWO-ALW VIDI to GBDD (grant No. 864.11.003).

Acknowledgments

We would like to thank Janna Barel for the initial establishment of the field experiment, the staff from Unifarm and Irene Garcia Gonzalez and Dominika Piwcewicz for their help with the field work. We thank Juha Suomalainen and Bob van der Meij for their contribution to the data acquisition and image pre-processing. This article was based on the dissertation of Rik Nuijten for the program MSc. Geographical Information Managements and Applications (GIMA) [46]; we would like to thank all coordinators and supervisors involved. We thank all anonymous reviewers and journal editorial staff for their efforts in improving the quality of this manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cortois, R.; Schröder-Georgi, T.; Weigelt, A.; van der Putten, W.; De Deyn, G. Plant–soil feedbacks: Role of plant functional group and plant traits. J. Ecol. 2016, 104, 1608–1617. [Google Scholar] [CrossRef]
  2. Barel, J.; Kuyper, T.; de Boer, W.; Douma, J.; De Deyn, G. Legacy effects of diversity in space and time driven by winter cover crop biomass and nitrogen concentration. J. Appl. Ecol. 2018, 55, 299–310. [Google Scholar] [CrossRef]
  3. Pernilla Brinkman, E.; Van der Putten, W.H.; Bakker, E.; Verhoeven, K.J.F. Plant-soil feedback: Experimental approaches, statistical analyses and ecological interpretations. J. Ecol. 2010, 98, 1063–1073. [Google Scholar] [CrossRef]
  4. van der Putten, W.; Bardgett, R.; Bever, J.; Bezemer, T.; Casper, B.; Fukami, T.; Kardol, P.; Klironomos, J.; Kulmatiski, A.; Schweitzer, J.; et al. Plant-soil feedbacks: The past, the present and future challenges. J. Ecol. 2013, 101, 265–276. [Google Scholar] [CrossRef]
  5. Dias, T.; Dukes, A.; Antunes, P.M. Accounting for soil biotic effects on soil health and crop productivity in the design of crop rotations. J. Sci. Food Agric. 2015, 95, 447–454. [Google Scholar] [CrossRef]
  6. Nebiker, S.; Annen, A.; Scherrer, M.; Oesch, D. A Light-weight Multispectral Sensor for Micro UAV—Opportunities for Very High Resolution Airborne Remote Sensing. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 37, 1193–1200. [Google Scholar]
  7. Kulmatiski, A.; Kardol, P. Getting Plant—Soil Feedbacks out of the Greenhouse: Experimental and Conceptual Approaches. In Progress in Botany; Springer: Berlin/Heidelberg, Germany, 2008; Volume 69, pp. 449–472. [Google Scholar]
  8. van der Meij, B.; Kooistra, L.; Suomalainen, J.; Barel, J.; De Deyn, G. Remote sensing of plant trait responses to field-based plant–soil feedback using UAV-based optical sensors. Biogeosciences 2017, 14, 733–749. [Google Scholar] [CrossRef]
  9. Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef]
  10. Dixon, J.; McCann, M. Precision Agriculture in the 21st Century: Geospatial and Information Technologies in Crop Management; The National Academies Press: Washington, DC, USA, 1997. [Google Scholar]
  11. Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  12. Tian, L. Development of a sensor-based precision herbicide application system. Comput. Electron. Agric. 2002, 36, 133–149. [Google Scholar] [CrossRef]
  13. Feng, Q.; Liu, J.; Gong, J. UAV Remote Sensing for Urban Vegetation Mapping Using Random Forest and Texture Analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef]
  14. Andújar, D.; Ribeiro, A.; Fernández-Quintanilla, C.; Dorado, J. Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Comput. Electron. Agric. 2016, 122, 67–73. [Google Scholar] [CrossRef]
  15. Faye, E.; Rebaudo, F.; Yánez-Cajo, D.; Cauvy-Fraunié, S.; Dangles, O. A toolbox for studying thermal heterogeneity across spatial scales: From unmanned aerial vehicle imagery to landscape metrics. Methods Ecol. Evol. 2016, 7, 437–446. [Google Scholar] [CrossRef]
  16. Guo, Q.; Wu, F.; Pang, S.; Zhao, X.; Chen, L.; Liu, J.; Xue, B.; Xu, G.; Li, L.; Jing, H.; et al. Crop 3D—A LiDAR based platform for 3D high-throughput crop phenotyping. Sci. China Life Sci. 2018, 61, 328–339. [Google Scholar] [CrossRef] [PubMed]
  17. Li, L.; Zhang, Q.; Huang, D. A Review of Imaging Techniques for Plant Phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef] [PubMed]
  18. Christiansen, M.P.; Laursen, M.S.; Jørgensen, R.N.; Skovsen, S.; Gislum, R. Designing and Testing a UAV Mapping System for Agricultural Field Surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef] [PubMed]
  19. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K. Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef]
  20. Mathews, A.; Jensen, J.; Mathews, A.J.; Jensen, J.L.R. Visualizing and Quantifying Vineyard Canopy LAI Using an Unmanned Aerial Vehicle (UAV) Collected High Density Structure from Motion Point Cloud. Remote Sens. 2013, 5, 2164–2183. [Google Scholar] [CrossRef]
  21. Ballesteros, R.; Ortega, J.F.; Hernandez, D.; Moreno, M.A. Onion biomass monitoring using UAV-based RGB imaging. Precis. Agric. 2018, 19, 840–857. [Google Scholar] [CrossRef]
  22. White, J.C.; Wulder, M.A.; Vastaranta, M.; Coops, N.C.; Pitt, D.; Woods, M. The utility of image-based point clouds for forest inventory: A comparison with airborne laser scanning. Forests 2013, 4, 518–536. [Google Scholar] [CrossRef]
  23. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  24. Jóźków, G.; Toth, C.K.; Grejner-Brzezinska, D. UAS Topographic Mapping with Volodyne LiDAR Sensor. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, III, 201–208. [Google Scholar] [CrossRef]
  25. Cheng, G.; Han, J. A survey on object detection in optical remote sensing images. ISPRS J. Photogramm. Remote Sens. 2016, 117, 11–28. [Google Scholar] [CrossRef]
  26. Tiede, D.; Krafft, P.; Füreder, P.; Lang, S. Stratified template matching to support refugee camp analysis in OBIA workflows. Remote Sens. 2017, 9, 326. [Google Scholar] [CrossRef]
  27. Kalantar, B.; Mansor, S.B.; Zulhaidi, H.; Shafri, M.; Halin, A.A. Integration of template matching and object-based image analysis for semi-automatic oil palm tree counting in UAV images. In Proceedings of the 37th Asian Conference on Remote Sensing, Colombo, Sri Lanka, 17–21 October 2016. [Google Scholar]
  28. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef]
  29. Hay, G.; Castilla, G. Geographic Object-Based Image Analysis (GEOBIA): A new name for a new discipline. Object-Based Image Anal. 2008, 75–89. [Google Scholar]
  30. Benz, U.; Hofmann, P.; Willhauck, G.; Lingenfelder, I.; Heynen, M. Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information. ISPRS J. Photogramm. Remote Sens. 2004, 58, 239–258. [Google Scholar] [CrossRef]
  31. Castillejo-González, I.; López-Granados, F.; García-Ferrer, A.; Peña, J.; Jurado-Expósito, M.; de la Orden, M.; González-Audicana, M. Object- and pixel-based analysis for mapping crops and their agro-environmental associated measures using QuickBird imagery. Comput. Electron. Agric. 2009, 68, 207–215. [Google Scholar] [CrossRef]
  32. Yu, Q.; Gong, P.; Clinton, N.; Biging, G.; Kelly, M.; Schirokauer, D. Object-based Detailed Vegetation Classification with Airborne High Spatial Resolution Remote Sensing Imagery. Photogramm. Eng. Remote Sens. 2006, 72, 799–811. [Google Scholar] [CrossRef]
  33. Yan, G.; Mas, J.; Maathuis, B.; Xiangmin, Z.; Dijk, P. van Comparison of pixel-based and object-oriented image classification approaches—A case study in a coal fire area, Wuda, Inner Mongolia, China. Int. J. Remote Sens. 2006, 27, 4039–4055. [Google Scholar] [CrossRef]
  34. Platt, R.; Rapoza, L. An Evaluation of an Object-Oriented Paradigm for Land Use/Land Cover Classification. Prof. Geogr. 2008, 60, 87–100. [Google Scholar] [CrossRef]
  35. Wang, L.; Sousa, W.; Gong, P. Integration of object-based and pixel-based classification for mapping mangroves with IKONOS imagery. Int. J. Remote Sens. 2004, 25, 5655–5668. [Google Scholar] [CrossRef]
  36. Peña-Barragán, J.; Gutiérrez, P.; Hervás-Martínez, C.; Six, J.; Plant, R.E.; López-Granados, F. Object-Based Image Classification of Summer Crops with Machine Learning Methods. Remote Sens. 2014, 6, 5019–5041. [Google Scholar] [CrossRef]
  37. Peña-Barragán, J.; Kelly, M.; de Castro, A.I.; López-Granados, F. Object-Based Approach for Crop Row Characterization in Uav Images for Site-Specific Weed Management. In Proceedings of the 4th Geographic Object-Based Image Analysis (GEOBIA) Conference, Rio de Janeiro, Brazil, 7–9 May 2012; pp. 426–430. [Google Scholar]
  38. Suomalainen, J.; Anders, N.; Iqbal, S.; Roerink, G.; Franke, J.; Wenting, P.; Hünniger, D.; Bartholomeus, H.; Becker, R.; Kooistra, L. A Lightweight Hyperspectral Mapping System and Photogrammetric Processing Chain for Unmanned Aerial Vehicles. Remote Sens. 2014, 6, 11013–11030. [Google Scholar] [CrossRef]
  39. Jebara, T.; Azarbayejani, A.; Pentland, A. 3D structure from 2D motion. IEEE Signal Process. Mag. 1999, 16, 66–84. [Google Scholar] [CrossRef]
  40. Agisoft LLC. Agisoft PhotoScan Professional Edition 2018; Agisoft LLC: Saint Petersburg, Russia, 2018. [Google Scholar]
  41. van der Meij, B. Measuring the Legacy of Plants and plant Traits Using UAV-Based Optical Sensors. Masters’ Thesis, Utrecht University, Utrecht, The Netherlands, 2016. [Google Scholar]
  42. Boissonnat, J.D.; Gazais, F. Smooth surface reconstruction via natural neighbour interpolation of distance functions. Comput. Geom. Theory Appl. 2002, 22, 185–203. [Google Scholar] [CrossRef]
  43. Lewis, J.P. Fast Template Matching Template Matching by Cross Correlation 2 Normalized Cross Correlation Normalized Cross Correlation in the Transform Domain. Pattern Recognit. 1995, 10, 120–123. [Google Scholar]
  44. Flanders, D.; Hall-Beyer, M.; Pereverzoff, J. Preliminary evaluation of eCognition object-based software for cut block delineation and feature extraction. Can. J. Remote Sens. 2003, 29, 441–452. [Google Scholar] [CrossRef]
  45. Reis, M.S.; de Oliveira, M.A.F.; Korting, T.S.; Pantaleao, E.; Sant’Anna, S.J.S.; Dutra, L.V.; Lu, D. Image segmentation algorithms comparison. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 4340–4343. [Google Scholar]
  46. Nuijten, R.J.G. Developing OBIA Methods for Measuring Plant Traits and the Legacy of Crops Using VHR UAV-Based Optical Sensors. Masters’ Thesis, Universiteit Utrecht, Utrecht, The Netherlands, 2018. [Google Scholar]
  47. Hunt, E.R.; Daughtry, C.S.T.; Eitel, J.U.H.; Long, D.S. Remote Sensing Leaf Chlorophyll Content Using a Visible Band Index. Agron. J. 2011, 103, 1090. [Google Scholar] [CrossRef]
  48. Mckinnon, T.; Hoff, P. Comparing RGB-Based Vegetation Indices with NDVI for Drone Based Agricultural Sensing; Agribotix LLC: Saint Petersburg, Russia, 2017; pp. 1–8. [Google Scholar]
  49. Pouliot, D.A.; King, D.J.; Bell, F.W.; Pitt, D.G. Automated tree crown detection and delineation in high-resolution digital camera imagery of coniferous forest regeneration. Remote Sens. Environ. 2002, 82, 322–334. [Google Scholar] [CrossRef]
  50. Soper, H.E.; Young, A.W.; Cave, B.M.; Lee, A.; Pearson, K. On the Distribution of the correlation coefficient in small samples. Biometrika 1932, 24, 382–403. [Google Scholar]
  51. Tukey, J.W. Comparing Individual Means in the Analysis of Variance. Biometrics 1949, 5, 99. [Google Scholar] [CrossRef] [PubMed]
  52. Ruxton, G.D.; Beauchamp, G. Time for some a priori thinking about post hoc testing. Behav. Ecol. 2008, 19, 690–693. [Google Scholar] [CrossRef]
  53. Goodbody, T.R.H.; Coops, N.C.; Hermosilla, T.; Tompalski, P.; Pelletier, G. Vegetation Phenology Driving Error Variation in Digital Aerial Photogrammetrically Derived Terrain Models. Remote Sens. 2018, 10, 1554. [Google Scholar] [CrossRef]
  54. Turner, D.; Lucieer, A.; Watson, C.; Turner, D.; Lucieer, A.; Watson, C. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef]
  55. Ouédraogo, M.; Degré, A.; Debouche, C.; Lisein, J. The evaluation of unmanned aerial system-based photogrammetry and terrestrial laser scanning to generate DEMs of agricultural watersheds. Geomorphology 2014, 214, 339–355. [Google Scholar] [CrossRef]
  56. Tian, Y.C.; Yao, X.; Yang, J.; Cao, W.X.; Hannaway, D.B.; Zhu, Y. Assessing newly developed and published vegetation indices for estimating rice leaf nitrogen concentration with ground- and space-based hyperspectral reflectance. Field Crops Res. 2010, 120, 299–310. [Google Scholar] [CrossRef]
  57. Ciganda, V.; Gitelson, A.; Schepers, J. Non-destructive determination of maize leaf and canopy chlorophyll content. J. Plant Physiol. 2009, 166, 157–167. [Google Scholar] [CrossRef]
Figure 1. Study area near Wageningen; an orthophoto mosaic of 1 July 2015 displayed in a true colour composite. Both maps are projected in WGS 84/UTM zone 31N. In the image on the right the light green squares are plots with endive, the darker green squares are plots with oat.
Figure 1. Study area near Wageningen; an orthophoto mosaic of 1 July 2015 displayed in a true colour composite. Both maps are projected in WGS 84/UTM zone 31N. In the image on the right the light green squares are plots with endive, the darker green squares are plots with oat.
Drones 03 00054 g001
Figure 2. Methodological flow chart of photogrammetric pre-processing, template matching, object-based image analysis (OBIA), data fusion, and evaluation.
Figure 2. Methodological flow chart of photogrammetric pre-processing, template matching, object-based image analysis (OBIA), data fusion, and evaluation.
Drones 03 00054 g002
Figure 3. Map showing the crop height model (CHM) for four experimental plots including multi-resolution image segmentation (MIRS) based vector objects representing main crop area. Highest height values occur at the center of spatial objects while lowest height values occur at the edges of objects.
Figure 3. Map showing the crop height model (CHM) for four experimental plots including multi-resolution image segmentation (MIRS) based vector objects representing main crop area. Highest height values occur at the center of spatial objects while lowest height values occur at the edges of objects.
Drones 03 00054 g003
Figure 4. Eight sample templates used for template matching representing crop-objects with different shapes, sizes, and shadow directions: (a) small crop with no overlap, (b) two-sided overlap, (c) three-sided overlap, (d) four-sided overlap, (e) side with no shadow, (f) side with shadow, (g) crop located at plot’s corner, (h) possible error values within image.
Figure 4. Eight sample templates used for template matching representing crop-objects with different shapes, sizes, and shadow directions: (a) small crop with no overlap, (b) two-sided overlap, (c) three-sided overlap, (d) four-sided overlap, (e) side with no shadow, (f) side with shadow, (g) crop located at plot’s corner, (h) possible error values within image.
Drones 03 00054 g004
Figure 5. Map showing four experimental plots including multi-resolution image segmentation (MIRS) based vector objects representing main crop area (shown with 60% transparency) and detected Cichorium endivia crops resulting from stratified template matching.
Figure 5. Map showing four experimental plots including multi-resolution image segmentation (MIRS) based vector objects representing main crop area (shown with 60% transparency) and detected Cichorium endivia crops resulting from stratified template matching.
Drones 03 00054 g005
Figure 6. Scatterplots including linear model with 95% confidence interval, Pearson correlation coefficient (R-value), and significance level (p-value) of the correlation between in situ dry biomass and unmanned aerial systems (UAS)-based mean crop area estimates (left) and between in situ dry biomass and UAS-based mean crop volume estimates (right).
Figure 6. Scatterplots including linear model with 95% confidence interval, Pearson correlation coefficient (R-value), and significance level (p-value) of the correlation between in situ dry biomass and unmanned aerial systems (UAS)-based mean crop area estimates (left) and between in situ dry biomass and UAS-based mean crop volume estimates (right).
Drones 03 00054 g006
Figure 7. Bar plots presenting in situ mean crop dry biomass (g m−2; left) and UAS-based mean crop volume estimates (cm3; right) including Standard Error (SE). Different letters at the bottom of each bar indicate significant differences between the winter cover crop (WCC) treatments, based on Tukey’s test (p < 0.05).
Figure 7. Bar plots presenting in situ mean crop dry biomass (g m−2; left) and UAS-based mean crop volume estimates (cm3; right) including Standard Error (SE). Different letters at the bottom of each bar indicate significant differences between the winter cover crop (WCC) treatments, based on Tukey’s test (p < 0.05).
Drones 03 00054 g007
Table 1. Experimental field layout.
Table 1. Experimental field layout.
Field CharacteristicData
Crop count5930
Plot count60
Plot dimensions3 × 3 m2
Crop speciesEndive
Cover crop speciesRadish (Raphanus sativus; Rs), perennial ryegrass (Lolium perenne; Lp), white clover (Trifolium repens; Tr), and common vetch (Vicia sativa; Vs)
Cover crop treatmentsMonocultures: Rs, Lp, Tr, and Vs; mixtures: Rs+Vs and Lp+Tr; fallow
Table 2. Results of crop covered area segmentation and individual crop detection using an object-based image analysis (OBIA) and an integrated template matching (TM) approach.
Table 2. Results of crop covered area segmentation and individual crop detection using an object-based image analysis (OBIA) and an integrated template matching (TM) approach.
MetricOBIAStratified TM
True positives429.3 m25,921 crops
False positives14.2 m21 crops
False negatives56.9 m29 crops
Detection rate88.3%99.9%
Accuracy index85.4%99.8%
Back to TopTop