Next Article in Journal
Spacecraft Robot Kinematics Using Dual Quaternions
Previous Article in Journal
Beyond the Sex Doll: Post-Human Companionship and the Rise of the ‘Allodoll’
Previous Article in Special Issue
A Novel Multirobot System for Plant Phenotyping
Article Menu
Issue 4 (December) cover image

Export Article

Open AccessArticle

Leaf Area Estimation of Reconstructed Maize Plants Using a Time-of-Flight Camera Based on Different Scan Directions

1
Institute of Agricultural Engineering, University of Hohenheim, Garbenstrasse 9, 70599 Stuttgart, Germany
2
Laboratorio de Propiedades Físicas (LPF)-TAGALIA, Technical University of Madrid, 28040 Madrid, Spain
*
Author to whom correspondence should be addressed.
Robotics 2018, 7(4), 63; https://doi.org/10.3390/robotics7040063
Received: 19 September 2018 / Revised: 8 October 2018 / Accepted: 11 October 2018 / Published: 11 October 2018
(This article belongs to the Special Issue Agricultural and Field Robotics)
  |  
PDF [3696 KB, uploaded 11 October 2018]
  |  

Abstract

The leaf area is an important plant parameter for plant status and crop yield. In this paper, a low-cost time-of-flight camera, the Kinect v2, was mounted on a robotic platform to acquire 3-D data of maize plants in a greenhouse. The robotic platform drove through the maize rows and acquired 3-D images that were later registered and stitched. Three different maize row reconstruction approaches were compared: reconstruct a crop row by merging point clouds generated from both sides of the row in both directions, merging point clouds scanned just from one side, and merging point clouds scanned from opposite directions of the row. The resulted point cloud was subsampled and rasterized, the normals were computed and re-oriented with a Fast Marching algorithm. The Poisson surface reconstruction was applied to the point cloud, and new vertices and faces generated by the algorithm were removed. The results showed that the approach of aligning and merging four point clouds per row and two point clouds scanned from the same side generated very similar average mean absolute percentage error of 8.8% and 7.8%, respectively. The worst error resulted from the two point clouds scanned from both sides in opposite directions with 32.3%. View Full-Text
Keywords: 3-D sensors; crop characterization; agricultural robotics; precision farming; plant phenotyping 3-D sensors; crop characterization; agricultural robotics; precision farming; plant phenotyping
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Vázquez-Arellano, M.; Reiser, D.; Paraforos, D.S.; Garrido-Izard, M.; Griepentrog, H.W. Leaf Area Estimation of Reconstructed Maize Plants Using a Time-of-Flight Camera Based on Different Scan Directions. Robotics 2018, 7, 63.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Robotics EISSN 2218-6581 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top