Next Article in Journal
DoF-Dependent and Equal-Partition Based Lens Distortion Modeling and Calibration Method for Close-Range Photogrammetry
Next Article in Special Issue
Visual Guidance and Egg Collection Scheme for a Smart Poultry Robot for Free-Range Farms
Previous Article in Journal
Characterization and Differentiation between Olive Varieties through Electrical Impedance Spectroscopy, Neural Networks and IoT
Previous Article in Special Issue
Environment Monitoring of Rose Crops Greenhouse Based on Autonomous Vehicles with a WSN and Data Analysis

Plant Leaf Position Estimation with Computer Vision

Engineering Department, Lancaster University, Lancaster LA1 4YW, UK
Lancaster Environment Centre, Lancaster University, Lancaster LA1 4YW, UK
Author to whom correspondence should be addressed.
Sensors 2020, 20(20), 5933;
Received: 3 September 2020 / Revised: 13 October 2020 / Accepted: 16 October 2020 / Published: 20 October 2020
(This article belongs to the Special Issue Sensing Technologies for Agricultural Automation and Robotics)
Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies. View Full-Text
Keywords: neural network; computer vision; depth estimation; position estimation; parallax neural network; computer vision; depth estimation; position estimation; parallax
Show Figures

Figure 1

MDPI and ACS Style

Beadle, J.; Taylor, C.J.; Ashworth, K.; Cheneler, D. Plant Leaf Position Estimation with Computer Vision. Sensors 2020, 20, 5933.

AMA Style

Beadle J, Taylor CJ, Ashworth K, Cheneler D. Plant Leaf Position Estimation with Computer Vision. Sensors. 2020; 20(20):5933.

Chicago/Turabian Style

Beadle, James, C. James Taylor, Kirsti Ashworth, and David Cheneler. 2020. "Plant Leaf Position Estimation with Computer Vision" Sensors 20, no. 20: 5933.

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop