Next Article in Journal
Pearl Millet Forage Water Use Efficiency
Next Article in Special Issue
Analyses of Work Efficiency of a Strawberry-Harvesting Robot in an Automated Greenhouse
Previous Article in Journal
Gene Expression and Metabolomics Profiling of the Common Wheat Obtaining Leaf Rust Resistance by Salicylic or Jasmonic Acid through a Novel Detached Leaf Rust Assay
Previous Article in Special Issue
Power Transmission Efficiency Analysis of 42 kW Power Agricultural Tractor According to Tillage Depth during Moldboard Plowing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Crop Height Measurement System Based on 3D Image and Tilt Sensor Fusion

1
Department of Biosystems Machinery Engineering, Chungnam National University, Daejeon 34134, Korea
2
Department of Smart Agricultural Systems, Chungnam National University, Daejeon 34134, Korea
3
Smart Agricultural Machinery R&D Group, Korea Institute of Industrial Technology (KITECH), Gimje 54325, Korea
4
Interdisciplinary Program in Cognitive Science, Seoul National University, Seoul 08826, Korea
5
Reliability Test Team, TYM ICT Co. Ltd., Gongju 32530, Korea
6
Department of Mechatronics Engineering, Chungnam National University, Daejeon 34134, Korea
7
Department of Bio-Industrial Machinery Engineering, Kyungpook National University, Daegu 41566, Korea
*
Authors to whom correspondence should be addressed.
Agronomy 2020, 10(11), 1670; https://doi.org/10.3390/agronomy10111670
Submission received: 22 September 2020 / Revised: 28 October 2020 / Accepted: 28 October 2020 / Published: 29 October 2020
(This article belongs to the Special Issue Automation for Digital Farming)

Abstract

:
Machine-vision-based crop detection is a central issue for digital farming, and crop height is an important factor that should be automatically measured in robot-based cultivations. Three-dimensional (3D) imaging cameras make it possible to measure actual crop height; however, camera tilt due to irregular ground conditions in farmland prevents accurate height measurements. In this study, stereo-vision-based crop height was measured with compensation for the camera tilt effect. For implementing the tilt of the camera installed on farm machines (e.g., tractors), we developed a posture tilt simulator for indoor testing that could implement the camera tilt by pitch and roll rotations. Stereo images were captured under various simulator tilt conditions, and crop height was measured by detecting the crop region in a disparity map, which was generated by matching stereo images. The measured height was compensated for by correcting the position of the region of interest (RoI) in the 3D image through coordinate transformation between camera coordinates and simulator coordinates. The tests were conducted by roll and pitch rotation around the simulator coordinates. The results showed that crop height could be measured using stereo vision, and that tilt compensation reduced the average error from 15.6 to 3.9 cm. Thus, the crop height measurement system proposed in this study, based on 3D imaging and a tilt sensor, can contribute to the automatic perception of agricultural robots.

1. Introduction

Sensor fusion is a central issue for digital farming, and machine vision is one of the tools that utilize data for measuring and analyzing visual information required for autonomous or automated farming systems. Machine vision can be applied to recognize objects in farmland for various purposes, such as plant phenotyping, growth, disease forecasting, and crop region detection. In particular, crop region detection is an important technique that can make farm machinery suitable for site-specific cultivation by providing a crop elevation map when machine vision can acquire 3D structural data [1].
Crop elevation (i.e., height) is a useful trait related to crop growth and yield. It is typically measured as the shortest distance from the ground level to the highest point of the region’s upper envelope [2], and mostly measured manually, which is not only time-consuming but also impossible to measure in real time. To address this problem, 3D machine vision has been used to automatically measure crop height, and the Microsoft Kinect camera, which can capture 3D imagery data, including distance to objects, has recently been employed. In several previously conducted studies, significant results in crop height measurement for cotton plants [2], cauliflower plants [3], and cherry trees [4] were achieved. However, these studies used controlled test environments for stable measurement without considering the platform posture on uneven terrain and the Kinect camera, which uses wavelengths within the infrared radiation (IR) range, which is sensitive to light changes. In addition, most of the studies have focused on individual plant types using top-view images with a simple background such as soil. Thus, their approaches are difficult to implement in the actual working of farm machinery that needs to look ahead to locations containing various objects while traveling on the ground [5].
Alternatively, stereo vision can also capture 3D images with depth information in a way that is different from that of the Kinect, and is less sensitive to ambient light changes [6]. Stereo vision consists of two cameras arranged in parallel, providing 3D structural information gleaned from stereo matching between two plane images simultaneously taken from each camera [7]. Stereo matching produces disparity maps that indicate the distance between two corresponding points in the left and right images of a stereo pair [8]. The authors of several stereo-vision-based studies conducted not only phenotyping but also region detection for autonomous navigation. Kise et al. [9] studied the automatic detection of infield crop rows using stereo vision mounted on a tractor, and generated elevation maps of the crop rows for automated tractor guidance. The elevation map compensated for perspective distortion, and the tractor could determine its heading angle from the map. The authors also created a three-dimensional virtual elevation map of look-ahead terrain using stereo vision, and demonstrated that it could automatically estimate the tractor attitude and motion and prevent tractor rollover [7]. Kneip et al. [10] detected crop edges based on crop region elevation for the automated guidance of combine harvesters. The authors tested the proposed approach in different field conditions, and their results showed that the algorithm could accurately detect crop edges by estimating the height and volume of the crop region in real time.
In general, it is hard to keep agricultural machinery stable during cultivation on irregular ground surfaces due to soil texture, moisture content, and hardness. Farm machinery cannot be controlled under uncommon and unsuitable soil conditions for cultivation that make the machine travel with a tilted posture [11]. Such farming conditions also tilt the machine vision and prevent accurate height measurements due to incorrectly captured images. Thus, the effect of machine posture should be considered to reduce the error of crop height measurements in practical applications in natural field conditions. The respective authors of several studies have proposed tractor posture estimation methods; however, in order to accurately measure crop height, the machine vision tilt effect must be corrected based on the machine posture.
In this paper, we present a method for accurate crop height measurement using tilt sensor fusion that can compensate for the camera tilt effect. For implementing farm machinery camera tilt, we developed a tilt simulator for indoor tests that can be controlled with pitch and roll rotations. Stereo vision was mounted on the simulator, and the height of the look-ahead crop was measured by pitch and roll angles. The region of interest (RoI) within the crop region measured in a tilted posture was corrected to one predicted in a stable posture through 3D coordinate transformation, and the corrected crop height was compared with the one before correction. Our work is one of the preliminary investigations for practical crop height measurement in the field, and the approach is focused on correcting crop height errors due to farm machinery rotation while working. Crop shape variety was constrained to reduce the complexity of the study. The contributions of our study are that the proposed system can improve the accuracy of machine-vision-based crop height measurement independent of working posture, and the developed posture simulator can be utilized in various ways to construct data sets for dynamic posture simulation for farm machinery.

2. Materials and Methods

2.1. Tilt Simulator Design

Figure 1 shows the schematic structure and major components of the tilt simulator used in this study and the base platform, which was designed based on the dimensions of a type of utility tractor that is mostly used for upland farming in Korea (length 2000 mm × width 1000 mm × height 500 mm). The simulator consisted of four stepping motors with motor control units (MCUs) to control the posture, an inclinometer to measure the roll and pitch angles, multifunction data acquisition (DAQ), and a laptop PC. The stepping motors (A200K-M599-GB10, Autonics, Busan, Korea) with MCUs (MD5-HF28, Autonics, Korea) were located at each of the four corners of the simulator’s rectangular base platform, and each motor could control the elevation of each corner with the use of a linear guide. Each motor was a brushless five-phase motor and could be rotated with a 20 Nm rated torque with 0.72° position control resolution. The inclinometer (SST400, Vigor Technology, Shanghai, China) was positioned at the simulator’s center of gravity (CG), and measured roll and pitch angles through calibration between the inclination and rotated angles. The simulator posture was controlled by the target pitch and roll angle, and closed-loop proportional–integral (PI) control was employed using the inclinometer angle as feedback, with experimentally determined PI coefficients (proportional gain: 0.011, integral gain: 28). The location of the inclinometer was fixed during tilt control by controlling the pitch and roll angle centered on the CG. Data collection and control commands were executed using multifunction DAQ (NI USB-6212, National Instrument, Austin, TX, USA) with various analog and digital I/O channels.
In order to acquire crop images with the simulator posture, stereo vision (Tara USB 3.0 Stereo Vision Camera, e-con Systems, Chennai, India) was used in this study. The stereo vision had two identical cameras that had an ON Semiconductor 1/3 inch CMOS digital image sensor, and the cameras were located in parallel at a baseline distance of 60 mm. The images acquired by stereo vision were mounted at the front and center of the simulator and were transmitted to a laptop PC. Table 1 shows the specifications of the equipment used in the tilt simulator.

2.2. Crop Height Measurement

Stereo vision can construct 3D images based on the difference between two plane images of the same scene. The difference in the horizontal (x-axis) pixel distance of the target object projected on the left and right plane images, which is called disparity, can provide the object distance (depth) from the center of the cameras using the distance between two camera lenses (baseline). The shortest distance from the principal foci of the lens to the image plane (focal length) was found using Equation (1). If the distance to the target object is known, the 3D position of the target object in the global coordinates can be obtained. The vertical position on the y-axis in the global coordinates is related to crop height and can be calculated using Equation (2).
Z = f × T x L   x R ,
where Z is the distance between camera and object (mm), f is focal length (pixel), T is distance between left and right cameras (mm), x L is coordinate x on the left image (pixel), and x R is coordinate x on the right image (pixel).
Y r e a l = T × Y x L     x R ,
where Y r e a l is the y-axis value (mm) in the global coordinates and Y is the y-axis value (pixel) in the camera coordinates.
Crop height was measured in the 3D image, and the image process consisted of several steps, as shown in Figure 2. Stereo images were matched to generate the disparity map, and the disparity map was converted into depth maps to find the distance between camera and object. Discontinuities in pixel intensity in the depth map were used to locate the edges between each object, and Canny edge detection was employed to determine the boundaries of the crop regions [12]. Connected component clustering was conducted to cluster the crop region based on the detected edges [13], and filtering was performed to determine the crop region among several clustered regions. The region selected as the crop region had the highest number of pixels and was located close to the bottom of the center when considering the location of the crop in the field of view during cultivation. The region was connected to the ground region because crops grow from the ground; this can cause continuity of the pixel intensity between the crop and ground regions behind crops in the image. Therefore, the ground region included in the crop region was removed using depth-based filtering. The depth range was determined by the distance and width of the target. In this study, target distance and width were 2 and 0.3 m, respectively, and the depth range of the filter was set from 1.7 to 2.3 m. A bounding box for the RoI was created, which had the smallest size that included all pixels in the crop region; crop height was calculated by simply using the upper boundary of the box by representing its vertical location in the global coordinates.

2.3. Tilt-Correction Methods

Figure 3a shows the physical geometry between the coordinates of the simulator and the camera image plane, and the z-axis of the image plane was related to the depth of objects by matching the stereo images. The simulator’s coordinates had the simulator’s CG as the origin and an altitude of 0.5 m from the ground. The simulator was designed so that it could rotate on the x and z axes, which are called the pitch and roll rotations, respectively. Regarding yaw rotation, rotation on the y-axis was not considered in this study because it was the rotation of the vertical axis of the ground which has little effect on height measurement. The camera was mounted at position offset ( d x , d y , d z ) onto the simulator’s CG, and its coordinates had the same orientation as that of the simulator coordinates (pure translation), where translations were 0.5, −0.1, and 1.0 m for d x , d y , and d z on the simulator origin, respectively. In addition, the camera was installed on a simulator with a rigid body, and the relative positions of the two coordinates did not change.
Figure 3b shows that the image captured by the camera was expressed in the image plane, and the position and orientation of the captured object changed in the image plane according to the simulator tilt. When the simulator was tilted, the camera, which had a rigid connection to the simulator, was also rotated around the rotation axis of the simulator. As the object in the image plane was rotated and translated according to the spinning of the camera coordinates, the coordinates of the object expressed in the image plane changed. For example, when the RoI for the crop region was moved to the lower-right of the image plane, it was expected that the upper boundary of the bounding box would be lowered, and crop height was evaluated to be smaller than the actual height was (and vice versa). Therefore, crop height was measured after correcting the RoI by transforming the coordinates to the initial state without any rotations [14].
The main idea is that the image plane would provide 2D pixel coordinates (x and y axes) with depth (z-axis), and the 3D location of the detected RoI could be represented in the simulator coordinates at the stable posture. The 3D location of the RoI in the camera coordinates ( x c , y c , z c ) was easily transformed into that in the current simulator coordinates ( x s ´ , y s ´ , z s ´ ) using a homogeneous translation matrix based on Equation (3). Since the current simulator could be tilted rather than be in an initial state, the 3D location in the simulator coordinates could be represented in the initial simulator coordinates ( x s , y s , z s ) when the simulator had a stable posture without pitch and roll rotations. The two coordinates were related solely by rotation, without translation, and the 3D location in the current simulator coordinates could be converted into one in the initial simulator coordinates using a homogeneous matrix to represent the roll, pitch, and yaw (RPY) orientation based on Equation (4) [15]. The matrix used three angles for roll, pitch, and yaw rotations, where the yaw angle was 0 due to its above-mentioned negligible effect.
The currently obtained image plane is expressed in the initial simulator coordinates with stable posture using the combined homogeneous transformation matrix of Equation (5). Crop height was estimated with the upper boundary of the bounding box based on the reconstructed RoI.
T = [ 1 0 0 d x 0 1 0 d y 0 0 1 d z 0 0 0 1 ] ,
where T is a homogeneous matrix for pure translation, and d x , d y , and d z are the translation relative to the x, y, and z axes of the reference coordinates, respectively.
R P Y = [ cos ( φ z ) cos ( φ x ) cos ( φ z ) sin ( φ x ) sin ( φ y ) sin ( φ z ) cos ( φ y ) cos ( φ z ) sin ( φ x ) cos ( φ y ) + sin ( φ z ) sin ( φ y ) 0 sin ( φ z ) cos ( φ x ) sin ( φ z ) sin ( φ x ) sin ( φ y ) + cos ( φ z ) cos ( φ y ) sin ( φ z ) sin ( φ x ) cos ( φ y ) cos ( φ z ) sin ( φ y ) 0 sin ( φ x ) cos ( φ x ) sin ( φ y ) cos ( φ x ) cos ( φ y ) 0 0 0 0 1 ] ,
where R P Y is a homogeneous matrix for pure rotation represented by roll, pitch, and yaw angles; and φ x , φ y , and φ z are rotations relative to the x, y, and z axes of the reference coordinates, called roll, pitch, and yaw, respectively.
[ x s y s z s 1 ] = R P Y ( φ x ,   φ y ,   φ z ) T ( d x ,   d y ,   d z ) [ x c y c z c 1 ] ,
where x s , y s , and z s are simulator coordinates, and x c , y c , and z c are camera coordinates.

2.4. Experiments

A potted crop with a height of 71.6 cm and a diameter of 30 cm was selected as the target object, and the stereo images were taken at a distance of approximately 2 m. There are various crops with a variety of individual shapes in field conditions; however, as mentioned above, this work was focused on tilt correction, and the target was limited to a simple crop object. The test was conducted with the simulator posture, and the posture was set by a control roll and pitch angle. Each rotation was set to −10°, −5°, 0°, 5°, and 10°, the total number of test conditions was 25 (5 roll × 5 pitch), and each test was repeated five times per sample. Performance was evaluated by comparing the average value of measured height with a reference height, and the error was represented with mean absolute error (MAE) [16]. We analyzed the effect of simulator tilt correction with a statistical method for normality, and t-tests were conducted using tilt correction as a factor. Since the number of samples for each condition was five, testing for normality was performed using the Shapiro–Wilk test, which is generally used when the number of samples is low.

2.5. Implementations

All statistical analyses were performed using SAS (version 9.1, SAS Institute, Cary, NC, USA) and Visual Studio (version 2017, Microsoft, WA, USA) software; the C++ language, the Tara stereo vision library (e-con Systems, Chennai, India), and OpenCV (version 3.4, BSD, CA, USA) were used to implement the image processing.

3. Results and Discussion

3.1. Tilt Simulator and Image Collection

Figure 4 shows the developed tilt simulator for simulating the working posture of agricultural machinery. A single motor system consists of a motor, MCU, and cam gear, and in this study, a single motor system was applied to each corner of the square simulator base platform. This motor system controlled the posture of the simulator base platform. The inclinometer was located at the center of the simulator base platform, and the four motors controlled the tilt around the inclinometer, the simulator’s CG. This gave the inclinometer a fixed position during tilt control, thus improving the reliability of the tilt control. However, in field conditions, a farm machine’s CG is not fixed during actual work due to uneven terrain. To track the CG during working, an additional sensor system is required to measure the 3D coordinates of the CG. This challenge is outside of the scope of this work and can be dealt with using an advanced technique after this study. Thus, the developed simulator was set only for the rotation of farm machines without any coordinate translation.
Motor control for setting the pitch and roll angles was conducted using the experimentally determined PI controller, and the results showed that the tilt angles of the simulator could be controlled within errors of less than 0.3°. A commercial inclinometer (PRO3600, Sincon, Bucheon, Korea) was used as the ground truth for the rotation angle.
Figure 5 shows images taken with stereo vision at simulator roll and pitch angles, and the represented image was captured from the left camera of stereo vision. The crop was located at the bottom-center of the image under the stable posture of the simulator without any rotations, and the target region included the bottom-center region, even if it was slightly off the center line when the simulator was rotated. This means that the assumption of our approach—that the target region was located close to the bottom of the center in the field of view during cultivation—was satisfied, and the proposed method could cover sufficient rotation ranges (i.e., farm machine tilting).
When the simulator was rotated in the roll, the crop in the image moved and slightly rotated. The crop region in the image moved to the left and rotated clockwise as the roll angle increased in the negative direction (and vice versa). In the case of the pitch, the horizontal position of the crop gradually moved to the top region in the image as the pitch angle increased in the negative direction. Conversely, the horizontal position of the crop gradually moved to the bottom region in the image as the pitch angle increased in the positive direction, and some examples showed that part of the crop was not included in the image boundaries.
In these results, pitch and roll rotations meant that it occurred due to the altitude difference between the front and rear wheels and left and right wheels, respectively, and the roll rotation showed symmetry between the rotation directions, whereas the pitch did not. As a result, pitch rotation can produce a larger error in height measurements than roll rotation can.

3.2. Crop Height Measurement

Crop height was measured by the pitch and roll angles of the simulator, and representative results are shown in Figure 6. The bounding box was created for each result, and the upper boundary of the box, which was related to crop height, was determined to coincide with the highest point of the crop region regardless of the simulator’s rotations. The bounding box had a minimal region to include in the RoI in the case of a positive pitch angle; however, the generated bounding box was wider than the target region in each result, with a negative pitch angle (including 0°). The target crop region with its bottom part connected to the ground was included in the image at a pitch angle of less than 0°, which made it possible to include ground regions at the same distance. For this reason, the bounding box had a wider region due to it being connected to the ground despite depth-based filtering; however, there was no significant effect on the scope of this study, i.e., crop height measurement, because the upper part related to the height measurement was clearly distinguishable from the background.
Crop height was measured in the range 38–95 cm depending on the simulator’s rotations, although crop region and upper boundary were effectively detected in all test conditions. The reason for this wide range of crop heights was an error that occurred as the crop moved and rotated in the scene while the simulator rotated to roll and pitch. The results showed that machine vision tilting or base platform posture should be considered for accurately measuring crop height when the system is applied to a mobile platform such as farm machinery. The tilting of farm machinery during cultivation causes crops to be perceived as crooked, leading to errors in height measurement, even if accurate crop region measurement is possible.
To correct the current crop region measured by the tilted simulator, the current RoI was reconstructed by representing its location in the initial simulator coordinates where the simulator posture was stable. The bounding box and crop height were estimated based on the reconstructed RoI, and the results are shown in Figure 7. The upper boundary in each example was effectively detected at the same height as the crop height. In addition, results showed that the crop region, which rotated and moved according to the pitch and roll rotations, was restored to the location (with orientation) in the image when capturing in the stable posture. Tilt correction could accurately detect the RoI, and height measurement errors were reduced. The lower part of the crop was not captured because the pitch rotation could not be restored. However, the detected region could be restored to its original position, and most results showed that the reconstructed RoIs were located in the lower center of the image (the predicted location if part of the region was lost). In the reconstructed RoIs, the part where the pixel value could not be expressed due to coordinate transformation, such as dot pattern and texture, could be solved through pixel interpolations.
Generally, the proposed tilt correction could reliably produce the RoI for measuring crop height. Although the test was conducted using a simple object, it could be scaled-up to various crop conditions by tuning the region detection to suit the domain.

3.3. Performance Analysis

Figure 8 shows the range of the measured crop height before and after tilt correction; Figure 8a,b show the results before and after correction, respectively. Actual crop height was approximately 71.6 cm; however, crop height by pitch and roll angles were precisely measured around the actual height when the RoI was reconstructed based on the simulator’s posture to correct tilt. The result showed the range of 62.7 to 73.2 cm after tilt correction. The difference between maximal and minimal values was reduced to 19% of the values before correction by considering the tilt effect.
Table 2 shows the maximal, minimal, and average values of the height measurement error before and after correction, and the absolute difference (AD) for error before and after correction under each tilt condition (five roll × five pitch). The error value is expressed as MAE. The maximal error was reduced by about 23.8 cm (73%), from 32.7 to 8.9 cm, through tilt correction, and the average error was reduced by approximately 11.7 cm (75%), from 15.6 to 3.9 cm. This result showed that the average MAE after correction was approximately 5% considering the actual crop height, which was 71.6 cm. From this result, it can be seen that crop height was accurately measured using our approach with a 5% error level.
Several studies on crop height measurements have shown significant results: (1) mean relative error (MRE) of 5.4% in Kinect-v2-based studies [2], (2) MRE of 2.63% in red-green-blue (RGB) digital-camera-based studies [17], and (3) MRE of 5.08% in light detection and ranging (LIDAR)-based studies [18]. The above performance comparison revealed that our study results had similar performance to those of previous studies. This indicates that our method has an advantage in practicality, in that it was examined in various postures, unlike those methods that obtain images from a stable posture. In addition, crop height measurement performance before and after correction for each tilt condition was confirmed by AD analysis. As a result, tilt correction at the various tilt conditions showed performance improvement of up to 29.8 cm and a mean of 12.1 cm.
Table 3 shows the MAE normality test results according to each condition based on the Shapiro–Wilk test. Normality results indicated that the null hypothesis (i.e., following normal distribution) was adopted as the significance level was 0.05 or higher in all conditions. Therefore, in this study, it was considered that the MAE of all crop height measurement conditions satisfied normality.
Table 4 shows the MAE comparison between corrected and uncorrected measured height using the t-test for each roll and pitch angle. The MAE for each test condition was the average value for 25 samples with five repetitions and five angle levels of another rotation factor. The performance of crop height measurement was observed to significantly improve at 0.1% through tilt correction for pitch rotations. In the case of roll rotation, there was a difference within 5% of the significance level at −5° and 0°, and there was no difference for the positive direction (−10°, 5°, and 10°). The t-value before and after tilt correction was larger in the pitch rotation compared to that in the roll rotation, and the larger the angle of pitch rotation was, the greater the difference between the two MAEs was (t-values were in the range of 0.64 to 36.45, while t-values by roll rotation were between 1.73 and 2.41). In addition, in the case of 0°, there was no difference for each rotation between the two MAEs in the pitch condition, whereas a difference was observed in the roll condition. These results again indicated that the pitch angle was directly related to the vertical position of the crop in the image, and that it could then enhance the tilt correction effect. On the other hand, height error was relatively small by roll rotation compared to by pitch because roll was rotated on a plane around the center of the image.
In addition, the performance according to the rotation angle showed a tendency of asymmetry. For example, performance between positive and negative angles in pitch rotation showed that MAEs after correction had small values at positive angles, while the opposite was observed for the uncorrected ones. In the initial conditions without any rotation, the target was mostly located below the horizontal line of the image, and this geometry further accelerated the crop movement when the camera was rotated upwards. For this reason, pitch rotation produced a higher error in a positive angle than that in a negative one, and the reversal between positive and negative errors after correction was expected to have been caused by geometry model tolerances or tilt sensor error by hysteresis.

4. Conclusions

Our approach aimed to achieve accurate crop height measurement using stereo vision and tilt sensor fusion. A tilt simulator was developed to implement the working posture of farm machinery, and stereo images of a front-potted crop were taken under various simulator posture conditions. Crop height was measured by detecting the crop region in a disparity map with an edge detector, depth-based filtering, and connected component clustering. The measured height was corrected considering the simulator tilt through RoI reconstruction using coordinate transformation between camera coordinates and simulator coordinates. The results showed that the crop height in the frontal scene could be measured with approximately 5% MAE by correcting the tilt effect of the simulator posture, and correction reduced the averaged error from 15.6 to 3.9 cm. Since the pitch angle was directly related to the vertical position of the crop in the image, the pitch angle was found to have a relatively high crop height error compared to that of the roll angle.
In this study, automatic crop height measurement was conducted, focusing on error correction by camera tilting to improve accuracy in various farm machinery postures. The RoI that was measured under pitching and rolling conditions could be represented in the nonrotated simulator coordinates, which enabled accurate height measurement. Some examples showed the RoI to be wider than the target region was, with loss of the bottom of the target region, and pixel loss due to coordinate transformation. Our approach, though, measured a crop image based on the vertical (y-axis) coordinate of the upper boundary, and there was no significant effect on crop height measurement.
However, our work was conducted with limited test conditions, for example, a simple crop sample used and assumed a pure rotational posture. There are various crop types in the field, and their shape varies by growth, even for the same crop. This diversity makes it difficult to extend machine vision systems to field use systems because object detection can be interfered by crops overlapping, maturity color differences, and other abnormalities (e.g., weed infestations). In addition, the center of gravity of farm machinery cannot be defined as a fixed position during travel on irregular ground, which means that it is difficult to use the proposed tilt correction in field conditions with only a solely rotational model. To apply infield conditions, the proposed method should be improved to detect the RoI for various crop types and shape conditions. The position of the tilt sensor must also be compensated in real time since the center of gravity changes constantly during farming.
Despite these limitations, our approach shows the possibility that crop height can be automatically measured with posture-invariant accuracy, and it can practically contribute to improving the performance of crop height measurement by reflecting factors occurring in the field. Furthermore, several studies estimated machine posture based on 3D images, but we estimated the physical properties of objects in a 3D image based on posture sensing, which has significant meaning for precision farming in terms of monitoring spatial information.

Author Contributions

Conceptualization, D.-H.L., Y.-J.K., and S.-U.P.; methodology, D.-H.L. and T.K.; software, T.K.; formal analysis, W.-S.K. and Y.-S.K.; investigation, D.-H.H.; writing—original draft preparation, W.-S.K. and D.-H.L.; writing—review and editing, W.-S.K., D.-H.L., and S.-U.P.; project administration, Y.-J.K. and S.-S.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Industrial Strategic Technology Development Program (20003975, development of intelligent 30 kW crawler-based traveling platform for multipurpose farming) funded by the Ministry of Trade, Industry, and Energy (Ml, Korea).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rovira-Más, F.; Zhang, Q.; Reid, J.F. Stereo vision three-dimensional terrain maps for precision agriculture. Comput. Electron. Agric. 2008, 60, 133–143. [Google Scholar] [CrossRef]
  2. Jiang, Y.; Li, C.; Paterson, A.H. High throughput phenotyping of cotton plant height using depth images under field conditions. Comput. Electron. Agric. 2016, 130, 57–68. [Google Scholar] [CrossRef]
  3. Andújar, D.; Ribeiro, A.; Fernández-Quintanilla, C.; Dorado, J. Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Comput. Electron. Agric. 2016, 122, 67–73. [Google Scholar] [CrossRef]
  4. Malekabadi, A.J.; Khojastehpour, M.; Emadi, B. Disparity map computation of tree using stereo vision system and effects of canopy shapes and foliage density. Comput. Electron. Agric. 2019, 156, 627–644. [Google Scholar] [CrossRef]
  5. Kim, W.S.; Lee, D.H.; Kim, Y.J.; Kim, T.; Hwang, R.Y.; Lee, H.J. Path detection for autonomous traveling in orchards using patch-based CNN. Comput. Electron. Agric. 2020, 175, 105620. [Google Scholar] [CrossRef]
  6. Reid, J.; Searcy, S. Vision-based guidance of an agriculture tractor. IEEE Control Syst. Mag. 1987, 7, 39–43. [Google Scholar] [CrossRef]
  7. Kise, M.; Zhang, Q. Sensor-in-the-loop tractor stability control: Look-ahead attitude prediction and field tests. Comput. Electron. Agric. 2006, 52, 107–118. [Google Scholar] [CrossRef]
  8. Bleyer, M.; Breiteneder, C. Stereo matching—State-of-the-art and research challenges. In Advanced Topics in Computer Vision; Farinella, G.M., Battiato, S., Cipolla, R., Eds.; Springer: London, UK, 2013; pp. 143–179. [Google Scholar] [CrossRef]
  9. Kise, M.; Zhang, Q.; Más, F.R. A stereovision-based crop row detection method for tractor-automated guidance. Biosyst. Eng. 2005, 90, 357–367. [Google Scholar] [CrossRef]
  10. Kneip, J.; Fleischmann, P.; Berns, K. Crop edge detection based on stereo vision. Rob. Auton. Syst. 2020, 123, 103323. [Google Scholar] [CrossRef]
  11. Roca, J.; Comellas, M.; Pijuan, J.; Nogues, M. Development of an easily adaptable three-point hitch dynamometer for agricultural tractors. Analysis of the disruptive effects on the measurements. Soil Tillage Res. 2019, 194, 104323. [Google Scholar] [CrossRef]
  12. Xu, Q.; Varadarajan, S.; Chakrabarti, C.; Karam, L.J. A distributed canny edge detector: Algorithm and FPGA implementation. IEEE Trans. Image Process. 2014, 23, 2944–2960. [Google Scholar] [CrossRef] [PubMed]
  13. He, L.; Ren, X.; Gao, Q.; Zhao, X.; Yao, B.; Chao, Y. The connected-component labeling problem: A review of state-of-the-art algorithms. Pattern Recognit. 2017, 70, 25–43. [Google Scholar] [CrossRef]
  14. Fu, J.; Chu, W.; Dixson, R.; Vorburger, T. Three-dimensional image correction of tilted samples through coordinate transformation. Scanning J. Scanning Microsc. 2008, 30, 41–46. [Google Scholar] [CrossRef] [PubMed]
  15. Niku, S.B. Introduction to Robotics: Analysis, Systems, Applications; Prentice Hall: Upper Saddle River, NJ, USA, 2001. [Google Scholar]
  16. Jay, S.; Rabatel, G.; Hadoux, X.; Moura, D.; Gorretta, N. In-field crop row phenotyping from 3D modeling performed using structure from motion. Comput. Electron. Agric. 2015, 110, 70–77. [Google Scholar] [CrossRef] [Green Version]
  17. Sritarapipat, T.; Rakwatin, P.; Kasetkasem, T. Automatic rice crop height measurement using a field server and digital image processing. Sensors 2014, 14, 900–926. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Zhang, L.; Grift, T.E. A LIDAR-based crop height measurement system for Miscanthus giganteus. Comput. Electron. Agric. 2012, 85, 70–76. [Google Scholar] [CrossRef]
Figure 1. Schematic structure and major components of tilt simulator.
Figure 1. Schematic structure and major components of tilt simulator.
Agronomy 10 01670 g001
Figure 2. Procedures for stereo-vision-based crop height measurement.
Figure 2. Procedures for stereo-vision-based crop height measurement.
Agronomy 10 01670 g002
Figure 3. Physical geometry between simulator and stereo vision: (a) coordinate systems of simulator, and (b) effect of simulator rotation on image plane.
Figure 3. Physical geometry between simulator and stereo vision: (a) coordinate systems of simulator, and (b) effect of simulator rotation on image plane.
Agronomy 10 01670 g003
Figure 4. Farm machine tilting simulator with a four-stepping-motor-based posture control mechanism: (a) system configuration, (b) pitch rotation, and (c) roll rotation.
Figure 4. Farm machine tilting simulator with a four-stepping-motor-based posture control mechanism: (a) system configuration, (b) pitch rotation, and (c) roll rotation.
Agronomy 10 01670 g004
Figure 5. Acquired images from left camera of stereo vision according to tilt angles.
Figure 5. Acquired images from left camera of stereo vision according to tilt angles.
Agronomy 10 01670 g005
Figure 6. Results of crop height measurement using the disparity map.
Figure 6. Results of crop height measurement using the disparity map.
Agronomy 10 01670 g006
Figure 7. Results of crop height measurement with simulator’s tilt effect correction.
Figure 7. Results of crop height measurement with simulator’s tilt effect correction.
Agronomy 10 01670 g007
Figure 8. Measured crop heights by pitch and roll angles of stereo vision (a) before and (b) after tilt correction.
Figure 8. Measured crop heights by pitch and roll angles of stereo vision (a) before and (b) after tilt correction.
Agronomy 10 01670 g008
Table 1. Specifications of used tilt simulator devices.
Table 1. Specifications of used tilt simulator devices.
ItemSpecifications
Dimensions of the simulator base
(length × width × height)
2000 × 1000 × 500 mm
Stepping motor (A200K-M599-GB10)Type: brushless
Phase: 5
Rated torque: 20 N·m
Rotation speed: 0–180 rpm
Power supply: 24 VDC (0.62 A)
Motor control unit (MCU) (HD5-HF28)Basic step angle: 0.72°/step
Inclinometer (SST 400)Accuracy up to ±0.006° @ ±5° to ±30°
0.0006° resolution
Power supply: 9–36 VDC (<100 mA)
DAQ (NI USB-6212)16 analog inputs (16 bit resolution, 400 kS/s)
Two analog outputs (250 kS/s)
Up to 32 digital inputs and outputs
Stereo vision (Tara USB 3.0 Stereo Vision Camera)Image sensors: MT9V024 (1/3″)
Resolution: 640 × 480 pixels
Depth range: 500–3000 mm
Baseline: 60 mm
Weight: 80.5 g
Table 2. Absolute error in measured crop heights by tilt correction.
Table 2. Absolute error in measured crop heights by tilt correction.
NMax. (cm)Min. (cm)Mean (cm)
Uncorrected height7532.70.515.6 ± 10.79
Corrected height758.90.13.9 ± 2.46
Absolute difference (AD)7529.80.712.1 ± 9.71
Mean values expressed as average ± standard deviation.
Table 3. Testing for normality of mean absolute errors (MAEs) according to each condition based on Shapiro–Wilk test.
Table 3. Testing for normality of mean absolute errors (MAEs) according to each condition based on Shapiro–Wilk test.
RotationAngle (°)UncorrectedCorrected
StatisticdfSig.StatisticdfSig.
Pitch−100.9350.590.93050.60
−50.8950.350.84850.19
00.9050.420.82750.13
50.7750.050.95150.74
100.9550.700.88350.32
Roll−100.9850.900.90450.43
−50.9950.990.92150.54
00.9950.990.90050.41
50.9950.980.91350.48
100.9650.800.92850.59
Table 4. Statistical comparison between uncorrected and corrected mean absolute errors (MAEs) by camera-tilting conditions.
Table 4. Statistical comparison between uncorrected and corrected mean absolute errors (MAEs) by camera-tilting conditions.
RotationAngle (°)MAE (cm)T-ValueP-Value
UncorrectedCorrected
Pitch−1020.1 ± 3.426.9 ± 1.547.87<0.01 **
−58.9 ± 0.934.2 ± 1.625.56<0.01 **
01.1 ± 1.101.7 ± 2.040.640.54
515.9 ± 1.562.2 ± 1.8612.69<0.01 **
1032.0 ± 0.494.3 ± 1.6336.45<0.01 **
Roll−1015.2 ± 11.73.5 ± 2.72.170.06
−516.6 ± 12.43.1 ± 1.62.410.04 *
015.2 ± 11.72.6 ± 2.52.360.04 *
515.8 ± 12.13.6 ± 2.32.230.06
1015.2 ± 11.16.5 ± 1.61.730.12
MAE values expressed as average ± standard deviation; * p < 0.05, ** p < 0.01.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kim, W.-S.; Lee, D.-H.; Kim, Y.-J.; Kim, Y.-S.; Kim, T.; Park, S.-U.; Kim, S.-S.; Hong, D.-H. Crop Height Measurement System Based on 3D Image and Tilt Sensor Fusion. Agronomy 2020, 10, 1670. https://doi.org/10.3390/agronomy10111670

AMA Style

Kim W-S, Lee D-H, Kim Y-J, Kim Y-S, Kim T, Park S-U, Kim S-S, Hong D-H. Crop Height Measurement System Based on 3D Image and Tilt Sensor Fusion. Agronomy. 2020; 10(11):1670. https://doi.org/10.3390/agronomy10111670

Chicago/Turabian Style

Kim, Wan-Soo, Dae-Hyun Lee, Yong-Joo Kim, Yeon-Soo Kim, Taehyeong Kim, Seong-Un Park, Sung-Soo Kim, and Dong-Hyuck Hong. 2020. "Crop Height Measurement System Based on 3D Image and Tilt Sensor Fusion" Agronomy 10, no. 11: 1670. https://doi.org/10.3390/agronomy10111670

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop