Next Article in Journal
A Tellurium Oxide Microcavity Resonator Sensor Integrated On-Chip with a Silicon Waveguide
Next Article in Special Issue
Accuracy Assessment of Semi-Automatic Measuring Techniques Applied to Displacement Control in Self-Balanced Pile Capacity Testing Appliance
Previous Article in Journal
Employing Ray-Tracing and Least-Squares Support Vector Machines for Localisation
Previous Article in Special Issue
On the Sensitivity of the Parameters of the Intensity-Based Stochastic Model for Terrestrial Laser Scanner. Case Study: B-Spline Approximation
Article Menu
Issue 11 (November) cover image

Export Article

Sensors 2018, 18(11), 4060; https://doi.org/10.3390/s18114060

Article
Evaluation of Object Surface Edge Profiles Detected with a 2-D Laser Scanning Sensor
1
College of Engineering, Nanjing Agricultural University, Nanjing 210031, China
2
USDA-ARS Application Technology Research Unit, Wooster, OH 44691, USA
3
Department of Food, Agricultural and Biological Engineering, The Ohio State University, Wooster, OH 43210, USA
*
Authors to whom correspondence should be addressed.
Received: 5 September 2018 / Accepted: 7 November 2018 / Published: 21 November 2018

Abstract

:
Canopy edge profile detection is a critical component of plant recognition in variable-rate spray control systems. The accuracy of a high-speed 270° radial laser sensor was evaluated in detecting the surface edge profiles of six complex-shaped objects. These objects were toy balls with a pink smooth surface, light brown rectangular cardboard boxes, black and red texture surfaced basketballs, white smooth cylinders, and two different sized artificial plants. Evaluations included reconstructed three-dimensional (3-D) images for the object surfaces with the data acquired from the laser sensor at four different detection heights (0.25, 0.50, 0.75, and 1.00 m) above each object, five sensor travel speeds (1.6, 2.4, 3.2, 4.0, and 4.8 km h−1), and 8 to 15 horizontal distances to the sensor ranging from 0 to 3.5 m. Edge profiles of the six objects detected with the laser sensor were compared with images taken with a digital camera. The edge similarity score (ESS) was significantly affected by the horizontal distances of the objects, and the influence became weaker when the objects were placed closer to each other. The detection heights and travel speeds also influenced the ESS slightly. The overall average ESS ranged from 0.38 to 0.95 for all the objects under all the test conditions, thereby providing baseline information for the integration of the laser sensor into future development of greenhouse variable-rate spray systems to improve pesticide, irrigation, and nutrition application efficiencies through watering booms.
Keywords:
automation; surface contour detection; greenhouse crop; pesticide spray equipment; variable rate

1. Introduction

The timely application of pesticides is critical to protect crops from insect and disease damage [1]. However, conventional constant-rate sprayers are often reported to waste much of the spray materials, resulting in increased production costs and environmental contamination potential [2,3,4,5,6].
In order to overcome these problems, modern sprayers are expected to automatically control spray outputs precisely to match the plant configuration and foliage density in order to avoid delivering chemicals to non-target areas. In this way, high efficiency and low production costs can be achieved for pesticide spray applications with minimum environmental impact [7,8,9,10,11,12,13,14].
Earlier precision sprayer control systems only manipulate spray outputs in an on/off or intermittent manner according to the presence or absence of target plants [15,16,17]. Chemical application and crop management according to the orchard foliar volume is known as the tree row volume (TRV) concept. Chemical application would be more precise and efficient if the spray could be adjusted in real time based on the foliar volume rather than only on the presence of plants or land area.
A modern, variable-rate sprayer is developed by integrating sensor-guided technology to detect plants and determine the chemical amount to match the plant canopy volume. Thus, sensors are important factors in target structure detection to fulfill automatic variable-rate functions.
The ultrasonic sensor is one of the sensor application examples. McConnell et al. [18] propose a system to determine the volume of foliage by summing the tree measurements detected by a vertical array of transducers. Ultrasonic generators are used as foliage sensors and installed on a discrete grove sprayer to actuate spray nozzles corresponding to foliage sensed within a spray zone [19]. Giles et al. [20] develop an electronic tree canopy volume measurement system based on commercial ultrasonic range transducers. The performance of that measurement system is calibrated at different stages of leaf development in apple and peach orchards. However, the system tends to overestimate the tree width and underestimate the tree extension. Soon afterwards, an electronic measuring system for orchard tree foliage sensing and mapping is proposed for the determination of the amount and vertical distribution of sensed centroids in vertical sectors of orchard trees, and to create map of foliar volumes [21]. Recently, an experimental vertical boom spray system integrated with 20 Hz ultrasonic sensors was developed to spray liner trees in nursery production [22,23].
Laser range detection sensors have been shown to be more accurate and robust, with obvious advantages in detection of crops under field conditions compared to ultrasonic and other sensors. Instrumentations incorporated with laser sensors for crop geometry measurement have been developed and mounted on tractor platforms to adjust the spray output to compensate for the enormous crop changes in orchards for the purpose of avoiding the administration of over or under doses of pesticide [24,25,26]. Tumbo et al. [27] compare the performance of the ultrasonic and laser sensors in estimating canopy volume in citrus, and find that the laser sensor has better estimation, especially for partially-defoliated trees or small replants. Furthermore, lasers can also adapt to a wider range of temperatures and offer faster data acquisition than ultrasonic sensors. Schwartz Electro–Optics, Inc. (SEO) (Orlando, FL, USA) uses a ground-based laser sensor to detect tree foliage and produce a two dimensional profile, but canopy volumes cannot be obtained by that detection system. An airborne laser range detection system is also utilized in the canopy measurement and is validated to be significantly correlated with the measurement made with ground measurements [28,29,30].
Ultimately, a laser scanning sensor is integrated to the variable-rate spray control system of an air-assisted sprayer [31]. Another 270° radial range laser sensor is verified in the detection of complex-shaped objects, and is successfully installed in an air-assisted sprayer [32,33]. However, none of these previous studies referred to canopies as potentially small and dense as greenhouse grown plants.
In greenhouse spray application, handgun equipment and spray booms are common choices, while the spray booms are demonstrated to work better in efficiency and reducing deposition variations at different locations in canopy than handgun equipment [34,35,36].
Targets are placed under the spray booms in greenhouses implemented with boom spray applications. A simple, inexpensive laser scanning sensor for indoor use may be suitable for variable-rate applications if the sensor could be mounted on the spray boom with its scanning plane facing toward the ground [37].
As laser sensors play an important role in plant volume estimation for precise spray, they are of significant importance to validate the detection accuracy of the laser sensor before being integrated into real spray systems. However, current applications of laser sensors on field sprayers are mostly based on the ability of the laser sensor from the side view to detect large and tall plants individually, which is not adaptable for the horizontal spray booms used in greenhouse to treat the pot-grown plants placed on the ground. In previous research [37], the accuracy of the laser sensor in detecting complex-shaped objects along X (width or horizontal distance) direction, Y (length or sensor speed) direction and Z (height or vertical) direction is validated. The dimensions of all the objects detected by the laser sensor matched with the actual dimensions in Y and Z directions very well under indoor conditions. The highest root mean square error (RMSE) and coefficient of variation (CV) are 83 mm and 50.9% in the horizontal direction, 41 mm and 15.2% in the travel direction, and 16 mm and 14.0% in the height direction, respectively. However, the canopy information of the plant could not be fully represented by the measurement along those three coordinates alone. Edges are of fundamental importance to characterize boundaries and preserve the important structural properties in an image, as they correspond to discontinuities in the physical and geometrical properties of scene objects [38,39]. Thus, edge detection is critical for object recognition [40,41] to compensate for the shortage of the three-coordinate-measurement of targets in previous research.
Numerous edge detectors have been reported over the past 50 years. Computing color gradient magnitudes followed by non-maximal suppression is a traditional approach for edge detection [42,43]. The detection of intensity or color gradients is also developed for edge detection of images with certain orientations [44,45,46]. However, they perform poorly for images with blurred and noisy edges. To solve this problem, sophisticated operators such as linear operators have been developed with better immunity to noise for more accurate edge detection. Shen and Castan [47] propose that the symmetric exponential filter of an infinite size (ISEF) is optimal in detecting mono- and multi-edge. A structured learning approach is also applied to edge detection, and shows surprising computationally efficiency [48]. A structure-from-motion task and empirical receiver operating characteristic (ROC) curves are applied in a framework to evaluate the edge detector [49,50].
With evidence of the dimension measurement accuracy of the laser sensor in previous research, the goal of this research was to evaluate the edge profile detection accuracy of the indoor use 270° radial range laser scanning sensor before its integration to the greenhouse variable-rate spray system. The primary objectives were (1) using the laser sensor to scan the complex-shaped objects at different horizontal distances, sensor speeds, and detection heights under indoor conditions; (2) analyzing the detection resolution of the laser sensor; (3) reconstructing the pseudo-color images mapping the 3-D object surface and detecting the edge profiles of the paired images from the laser sensor and the digital camera; (4) validating the edge detection accuracy of the laser sensor with the edge similarity score (ESS) of pared images from the laser sensor and camera.

2. Materials and Methods

2.1. Laser Sensor and Data Acquisition

The laser-scanning sensor used for tests was an indoor use sensor with 10 m detection range (Model UST-10LX, Hokuyo Automatic Co., Ltd., Osaka, Japan). Its light weight (130 g) and small size (50 × 50 × 70 mm) provided flexibilities for installation on moving frames. The sensor transmitted 1080 laser data points in the 270° full radial range at 0.25° angular resolution in a 25 ms scanning cycle. The 270° fan-shaped detection plane of the sensor was oriented perpendicular to both the floor and the sensor travel direction, which enabled the laser sensor to continuously scan the bilateral plants surfaces (540 points on each side). The laser transmitter inside the sensor was driven by a precision step motor rotated evenly at a resolution of 0.25° with the laser wavelength of 905 nm. The object position and distance to the sensor could be calculated from the reflect signals and rotation angles of their transmitted laser beams. The laser sensor was connected to an embedded computer (MXE-1005, Fanless Embedded Computer, ADLINK Technology Inc., Taiwan) via an Ethernet interface connection. A specially-designed program was written in VC++ language and was installed in the embedded computer to filter the uninterested data from the laser sensor, reconstruct 3-D pseudo-color object surface with a format of gray-scale values, calculate the desired amount of spray needed for each nozzle operation, and operate the sprayer. The program was very similar to the program reported by Liu and Zhu [32]. The contours of the images for the detected objects were also constructed by the morphological operator and edge detection method [31,32].

2.2. Laboratory Tests

The accuracy of the laser sensor in detecting the edge profiles of complex-shaped objects was validated in a 5 m wide, 8 m long and 5.5 m high indoor area with 100 lux illuminance light intensity. The ambient temperature was 23 °C and relative humidity was 15%. The sensor was attached on a height-adjustable bracket which was mounted to a constant-speed track, and the 270° fan-shaped detection plane of the sensor faced vertically toward the ground. The sensor height could be adjusted in a range from 0 to 1.5 m above targets. The detection targets were four regularly-shaped objects and two artificial plants (Figure 1). The regularly-shaped objects were toy balls with a pink smooth surface, light brown rectangular cardboard boxes, black and red texture surfaced basketballs, white smooth cylinders (hereafter referred to as “toy ball”, “rectangular box”, “basketball”, and “cylinder”, respectively). The two artificial plants were named “Plant 1” and “Plant 2”. Real greenhouse plants were not chosen for the test to avoid dimension changes over the experiment period.
The diameters of toy balls and basketballs were 233 and 183 mm, respectively. The rectangular boxes were 231 mm wide, 225 mm long, and 115 mm high. The cylinders were 108 mm high and 118 mm in diameter. The dimensions of artificial plants were 231 mm wide, 232 mm long, and 325 mm high for plants 1, and 254 mm wide, 277 mm long and 179 mm high for plants 2, respectively. The edge profiles of all the objects constructed from images by digital camera were used for standard comparisons with the profiles constructed from images by the laser sensor.
During the test, eight of the same objects were classified into a row, evenly. Six types of objects were placed in six rows, parallel to each other. The first object in one row was located directly under the sensor travel path, and the last object was 3.5 m from the first object. This test consisted of two parts, Part 1 and Part 2; the variables in each part were listed in Table 1. As the nozzle space on greenhouse spray booms was usually 0.5 m, the interval distance between objects was set as 0.5 m in Part 1. With this setting, the object was right below the nozzle on the greenhouse spray boom. Correspondingly, the interval spacing between two objects in the same row was 0.27, 0.31, 0.27, 0.38, 0.27 and 0.25 m for toy ball, basketball, rectangular box, cylinder, artificial plant 1, and plant 2, respectively. To simulate a denser crop, the interval distance was shortened to 0.25 m. Similarly, the spacing between two objects in the same row became 0.02, 0.06, 0.02, 0.13, 0.02 and 0 m for toy ball, basketball, rectangular box, cylinder, artificial plant 1, and plant 2, respectively. In this research, the vertical distance between the laser sensor and the top of objects was defined as the detection height.

2.3. Data Analysis

2.3.1. Detection Resolution of the Laser Sensor

Because the laser sensor was mounted with its fan-shaped detection plane along the horizontal distance direction and scanned the objects from one side to the other side, detection resolution (DR) was analyzed along the horizontal direction (DRH) and the sensor travel direction (DRS). In the sensor travel direction, as the detection cycle time of the laser sensor was 25 ms, DRS was affected by the sensor travel speeds; the faster the speed, the lower the DRS.
As the angular resolution of the laser sensor was 0.25°, the distance between two adjacent laser data points varied with the distance away from the sensor, which was the same to that using laser sensors to detect vertical objects [32]. Hence, DRH varied with the dissimilar horizontal distances and detection heights. For a flat surface parallel to the ground on one side of the sensor (taking the minus side as an example), red solid lines shown in Figure 2 represented the laser flux. The blue bold solid line represented the object surface. It was assumed two adjacent laser beams, No. 1 and No. 2, transmitted on the object surface, whose horizontal distances were WL and WA, respectively. DRH could be calculated with the Equation (1) based on the geometry analysis (Figure 2).
DRH = | W A W L | = 1000 × H × { tan [ ( N L + 1 ) × π α 180 ] tan ( N L × π α 180 ) } ,
where H is the detection height, α is the sensor angular resolution (0.25°), NL is the numbers of points that reach the surface at width WL (take the minus side as an example).
The following equations were used to calculate the values of NL:
N L = 1 + I n t [ arctan ( W L H ) π α 180 ] .

2.3.2. Paired Image Similarity Evaluation

In this test, edge similarity score (ESS), based on the distance fields computation, was used to quantify the similarity of paired images from the digital camera and the laser sensor. Distance fields were widely used in shape representation [51,52], object simplification [53], remeshing [54], sculpting [55], swept volume computation [56], path planning and navigation [57,58], collision and proximity computations [59,60], etc. Different algorithms have been proposed to compute the distance fields of geometric models in 2-D and volumetric models in 3-D. Sud and Manocha [61] presented a fast algorithm (DiFi) to compute a discretized distance field of complex objects. For the image similarity evaluation in the test, the absolute correlation coefficient between two edge distance fields of the paired images from the laser sensor and the digital camera was used to calculate ESS as Equation (3) [32]:
ESS = a b s ( j = 1 n ( α j α ¯ ) ( β j β ¯ ) j = 1 n ( α j α ¯ ) 2 j = 1 n ( β j β ¯ ) 2 ) ,
where αj and βj (j = 1, 2, ⋯, n) are pixel points at the edge profiles of paired images obtained from the laser sensor and camera respectively, and α ¯ and β ¯ are the mean values of αj and βj. ESS ranges from 0 to 1:0 means there is no similarity between the edge profiles of the paired images, while 1 means the two profiles are exactly same. Thus, greater ESS presented higher similarity of the paired images.

3. Results and Discussion

3.1. Detection Resolution of the Laser Sensor

For the sensor travel direction, DRS was proportional to the speeds because of the 25 ms scanning cycle time. The DRS was 11, 16, 22, 27 and 33 mm at speeds of 1.6, 2.4, 3.2, 4.0 and 4.8 km h−1, respectively. Under ideal conditions, there would be, at most, a single laser beam that could reach the objects when their dimensions in the travel direction were smaller than these resolutions.
The DRH was significantly affected by the sensor detection heights and horizontal distances. The horizontal distance influenced DRH more significantly when the detection was conducted at a relatively low detection height (Figure 3). DRH increased as the object horizontal distance increased at all detection heights. For example, at 0.5 m detection height, DRH was 2.5, 4.4, 10.9, 22.0, 36.6, 56.1, 82.2 and 109.2 mm when the horizontal distance was 0, 0.5, 1.0, 1.5, 2.0, 2.5, 3.0 and 3.5 m, respectively. The DRH at lower detection height increased more rapidly with the increase of horizontal distance than DRH at higher detection height. The DRH did not have a constant increase or decrease trend with increase of the detection height. For example, at 0 m horizontal distance, DRH increased from 1.1 to 4.4 mm as the detection height increased from 0.25 to 1.0 m. However, at the 3.5 m horizontal distance, DRH was 211.0, 109.2, 74.2 and 58.3 mm for the detection height of 0.25, 0.5, 0.75 and 1.0 m, respectively. These results revealed that relatively higher detection height could weaken the negative influence of the horizontal distances on DRH and increase the detection accuracy.

3.2. Object Outline Profile Similarity

Figure 4 and Figure 5 illustrated the process of the ESS calculated from the outline profile similarity for the paired images of objects from the camera and the laser sensor. For the reconstructed pseudo-color images from the laser sensor, different colors represented different vertical distances between the sensor and the object surfaces.

3.2.1. Edge Detection of Toy Balls

Travel speed had a minor influence on the measurement accuracy in the range of travel speeds evaluated. Table 2, Table 3 and Table 4 shown the mean ESS of the toy balls calculated with Equation (3) for different sensor travel speeds, detection heights, and object types, respectively. The mean ESS of eight toy balls was 0.59, 0.58, 0.56 and 0.54 at the sensor travel speed of 1.6, 2.4, 3.2, 4.0 and 4.8 km h−1, respectively. As mentioned above, the sensor travel speed slightly influenced the detection resolution along the travel direction. Similarly, ESS was also influenced by the sensor travel speed. For example, for the toy ball located at 0.5 m horizontal distance, the mean ESS was 0.83, 0.82, 0.81, 0.79 and 0.78 at the sensor travel speed of 1.6, 2.4, 3.2, 4.0 and 4.8 km h−1, respectively (Table 2).
As predicted by the calculation of DR of the laser sensor in detecting objects with different horizontal distances at different sensor detection heights, the detection accuracy was affected by both the detection height and horizontal distance. The horizontal distance influenced the detection more significantly. For the eight toy balls at detection heights 0.25, 0.5, 0.75 and 1.0 m, the average ESS was 0.48, 0.51, 0.61 and 0.67, respectively. For all the toy balls, the influence from the four detection heights varied with their horizontal distances. For example, the ESS ranged from 0.96 to 0.97 for the toy ball at horizontal distance of 0 m, while the range was changed from 0.54 to 0.76 at horizontal distance of 1.0 m (Table 3). That is, the detection heights slightly affected the ESS of the toy ball directly under the sensor, and this influence became greater as the horizontal distance became longer.
For the eight toy balls under all the 20 combinations of detection heights and travel speeds, the mean ESSs were 0.96, 0.80, 0.66, 0.54, 0.47, 0.38, 0.37 and 0.35 at horizontal distances of 0, 0.5, 1.0, 1.5, 2.0, 2.5, 3.0 and 3.5 m, respectively (Table 4). On the other hand, the mean ESS of the toy balls at different horizontal distances ranged from 0.98 to 0.37, from 0.97 to 0.35, from 0.96 to 0.35, from 0.96 to 0.34, and from 0.95 to 0.33 for the sensor travel speeds of 1.6, 2.4, 3.2, 4.0 and 4.8 km h−1 (Table 2). The mean ESS for toy balls at detection heights of 0.25, 0.5, 0.75 and 1.0 m was from 0.96 to 0.29, from 0.96 to 0.30, from 0.97 to 0.35, and from 0.96 to 0.46, respectively (Table 3). Thus, the horizontal distances influenced the edge detection significantly. All these influences from the detection heights and horizontal distances were caused by DR.
As the plant arrangement in real greenhouse could not be changed, a higher detection accuracy of the laser sensor could be obtained with a detection height greater than 1.0 m and a sensor speed lower than 1.6 km h−1.

3.2.2. Edge Detection of Basketballs

The edge detection accuracy of basketballs was also significantly influenced by the horizontal distance, varying with the detection height, and slightly affected by the sensor speed. For the basketballs, the mean ESSs at different travel speeds and detection heights are illustrated in Table 5 and Table 6, respectively. The overall mean ESS of the basketballs across the five travel speeds and four detection heights decreased from 0.95 ± 0.61 to 0.38 ± 0.03 as the horizontal distance increased from 0 to 3.5 m. Across the different horizontal distances and detection heights, the mean ESS was 0.61, 0.59, 0.58, 0.57, and 0.56 at the sensor travel speed of 1.6, 2.4, 3.2, 4.0, and 4.8 km h−1, respectively. The mean ESS was 0.49, 0.53, 0.61, and 0.70 across the different speeds and horizontal distances at detection heights of 0.25, 0.5, 0.75, and 1.0 m, respectively.
Similar to the toy balls, the object horizontal distances influenced the ESS greatly. For example, the ESS ranged from 0.96 to 0.37 for the basketballs at different horizontal distances at the sensor travel speed of 3.2 km h−1 (Table 5). Besides, the ESS ranged from 0.36 to 0.95 for the basketballs at different horizontal distances for sensor detection height of 0.25 m, and from 0.42 to 0.95 for the sensor detection height of 1.0 m (Table 6). Hence, based on the evaluation in this research, the optimal laser scanner sensor settings would be at detection height of 1.0 m and travel speed of 1.6 km h−1.

3.2.3. Edge Detection of Rectangular Boxes

Similar to the ball objects, the detection ability of the laser sensor in detecting rectangular box was significantly affected by the object horizontal distance and detection heights, and decreased with the sensor travel speed increasing from 1.6 to 4.8 km h−1. However, the overall ESS was greater than the two sized balls mentioned above, which was due to that the rectangular box surface was smoother than the ball surface. The mean ESSs of rectangular boxes at different travel speeds and detection heights are shown in Table 7 and Table 8, respectively. Across all the combinations of travel speeds and detection heights, the overall mean ESS decreased from 0.97 ± 0.01 to 0.48 ± 0.06 as the horizontal distance increased from 0 to 3.5 m (Table 4). However, the ESS slightly decreased from 0.73 to 0.67 as the sensor travel speed increased from 1.6 to 4.8 km h−1. The mean ESS was 0.67, 0.68, 0.72, and 0.73 at detection heights of 0.25, 0.5, 0.75 and 1.0 m, respectively.

3.2.4. Edge Detection of Cylinders

Table 9 and Table 10 shown the mean ESS of the cylinders at eight horizontal distances for different travel speeds and different detection heights, respectively. The ESS was 0.61, 0.59, 0.58, 0.57 and 0.56 at the sensor travel speed of 1.6, 2.4, 3.2, 4.0 and 4.8 km h−1, respectively. In addition, the mean ESS was 0.52, 0.52, 0.61 and 0.66 at detection heights of 0.25, 0.5, 0.75 and 1.0 m, respectively. Similar to other regularly-shaped objects, ESS of the cylinders was considerably influenced by the horizontal distances, and slightly influenced by the travel speeds. For instance, the overall ESS across all the conditions at different horizontal distances varied from 0.31 to 0.96 (Table 4), and varied between 0.85 and 0.86 for the cylinders at horizontal distance of 0.5 m as the travel speed increased from 1.6 to 4.8 km h−1 (Table 9). Because of the variable DR caused by different detection heights and horizontal distances, the ESS also varied with the horizontal distance at a given detection height. For the cylinders at 0 m horizontal distance, the ESS changed slightly in the range from 0.95 to 0.96 (Table 10). However, the ESS of the cylinders at the horizontal distances greater than 0 m increased with the detection heights. For example, ESS at 1.0 m horizontal distance was 0.58, 0.63, 0.73 and 0.81 when the detection height was 0.25, 0.5, 0.75, and 1.0 m, respectively (Table 10). These results also confirmed that greater detection height could increase the edge detection accuracy of the laser sensor.

3.2.5. Edge Detection of Two Artificial Plants

The mean ESS at different travel speeds and detection heights are shown in Table 11 and Table 12 for the artificial plant 1, and are shown in Table 13 and Table 14 for the artificial plant 2, respectively. Compared with the regularly-shaped objects, the ESS of the artificial plants were a little lower because of their irregular architectures. However, the overall mean ESS was still influenced considerably by the horizontal distances. The ESS for artificial plant 1 was 0.90 ± 0.02, 0.72 ± 0.02, 0.67 ± 0.04, 0.61 ± 0.03, 0.45 ± 0.08, 0.37 ± 0.09, 0.35 ± 0.09 and 0.35 ± 0.08 at the horizontal distance of 0, 0.5, 1.0, 1.5, 2.0, 2.5, 3.0 and 3.5 m, respectively. However, the overall mean ESS for plant 2 was slightly higher than that for plant 1. The ESS for artificial plant 2 was 0.94 ± 0.01, 0.81 ± 0.05, 0.78 ± 0.04, 0.64 ± 0.04, 0.48 ± 0.08, 0.44 ± 0.06, 0.41 ± 0.11 and 0.38 ± 0.09 at the horizontal distance of 0, 0.5, 1.0, 1.5, 2.0, 2.5, 3.0 and 3.5 m, respectively. This was because that the plant 2 had flatter canopy surface than the plant 1, which increased the laser sensor detection accuracy.
Similar to the regularly-shaped objects, the mean ESS for the artificial plants was slightly influenced by the sensor travel speed and was considerably influenced by the detection height. At sensor travel speeds of 1.6, 2.4, 3.2, 4.0, and 4.8 km h−1, the mean ESS was 0.59, 0.57, 0.55, 0.54, and 0.52 for the artificial plant 1 and was 0.64, 0.63, 0.61, 0.60, and 0.58 for the artificial plant 2, respectively. Instead, at the detection height of 0.25, 0.5, 0.75, and 1.0 m, the mean ESS was 0.51, 0.51, 0.58, and 0.61 for the artificial plant 1 and was 0.58, 0.59, 0.61, and 0.67 for the artificial plant 2.

3.2.6. Edge Detection of Objects at 0.25 m Apart

Additional edge detections for the objects arranged 0.25 m apart at 3.2 km h−1 travel speed and 0.5 m detection height showed the overall mean ESS for all the objects ranged from 0.95 to 0.58 as the horizontal distance increased from 0 to 3.5 m. Generally, the edge detection accuracy for the objects that were placed 0.25 m apart was higher than that of the objects arranged 0.5 m apart. The reason was that the laser sensor detected the objects from top to bottom, and the interference of the laser points reflected from the side surface of the objects became weaker when the objects placed closer to each other.
To maximize the space usage for optimal greenhouse production, plants are usually placed close to each other. This production practice benefits the application of laser sensors in greenhouses, because it is more accurate for the sensors to detect plants which are placed close to each other.

4. Summary

The accuracy of the 270° radial laser sensor to detect object shape and sizes was evaluated by the object edge similarity of paired images from the laser sensor and a digital camera. The edge detection accuracy was affected by the laser DR, which was significantly influenced by the sensor travel speed, and horizontal distance and sensor detection height. The DRS was inversely proportional to the sensor travel speed. High sensor speed produced negative effect on DRS. The influence on DRH coming from the horizontal distance at relatively lower sensor detection height was more significant than that at relatively greater detection height. These results revealed that relatively greater detection height could weaken the negative influence of the horizontal distances on DRH and improve DR.
The edge detection accuracy of the laser sensor was slightly influenced by the sensor travel speed, and significantly by the horizontal distance and sensor detection height. The ESS decreased slightly as the sensor speed increased from 1.6 to 4.8 km h−1. The ESS significantly decreased with increase of the horizontal distance in the range of 0 to 3.5 m. The detection results also showed that ESS increased as the sensor detection height increasing from 0.25 to 1.0 m in the test. Moreover, the influence from the horizontal distance of objects could be weakened by raising the sensor detection height in the range of 0.25 to 1.0 m. Based on the parameters evaluated in this research, the laser sensor setting of detection height of 1.0 m and travel speed of 1.6 km h−1 would optimize the laser sensor detection.
The detection results of the two sized artificial plants, as well as the rectangular box, demonstrated that the laser sensor could obtain a better performance in edge detection of those objects with flatter canopy surface.
Evaluation of objects with small interval spacing between two objects in the same row also illustrated that the edge detection accuracy became higher when the objects were placed closer together, which illustrated that the sensor was able to detect plants in greenhouse, because they were usually placed close to one another to maximize space usage for optimal greenhouse production.
In general, the laboratory test demonstrated the 270° radial laser scanning sensor was capable of detecting surface profiles of the complex-shaped objects at different horizontal distances and detection heights to the sensor with different sensor speeds. Future research will integrate this laser sensor and the control system to the variable-rate sprayers to automatically control the spray outputs based on plant structures and travel speeds in real time for greenhouse applications.

Author Contributions

All authors contributed to conceiving the research objectives, approaches, and manuscript preparations and edits. Tingting Yan conducted experiments and data analyses and prepared the manuscript draft.

Funding

We are also grateful to the USDA-NIFA Specialty Crop Research Initiative (Grant No. 2015-51181-24253) for funding this research.

Acknowledgments

The authors acknowledge invaluable technical assistance from Adam Clark, Barry Nudd, and Andy Doklovic.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cho, S.I.; Ki, N.H. Autonomous speed sprayer guidance using machine vision and fuzzy logic. Trans. ASAE 1999, 42, 1137–1143. [Google Scholar] [CrossRef]
  2. Fox, R.D.; Reichard, D.L.; Brazee, R.D.; Krause, C.R.; Hall, F.R. Downwind residues from spraying a semidwarf apple orchard. Trans. ASAE 1993, 36, 333–340. [Google Scholar] [CrossRef]
  3. Zhu, H.; Derksen, R.C.; Guler, H.; Krause, C.R.; Ozkan, H.E. Foliar deposition and off-target loss with different spray techniques in nursery applications. Trans. ASAE 2006, 49, 325–334. [Google Scholar] [CrossRef]
  4. Derksen, R.C.; Zhu, H.; Fox, R.D.; Brazee, R.D.; Krause, C.R. Coverage and drift produced by air induction and conventional hydraulic nozzles used for orchard applications. Trans. ASABE 2007, 50, 1493–1501. [Google Scholar] [CrossRef]
  5. Fox, R.D.; Derksen, R.C.; Zhu, H.; Brazee, R.D.; Svensson, S.A. A history of air-blast sprayer development and future prospects. Trans. ASABE 2008, 51, 405–410. [Google Scholar] [CrossRef]
  6. Pai, N.; Salyani, M.; Sweeb, R.D. Regulating airflow of orchard airblast sprayer based on tree foliage density. Trans. ASABE 2009, 52, 1423–1428. [Google Scholar] [CrossRef]
  7. Wei, J.; Salyani, M. Development of a laser scanner for measuring tree canopy characteristics: Phase 1. Prototype development. Trans. ASAE 2004, 47, 2101–2107. [Google Scholar] [CrossRef]
  8. Palacín, J.; Pallejà, T.; Tresanchez, M.; Sanz, R.; Llorens, J.; Ribes-Dasi, M.; Masip, J.; Arno, J.; Escola, A.; Rosell, J.R. Real-time tree-foliage surface estimation using a ground laser scanner. IEEE Trans. Instrum. Meas. 2007, 56, 1377–1383. [Google Scholar] [CrossRef][Green Version]
  9. Rosell, J.R.; Sanz, R.; Llorens, J.; Arnó, J.; Escolà, A.; Ribes-Dasi, M.; Palacin, J. A tractor-mounted scanning LIDAR for the non-destructive measurement of vegetative volume and surface area of tree-row plantations: A comparison with conventional destructive measurements. Biosyst. Eng. 2009, 102, 128–134. [Google Scholar] [CrossRef][Green Version]
  10. Rosell, J.R.; Llorens, J.; Sanz, R.; Arnó, J.; Ribes-Dasi, M.; Masip, J.; Palacin, J. Obtaining the three-dimensional structure of tree orchards from remote 2D terrestrial LIDAR scanning. Agric. Forest Meteorol. 2009, 149, 1505–1515. [Google Scholar] [CrossRef][Green Version]
  11. Palleja, T.; Tresanchez, M.; Teixido, M.; Sanz, R.; Rosell, J.R.; Palacin, J. Sensitivity of tree volume measurement to trajectory errors from a terrestrial LIDAR scanner. Agric. Forest Meteorol. 2010, 150, 1420–1427. [Google Scholar] [CrossRef]
  12. Downey, D.; Giles, D.; Klassen, P.; Niederholzer, F. “Smart” sprayer technology provides environmental and economic benefits in California orchards. Calif. Agric. 2011, 65, 85–89. [Google Scholar][Green Version]
  13. Llorens, J.; Gil, E.; Llop, J. Ultrasonic and LIDAR sensors for electronic canopy characterization in vineyards: Advances to improve pesticide application methods. Sensors 2011, 11, 2177–2194. [Google Scholar] [CrossRef] [PubMed]
  14. Sanz, R.; Llorens, J.; Rosell, J.R.; Gregorio-Lopez, E.; Palacin, J. Characterisation of the LMS200 laser beam under the influence of blockage surfaces: Influence on 3D scanning of tree orchards. Sensors 2011, 11, 2751–2772. [Google Scholar] [CrossRef] [PubMed]
  15. Reichard, D.L.; Ladd, T.L. An automatic intermittent sprayer. Trans. ASAE 1981, 24, 893–896. [Google Scholar] [CrossRef]
  16. Ladd, T.L., Jr.; Reichard, D.L.; Collins, D.L.; Buriff, C.R. An automatic intermittent sprayer: A new approach to the insecticidal control of horticultural insect pests. J. Econ. Entomol. 1978, 71, 789–792. [Google Scholar] [CrossRef]
  17. Ladd, T.L., Jr.; Reichard, D.L. Photoelectrically-operated intermittent sprayers for the insecticidal control of horticultural insect pests. J. Econ. Entomol. 1980, 73, 525–528. [Google Scholar] [CrossRef]
  18. McConnell, R.L.; Elliot, K.C.; Blizzard, S.H.; Kosten, K.H. Electronic measurement of tree-row volume. In Agricultural Electronics; ASAE: St. Joseph, MI, USA, 1983; Volume I, pp. 85–90. [Google Scholar]
  19. Roper, B.E. Grove Sprayer. U.S. Patent No. 4768713, 6 September 1988. [Google Scholar]
  20. Giles, D.K.; Delwiche, M.J.; Dodd, R.B. Electronic measurement of tree canopy volume. Trans. ASAE 1988, 31, 264–272. [Google Scholar]
  21. Giles, D.K.; Delwiche, M.J.; Dodd, R.B. Method and Apparatus for Target Plant Foliage Sensing and Mapping and Related Materials Application Control. U.S. Patent 4,823,268, 18 April 1989. [Google Scholar]
  22. Jeon, H.Y.; Zhu, H.; Derksen, R.C.; Ozkan, H.E.; Krause, C.R.; Fox, R.D. Performance evaluation of a newly developed variable-rate sprayer for nursery liner applications. Trans. ASABE 2011, 54, 1997–2007. [Google Scholar] [CrossRef]
  23. Jeon, H.; Zhu, H. Development of variable-rate sprayer for nursery liner applications. Trans. ASABE 2012, 55, 303–312. [Google Scholar] [CrossRef]
  24. Walklate, P.J. A laser scanning instrument for measuring crop geometry. Agric. Forest Meteorol. 1989, 46, 275–284. [Google Scholar] [CrossRef]
  25. Walklate, P.J.; Richardson, G.M.; Baker, D.E.; Richards, P.A.; Cross, J.V. Short-range lidar measurement of top fruit tree canopies for pesticide applications research in the United Kingdom. In Advances in Laser Remote Sensing for Terrestrial and Oceanographic Applications; International Society for Optics and Photonics: Bellingham, WA, USA, 1997; Volume 3059, pp. 143–152. [Google Scholar]
  26. Walklate, P.J.; Richardson, G.M.; Cross, J.V.; Murray, R.A.; Gilbert, A.; Glass, C.; Taylor, W.; Western, N. Relationship between orchard tree crop structure and performance characteristics of an axial fan sprayer. Aspects Appl. Biol. 2000, 57, 285–292. [Google Scholar]
  27. Tumbo, S.D.; Salyani, M.; Whitney, J.D.; Wheaton, T.A.; Miller, W.M. Investigation of laser and ultrasonic ranging sensors for measurements of citrus canopy volume. Appl. Eng. Agric. 2002, 18, 367. [Google Scholar] [CrossRef]
  28. Ritchie, J.C.; Evans, D.L.; Jacobs, D.; Everitt, J.H.; Weltz, M.A. Measuring canopy structure with an airborne laser altimeter. Trans. ASAE 1993, 36, 1235–1238. [Google Scholar] [CrossRef]
  29. Nilsson, M. Estimation of tree heights and stand volume using an airborne lidar system. Remote Sens. Environ. 1996, 56, 1–7. [Google Scholar] [CrossRef]
  30. Parker, R.C.; Matney, T.G. Comparison of optical dendrometers for prediction of standing tree volume. South. J. Appl. Forestry 1999, 23, 100–107. [Google Scholar]
  31. Chen, Y.; Zhu, H.; Ozkan, H.E. Development of a variable-rate sprayer with laser scanning sensor to synchronize spray outputs to tree structures. Trans. ASABE 2012, 55, 773–781. [Google Scholar] [CrossRef]
  32. Liu, H.; Zhu, H. Evaluation of a laser scanning sensor in detection of complex-shaped targets for variable-rate sprayer development. Trans. ASABE 2016, 59, 1181–1192. [Google Scholar]
  33. Shen, Y.; Zhu, H.; Liu, H.; Chen, Y.; Ozkan, E. Development of a laser-guided embedded-computer-controlled air-assisted precision sprayer. Trans. ASABE 2017, 60, 1827–1838. [Google Scholar] [CrossRef]
  34. Derksen, R.C.; Frantz, J.; Ranger, C.M.; Locke, J.C.; Zhu, H.; Krause, C.R. Comparing greenhouse handgun delivery to poinsettias by spray volume and quality. Trans. ASABE 2008, 51, 27–33. [Google Scholar] [CrossRef]
  35. Langenakens, J.; Vergauwe, G.; de Moor, A. Comparing handheld spray guns and spray booms in lettuce crops in a greenhouse. Aspects Appl. Biol. 2002, 66, 123–128. [Google Scholar]
  36. Knewitz, H.; Koch, H.; Lehn, F. Einsatz eines Düsenverbandes und flächenbezogene Dosierung bei der Anwendung von Pflanzenschutzmitteln im Gewächshaus. Gesunde Pflanzen 2003, 55, 70–76. [Google Scholar] [CrossRef]
  37. Yan, T.; Zhu, H.; Sun, L.; Wang, X.; Ling, P. Detection of 3-D objects with a 2-D laser scanning sensor for greenhouse spray applications. Comput. Electron. Agric. 2018, 152, 363–374. [Google Scholar] [CrossRef]
  38. Ziou, D.; Tabbone, S. Edge detection techniques-an overview. Pattern Recogn. Image Anal. C/C Raspoznavaniye Obrazov I Analiz Izobrazhenii 1998, 8, 537–559. [Google Scholar]
  39. Maini, R.; Aggarwal, H. Study and comparison of various image edge detection techniques. Int. J. Image Process. 2009, 3, 1–11. [Google Scholar]
  40. Ullman, S.; Basri, R. Recognition by Linear Combination of Models (No. AI-M-1152); Massachusetts Inst. of Tech Cambridge Artificial Intelligence Lab: Cambridge, MA, USA, 1989. [Google Scholar]
  41. Ferrari, V.; Fevrier, L.; Jurie, F.; Schmid, C. Groups of adjacent contour segments for object detection. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 36–51. [Google Scholar] [CrossRef] [PubMed]
  42. Canny, J. A computational approach to edge detection. IEEE Trans. Pattern Anal. Mach. Intel. 1986, 6, 679–698. [Google Scholar] [CrossRef]
  43. Breiman, L.; Friedman, J.; Stone, C.J.; Olshen, R.A. Classification and Regression Trees; Chapman and Hall/CRC: London, UK, 1984. [Google Scholar]
  44. Fram, J.R.; Deutsch, E.S. On the quantitative evaluation of edge detection schemes and their comparison with human performance. IEEE Trans. Comput. 1975, 6, 616–628. [Google Scholar] [CrossRef]
  45. Bergholm, F. Edge focusing. IEEE Trans. Pattern Anal. Mach. Intell. 1987, 6, 726–741. [Google Scholar] [CrossRef]
  46. Perona, P.; Malik, J. Scale-space and edge detection using anisotropic diffusion. IEEE Trans. Pattern Anal. Mach. Intel. 1990, 12, 629–639. [Google Scholar] [CrossRef][Green Version]
  47. Shen, J.; Castan, S. Towards the unification of band-limited derivative operators for edge detection. Signal Process. 1993, 31, 103–119. [Google Scholar] [CrossRef]
  48. Dollár, P.; Zitnick, C.L. Structured forests for fast edge detection. In Proceedings of the IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013; pp. 1841–1848. [Google Scholar]
  49. Shin, M.C.; Goldgof, D.; Bowyer, K.W. An objective comparison methodology of edge detection algorithms using a structure from motion task. In Proceedings of the 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Barbara, CA, USA, 23–25 June 1998. [Google Scholar]
  50. Bowyer, K.; Kranenburg, C.; Dougherty, S. Edge detector evaluation using empirical ROC curves. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Fort Collins, CO, USA, 23–25 June 1999; Volume 1, pp. 354–359. [Google Scholar]
  51. Varadhan, G.; Krishnan, S.; Kim, Y.J.; Manocha, D. Feature-sensitive subdivision and isosurface reconstruction. In Proceedings of the 14th IEEE Visualization, Seattle, WA, USA, 22–24 October 2003. [Google Scholar]
  52. Govindaraju, N.K.; Lloyd, B.; Wang, W.; Lin, M.; Manocha, D. Fast computation of database operations using graphics processors. In Proceedings of the ACM SIGGRAPH 2005 Courses, Los Angeles, CA, USA, 31 July–4 August 2005. [Google Scholar]
  53. He, T.; Hong, L.; Varshney, A.; Wang, S.W. Controlled topology simplification. IEEE Trans. Vis. Comput. Graphics 1996, 2, 171–184. [Google Scholar]
  54. Kobbelt, L.P.; Botsch, M.; Schwanecke, U.; Seidel, H.P. Feature sensitive surface extraction from volume data. In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, Los Angeles, CA, USA, 12–17 August 2001; pp. 57–66. [Google Scholar]
  55. Perry, R.N.; Frisken, S.F. Kizamu: A system for sculpting digital characters. In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, Los Angeles, CA, USA, 12–17 August 2001; pp. 47–56. [Google Scholar]
  56. Kim, Y.J.; Varadhan, G.; Lin, M.C.; Manocha, D. Fast swept volume approximation of complex polyhedral models. Comput. Aided Design 2004, 36, 1013–1027. [Google Scholar] [CrossRef]
  57. Hong, L.; Muraki, S.; Kaufman, A.; Bartz, D.; He, T. Virtual voyage: Interactive navigation in the human colon. In Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques, Los Angeles, CA, USA, 3–8 August 1997; pp. 27–34. [Google Scholar]
  58. Hoff, K.E., III; Keyser, J.; Lin, M.; Manocha, D.; Culver, T. Fast computation of generalized Voronoi diagrams using graphics hardware. In Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques, Los Angeles, CA, USA, 8–13 August 1999; pp. 277–286. [Google Scholar]
  59. Hoff, K.E., III; Zaferakis, A.; Lin, M.; Manocha, D. Fast and simple 2D geometric proximity queries using graphics hardware. In Proceedings of the 2001 Symposium on Interactive 3D Graphics, Research Triangle Pk, North Carolina, NC, USA, 19–21 March 2001; pp. 145–148. [Google Scholar]
  60. Hoff, K.; Zaferakis, A.; Lin, M.; Manocha, D. Fast 3d Geometric Proximity Queries between Rigid and Deformable Models Using Graphics Hardware Acceleration; UNC-CH Technical Report TR02-004; University of North Carolina, Department of Computer Science: Chapel Hill, NC, USA, 2002. [Google Scholar]
  61. Sud, A.; Manocha, D. Fast Distance Field Computation Using Graphics Hardware; Computer Science Technical Report TR03-026; University of North Carolina, Department of Computer Science: Chapel Hill, NC, USA, 2003. [Google Scholar]
Figure 1. The 270° radial range laser sensor mounted on a constant-speed track to detect six rows of different objects (artificial plant 1, artificial plant 2, basketball, rectangular box, toy ball, and cylinder).
Figure 1. The 270° radial range laser sensor mounted on a constant-speed track to detect six rows of different objects (artificial plant 1, artificial plant 2, basketball, rectangular box, toy ball, and cylinder).
Sensors 18 04060 g001
Figure 2. Geometry analysis of the laser beam points transmitted on the object surface on one side of the sensor.
Figure 2. Geometry analysis of the laser beam points transmitted on the object surface on one side of the sensor.
Sensors 18 04060 g002
Figure 3. Calculated detection resolution of the laser sensor along the horizontal direction (DRH) with Equation (1) for different horizontal distances and detection heights.
Figure 3. Calculated detection resolution of the laser sensor along the horizontal direction (DRH) with Equation (1) for different horizontal distances and detection heights.
Sensors 18 04060 g003
Figure 4. Processing edge similarity score (ESS) for paired images of eight 0.5 m evenly spaced objects obtained from the camera (upper) and laser sensor (lower): (a) toy ball, (b) basketball, (c) rectangular box, (d) cylinder, (e) artificial plant 1, and (f) artificial plant 2. The reconstructed images from the laser sensor were taken at 3.2 km h−1 travel speed and 0.5 m detection height. Different colors represent different vertical distances between the object surfaces and the laser sensor.
Figure 4. Processing edge similarity score (ESS) for paired images of eight 0.5 m evenly spaced objects obtained from the camera (upper) and laser sensor (lower): (a) toy ball, (b) basketball, (c) rectangular box, (d) cylinder, (e) artificial plant 1, and (f) artificial plant 2. The reconstructed images from the laser sensor were taken at 3.2 km h−1 travel speed and 0.5 m detection height. Different colors represent different vertical distances between the object surfaces and the laser sensor.
Sensors 18 04060 g004aSensors 18 04060 g004b
Figure 5. Processing ESS for paired images of eight 0.25 m evenly spaced objects obtained from the camera (upper) and laser sensor (lower): (a) toy ball, (b) artificial plant 1. The reconstructed images from the laser sensor were taken at 3.2 km h−1 travel speed and 0.5 m detection height. Different colors represent different vertical distances between the object surfaces and the laser sensor.
Figure 5. Processing ESS for paired images of eight 0.25 m evenly spaced objects obtained from the camera (upper) and laser sensor (lower): (a) toy ball, (b) artificial plant 1. The reconstructed images from the laser sensor were taken at 3.2 km h−1 travel speed and 0.5 m detection height. Different colors represent different vertical distances between the object surfaces and the laser sensor.
Sensors 18 04060 g005
Table 1. Test variables in the test for validating the laser sensor detection accuracy for object edge profiles.
Table 1. Test variables in the test for validating the laser sensor detection accuracy for object edge profiles.
Test PartVariable
1Interval distance (m)0.5
Speed (km h−1)1.62.43.24.04.8
Detection height (m)0.250.500.751.00
Replication3
2Interval distance (m)0.25
Speed (km h−1)3.2
Detection height (m)0.5
Replication3
Table 2. Mean edge similarity score (ESS) of toy balls at five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) across eight horizontal distances and four different detection heights.
Table 2. Mean edge similarity score (ESS) of toy balls at five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) across eight horizontal distances and four different detection heights.
Speed (km h−1)Horizontal Distance (m)
00.51.01.52.02.53.03.5
1.60.980.830.690.580.490.410.390.37
2.40.970.820.660.550.490.400.390.35
3.20.960.810.670.540.470.380.380.35
4.00.960.790.630.510.470.380.350.34
4.80.950.780.640.500.450.350.320.33
Table 3. Mean ESS of toy balls at four detection heights (0.25, 0.5, 0.75 and 1.0 m) across eight horizontal distances and five laser sensor travel speeds.
Table 3. Mean ESS of toy balls at four detection heights (0.25, 0.5, 0.75 and 1.0 m) across eight horizontal distances and five laser sensor travel speeds.
Detection Height (m)Horizontal Distance (m)
00.51.01.52.02.53.03.5
0.250.960.640.540.370.360.360.330.29
0.50.960.740.610.430.360.330.340.30
0.750.970.910.730.660.560.370.350.35
1.00.960.930.760.690.620.480.440.46
Table 4. Mean ESS of four regularly-shaped objects (toy balls, basketballs, rectangular boxes and cylinders) and two sized artificial plants (plant 1 and plant 2) across all the combinations of five travel speeds (1.6, 2.4, 3.2, 4.0, and 4.8 km h−1) and four different detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances.
Table 4. Mean ESS of four regularly-shaped objects (toy balls, basketballs, rectangular boxes and cylinders) and two sized artificial plants (plant 1 and plant 2) across all the combinations of five travel speeds (1.6, 2.4, 3.2, 4.0, and 4.8 km h−1) and four different detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances.
ObjectHorizontal Distance (m)
00.51.01.52.02.53.03.5
Toy ball0.960.800.660.540.470.380.370.35
Basketball0.950.790.680.580.490.410.380.38
Rectangular box0.970.900.840.730.640.540.490.48
Cylinder0.960.850.690.610.510.370.330.31
Plant 10.900.720.670.610.450.370.350.35
Plant 20.940.810.780.640.480.440.410.38
Table 5. Mean ESS of basketballs across four different detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances and five travel speeds.
Table 5. Mean ESS of basketballs across four different detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances and five travel speeds.
Speed (km h−1)Horizontal Distance (m)
00.51.01.52.02.53.03.5
1.60.960.810.720.630.520.430.400.40
2.40.950.800.690.600.520.420.380.37
3.20.960.790.680.580.480.400.370.37
4.00.940.790.650.540.500.400.350.35
4.80.950.770.660.530.450.390.340.36
Table 6. Mean ESS of basketballs across five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) at eight horizontal distances and four different detection heights.
Table 6. Mean ESS of basketballs across five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) at eight horizontal distances and four different detection heights.
Detection Height (m)Horizontal Distance (m)
00.51.01.52.02.53.03.5
0.250.950.610.530.410.380.300.340.36
0.50.950.710.640.500.430.350.350.35
0.750.960.910.720.640.500.380.360.39
1.00.950.940.840.750.660.610.450.42
Table 7. Mean ESS of rectangular boxes across four different detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances and five sensor travel speeds.
Table 7. Mean ESS of rectangular boxes across four different detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances and five sensor travel speeds.
Speed (km h−1)Horizontal Distance (m)
00.51.01.52.02.53.03.5
1.60.980.920.860.790.680.600.500.51
2.40.970.910.850.730.640.550.500.48
3.20.970.900.840.730.640.530.490.48
4.00.970.900.840.710.620.510.480.47
4.80.960.900.810.700.610.500.450.46
Table 8. Mean ESS of rectangular boxes across five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) at eight horizontal distances and four detection heights.
Table 8. Mean ESS of rectangular boxes across five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) at eight horizontal distances and four detection heights.
Detection Height (m)Horizontal Distance (m)
00.51.01.52.02.53.03.5
0.250.980.880.820.710.600.540.420.41
0.50.980.910.820.720.640.490.440.43
0.750.970.910.840.730.650.560.520.53
1.00.960.920.870.750.660.570.550.54
Table 9. Mean ESS of cylinders across four different detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances and five travel speeds.
Table 9. Mean ESS of cylinders across four different detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances and five travel speeds.
Speed (km h−1)Horizontal Distance (m)
00.51.01.52.02.53.03.5
1.60.970.860.740.660.550.390.350.33
2.40.950.850.720.630.540.390.330.32
3.20.960.860.680.600.510.370.330.31
4.0 0.940.850.660.590.490.360.330.29
4.80.950.850.650.580.460.350.310.30
Table 10. Mean ESS of cylinders across five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) at eight horizontal distances and four detection heights.
Table 10. Mean ESS of cylinders across five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) at eight horizontal distances and four detection heights.
Detection Height (m)Horizontal Distance (m)
00.51.01.52.02.53.03.5
0.250.960.830.580.560.430.300.260.24
0.50.950.830.630.560.470.300.230.22
0.750.960.860.730.610.550.430.380.37
1.00.950.900.810.730.580.470.440.41
Table 11. Mean ESS of artificial plant 1 across four detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances and the five travel speeds.
Table 11. Mean ESS of artificial plant 1 across four detection heights (0.25, 0.5, 0.75 and 1.0 m) at eight horizontal distances and the five travel speeds.
Speed (km h−1)Horizontal Distance (m)
00.51.01.52.02.53.03.5
1.60.920.740.700.640.500.420.390.37
2.40.910.730.690.630.470.420.370.37
3.20.900.730.670.610.460.360.350.35
4.00.890.710.660.600.410.320.330.34
4.80.880.710.640.570.400.320.320.31
Table 12. Mean ESS of artificial plant 1 across five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) at eight horizontal distances and four detection heights.
Table 12. Mean ESS of artificial plant 1 across five travel speeds (1.6, 2.4, 3.2, 4.0 and 4.8 km h−1) at eight horizontal distances and four detection heights.
Detection Height (m)Horizontal Distance (m)
00.51.01.52.02.53.03.5
0.250.910.690.660.600.380.260.270.29
0.50.890.740.650.600.400.270.270.27
0.750.910.730.660.610.450.460.420.40
1.00.890.730.710.640.560.470.450.44
Table 13. Mean ESS of plant 2 across four detection heights (0.25, 0.5, 0.75, and 1.0 m) at eight horizontal distances (0, 0.5, 1.0, 1.5, 2.0, 2.5, 3.0, and 3.5 m) and five travel speeds (1.6, 2.4, 3.2, 4.0, and 4.8 km h−1).
Table 13. Mean ESS of plant 2 across four detection heights (0.25, 0.5, 0.75, and 1.0 m) at eight horizontal distances (0, 0.5, 1.0, 1.5, 2.0, 2.5, 3.0, and 3.5 m) and five travel speeds (1.6, 2.4, 3.2, 4.0, and 4.8 km h−1).
Speed (km h−1)Horizontal Distance (m)
00.51.01.52.02.53.03.5
1.60.940.830.800.680.540.490.450.42
2.40.940.820.790.670.520.450.430.41
3.20.940.810.800.630.470.450.400.37
4.00.940.800.760.630.440.430.390.36
4.80.930.820.740.610.410.400.370.36
Table 14. Mean ESS of plant 2 at eight horizontal distances (0, 0.5, 1.0, 1.5, 2.0, 2.5, 3.0, and 3.5 m) across five travel speeds (1.6, 2.4, 3.2, 4.0, and 4.8 km h−1) at four different detection heights (0.25, 0.5, 0.75, and 1.0 m).
Table 14. Mean ESS of plant 2 at eight horizontal distances (0, 0.5, 1.0, 1.5, 2.0, 2.5, 3.0, and 3.5 m) across five travel speeds (1.6, 2.4, 3.2, 4.0, and 4.8 km h−1) at four different detection heights (0.25, 0.5, 0.75, and 1.0 m).
Detection Height (m)Horizontal Distance (m)
00.51.01.52.02.53.03.5
0.250.940.740.770.650.430.410.350.33
0.50.940.830.790.650.450.430.350.29
0.750.940.840.810.630.460.410.360.41
1.00.940.850.740.650.560.530.580.51

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top