1. Introduction
Returning straw to the field is an important measure of conservation tillage to increase land fertility in the world today. Returning straw to cover the soil can hold moisture to encourage seed germination. However, straw that is too thick is not easy to decay and will affect the seedling growth, resulting in crop yield reduction. Qin et al. [
1] studied the effects of straw mulching on soil health with 2 cm, 4 cm and 6 cm straw thickness covering the soil surface. They found that straw mulching thickness had a great influence on regulating soil temperature and humidity. Results showed that 4 cm straw mulching treatment could best meet the needs of grape root growth. Stagnari et al. [
2] also conducted two-year field trails on the influence of mulching straw amount on wheat growing. They drew the conclusion that at least covering 1.5 tons of straw per hectare (30% straw cover rate) could significantly produce higher yields. So it is necessary to study the detection method of straw mulching thickness and straw coverage unevenness.
The traditional segmentation algorithms for straw coverage detection mainly based on thresholds or texture features. However, it is difficult to identify straw from soil because of the similar color. Liu et al. [
3] proposed straw coverage detection method based on DE-GWO multi-threshold image segmentation algorithm, and the detection accuracy could reach 95%, but the algorithm stability is poor. Yang et al. [
4] calculated the straw coverage rate in the field using an Otsu algorithm for threshold segmentation. Li et al. [
5] used fast Fourier transform and SVM to realize automatic recognition of corn straw coverage.
To deal with large-scale UAV images, Liu et al. [
6] proposed a multi-objective grey wolf optimization algorithm to optimize large-scale image segmentation for the aerial image of straw coverage. Liu et al. [
7] proposed a sematic segmentation algorithm (DSRA-UNet) to segment the straw and soil in UAV image.
In addition, some scholars conducted straw coverage detection using remote sensing technology. Zhou et al. [
8] proposed a deep learning algorithm to detect the ground straw coverage under conservation tillage by using UAV low-altitude remote sensing images. Cai et al. [
9] used optical and SAR remote sensing images to estimate Winter Wheat residue coverage. Memon et al. [
10] assessed wheat straw cover in a rice-wheat cropping system by using Landsat Satellite Dat. Riegler-Nurscher et al. [
11] proposed a machine learning approach for pixel wise classification of residue and vegetation cover under field conditions. Laamrani et al. [
12] used a mobile device “App” and proximal remote sensing technologies to assess soil cover fractions on agricultural fields.
These studies have achieved many valuable results, but most of them focus on the detection of straw coverage, and there are no reports on the detection of straw coverage thickness or unevenness.
RGB-D sensors, among which Microsoft’s Kinect and RealSense D435i are very popular, allow for capturing depth and color information at the same time. The RGB-D sensor is widely used in obtaining three-dimensional information of crops in agriculture [
13]. Ehud Barnea et al. [
14] used the depth camera data to construct the three-dimensional fruit detection model for the crop harvesting robot. Song et al. [
15] extracted the information from the depth image and constructed a three-dimensional reconstruction algorithm for banana pseudo stem parameter extraction. Andujar et al. [
16] used a depth camera to extract structural parameters to assess the growth status and yield of cauliflower crops. Vázquez-Arellano et al. [
17] processed the point clouds of maize plants and soil points obtained by the depth camera and demonstrated that the RGB-D camera can accurately obtain the stem position and plant height of maize plants. Yang et al. [
18] used cucumber seedlings as the experimental object, used a Kinect2.0 camera to obtain 3D information of cucumber seedling tray, and finally calculated the height of single cucumber seedlings. Wang et al. [
19] combined the deep learning algorithm with deep cameras and proposed a method of RGB-D information fusion to achieve UAV environment perception and autonomous obstacle avoidance.
In this study, an outdoor machine vision system was developed with low-cost RGB-D sensors to detect the straw covering unevenness. The two most widely used RGB-D sensors (Realsense Di435i and Microsoft’s Kinect V2) were applied to estimate the straw covering unevenness by detecting the depth of straw coverage.
2. Overall Structure and Working Principle of No-Tillage Planter
No-tillage planter (2BHMX-6) was developed by Nanjing Institute of Agricultural Mechanization. It is composed of crushing the wheat straw device, field cleaning device, sowing fertilizer device, covering soil and compacting device etc. The overall structure is shown as
Figure 1.
The overall structure of no-tillage planter is as follows. The planter is connected with the tractor by suspension, when the operation is conducted, first the wheat straw covered on the area to be sowed is crushed, then the crushed straw is blown to the rear side of the planter by fan, then seeds are sowed on the cleaned area, just the sowed area is evenly covered with crushed straw [
20]. To real time acquiring the straw mulching information, the RGB-D sensor is installed on the rear of the planter with lens vertical downward, and a baffle in front of the sensor is designed to prevent flying straw from affecting image capture. As the sowing area is cleaned, the ground is relatively flat, which could make the depth detection more feasible. Owing to the large amount straw mulching, detecting the unevenness of straw coverage is more challenging.
3. Unevenness Detection
3.1. Manual Measurement Method of Straw Covering Unevenness
According to GB/T24675.6-2021 [
21] conservation tillage equipment-Part 6: Smashed straw machine, six region of size 1 × 1 m was chosen on the field covering straw one by one. First, the average mass of the straw covering the six-region was calculated according to the Equation (1). Then the straw coverage unevenness was calculated according to the Equation (2). The straw image including six region was shown in
Figure 2.
—the straw mass of the ith region, g;
—the average mass of straw, g;
—the unevenness of straw coverage, %.
3.2. Straw Covering Unevenness Detection by RGB-Sensors
3.2.1. Data Acquisition
RealSense D435i sensor (Intel, Santa Clara, CA, USA) was used to collect the field images in the experimental field of South Campus of Shandong Agricultural University in Tai’an City, Shandong, China (36.1646° N, 117.1559° E) after the peanut no-tillage planter operation with the wheat straw mulched on 12 June 2021, cloudy. Kinect v2 (Microsoft, Redmond, WA, USA) was used to collect the field images in the ‘bai ma’ experimental field of Nanjing Agricultural Mechanization Research Institute in Nan Jing City, Jiang Su, China (31.6051° N, 119.1819° E) after the no-tillage planter operation with the rice straw mulched on 1 November 2021, cloudy.
RGB-D sensor was amounted on a tripod with lens vertically to the ground. The distance from the lens to the ground can be adjusted, the RGB-D sensors were connected to the laptop with an Intel(R) Core(TM) i5-8300H
[email protected], RAM 16.0G, NVIDIA GEFORCE GTX1050 GPU, and Windows 10 operating system, the software for image acquisition and unevenness computing was developed using Matlab2017b and the Intel RealSense D435i software development kit (SDK Version2) (C
++ programming language, the OpenGL library for Kinect 2). 288 images were captured by each RGB-D sensor. Both RealSense D435i and Kinect V2 can capture RGB and depth image simultaneously. D435i can provides a much higher resolution of 1920 × 1080 in RGB image and 1280 × 720 in the depth map, and the Kinect v2 with the resolution of depth sensor 512 × 424, RGB 1920 × 1080. The exposure mode is set to automatic exposure, and the exposure time is automatically adjusted.
A frame of 1 m × 1 m was laid on the field, to capture the field area of over 1 square meters, the distance from the lens to the ground was set as 1400 mm. First the RGB-D sensor captured the field image including the frame, then the thickness of straw coverage was measured using digital vernier caliper (0–300 mm, Wuxi KBD Tools Co., Ltd., Wuxi, China) and the mass of straw in the frame was measured by electronic scale (YP, Shanghai Guangzheng Medical Equipment Co. Ltd., Shanghai, China), duplicated three times and recorded as mi. Then the frame was moved to the next area of 1 square, and so on.
3.2.2. Fusion of Depth and Image Data
The geometric calibration including the intrinsic and extrinsic parameters of RGB-D sensor was completed with a regular black and white checkerboard. Joint calibration of depth and color camera was done according to the instruction of official website of RGB-D sensor. The depth image pixels were projected to the camera space coordinates to obtain the 3D point of the image.
where (u,ν) represents the coordinates of the pixel points of the depth image, d represents the corresponding depth value of the point.
K represents the built-in parameter matrix, and (X,Y,Z) is the actual coordinate corresponding to the pixel point.
The RGB image and depth image of straw coverage field are shown in
Figure 3. Because the depth value of the depth image referred to the distance from the sensor to each point in the scene, the straw thickness could be obtained by using the depth value subtracting the height of the camera from the ground. It takes 20–30 ms to calculate the height difference between the ground and the straw in the depth map, and the processing speed can meet the real-time work. From
Figure 3, we could see that the thickness of straw changes within 100 mm. The different colors showed the different thickness; the yellow indicated the ground, and the dark blue indicated the region covered with the thickest straw.
3.3. Detection System of Straw Coverage by RGB-D Sensor
Detection system of straw coverage (shown in
Figure 4) was developed by Matlab 2017b, which had the functions as follows: RGB color image show, ROI region clipping, thickness visualization, depth image data statistics and analysis, real-time acquisition and analysis of depth data on RGB-D image, image saving, ROI volume statistics, thickness statistics and straw coverage calculation, etc.
From
Figure 4, we could see that after loading the picture captured by RGB-D imager, the RGB picture and thickness visualization were shown. From the thickness visualization, the yellow color represented the thickest region with the thickness about 100 mm, the deep blue color showed the ground with the thickness of 0 mm. Any area of ROI region can be selected such as the square box in the RGB image of
Figure 4, the average thickness and straw volume can be calculated. The thickness of every point can be obtained. From
Figure 4, we can see that the straw thickness at the cross cursor was 23 mm, and the average thickness is 20.61 mm. In addition, the thickness threshold can be set, the percentage exceeding the threshold can be calculated and the straw coverage can be obtained. What is more, this detection system can be used for real-time straw coverage depth acquisition, and which can be able to get the video flow from sensors and process it in real-time.
3.4. Results and Discussion
RealSense D435i was compared to Kinect V2. The latter required additional power supply so D435i is more suited for outdoor application. The Realsense D435i sensor is cost-effective and compact, with no external power supply required. The calculation is performed through the built-in USB interface, it is very suitable for data transmission on mobile devices. Therefore, it is more convenient for Realsense sensor to be installed on the no-tillage planter or smashed straw machine and detect the unevenness of straw coverage in fields.
Manual measurement adopts the method specified in the national standard ‘GB/T24675.6-2021’ and the unevenness of straw covering is calculated by Formulas (1) and (2) using every six straw mass data. Then the according straw volume is calculated by the detection system. For the density of the straw is an invariant, mass is proportional to volume, so the M
i (mass) is substituted by V
i (Volume) in Formulas (1) and (2). After the dimension calibration, the calibration coefficient of actual area and pixel area was obtained, with the straw thickness from RGB-D data, straw volume can be computed as Formula (4). Thus, the unevenness of straw covering detected by RGB-D sensor is calculated by the detection system. Then, the correlation coefficient was analyzed to test for the relationship between the straw covering unevenness by manual and deep sensors.
—the straw volume of the ith region, mm3;
—the total number of pixel points of depth image;
—calibration coefficient between actual area and pixel area;
—an unit pixel area;
—thickness of the jth pixel point of the ith image, mm;
—the average volume of straw, mm3;
—the unevenness of straw coverage, %.
Furthermore,
Figure 5 demonstrated the correlation coefficient plots of the straw covering unevenness measured by manual versus by RGB-D sensors, with R of 0.93, RMSE of 4.59% and MAPE of 3.86% with D435i sensor, and with R of 0.915, RMSE of 6.53% and MAPE of 13.85% with Kinect V2. It could be observed that these values correlated well with each other, which showed both two kinds of RGB-D sensors can acquire the unevenness of straw covering efficiently.
In addition, this method may provide some reference to depth acquisition and the detection of vegetation coverage. Compared to the two RGB-D sensors, without extra power supply and lighter weight, D435i has the advantage to be installed on the no-tillage planter or other platforms. Both the RGB-D sensors are affected by ambient light. Lidar, as a common sensor to detect depth, is less affected by ambient light, but it costs higher than RGB-D sensor. In the future, it is necessary to conduct real-time detection of straw coverage unevenness under different crop straw, different ambient light with different sensors.