Next Article in Journal
Trends and Prospect of Machine Vision Technology for Stresses and Diseases Detection in Precision Agriculture
Previous Article in Journal
Autonomous Navigation of a Forestry Robot Equipped with a Scanning Laser
Previous Article in Special Issue
Prediction of Potassium in Peach Leaves Using Hyperspectral Imaging and Multivariate Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Detection Method of Straw Mulching Unevenness with RGB-D Sensors

1
Nanjing Institute of Agricultural Mechanization, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
2
College of Mechanical and Electrical Engineering, Shandong Agricultural University, Tai’an 271018, China
*
Authors to whom correspondence should be addressed.
AgriEngineering 2023, 5(1), 12-19; https://doi.org/10.3390/agriengineering5010002
Submission received: 20 October 2022 / Revised: 12 December 2022 / Accepted: 16 December 2022 / Published: 21 December 2022
(This article belongs to the Special Issue Hyperspectral Imaging Technique in Agriculture)

Abstract

:
Returning straw to the field is very important of for the conservation tillage to increase land fertility. It is vital to detect the unevenness of the straw covering to evaluate the performance of no-tillage planter, especially for the ones with returning full amount of straw. In this study, two kinds of RGB-D(Red, Green, Blue-Depth) sensors (RealSense D435i and Kinect v2) were applied to estimate the straw mulching unevenness by detecting the depth of straw coverage. Firstly, the overall structure and working principle of no-tillage planter with returning the full amount of straw was introduced. Secondly, field images were captured with the two kinds of RGB-D sensors after no tillage planter operation. Thirdly, straw covering unevenness computing was carried on a system developed by Matlab. Finally, the correlation analysis was conducted to test for the relationship between the straw covering unevenness by manual and deep sensors, with R (correlation coefficient) of 0.93, RMSE(Root Mean Square Error) of 4.59% and MAPE(Mean of Absolute Percentage Error) of 3.86% with D435i sensor, and with R of 0.915, RMSE of 6.53% and MAPE of 13.85% with Kinect V2, which showed both kinds of RGB-D sensors can acquire the unevenness of straw covering efficiently. The finding can provide a potential way to detect the unevenness of straw coverage and data support for operation evaluation and improvement of no-tillage planter.

1. Introduction

Returning straw to the field is an important measure of conservation tillage to increase land fertility in the world today. Returning straw to cover the soil can hold moisture to encourage seed germination. However, straw that is too thick is not easy to decay and will affect the seedling growth, resulting in crop yield reduction. Qin et al. [1] studied the effects of straw mulching on soil health with 2 cm, 4 cm and 6 cm straw thickness covering the soil surface. They found that straw mulching thickness had a great influence on regulating soil temperature and humidity. Results showed that 4 cm straw mulching treatment could best meet the needs of grape root growth. Stagnari et al. [2] also conducted two-year field trails on the influence of mulching straw amount on wheat growing. They drew the conclusion that at least covering 1.5 tons of straw per hectare (30% straw cover rate) could significantly produce higher yields. So it is necessary to study the detection method of straw mulching thickness and straw coverage unevenness.
The traditional segmentation algorithms for straw coverage detection mainly based on thresholds or texture features. However, it is difficult to identify straw from soil because of the similar color. Liu et al. [3] proposed straw coverage detection method based on DE-GWO multi-threshold image segmentation algorithm, and the detection accuracy could reach 95%, but the algorithm stability is poor. Yang et al. [4] calculated the straw coverage rate in the field using an Otsu algorithm for threshold segmentation. Li et al. [5] used fast Fourier transform and SVM to realize automatic recognition of corn straw coverage.
To deal with large-scale UAV images, Liu et al. [6] proposed a multi-objective grey wolf optimization algorithm to optimize large-scale image segmentation for the aerial image of straw coverage. Liu et al. [7] proposed a sematic segmentation algorithm (DSRA-UNet) to segment the straw and soil in UAV image.
In addition, some scholars conducted straw coverage detection using remote sensing technology. Zhou et al. [8] proposed a deep learning algorithm to detect the ground straw coverage under conservation tillage by using UAV low-altitude remote sensing images. Cai et al. [9] used optical and SAR remote sensing images to estimate Winter Wheat residue coverage. Memon et al. [10] assessed wheat straw cover in a rice-wheat cropping system by using Landsat Satellite Dat. Riegler-Nurscher et al. [11] proposed a machine learning approach for pixel wise classification of residue and vegetation cover under field conditions. Laamrani et al. [12] used a mobile device “App” and proximal remote sensing technologies to assess soil cover fractions on agricultural fields.
These studies have achieved many valuable results, but most of them focus on the detection of straw coverage, and there are no reports on the detection of straw coverage thickness or unevenness.
RGB-D sensors, among which Microsoft’s Kinect and RealSense D435i are very popular, allow for capturing depth and color information at the same time. The RGB-D sensor is widely used in obtaining three-dimensional information of crops in agriculture [13]. Ehud Barnea et al. [14] used the depth camera data to construct the three-dimensional fruit detection model for the crop harvesting robot. Song et al. [15] extracted the information from the depth image and constructed a three-dimensional reconstruction algorithm for banana pseudo stem parameter extraction. Andujar et al. [16] used a depth camera to extract structural parameters to assess the growth status and yield of cauliflower crops. Vázquez-Arellano et al. [17] processed the point clouds of maize plants and soil points obtained by the depth camera and demonstrated that the RGB-D camera can accurately obtain the stem position and plant height of maize plants. Yang et al. [18] used cucumber seedlings as the experimental object, used a Kinect2.0 camera to obtain 3D information of cucumber seedling tray, and finally calculated the height of single cucumber seedlings. Wang et al. [19] combined the deep learning algorithm with deep cameras and proposed a method of RGB-D information fusion to achieve UAV environment perception and autonomous obstacle avoidance.
In this study, an outdoor machine vision system was developed with low-cost RGB-D sensors to detect the straw covering unevenness. The two most widely used RGB-D sensors (Realsense Di435i and Microsoft’s Kinect V2) were applied to estimate the straw covering unevenness by detecting the depth of straw coverage.

2. Overall Structure and Working Principle of No-Tillage Planter

No-tillage planter (2BHMX-6) was developed by Nanjing Institute of Agricultural Mechanization. It is composed of crushing the wheat straw device, field cleaning device, sowing fertilizer device, covering soil and compacting device etc. The overall structure is shown as Figure 1.
The overall structure of no-tillage planter is as follows. The planter is connected with the tractor by suspension, when the operation is conducted, first the wheat straw covered on the area to be sowed is crushed, then the crushed straw is blown to the rear side of the planter by fan, then seeds are sowed on the cleaned area, just the sowed area is evenly covered with crushed straw [20]. To real time acquiring the straw mulching information, the RGB-D sensor is installed on the rear of the planter with lens vertical downward, and a baffle in front of the sensor is designed to prevent flying straw from affecting image capture. As the sowing area is cleaned, the ground is relatively flat, which could make the depth detection more feasible. Owing to the large amount straw mulching, detecting the unevenness of straw coverage is more challenging.

3. Unevenness Detection

3.1. Manual Measurement Method of Straw Covering Unevenness

According to GB/T24675.6-2021 [21] conservation tillage equipment-Part 6: Smashed straw machine, six region of size 1 × 1 m was chosen on the field covering straw one by one. First, the average mass of the straw covering the six-region was calculated according to the Equation (1). Then the straw coverage unevenness was calculated according to the Equation (2). The straw image including six region was shown in Figure 2.
M ¯ = i = 1 6 M i 6
F b = 1 M ¯ i = 1 6 M i M ¯ 2 5 × 100
M i —the straw mass of the ith region, g;
M ¯ —the average mass of straw, g;
F b —the unevenness of straw coverage, %.
Figure 2. Straw image.
Figure 2. Straw image.
Agriengineering 05 00002 g002

3.2. Straw Covering Unevenness Detection by RGB-Sensors

3.2.1. Data Acquisition

RealSense D435i sensor (Intel, Santa Clara, CA, USA) was used to collect the field images in the experimental field of South Campus of Shandong Agricultural University in Tai’an City, Shandong, China (36.1646° N, 117.1559° E) after the peanut no-tillage planter operation with the wheat straw mulched on 12 June 2021, cloudy. Kinect v2 (Microsoft, Redmond, WA, USA) was used to collect the field images in the ‘bai ma’ experimental field of Nanjing Agricultural Mechanization Research Institute in Nan Jing City, Jiang Su, China (31.6051° N, 119.1819° E) after the no-tillage planter operation with the rice straw mulched on 1 November 2021, cloudy.
RGB-D sensor was amounted on a tripod with lens vertically to the ground. The distance from the lens to the ground can be adjusted, the RGB-D sensors were connected to the laptop with an Intel(R) Core(TM) i5-8300H [email protected], RAM 16.0G, NVIDIA GEFORCE GTX1050 GPU, and Windows 10 operating system, the software for image acquisition and unevenness computing was developed using Matlab2017b and the Intel RealSense D435i software development kit (SDK Version2) (C++ programming language, the OpenGL library for Kinect 2). 288 images were captured by each RGB-D sensor. Both RealSense D435i and Kinect V2 can capture RGB and depth image simultaneously. D435i can provides a much higher resolution of 1920 × 1080 in RGB image and 1280 × 720 in the depth map, and the Kinect v2 with the resolution of depth sensor 512 × 424, RGB 1920 × 1080. The exposure mode is set to automatic exposure, and the exposure time is automatically adjusted.
A frame of 1 m × 1 m was laid on the field, to capture the field area of over 1 square meters, the distance from the lens to the ground was set as 1400 mm. First the RGB-D sensor captured the field image including the frame, then the thickness of straw coverage was measured using digital vernier caliper (0–300 mm, Wuxi KBD Tools Co., Ltd., Wuxi, China) and the mass of straw in the frame was measured by electronic scale (YP, Shanghai Guangzheng Medical Equipment Co. Ltd., Shanghai, China), duplicated three times and recorded as mi. Then the frame was moved to the next area of 1 square, and so on.

3.2.2. Fusion of Depth and Image Data

The geometric calibration including the intrinsic and extrinsic parameters of RGB-D sensor was completed with a regular black and white checkerboard. Joint calibration of depth and color camera was done according to the instruction of official website of RGB-D sensor. The depth image pixels were projected to the camera space coordinates to obtain the 3D point of the image.
u ν d = K X Y Z
where (u,ν) represents the coordinates of the pixel points of the depth image, d represents the corresponding depth value of the point. K represents the built-in parameter matrix, and (X,Y,Z) is the actual coordinate corresponding to the pixel point.
The RGB image and depth image of straw coverage field are shown in Figure 3. Because the depth value of the depth image referred to the distance from the sensor to each point in the scene, the straw thickness could be obtained by using the depth value subtracting the height of the camera from the ground. It takes 20–30 ms to calculate the height difference between the ground and the straw in the depth map, and the processing speed can meet the real-time work. From Figure 3, we could see that the thickness of straw changes within 100 mm. The different colors showed the different thickness; the yellow indicated the ground, and the dark blue indicated the region covered with the thickest straw.

3.3. Detection System of Straw Coverage by RGB-D Sensor

Detection system of straw coverage (shown in Figure 4) was developed by Matlab 2017b, which had the functions as follows: RGB color image show, ROI region clipping, thickness visualization, depth image data statistics and analysis, real-time acquisition and analysis of depth data on RGB-D image, image saving, ROI volume statistics, thickness statistics and straw coverage calculation, etc.
From Figure 4, we could see that after loading the picture captured by RGB-D imager, the RGB picture and thickness visualization were shown. From the thickness visualization, the yellow color represented the thickest region with the thickness about 100 mm, the deep blue color showed the ground with the thickness of 0 mm. Any area of ROI region can be selected such as the square box in the RGB image of Figure 4, the average thickness and straw volume can be calculated. The thickness of every point can be obtained. From Figure 4, we can see that the straw thickness at the cross cursor was 23 mm, and the average thickness is 20.61 mm. In addition, the thickness threshold can be set, the percentage exceeding the threshold can be calculated and the straw coverage can be obtained. What is more, this detection system can be used for real-time straw coverage depth acquisition, and which can be able to get the video flow from sensors and process it in real-time.

3.4. Results and Discussion

RealSense D435i was compared to Kinect V2. The latter required additional power supply so D435i is more suited for outdoor application. The Realsense D435i sensor is cost-effective and compact, with no external power supply required. The calculation is performed through the built-in USB interface, it is very suitable for data transmission on mobile devices. Therefore, it is more convenient for Realsense sensor to be installed on the no-tillage planter or smashed straw machine and detect the unevenness of straw coverage in fields.
Manual measurement adopts the method specified in the national standard ‘GB/T24675.6-2021’ and the unevenness of straw covering is calculated by Formulas (1) and (2) using every six straw mass data. Then the according straw volume is calculated by the detection system. For the density of the straw is an invariant, mass is proportional to volume, so the Mi (mass) is substituted by Vi (Volume) in Formulas (1) and (2). After the dimension calibration, the calibration coefficient of actual area and pixel area was obtained, with the straw thickness from RGB-D data, straw volume can be computed as Formula (4). Thus, the unevenness of straw covering detected by RGB-D sensor is calculated by the detection system. Then, the correlation coefficient was analyzed to test for the relationship between the straw covering unevenness by manual and deep sensors.
V i = j = 1 n C · A j · D i j
V ¯ = i = 1 6 V i 6
F b = 1 V ¯ i = 1 6 V i V ¯ 2 5 × 100
V i —the straw volume of the ith region, mm3;
n —the total number of pixel points of depth image;
C —calibration coefficient between actual area and pixel area;
A j —an unit pixel area;
D i j —thickness of the jth pixel point of the ith image, mm;
V ¯ —the average volume of straw, mm3;
F b —the unevenness of straw coverage, %.
Furthermore, Figure 5 demonstrated the correlation coefficient plots of the straw covering unevenness measured by manual versus by RGB-D sensors, with R of 0.93, RMSE of 4.59% and MAPE of 3.86% with D435i sensor, and with R of 0.915, RMSE of 6.53% and MAPE of 13.85% with Kinect V2. It could be observed that these values correlated well with each other, which showed both two kinds of RGB-D sensors can acquire the unevenness of straw covering efficiently.
In addition, this method may provide some reference to depth acquisition and the detection of vegetation coverage. Compared to the two RGB-D sensors, without extra power supply and lighter weight, D435i has the advantage to be installed on the no-tillage planter or other platforms. Both the RGB-D sensors are affected by ambient light. Lidar, as a common sensor to detect depth, is less affected by ambient light, but it costs higher than RGB-D sensor. In the future, it is necessary to conduct real-time detection of straw coverage unevenness under different crop straw, different ambient light with different sensors.

4. Conclusions

Both RGB-D sensors can be used to complete the unevenness detection with R over 0.9. D435i is more suited for in-field detection with no need of additional power supply. These results indicated that it is feasible to adopt RGB-D sensors to detect the unevenness of straw covering, providing support for development of real time detection equipment installed on no tillage planter.

Author Contributions

Conceptualization, Y.S. and G.X.; methodology, Y.S. and Z.H.; software, X.G. and X.L.; validation, J.M.; formal analysis, F.W. and X.G.; investigation, X.G. and X.L.; resources, F.G.; data curation, X.L.; writing—original draft preparation, Y.S.; writing—review and editing, Y.S., G.X. and X.G.; visualization, X.G.; supervision, Z.H.; project administration, Z.H.; funding acquisition, Y.S. and Z.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Key Laboratory of Modern Agricultural Equipment, Ministry of Agriculture and Rural Affairs, P.R. China.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare that there are no conflict of interest regarding the publication of this study.

References

  1. Qin, T.Y.; Wang, L.; Zhao, J.S.; Zhou, G.F.; Li, C.H.; Guo, L.Y.; Jiang, G.M. Effects of Straw Mulching Thickness on the Soil Health in a Temperate Organic Vineyard. Agriculture 2022, 12, 1751. [Google Scholar] [CrossRef]
  2. Stagnari, F.; Galieni, A.; Speca, S.; Cafiero, G.; Pisante, M. Effects of straw mulch on growth and yield of durum wheat during transition to conservation agriculture in mediterranean environment. Field Crops Res. 2014, 167, 51–63. [Google Scholar] [CrossRef]
  3. Liu, Y.Y.; Wang, Y.Y.; YU, H.Y.; Qin, M.X.; Sun, J.H. Detection of straw coverage rate based on multi-threshold image segmentation algorithm. Trans. Chin. Soc. Agric. Mach. 2018, 49, 27–35. [Google Scholar]
  4. Yang, G.; Zhang, H.X.; Fang, T.; Zhang, C.L. Straw recognition and coverage rate detection technology based on improved AdaBoost algorithm. Trans. Chin. Soc. Agric. Mach. 2021, 52, 177–183. [Google Scholar]
  5. Li, J.; Lyu, C.X.; Yuan, Y.W.; Li, Y.S.; Wei, L.G.; Qin, Q.S. Automatic recognition of corn straw coverage based on fast Fourier transform and SVM. Trans. CSAE 2019, 35, 194–201. [Google Scholar]
  6. Liu, Y.Y.; Sun, J.H.; Zhang, S.J.; Yu, H.Y.; Wang, Y.Y. Detection of straw coverage based on multi-threshold and multi-target UAV image segmentation optimization algorithm. Trans. CSAE 2020, 36, 134–143. [Google Scholar]
  7. Liu, Y.Y.; Zhang, S.; Yu, H.Y.; Wang, Y.Y.; Wang, J.M. Straw derection algorithm based on semantic segmentation in complex farm scenarios. Opt. Precis. Eng. 2020, 28, 200–211. [Google Scholar]
  8. Zhou, D.Y.; Li, M.; Li, Y.; Qi, J.T.; Liu, K.; Cong, X.; Tian, X.L. Detection of ground straw coverage under conservation tillage based on deep learning. Comput. Electron. Agric. 2020, 172, 105369. [Google Scholar] [CrossRef]
  9. Cai, W.T.; Zhao, S.H.; Wang, Y.M.; Peng, F.C.; Heo, J.; Duan, Z. Estimation of Winter Wheat Residue Coverage Using Optical and SAR Remote Sensing Images. Remote Sens. 2019, 11, 1163. [Google Scholar] [CrossRef] [Green Version]
  10. Memon, M.S.; Zhou, J.; Sun, C.L.; Jiang, C.X.; Xu, W.Y.; Hu, Q.; Yang, H.X.; Ji, C.Y. Assessment of Wheat Straw Cover and Yield Performance in a Rice-Wheat Cropping System by Using Landsat Satellite Data. Sustainability 2019, 11, 5369. [Google Scholar] [CrossRef] [Green Version]
  11. Riegler-Nurscher, P.; Prankl, J.; Bauer, T.; Strauss, P.; Prankl, H. A machine learning approach for pixel wise classification of residue and vegetation cover under field conditions. Biosyst. Eng. 2018, 169, 188–198. [Google Scholar] [CrossRef]
  12. Laamrani, A.; Pardo Lara, R.; Berg, A.A.; Branson, D.; Joosse, P. Using a Mobile Device “App” and Proximal Remote Sensing Technologies to Assess Soil Cover Fractions on Agricultural Fields. Sensors 2018, 18, 708. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Fu, L.; Gao, F.; Wu, J.; Li, R.; Karkee, M.; Zhang, Q. Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review. Comput. Electron. Agric. 2020, 177, 105687. [Google Scholar] [CrossRef]
  14. Barnea, E.; Mairon, R.; Ben-Shahar, O. Colour-agnostic shape-based 3D fruit detection for crop harvesting robots. Biosyst. Eng. 2016, 146, 57–70. [Google Scholar] [CrossRef]
  15. Song, S.S.; Duan, J.L.; Yang, Z.; Zou, X.J.; Fu, L.H.; Ou, Z.W. A three-dimensional reconstruction algorithm for extracting parameters of the banana pseudo-stem. Optik 2019, 185, 486–496. [Google Scholar] [CrossRef]
  16. Andujar, D.; Ribeiro, A.; Fernandez-Quintanilla, C.; Dorado, J. Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Comput. Electron. Agric. 2016, 122, 67–73. [Google Scholar] [CrossRef]
  17. Vazquez-Arellano, M.; Paraforos, D.S.; Reiser, D.; Garrido-Izard, M.; Griepentrog, H.W. Determination of stem position and height of reconstructed maize plants using a time-of-flight camera. Comput. Electron. Agric. 2018, 154, 276–288. [Google Scholar] [CrossRef]
  18. Yang, S.; Gao, W.L.; Mi, J.Q.; Wu, M.L.; Wang, M.J.; Zheng, L.H. Method for measurement of vegetable seedings height based on RGB-D camera. Trans. Chin. Soc. Agric. Mach. 2019, 50, 128–135. [Google Scholar]
  19. Wang, D.S.; Li, W.; Liu, X.G.; Li, N.; Zhang, C.L. UAV environmental perception and autonomous obstacle avoidance: A deep learning and depth camera combined solution. Comput. Electron. Agric. 2020, 175, 105523. [Google Scholar] [CrossRef]
  20. Shao, Y.Y.; Xuan, G.T.; Peng, H.X.; Hu, Z.C.; Chen, Y.Q.; Wu, F. Design on peanut no-tillage planter under coverage of the wheat stubble. In ASABE Annual International Meeting; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2017. [Google Scholar]
  21. GB/T 24675.6-2021; Conservation Tillage Equipment-Smashed Straw Machine. China Standard Press Beijing: Beijing, China, 2006.
Figure 1. 1—Depth wheel; 2—Main frame; 3—Crushing straw device; 4—Transverse conveyor; 5—Ground Wheel; 6—Field cleaning (shallow spin); 7—Sowing and fertilizing device; 8—Even spreading device; 9—Intermediate coupling; 10—Gearbox; 11—RGB-D sensor.
Figure 1. 1—Depth wheel; 2—Main frame; 3—Crushing straw device; 4—Transverse conveyor; 5—Ground Wheel; 6—Field cleaning (shallow spin); 7—Sowing and fertilizing device; 8—Even spreading device; 9—Intermediate coupling; 10—Gearbox; 11—RGB-D sensor.
Agriengineering 05 00002 g001
Figure 3. Straw coverage image and depth image.
Figure 3. Straw coverage image and depth image.
Agriengineering 05 00002 g003
Figure 4. Detection system of straw coverage unevenness with RGB-D sensors.
Figure 4. Detection system of straw coverage unevenness with RGB-D sensors.
Agriengineering 05 00002 g004
Figure 5. The correlation coefficient plots of unevenness by manual (a) versus RGB-D (b) sensors.
Figure 5. The correlation coefficient plots of unevenness by manual (a) versus RGB-D (b) sensors.
Agriengineering 05 00002 g005
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shao, Y.; Guan, X.; Xuan, G.; Li, X.; Gu, F.; Ma, J.; Wu, F.; Hu, Z. Detection Method of Straw Mulching Unevenness with RGB-D Sensors. AgriEngineering 2023, 5, 12-19. https://doi.org/10.3390/agriengineering5010002

AMA Style

Shao Y, Guan X, Xuan G, Li X, Gu F, Ma J, Wu F, Hu Z. Detection Method of Straw Mulching Unevenness with RGB-D Sensors. AgriEngineering. 2023; 5(1):12-19. https://doi.org/10.3390/agriengineering5010002

Chicago/Turabian Style

Shao, Yuanyuan, Xianlu Guan, Guantao Xuan, Xiaoteng Li, Fengwei Gu, Junteng Ma, Feng Wu, and Zhichao Hu. 2023. "Detection Method of Straw Mulching Unevenness with RGB-D Sensors" AgriEngineering 5, no. 1: 12-19. https://doi.org/10.3390/agriengineering5010002

APA Style

Shao, Y., Guan, X., Xuan, G., Li, X., Gu, F., Ma, J., Wu, F., & Hu, Z. (2023). Detection Method of Straw Mulching Unevenness with RGB-D Sensors. AgriEngineering, 5(1), 12-19. https://doi.org/10.3390/agriengineering5010002

Article Metrics

Back to TopTop