Design and Experiment of an Automatic Row-Oriented Spraying System Based on Machine Vision for Early-Stage Maize Corps
Abstract
:1. Introduction
2. Materials and Methods
2.1. System Description
2.1.1. System Composition
2.1.2. Principle of the Visual Row-oriented Spraying System
2.2. Visual Algorithm Design
2.2.1. Development Environment and Technical Process
2.2.2. Image Pretreatment
2.2.3. Crop Row Identification
2.2.4. Guidance Line Extraction and Tracking
Algorithm 1. Slope-based outlier filter. |
Input Output: inliers in I Method: Initialize: I = Il or Ir, Tj = 0.2, Osum = 0, count = 0 while for I = 1 to n do Osum = Osum + Oi count = count + 1 endfor Omean = Osum/count for i = 1 to n do Si = Oi − Omean Push Si into Is endfor Find Smax in Is if Smax > Tj then i = idx(Smax = i) pop Si from Is pop Oi from Is else break endif return I |
2.3. Row Alignment Control Method
2.3.1. Visual Localization
2.3.2. Row-Oriented Delay Compensation Model
2.3.3. Row-Orientation Control Algorithm
2.4. Test Method
2.4.1. Robustness Test of Guidance Line Algorithm
2.4.2. Guidance Line Algorithm Accuracy Test
2.4.3. Kalman Filter Optimization Test
2.4.4. Control System Test
2.4.5. Field Test
2.4.6. Droplet Deposition Test
3. Results
3.1. Robustness Test Results and Analysis
3.2. Accumulated Threshold Influence on Guidance Line Accuracy
3.3. Evaluation of the Kalman Filtering Effect
3.4. Results and Analysis of the Row Alignment Test
3.5. Field Test
3.6. Spraying Performance Test
4. Discussion
5. Conclusions
- Because of the pesticide waste and environmental pollution caused by continuous spraying operations at the maize seedling stage, a set of visual row-oriented spraying systems based on automatic navigation technology was developed. Automatic navigation was conducted for preliminary row alignment, and machine vision technology was used to achieve accurate row alignment, which provided a new idea for fine plant protection operations.
- A Hough transform algorithm was used to detect the crop rows to be selected, an outlier threshold was eliminated based on the slope of the line to fit the auxiliary lines, and a guidance line was fitted according to the tangent formula. Because of the guidance line jitter problem of the extraction algorithm in video reasoning, a Kalman filter was used to track the target of the guidance line, and a robust row guidance algorithm was designed. According to the requirements of visual localization and the parameters of various identifications, theoretical analysis was carried out to set up a row-orientation delay compensation model, and a row-orientation control algorithm based on vision was designed.
- Test results based on the visual row-oriented spraying system show that the average time of the row guidance algorithm is 42 ms, and the average visual deviation is 2.75 cm, which can provide row guidance for the row alignment system. If automatic navigation is turned on, the row-orientation average deviation between the row alignment mechanism and the centerline between rows of crops is 5.08 cm, and the row-orientation deviation range is below ±15 cm. Compared with previous reports, the performance is improved, meeting the requirements of automatic row orientation. In addition, compared with traditional spraying, the inter-row deposition of visual row-oriented spraying is reduced by 20.36%, and the overall pesticide savings are 11.4%, which reduces pesticide waste and improves pesticide utilization. With increasing vehicle speed, the alignment accuracy also decreases. To overcome the limitation of speed on the effect of alignment, we will start by improving the visual algorithm efficiency and optimizing the hardware system structure and continue to try more innovative control schemes to improve the performance of the alignment operation.
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- The Data Lab for Statistical Innovation. FAO DataLab. Available online: https://www.fao.org/datalab/website/web/home (accessed on 9 November 2022).
- Notice of the General Office of the Ministry of Agriculture and the General Office of the Ministry of Finance on Preparing Cultivated Land Protection and Quality Improvement in 2014. Available online: http://www.moa.gov.cn/govpublic/CWS/201408/t20140804_3989342.htm (accessed on 9 November 2022).
- Chen, Y.; You, J.; Xing, Z.; Hu, H.; Mei, G.; Shi, Z.; Liu, H. General situation of precision agriculture development in major countries in the world and suggestions for China. J. Agric. Eng. 2021, 37, 315–324. [Google Scholar] [CrossRef]
- Nørremark, M.; Griepentrog, H.W.; Nielsen, J.; Søgaard, H.T. The development and assessment of the accuracy of an autonomous GPS-based system for intra-row mechanical weed control in row crops. Biosyst. Eng. 2008, 101, 396–410. [Google Scholar] [CrossRef]
- Perez-Ruiz, M.; Slaughter, D.C.; Gliever, C.; Upadhyaya, S.K. Tractor-based Real-time Kinematic-Global Positioning System (RTK-GPS) guidance system for geospatial mapping of row crop transplant. Biosyst. Eng. 2012, 111, 64–71. [Google Scholar] [CrossRef]
- He, J.; He, J.; Luo, X.; Li, W.; Man, Z.; Feng, D. Research on rice row recognition and tracking navigation based on multi-sensor fusion. J. Agric. Mach. 2022, 53, 18–26+137. [Google Scholar] [CrossRef]
- Luo, X.; Zhang, Z.; Zhao, Z.; Chen, B.; Hu, L.; Wu, X. DGPS automatic navigation control system of Dongfanghong X-804 tractor. J. Agric. Eng. 2009, 25, 139–145. [Google Scholar]
- Yang, Y.; Zhang, B.; Cha, J.; Wen, X.; Chen, L.; Zhang, T.; Dong, X.; Yang, X. Real-time extraction of navigation line from corn maize rows. J. Agric. Eng. 2020, 36, 162–171. [Google Scholar] [CrossRef]
- Huang, Y.; Fu, J.; Xu, S.; Han, T.; Liu, Y. Research on Integrated Navigation System of Agricultural Machinery Based on RTK-BDS/INS. Agriclture 2022, 12, 1169. [Google Scholar] [CrossRef]
- Zhang, X.; Li, X.; Zhang, B.; Zhou, J.; Tian, G.; Xiong, Y.; Gu, B. Automated robust crop-row detection in maize fields based on position clustering algorithm and shortest path method. Comput. Electron. Agric. 2018, 154, 165–175. [Google Scholar] [CrossRef]
- Li, J.; Zhu, R.; Chen, B. Image detection and verification of visual navigation route during cotton field management period. Int. J. Agric. Biol. Eng. 2018, 11, 159–165. [Google Scholar] [CrossRef] [Green Version]
- Liu, J.; Abbas, I.; Noor, R.S. Development of deep learning-based variable rate agrochemical spraying system for targeted weeds control in strawberry crop. Agronomy 2021, 11, 1480. [Google Scholar] [CrossRef]
- Diao, Z.; Zhao, M.; Song, Y.; Wu, B.; Wu, Y.; Qian, X.; Wei, Y. Crop line recognition algorithm and realization in precision pesticide system based on machine vision. J. Agric. Eng. 2015, 31, 47–52. [Google Scholar] [CrossRef]
- Perez-Ruiz, M.; Carballido, J.; Agüera, J.; Rodríguez-Lizana, A. Development and evaluation of a combined cultivator and band sprayer with a row-centering RTK-GPS guidance system. Sensors 2013, 13, 3313–3330. [Google Scholar] [CrossRef]
- Feng, J.; Li, Z.; Yang, W.; Han, X.; Zhang, X. Detection navigation baseline in row-following operation of maize weeder based on axis extraction. Int. J. Agric. Biol. Eng. 2020, 13, 181–186. [Google Scholar] [CrossRef]
- Gée, C.; Bossu, J.; Jones, G.; Truchetet, F. Crop/weed discrimination in perspective agronomic images. Comput. Electron. Agric. 2008, 60, 49–59. [Google Scholar] [CrossRef]
- Hu, J.; Li, G.; Mo, H.; Lv, Y.; Qian, T.; Chen, M.; Lu, S. Crop Node Detection and Internode Length Estimation Using an Improved YOLOv5 Model. Agriculture 2023, 13, 473. [Google Scholar] [CrossRef]
- Khalid, M.; Sarfraz, M.S.; Iqbal, U.; Aftab, M.U.; Niedbała, G.; Rauf, H.T. Real-Time Plant Health Detection Using Deep Convolutional Neural Networks. Agriculture 2023, 13, 510. [Google Scholar] [CrossRef]
- Mousazadeh, H. A technical review on navigation systems of agricultural autonomous off-road vehicles. J. Terramechanics 2013, 50, 211–232. [Google Scholar] [CrossRef]
- Zhou, Y.; Yang, Y.; Zhang, B.; Wen, X.; Yue, X.; Chen, L. Autonomous detection of crop rows based on adaptive multi-ROI in maize fields. Int. J. Agric. Biol. Eng. 2021, 14, 217–225. [Google Scholar] [CrossRef]
- Ospina, R.; Noguchi, N. Simultaneous mapping and crop row detection by fusing data from wide angle and telephoto images. Comput. Electron. Agric. 2019, 162, 602–612. [Google Scholar] [CrossRef]
- Rovira-Más, F.; Zhang, Q.; Reid, J.F.; Will, J.D. Hough-transform-based vision algorithm for crop row detection of an automated agricultural vehicle. Proc. Inst. Mech. Eng. D J. Automob. Eng. 2005, 219, 999–1010. [Google Scholar] [CrossRef]
- Chen, Z.; Li, W.; Zhang, W.; Li, Y.; Li, M.; Li, H. Study on extraction method of vegetable crop rows based on automatic Hough transform cumulative threshold. J. Agric. Eng. 2019, 35, 314–322. [Google Scholar] [CrossRef]
- Bakker, T.; Asselt, K.; Bontsema, J.; Müller, J.; Straten, G. Systematic design of an autonomous platform for robotic weeding. J. Terramechanics 2010, 47, 63–73. [Google Scholar] [CrossRef]
- Garford Farm Machinery. Robocrop InRow Weeder | Remove Inter Row Weeds. Available online: https://garford.com/products/robocrop-inrow-weeder/ (accessed on 10 November 2022).
- John Deere, US. Guidance|AutoTracTM Vision. Available online: https://www.deere.com/en/technology-products/precision-ag-technology/guidance/auto-trac-vision/ (accessed on 10 November 2022).
- Zhang, K.; Hu, Y.; Yang, L.; Zhang, D.; Cui, T.; Fan, L. Design and test of automatic alignment system of corn maize harvester. J. Agric. Mach. 2020, 51, 103–114. [Google Scholar] [CrossRef]
- Liu, Z.; Zhang, Z.; Luo, X.; Wang, H.; Huang, P.; Zhang, J. Design of automatic navigation operation system for Lovol ZP9500 high clearance boom sprayer based on GNSS. J. Agric. Eng. 2018, 34, 15–21. [Google Scholar] [CrossRef]
- Bengochea-Guevara, J.M.; Conesa-Muñoz, J.; Andújar, D.; Ribeiro, A. Merge fuzzy visual servoing and GPS-based planning to obtain a proper navigation behavior for a small crop-inspection robot. Sensors 2016, 16, 276. [Google Scholar] [CrossRef] [PubMed]
- Kanagasingham, S.; Ekpanyapong, M.; Chaihan, R. Integrating machine vision-based row guidance with gps and compass-based routing to achieve autonomous navigation for a rice field weeding robot. Precis. Agric. 2020, 21, 831–855. [Google Scholar] [CrossRef]
- Han, C.; Zheng, K.; Zhao, X.; Zheng, S.; Fu, H.; Zhai, C. Design and test of row identification and row spray control system for cabbage crops in field. J. Agric. Mach. 2022, 53, 89–101. [Google Scholar] [CrossRef]
- Image Transforms—Hough Transfor. Available online: https://homepages.inf.ed.ac.uk/rbf/HIPR2/hough.htm (accessed on 10 November 2022).
- Wu, Y. Determination of CAN bus bit timing parameters. Appl. Single Chip Microcomput. Embed. Syst. 2003, 19–22. [Google Scholar] [CrossRef]
- Jiang, G.; Wang, X.; Wang, Z.; Liu, H. Wheat rows detection at the early growth stage based on Hough transform and vanishing point. Comput. Electron. Agric. 2016, 123, 211–223. [Google Scholar] [CrossRef]
- Jiang, G.; Wang, Z.; Liu, H. Automatic detection of crop rows based on multi-ROIs. Expert Syst. Appl. 2015, 42, 2429–2441. [Google Scholar] [CrossRef]
- Zhai, C.; Fu, H.; Zheng, K.; Zheng, S.; Wu, H.; Zhao, X. Establishment and experiment of online recognition model of field cabbage based on deep learning. J. Agric. Mach. 2022, 53, 293–303. [Google Scholar] [CrossRef]
- Wang, S.; Yu, S.; Zhang, W.; Wang, X. Rice seedling row detection based on Hough transform of feature points neighborhood. J. Agric. Mach. 2020, 51, 18–25. [Google Scholar] [CrossRef]
- He, J.; Zang, Y.; Luo, X.W.; Zhao, R.M.; He, J.; Jiao, J.K. Visual detection of rice rows based on Bayesian decision theory and robust regression least squares method. Int. J. Agric. Biol. Eng. 2021, 14, 199–206. [Google Scholar] [CrossRef]
- Wang, Y.; Cui, S.; Liu, Y.; He, M.; Wang, P.; Wang, J. Construction and test of precision spraying device in straw collecting mode. J. Agric. Eng. 2022, 38, 12–20. [Google Scholar] [CrossRef]
Image No. | Growth Height/cm | Illumination Intensity/wLux | Weed Density/(Plant·m−2) | Target Pixel Ratio/% | Algorithm Time/ms | Recognition Success Rate/% |
---|---|---|---|---|---|---|
1~20 | 30 | 4.57~6.09 wLux | 13~25 | 5.20–9.46 | 36–40 | 95 |
21~40 | 7.43~8.54 wLux | 14~28 | 5.5–8.18 | 34–39 | 100 | |
41~60 | 10.27~11.36 wLux | 14~28 | 6.89–9.57 | 33–44 | 90 | |
61~80 | 40 | 4.57~6.09 wLux | 11~26 | 6.7–8.2 | 34.5–41.3 | 100 |
81~100 | 7.43~8.54 wLux | 11~29 | 8.5–11.8 | 35.5–38.9 | 100 | |
101~120 | 10.27~11.36 wLux | 11~30 | 7.0–12.9 | 35.1–40.1 | 95 | |
121~140 | 80 | 4.57~6.09 wLux | 14~44 | 9.1–13.8 | 30.9–36.4 | 85 |
141~160 | 7.43~8.54 wLux | 15~39 | 12–15.5 | 31.3–35.7 | 90 | |
161~180 | 10.27~11.36 wLux | 17~42 | 11.6–16.3 | 32.5–36.1 | 80 |
Test Group | Hough Transform Accumulated Threshold | Angle Error Standard Deviation/° | Average Angle Error/° | Offset Error Standard Deviation/cm | Mean Offset Error/cm | Identification Accuracy/% |
---|---|---|---|---|---|---|
1 | 25 | 2.45 | 2.7 | 3.8 | 2.6 | 88.3 |
2 | 35 | 0.74 | 0.9 | 1.8 | 1.1 | 93.3 |
3 | 45 | 2.80 | 3.6 | 4.2 | 3.2 | 86.0 |
Navigation State | Speed m/s | Visual Mean Deviation/cm | Visual Standard Deviation | Row-Orientation Mean Deviation o/cm | Row-Orientation Standard Deviation/cm | Proportion of Row-Orientation Deviation within ±15 cm/% | Proportion of Row-Orientation Deviation within ±30 cm/% |
---|---|---|---|---|---|---|---|
off | 0.27 | 0.37 | 2.78 | 5.22 | 5.08 | 90 | 100 |
0.51 | 0.14 | 2.01 | 7.92 | 7.98 | 90 | 100 | |
0.67 | 0.72 | 5.84 | 8.43 | 8.75 | 86.66 | 100 | |
on | 0.32 | 0.31 | 1.06 | 3.7 | 3.08 | 100 | 100 |
0.56 | 0.46 | 1.23 | 4.36 | 4.54 | 100 | 100 | |
0.71 | 0.45 | 3.60 | 7.24 | 7.64 | 100 | 100 |
Deposition mL/m2 | |||||
---|---|---|---|---|---|
Col | A | B | C | D | E |
Full-coverage spraying | 11.49 | 11.43 | 12.31 | 11.46 | 12.15 |
Row-oriented spraying | 9.67 | 12.11 | 9.60 | 11.54 | 9.34 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zheng, K.; Zhao, X.; Han, C.; He, Y.; Zhai, C.; Zhao, C. Design and Experiment of an Automatic Row-Oriented Spraying System Based on Machine Vision for Early-Stage Maize Corps. Agriculture 2023, 13, 691. https://doi.org/10.3390/agriculture13030691
Zheng K, Zhao X, Han C, He Y, Zhai C, Zhao C. Design and Experiment of an Automatic Row-Oriented Spraying System Based on Machine Vision for Early-Stage Maize Corps. Agriculture. 2023; 13(3):691. https://doi.org/10.3390/agriculture13030691
Chicago/Turabian StyleZheng, Kang, Xueguan Zhao, Changjie Han, Yakai He, Changyuan Zhai, and Chunjiang Zhao. 2023. "Design and Experiment of an Automatic Row-Oriented Spraying System Based on Machine Vision for Early-Stage Maize Corps" Agriculture 13, no. 3: 691. https://doi.org/10.3390/agriculture13030691
APA StyleZheng, K., Zhao, X., Han, C., He, Y., Zhai, C., & Zhao, C. (2023). Design and Experiment of an Automatic Row-Oriented Spraying System Based on Machine Vision for Early-Stage Maize Corps. Agriculture, 13(3), 691. https://doi.org/10.3390/agriculture13030691