Advances in Crop Row Detection for Agricultural Robots: Methods, Performance Indicators, and Scene Adaptability
Abstract
1. Introduction
1.1. Research Background
1.2. Research Content
1.3. Main Contributions
2. Conventional Technical Methods and Principles for Crop Row Detection
2.1. Detection Methods Based on Visual Sensors
2.1.1. Traditional Visual Methods
2.1.2. Deep Learning Methods
2.1.3. Summary of Detection Methods Based on Visual Sensors
2.2. LiDAR-Based Detection Methods
2.2.1. Point Cloud Preprocessing Technology
2.2.2. Row Structure Extraction Methods
2.2.3. Three-Dimensional Feature Calculation
2.2.4. Summary of LiDAR-Based Detection Methods
2.3. Multi-Sensor Fusion Method
2.3.1. Visual–LiDAR Fusion
2.3.2. Visual–GNSS/IMU Fusion
2.3.3. Other Multi-Sensor Fusion Methods
2.3.4. Summary of Multi-Sensor Fusion Method
2.4. Summary of Conventional Crop Row Detection Methods and Principles
3. Performance Evaluation Indicators for Crop Row Detection Methods
3.1. Accuracy Indicators
3.1.1. Detection Accuracy
3.1.2. Positioning Error
3.1.3. Summary of Accuracy Indicators
3.2. Efficiency Indicators
3.2.1. Real Time Performance
3.2.2. Cost Calculation
3.2.3. Summary of Efficiency Indicators
3.3. Robustness Indicators
3.3.1. Environmental Adaptability
3.3.2. Scene Fault Tolerance
3.3.3. Summary of Robustness Indicators
3.4. Practical Indicators
3.4.1. Hardware Cost
3.4.2. Deployment Difficulty
3.4.3. Summary of Practical Indicators
3.5. Summary of Performance Evaluation Indicators
4. Comparison of Adaptability in Farmland Scenes
4.1. Comparison of Methods for Open-Air Scenarios
4.1.1. Simple Scenarios in Open-Air Scenarios
4.1.2. Complex Scenarios in Open-Air Fields
4.1.3. Open-Air Scenarios Challenges and Responses
4.2. Comparison of Methods for Facility Agriculture Scenarios
4.2.1. Simple Scenarios in Facility Agriculture
4.2.2. Complex Scenarios in Facility Agriculture Scenarios
4.2.3. Facility Agriculture Scenarios Challenges and Responses
4.3. Comparison of Methods for Orchard Scenarios
4.3.1. Simple Scenarios in Orchard Scenarios
4.3.2. Complex Scenarios in Orchard Settings
4.3.3. Orchard Scenarios Challenges and Responses
4.4. Comparison of Methods for Special Terrain Scenarios
4.4.1. Slope Farmland Scenarios
4.4.2. Wetland Farmland Scenarios
4.4.3. Other Scenarios
4.5. Summary of Adaptability Comparison in Farmland Scenarios
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Shi, J.; Bai, Y.; Diao, Z.; Zhou, J.; Yao, X.; Zhang, B. Row Detection Based Navigation and Guidance for Agricultural Robots and Autonomous Vehicles in Row-Crop Fields: Methods and Applications. Agronomy 2023, 13, 1780. [Google Scholar] [CrossRef]
- Zhang, S.; Liu, Y.; Gong, K.; Tian, Y.; Du, Y.; Zhu, Z.; Zhai, Z. A Review of Vision-Based Crop Row Detection Method: Focusing on Field Ground Autonomous Navigation Operations. Comput. Electron. Agric. 2024, 222, 109086. [Google Scholar] [CrossRef]
- Yao, Z.; Zhao, C.; Zhang, T. Agricultural Machinery Automatic Navigation Technology. iScience 2024, 27, 108714. [Google Scholar] [CrossRef]
- Bai, Y.; Zhang, B.; Xu, N.; Zhou, J.; Shi, J.; Diao, Z. Vision-Based Navigation and Guidance for Agricultural Autonomous Vehicles and Robots: A Review. Comput. Electron. Agric. 2023, 205, 107584. [Google Scholar] [CrossRef]
- Wang, T.; Chen, B.; Zhang, Z.; Li, H.; Zhang, M. Applications of Machine Vision in Agricultural Robot Navigation: A Review. Comput. Electron. Agric. 2022, 198, 107085. [Google Scholar] [CrossRef]
- Bonacini, L.; Tronco, M.L.; Higuti, V.A.H.; Velasquez, A.E.B.; Gasparino, M.V.; Peres, H.E.N.; Becker, M. Selection of a Navigation Strategy According to Agricultural Scenarios and Sensor Data Integrity. Agronomy 2023, 13, 925. [Google Scholar] [CrossRef]
- Xie, K.; Zhang, Z.; Zhu, S. Enhanced Agricultural Vehicle Positioning through Ultra-Wideband-Assisted Global Navigation Satellite Systems and Bayesian Integration Techniques. Agriculture 2024, 14, 1396. [Google Scholar] [CrossRef]
- Wang, W.; Qin, J.; Huang, D.; Zhang, F.; Liu, Z.; Wang, Z.; Yang, F. Integrated Navigation Method for Orchard-Dosing Robot Based on LiDAR/IMU/GNSS. Agronomy 2024, 14, 2541. [Google Scholar] [CrossRef]
- Qu, J.; Qiu, Z.; Li, L.; Guo, K.; Li, D. Map Construction and Positioning Method for LiDAR SLAM-Based Navigation of an Agricultural Field Inspection Robot. Agronomy 2024, 14, 2365. [Google Scholar] [CrossRef]
- Wen, J.; Yao, L.; Zhou, J.; Yang, Z.; Xu, L.; Yao, L. Path Tracking Control of Agricultural Automatic Navigation Vehicles Based on an Improved Sparrow Search-Pure Pursuit Algorithm. Agriculture 2025, 15, 1215. [Google Scholar] [CrossRef]
- Su, Z.; Zou, W.; Zhai, C.; Tan, H.; Yang, S.; Qin, X. Design of an Autonomous Orchard Navigation System Based on Multi-Sensor Fusion. Agronomy 2024, 14, 2825. [Google Scholar] [CrossRef]
- Kaewkorn, S.; Ekpanyapong, M.; Thamma, U. High-accuracy position-aware robot for agricultural automation using low-cost IMU-coupled triple-laser-guided (TLG) system. IEEE Access 2021, 9, 54325–54337. [Google Scholar] [CrossRef]
- Cui, X.; Zhu, L.; Zhao, B.; Wang, R.; Han, Z.; Lu, K.; Cui, X. DoubleNet: A Method for Generating Navigation Lines of Unstructured Soil Roads in a Vineyard Based on CNN and Transformer. Agronomy 2025, 15, 544. [Google Scholar] [CrossRef]
- Chen, H.; Xie, H.; Sun, L.; Shang, T. Research on Tractor Optimal Obstacle Avoidance Path Planning for Improving Navigation Accuracy and Avoiding Land Waste. Agriculture 2023, 13, 934. [Google Scholar] [CrossRef]
- Jin, X.; Lin, C.; Ji, J.; Li, W.; Zhang, B.; Suo, H. An Inter-Ridge Navigation Path Extraction Method Based on Res2net50 Segmentation Model. Agriculture 2023, 13, 881. [Google Scholar] [CrossRef]
- Gao, P.; Fang, J.; He, J.; Ma, S.; Wen, G.; Li, Z. GRU–Transformer Hybrid Model for GNSS/INS Integration in Orchard Environments. Agriculture 2025, 15, 1135. [Google Scholar] [CrossRef]
- Yang, T.; Jin, C.; Ni, Y.; Liu, Z.; Chen, M. Path Planning and Control System Design of an Unmanned Weeding Robot. Agriculture 2023, 13, 2001. [Google Scholar] [CrossRef]
- Gai, J.; Guo, Z.; Raj, A.; Tang, L. Robust Crop Row Detection Using Discrete Fourier Transform (DFT) for Vision-Based In-Field Navigation. Comput. Electron. Agric. 2025, 229, 109666. [Google Scholar] [CrossRef]
- Zhang, B.; Zhao, D.; Chen, C.; Li, J.; Zhang, W.; Qi, L.; Wang, S. Extraction of Crop Row Navigation Lines for Soybean Seedlings Based on Calculation of Average Pixel Point Coordinates. Agronomy 2024, 14, 1749. [Google Scholar] [CrossRef]
- Ruangurai, P.; Dailey, M.N.; Ekpanyapong, M.; Soni, P. Optimal Vision-Based Guidance Row Locating for Autonomous Agricultural Machines. Precis. Agric. 2022, 23, 1205–1225. [Google Scholar] [CrossRef]
- Zhou, X.; Zhang, X.; Zhao, R.; Chen, Y.; Liu, X. Navigation Line Extraction Method for Broad-Leaved Plants in the Multi-Period Environments of the High-Ridge Cultivation Mode. Agriculture 2023, 13, 1496. [Google Scholar] [CrossRef]
- Gai, J.; Xiang, L.; Tang, L. Using a Depth Camera for Crop Row Detection and Mapping for Under-Canopy Navigation of Agricultural Robotic Vehicle. Comput. Electron. Agric. 2021, 188, 106301. [Google Scholar] [CrossRef]
- Yun, C.; Kim, H.J.; Jeon, C.W.; Gang, M.; Lee, W.S.; Han, J.G. Stereovision-Based Ridge-Furrow Detection and Tracking for Auto-Guided Cultivator. Comput. Electron. Agric. 2021, 191, 106490. [Google Scholar] [CrossRef]
- Zhang, X.; Chen, B.; Li, J.; Fang, X.; Zhang, C.; Peng, S.; Li, Y. Novel Method for the Visual Navigation Path Detection of Jujube Harvester Autopilot Based on Image Processing. Int. J. Agric. Biol. Eng. 2023, 16, 189–197. [Google Scholar] [CrossRef]
- Li, A.; Wang, C.; Ji, T.; Wang, Q.; Zhang, T. D3-YOLOv10: Improved YOLOv10-Based Lightweight Tomato Detection Algorithm under Facility Scenario. Agriculture 2024, 14, 2268. [Google Scholar] [CrossRef]
- Zhang, Z.; Lu, Y.; Peng, Y.; Yang, M.; Hu, Y. A Lightweight and High-Performance YOLOv5-Based Model for Tea Shoot Detection in Field Conditions. Agronomy 2025, 15, 1122. [Google Scholar] [CrossRef]
- Duan, Y.; Han, W.; Guo, P.; Wei, X. YOLOv8-GDCI: Research on the Phytophthora Blight Detection Method of Different Parts of Chili Based on Improved YOLOv8 Model. Agronomy 2024, 14, 2734. [Google Scholar] [CrossRef]
- Yang, Y.; Zhou, Y.; Yue, X.; Zhang, G.; Wen, X.; Ma, B.; Chen, L. Real-Time Detection of Crop Rows in Maize Fields Based on Autonomous Extraction of ROI. Expert Syst. Appl. 2023, 213, 118826. [Google Scholar] [CrossRef]
- Quan, L.; Guo, Z.; Huang, L.; Xue, Y.; Sun, D.; Chen, T.; Lou, Z. Efficient Extraction of Corn Rows in Diverse Scenarios: A Grid-Based Selection Method for Intelligent Classification. Comput. Electron. Agric. 2024, 218, 108759. [Google Scholar] [CrossRef]
- Li, D.; Li, B.; Kang, S.; Feng, H.; Long, S.; Wang, J. E2CropDet: An Efficient End-to-End Solution to Crop Row Detection. Expert Syst. Appl. 2023, 227, 120345. [Google Scholar] [CrossRef]
- Luo, Y.; Dai, J.; Shi, S.; Xu, Y.; Zou, W.; Zhang, H.; Li, Y. Deep Learning-Based Seedling Row Detection and Localization Using High-Resolution UAV Imagery for Rice Transplanter Operation Quality Evaluation. Remote Sens. 2025, 17, 607. [Google Scholar] [CrossRef]
- Gomez, D.; Selvaraj, M.G.; Casas, J.; Mathiyazhagan, K.; Rodriguez, M.; Assefa, T.; Mlaki, A.; Nyakunga, G.; Kato, F.; Mukankusi, C.; et al. Advancing common bean (Phaseolus vulgaris L.) disease detection with YOLO driven deep learning to enhance agricultural AI. Sci. Rep. 2024, 14, 15596. [Google Scholar] [CrossRef] [PubMed]
- Diao, Z.; Ma, S.; Zhang, D.; Zhang, J.; Guo, P.; He, Z.; Zhang, B. Algorithm for Corn Crop Row Recognition during Different Growth Stages Based on ST-YOLOv8s Network. Agronomy 2024, 14, 1466. [Google Scholar] [CrossRef]
- Liu, T.H.; Zheng, Y.; Lai, J.S.; Cheng, Y.F.; Chen, S.Y.; Mai, B.F.; Xue, Z. Extracting Visual Navigation Line between Pineapple Field Rows Based on an Enhanced YOLOv5. Comput. Electron. Agric. 2024, 217, 108574. [Google Scholar] [CrossRef]
- Li, G.; Le, F.; Si, S.; Cui, L.; Xue, X. Image Segmentation-Based Oilseed Rape Row Detection for Infield Navigation of Agri-Robot. Agronomy 2024, 14, 1886. [Google Scholar] [CrossRef]
- Zhou, X.; Chen, W.; Wei, X. Improved Field Obstacle Detection Algorithm Based on YOLOv8. Agriculture 2024, 14, 2263. [Google Scholar] [CrossRef]
- Shi, J.; Bai, Y.; Zhou, J.; Zhang, B. Multi-Crop Navigation Line Extraction Based on Improved YOLO-V8 and Threshold-DBSCAN under Complex Agricultural Environments. Agriculture 2024, 14, 45. [Google Scholar] [CrossRef]
- Liu, Y.; Guo, Y.; Wang, X.; Yang, Y.; Zhang, J.; An, D.; Bai, T. Crop Root Rows Detection Based on Crop Canopy Image. Agriculture 2024, 14, 969. [Google Scholar] [CrossRef]
- Wang, Q.; Qin, W.; Liu, M.; Zhao, J.; Zhu, Q.; Yin, Y. Semantic Segmentation Model-Based Boundary Line Recognition Method for Wheat Harvesting. Agriculture 2024, 14, 1846. [Google Scholar] [CrossRef]
- Osco, L.P.; de Arruda, M.D.S.; Gonçalves, D.N.; Dias, A.; Batistoti, J.; de Souza, M.; Gonçalves, W.N. A CNN Approach to Simultaneously Count Plants and Detect Plantation-Rows from UAV Imagery. ISPRS J. Photogramm. Remote Sens. 2021, 174, 1–17. [Google Scholar] [CrossRef]
- Lv, R.; Hu, J.; Zhang, T.; Chen, X.; Liu, W. Crop-Free-Ridge Navigation Line Recognition Based on the Lightweight Structure Improvement of YOLOv8. Agriculture 2025, 15, 942. [Google Scholar] [CrossRef]
- Zhang, T.; Zhou, J.; Liu, W.; Yue, R.; Yao, M.; Shi, J.; Hu, J. Seedling-YOLO: High-Efficiency Target Detection Algorithm for Field Broccoli Seedling Transplanting Quality Based on YOLOv7-Tiny. Agronomy 2024, 14, 931. [Google Scholar] [CrossRef]
- Wang, W.; Gong, Y.; Gu, J.; Yang, Q.; Pan, Z.; Zhang, X.; Zhou, M. YOLOv8-TEA: Recognition Method of Tender Shoots of Tea Based on Instance Segmentation Algorithm. Agronomy 2025, 15, 1318. [Google Scholar] [CrossRef]
- Ma, J.; Zhao, Y.; Fan, W.; Liu, J. An Improved YOLOv8 Model for Lotus Seedpod Instance Segmentation in the Lotus Pond Environment. Agronomy 2024, 14, 1325. [Google Scholar] [CrossRef]
- Wang, C.; Chen, X.; Jiao, Z.; Song, S.; Ma, Z. An Improved YOLOP Lane-Line Detection Utilizing Feature Shift Aggregation for Intelligent Agricultural Machinery. Agriculture 2025, 15, 1361. [Google Scholar] [CrossRef]
- Karim, M.R.; Reza, M.N.; Gong, H.; Haque, M.A.; Lee, K.H.; Sung, J.; Chung, S.O. Application of LiDAR Sensors for Crop and Working Environment Recognition in Agriculture: A Review. Remote Sens. 2024, 16, 4623. [Google Scholar] [CrossRef]
- Baltazar, A.R.; Dos Santos, F.N.; De Sousa, M.L.; Moreira, A.P.; Cunha, J.B. 2D LiDAR-Based System for Canopy Sensing in Smart Spraying Applications. IEEE Access 2023, 11, 43583–43591. [Google Scholar] [CrossRef]
- Li, Z.; Xie, D.; Liu, L.; Wang, H.; Chen, L. Inter-Row Information Recognition of Maize in the Middle and Late Stages via LiDAR Supplementary Vision. Front. Plant Sci. 2022, 13, 1024360. [Google Scholar] [CrossRef]
- Yan, P.; Feng, Y.; Han, Q.; Wu, H.; Hu, Z.; Kang, S. Revolutionizing Crop Phenotyping: Enhanced UAV LiDAR Flight Parameter Optimization for Wide-Narrow Row Cultivation. Remote Sens. Environ. 2025, 320, 114638. [Google Scholar] [CrossRef]
- Bhattarai, A.; Scarpin, G.J.; Jakhar, A.; Porter, W.; Hand, L.C.; Snider, J.L.; Bastos, L.M. Optimizing Unmanned Aerial Vehicle LiDAR Data Collection in Cotton Through Flight Settings and Data Processing. Remote Sens. 2025, 17, 1504. [Google Scholar] [CrossRef]
- Zou, R.; Zhang, Y.; Chen, J.; Li, J.; Dai, W.; Mu, S. Density Estimation Method of Mature Wheat Based on Point Cloud Segmentation and Clustering. Comput. Electron. Agric. 2023, 205, 107626. [Google Scholar] [CrossRef]
- Liu, L.; Ji, D.; Zeng, F.; Zhao, Z.; Wang, S. Precision Inter-Row Relative Positioning Method by Using 3D LiDAR in Planted Forests and Orchards. Agronomy 2024, 14, 1279. [Google Scholar] [CrossRef]
- Nehme, H.; Aubry, C.; Solatges, T.; Savatier, X.; Rossi, R.; Boutteau, R. Lidar-Based Structure Tracking for Agricultural Robots: Application to Autonomous Navigation in Vineyards. J. Intell. Robot. Syst. 2021, 103, 61. [Google Scholar] [CrossRef]
- Ban, C.; Wang, L.; Su, T.; Chi, R.; Fu, G. Fusion of Monocular Camera and 3D LiDAR Data for Navigation Line Extraction under Corn Canopy. Comput. Electron. Agric. 2025, 232, 110124. [Google Scholar] [CrossRef]
- Luo, S.; Wen, S.; Zhang, L.; Lan, Y.; Chen, X. Extraction of crop canopy features and decision-making for variable spraying based on unmanned aerial vehicle LiDAR data. Comput. Electron. Agric. 2024, 224, 109197. [Google Scholar] [CrossRef]
- Nazeri, B.; Crawford, M. Detection of Outliers in Lidar Data Acquired by Multiple Platforms over Sorghum and Maize. Remote Sens. 2021, 13, 4445. [Google Scholar] [CrossRef]
- Cruz Ulloa, C.; Krus, A.; Barrientos, A.; Del Cerro, J.; Valero, C. Robotic Fertilisation Using Localisation Systems Based on Point Clouds in Strip-Crop Fields. Agronomy 2021, 11, 11. [Google Scholar] [CrossRef]
- Nazeri, B.; Crawford, M.M.; Tuinstra, M.R. Estimating Leaf Area Index in Row Crops Using Wheel-Based and Airborne Discrete Return Light Detection and Ranging Data. Front. Plant Sci. 2021, 12, 740322. [Google Scholar] [CrossRef]
- Lin, Y.C.; Habib, A. Quality Control and Crop Characterization Framework for Multi-Temporal UAV LiDAR Data over Mechanized Agricultural Fields. Remote Sens. Environ. 2021, 256, 112299. [Google Scholar] [CrossRef]
- Karim, M.R.; Ahmed, S.; Reza, M.N.; Lee, K.H.; Sung, J.; Chung, S.O. Geometric Feature Characterization of Apple Trees from 3D LiDAR Point Cloud Data. J. Imaging 2024, 11, 5. [Google Scholar] [CrossRef]
- Escolà, A.; Peña, J.M.; López-Granados, F.; Rosell-Polo, J.R.; de Castro, A.I.; Gregorio, E.; Torres-Sánchez, J. Mobile Terrestrial Laser Scanner vs. UAV Photogrammetry to Estimate Woody Crop Canopy Parameters–Part 1: Methodology and Comparison in Vineyards. Comput. Electron. Agric. 2023, 212, 108109. [Google Scholar] [CrossRef]
- Xie, B.; Jin, Y.; Faheem, M. Research Progress of Autonomous Navigation Technology for Multi-Agricultural Scenes. Comput. Electron. Agric. 2023, 211, 107963. [Google Scholar] [CrossRef]
- Shi, M.; Feng, X.; Pan, S.; Song, X.; Jiang, L. A Collaborative Path Planning Method for Intelligent Agricultural Machinery Based on Unmanned Aerial Vehicles. Electronics 2023, 12, 3232. [Google Scholar] [CrossRef]
- He, J.; Dong, W.; Tan, Q.; Li, J.; Song, X.; Zhao, R. A Variable-Threshold Segmentation Method for Rice Row Detection Considering Robot Travelling Prior Information. Agriculture 2025, 15, 413. [Google Scholar] [CrossRef]
- Shi, Z.; Bai, Z.; Yi, K.; Qiu, B.; Dong, X.; Wang, Q.; Jiang, C.; Zhang, X.; Huang, X. Vision and 2D lidar fusion-based navigation line extraction for autonomous agricultural robots in dense pomegranate orchards. Sensors 2025, 25, 5432. [Google Scholar] [CrossRef]
- Song, P.; Li, Z.; Yang, M.; Shao, Y.; Pu, Z.; Yang, W.; Zhai, R. Dynamic Detection of Three-Dimensional Crop Phenotypes Based on a Consumer-Grade RGB-D Camera. Front. Plant Sci. 2023, 14, 1097725. [Google Scholar] [CrossRef]
- Li, Y.; Qi, B.; Bao, E.; Tang, Z.; Lian, Y.; Sun, M. Design and Analysis of a Sowing Depth Detection and Control Device for a Wheat Row Planter Based on Fuzzy PID and Multi-Sensor Fusion. Agronomy 2025, 15, 1490. [Google Scholar] [CrossRef]
- Guan, X.; Shi, L.; Yang, W.; Ge, H.; Wei, X.; Ding, Y. Multi-Feature Fusion Recognition and Localization Method for Unmanned Harvesting of Aquatic Vegetables. Agriculture 2024, 14, 971. [Google Scholar] [CrossRef]
- Hu, T.; Wang, W.; Gu, J.; Xia, Z.; Zhang, J.; Wang, B. Research on Apple Object Detection and Localization Method Based on Improved YOLOX and RGB-D Images. Agronomy 2023, 13, 1816. [Google Scholar] [CrossRef]
- Wu, S.; Chen, Z.; Bangura, K.; Jiang, J.; Ma, X.; Li, J.; Qi, L. A Navigation Method for Paddy Field Management Based on Seedlings Coordinate Information. Comput. Electron. Agric. 2023, 215, 108436. [Google Scholar] [CrossRef]
- Mwitta, C.; Rains, G.C.; Burlacu, A.; Mandal, S. The Integration of GPS and Visual Navigation for Autonomous Navigation of an Ackerman Steering Mobile Robot in Cotton Fields. Front. Robot. AI 2024, 11, 1359887. [Google Scholar] [CrossRef] [PubMed]
- Li, Z.; Xu, R.; Li, C.; Fu, L. Visual Navigation and Crop Mapping of a Phenotyping Robot MARS-PhenoBot in Simulation. Smart Agric. Technol. 2025, 11, 100910. [Google Scholar] [CrossRef]
- Li, C.; Wu, J.; Pan, X.; Dou, H.; Zhao, X.; Gao, Y.; Zhai, C. Design and Experiment of a Breakpoint Continuous Spraying System for Automatic-Guidance Boom Sprayers. Agriculture 2023, 13, 2203. [Google Scholar] [CrossRef]
- Chen, X.; Dang, P.; Chen, Y.; Qi, L. A Tactile Recognition Method for Rice Plant Lodging Based on Adaptive Decision Boundary. Comput. Electron. Agric. 2025, 230, 109890. [Google Scholar] [CrossRef]
- Chen, X.; Mao, Y.; Gong, Y.; Qi, L.; Jiang, Y.; Ma, X. Intra-Row Weed Density Evaluation in Rice Field Using Tactile Method. Comput. Electron. Agric. 2022, 193, 106699. [Google Scholar] [CrossRef]
- Chen, X.; Dang, P.; Zhang, E.; Chen, Y.; Tang, C.; Qi, L. Accurate Recognition of Rice Plants Based on Visual and Tactile Sensing. J. Sci. Food Agric. 2024, 104, 4268–4277. [Google Scholar] [CrossRef]
- Gronewold, A.M.; Mulford, P.; Ray, E.; Ray, L.E. Tactile Sensing & Visually-Impaired Navigation in Densely Planted Row Crops, for Precision Fertilization by Small UGVs. Comput. Electron. Agric. 2025, 231, 110003. [Google Scholar] [CrossRef]
- Li, J.; Zhang, M.; Zhang, G.; Ge, D.; Li, M. Real-Time Monitoring System of Seedling Amount in Seedling Box Based on Machine Vision. Agriculture 2023, 13, 371. [Google Scholar] [CrossRef]
- Khan, M.N.; Rahi, A.; Rajendran, V.P.; Al Hasan, M.; Anwar, S. Real-Time Crop Row Detection Using Computer Vision-Application in Agricultural Robots. Front. Artif. Intell. 2024, 7, 1435686. [Google Scholar] [CrossRef] [PubMed]
- Rocha, B.M.; da Fonseca, A.U.; Pedrini, H.; Soares, F. Automatic Detection and Evaluation of Sugarcane Planting Rows in Aerial Images. Inf. Process. Agric. 2023, 10, 400–415. [Google Scholar] [CrossRef]
- De Bortoli, L.; Marsi, S.; Marinello, F.; Gallina, P. Cost-Efficient Algorithm for Autonomous Cultivators: Implementing Template Matching with Field Digital Twins for Precision Agriculture. Comput. Electron. Agric. 2024, 227, 109509. [Google Scholar] [CrossRef]
- He, L.; Liao, K.; Li, Y.; Li, B.; Zhang, J.; Wang, Y.; Fu, X. Extraction of Tobacco Planting Information Based on UAV High-Resolution Remote Sensing Images. Remote Sens. 2024, 16, 359. [Google Scholar]
- Navone, A.; Martini, M.; Ambrosio, M.; Ostuni, A.; Angarano, S.; Chiaberge, M. GPS-Free Autonomous Navigation in Cluttered Tree Rows with Deep Semantic Segmentation. Robot. Auton. Syst. 2025, 183, 104854. [Google Scholar] [CrossRef]
- Katari, S.; Venkatesh, S.; Stewart, C.; Khanal, S. Integrating Automated Labeling Framework for Enhancing Deep Learning Models to Count Corn Plants Using UAS Imagery. Sensors 2024, 24, 6467. [Google Scholar] [CrossRef] [PubMed]
- De Silva, R.; Cielniak, G.; Gao, J. Vision Based Crop Row Navigation under Varying Field Conditions in Arable Fields. Comput. Electron. Agric. 2024, 217, 108581. [Google Scholar] [CrossRef]
- Kostić, M.M.; Grbović, Ž.; Waqar, R.; Ivošević, B.; Panić, M.; Scarfone, A.; Tagarakis, A.C. Corn Plant In-Row Distance Analysis Based on Unmanned Aerial Vehicle Imagery and Row-Unit Dynamics. Appl. Sci. 2024, 14, 10693. [Google Scholar]
- Ullah, M.; Islam, F.; Bais, A. Quantifying Consistency of Crop Establishment Using a Lightweight U-Net Deep Learning Architecture and Image Processing Techniques. Comput. Electron. Agric. 2024, 217, 108617. [Google Scholar] [CrossRef]
- Affonso, F.; Tommaselli, F.A.G.; Capezzuto, G.; Gasparino, M.V.; Chowdhary, G.; Becker, M. CROW: A Self-Supervised Crop Row Navigation Algorithm for Agricultural Fields. J. Intell. Robot. Syst. 2025, 111, 28. [Google Scholar] [CrossRef]
- Li, Q.; Zhu, H. Performance Evaluation of 2D LiDAR SLAM Algorithms in Simulated Orchard Environments. Comput. Electron. Agric. 2024, 221, 108994. [Google Scholar] [CrossRef]
- Fujinaga, T. Autonomous Navigation Method for Agricultural Robots in High-Bed Cultivation Environments. Comput. Electron. Agric. 2025, 231, 110001. [Google Scholar] [CrossRef]
- Pan, Y.; Hu, K.; Cao, H.; Kang, H.; Wang, X. A Novel Perception and Semantic Mapping Method for Robot Autonomy in Orchards. Comput. Electron. Agric. 2024, 219, 108769. [Google Scholar] [CrossRef]
- Zhou, Y.; Wang, X.; Wang, Z.; Ye, Y.; Zhu, F.; Yu, K.; Zhao, Y. Rolling 2D Lidar-Based Navigation Line Extraction Method for Modern Orchard Automation. Agronomy 2025, 15, 816. [Google Scholar] [CrossRef]
- Hong, Y.; Ma, R.; Li, C.; Shao, C.; Huang, J.; Zeng, Y.; Chen, Y. Three-Dimensional Localization and Mapping of Multiagricultural Scenes via Hierarchically-Coupled LiDAR-Inertial Odometry. Comput. Electron. Agric. 2024, 227, 109487. [Google Scholar] [CrossRef]
- Li, S.; Miao, Y.; Li, H.; Qiu, R.; Zhang, M. RTMR-LOAM: Real-Time Maize 3D Reconstruction Based on Lidar Odometry and Mapping. Comput. Electron. Agric. 2025, 230, 109820. [Google Scholar]
- Teng, H.; Wang, Y.; Chatziparaschis, D.; Karydis, K. Adaptive LiDAR Odometry and Mapping for Autonomous Agricultural Mobile Robots in Unmanned Farms. Comput. Electron. Agric. 2025, 232, 110023. [Google Scholar] [CrossRef]
- Ban, C.; Wang, L.; Chi, R.; Su, T.; Ma, Y. A Camera-LiDAR-IMU Fusion Method for Real-Time Extraction of Navigation Line between Maize Field Rows. Comput. Electron. Agric. 2024, 223, 109114. [Google Scholar]
- Zhu, X.; Zhao, X.; Liu, J.; Feng, W.; Fan, X. Autonomous Navigation and Obstacle Avoidance for Orchard Spraying Robots: A Sensor-Fusion Approach with ArduPilot, ROS, and EKF. Agronomy 2025, 15, 1373. [Google Scholar] [CrossRef]
- Li, Y.; Xiao, L.; Liu, Z.; Liu, M.; Fang, P.; Chen, X.; Yu, J.; Lin, J.; Cai, J. Recognition and Localization of Ratoon Rice Rolled Stubble Rows Based on Monocular Vision and Model Fusion. Front. Plant Sci. 2025, 16, 1533206. [Google Scholar] [CrossRef]
- Jiang, B.; Zhang, J.L.; Su, W.H.; Hu, R. A SPH-YOLOv5x-Based Automatic System for Intra-Row Weed Control in Lettuce. Agronomy 2023, 13, 2915. [Google Scholar] [CrossRef]
- Bah, M.D.; Hafiane, A.; Canals, R. Hierarchical Graph Representation for Unsupervised Crop Row Detection in Images. Expert Syst. Appl. 2023, 216, 119478. [Google Scholar] [CrossRef]
- Sun, J.; Wang, Z.; Ding, S.; Xia, J.; Xing, G. Adaptive Disturbance Observer-Based Fixed Time Nonsingular Terminal Sliding Mode Control for Path-Tracking of Unmanned Agricultural Tractors. Biosyst. Eng. 2024, 246, 96–109. [Google Scholar]
- Cui, B.; Cui, X.; Wei, X.; Zhu, Y.; Ma, Z.; Zhao, Y.; Liu, Y. Design and Testing of a Tractor Automatic Navigation System Based on Dynamic Path Search and a Fuzzy Stanley Model. Agriculture 2024, 14, 2136. [Google Scholar] [CrossRef]
- Afzaal, H.; Rude, D.; Farooque, A.A.; Randhawa, G.S.; Schumann, A.W.; Krouglicof, N. Improved Crop Row Detection by Employing Attention-Based Vision Transformers and Convolutional Neural Networks with Integrated Depth Modeling for Precise Spatial Accuracy. Smart Agric. Technol. 2025, 11, 100934. [Google Scholar] [CrossRef]
- Gong, H.; Zhuang, W.; Wang, X. Improving the Maize Crop Row Navigation Line Recognition Method of YOLOX. Front. Plant Sci. 2024, 15, 1338228. [Google Scholar] [CrossRef]
- Li, B.; Li, D.; Wei, Z.; Wang, J. Rethinking the Crop Row Detection Pipeline: An End-to-End Method for Crop Row Detection Based on Row-Column Attention. Comput. Electron. Agric. 2024, 225, 109264. [Google Scholar]
- Wei, J.; Zhang, M.; Wu, C.; Ma, Q.; Wang, W.; Wan, C. Accurate Crop Row Recognition of Maize at the Seedling Stage Using Lightweight Network. Int. J. Agric. Biol. Eng. 2024, 17, 189–198. [Google Scholar] [CrossRef]
- Gómez, A.; Moreno, H.; Andújar, D. Intelligent Inter-and Intra-Row Early Weed Detection in Commercial Maize Crops. Plants 2025, 14, 881. [Google Scholar] [CrossRef]
- Diao, Z.; Guo, P.; Zhang, B.; Zhang, D.; Yan, J.; He, Z.; Zhang, J. Navigation Line Extraction Algorithm for Corn Spraying Robot Based on Improved YOLOv8s Network. Comput. Electron. Agric. 2023, 212, 108049. [Google Scholar] [CrossRef]
- Zhu, C.; Hao, S.; Liu, C.; Wang, Y.; Jia, X.; Xu, J.; Wang, W. An Efficient Computer Vision-Based Dual-Face Target Precision Variable Spraying Robotic System for Foliar Fertilisers. Agronomy 2024, 14, 2770. [Google Scholar] [CrossRef]
- Liang, Z.; Xu, X.; Yang, D.; Liu, Y. The Development of a Lightweight DE-YOLO Model for Detecting Impurities and Broken Rice Grains. Agriculture 2025, 15, 848. [Google Scholar] [CrossRef]
- Jiang, L.; Wang, Y.; Wu, C.; Wu, H. Fruit Distribution Density Estimation in YOLO-Detected Strawberry Images: A Kernel Density and Nearest Neighbor Analysis Approach. Agriculture 2024, 14, 1848. [Google Scholar] [CrossRef]
- Memon, M.S.; Chen, S.; Shen, B.; Liang, R.; Tang, Z.; Wang, S.; Memon, N. Automatic Visual Recognition, Detection and Classification of Weeds in Cotton Fields Based on Machine Vision. Crop Prot. 2025, 187, 106966. [Google Scholar] [CrossRef]
- Zhang, S.; Wei, X.; Liu, C.; Ge, J.; Cui, X.; Wang, F.; Wang, A.; Chen, W. Adaptive Path Tracking and Control System for Unmanned Crawler Harvesters in Paddy Fields. Comput. Electron. Agric. 2025, 230, 109878. [Google Scholar] [CrossRef]
- Lu, E.; Xue, J.; Chen, T.; Jiang, S. Robust Trajectory Tracking Control of an Autonomous Tractor-Trailer Considering Model Parameter Uncertainties and Disturbances. Agriculture 2023, 13, 869. [Google Scholar] [CrossRef]
- Yang, Y.; Shen, X.; An, D.; Han, H.; Tang, W.; Wang, Y.; Chen, L. Crop Row Detection Algorithm Based on 3-D LiDAR: Suitable for Crop Row Detection in Different Periods. IEEE Trans. Instrum. Meas. 2024, 73, 1–13. [Google Scholar] [CrossRef]
- Kong, X.; Guo, Y.; Liang, Z.; Zhang, R.; Hong, Z.; Xue, W. A Method for Recognizing Inter-Row Navigation Lines of Rice Heading Stage Based on Improved ENet Network. Measurement 2025, 241, 115677. [Google Scholar] [CrossRef]
- Zhang, Z.; Lu, Y.; Zhao, Y.; Pan, Q.; Jin, K.; Xu, G.; Hu, Y. Ts-YOLO: An All-Day and Lightweight Tea Canopy Shoots Detection Model. Agronomy 2023, 13, 1411. [Google Scholar] [CrossRef]
- Luo, Y.; Wei, L.; Xu, L.; Zhang, Q.; Liu, J.; Cai, Q.; Zhang, W. Stereo-Vision-Based Multi-Crop Harvesting Edge Detection for Precise Automatic Steering of Combine Harvester. Biosyst. Eng. 2022, 215, 115–128. [Google Scholar] [CrossRef]
- He, Z.; Yuan, F.; Zhou, Y.; Cui, B.; He, Y.; Liu, Y. Stereo vision based broccoli recognition and attitude estimation method for field harvesting. Artif. Intell. Agric. 2025, 15, 526–536. [Google Scholar] [CrossRef]
- Lac, L.; Da Costa, J.P.; Donias, M.; Keresztes, B.; Bardet, A. Crop Stem Detection and Tracking for Precision Hoeing Using Deep Learning. Comput. Electron. Agric. 2022, 192, 106606. [Google Scholar] [CrossRef]
- Guo, P.; Diao, Z.; Zhao, C.; Li, J.; Zhang, R.; Yang, R.; Zhang, B. Navigation Line Extraction Algorithm for Corn Spraying Robot Based on YOLOv8s-CornNet. J. Field Robot. 2024, 41, 1887–1899. [Google Scholar] [CrossRef]
- Yang, K.; Sun, X.; Li, R.; He, Z.; Wang, X.; Wang, C.; Liu, H. A Method for Quantifying Mung Bean Field Planting Layouts Using UAV Images and an Improved YOLOv8-obb Model. Agronomy 2025, 15, 151. [Google Scholar] [CrossRef]
- Lin, Y.; Xia, S.; Wang, L.; Qiao, B.; Han, H.; Wang, L.; He, X.; Liu, Y. Multi-Task Deep Convolutional Neural Network for Weed Detection and Navigation Path Extraction. Comput. Electron. Agric. 2025, 229, 109776. [Google Scholar] [CrossRef]
- Patidar, P.; Soni, P. A Rapid Estimation of Intra-Row Weed Density Using an Integrated CRM, BTSORT and HSV Model across Entire Video Stream of Chilli Crop Canopies. Crop Prot. 2025, 189, 107039. [Google Scholar] [CrossRef]
- Song, Y.; Xu, F.; Yao, Q.; Liu, J.; Yang, S. Navigation Algorithm Based on Semantic Segmentation in Wheat Fields Using an RGB-D Camera. Inf. Process. Agric. 2023, 10, 475–490. [Google Scholar] [CrossRef]
- Costa, I.F.D.; Leite, A.C.; Caarls, W. Data Set Diversity in Crop Row Detection Based on CNN Models for Autonomous Robot Navigation. J. Field Robot. 2025, 42, 525–538. [Google Scholar] [CrossRef]
- Vrochidou, E.; Oustadakis, D.; Kefalas, A.; Papakostas, G.A. Computer Vision in Self-Steering Tractors. Machines 2022, 10, 129. [Google Scholar] [CrossRef]
- Zhao, R.; Yuan, X.; Yang, Z.; Zhang, L. Image-Based Crop Row Detection Utilizing the Hough Transform and DBSCAN Clustering Analysis. IET Image Process. 2024, 18, 1161–1177. [Google Scholar] [CrossRef]
- Pang, Y.; Shi, Y.; Gao, S.; Jiang, F.; Veeranampalayam-Sivakumar, A.N.; Thompson, L.; Luck, J.; Liu, C. Improved crop row detection with deep neural network for early-season maize stand count in UAV imagery. Comput. Electron. Agric. 2020, 178, 105766. [Google Scholar] [CrossRef]
- Liang, X.; Chen, B.; Wei, C.; Zhang, X. Inter-Row Navigation Line Detection for Cotton with Broken Rows. Plant Methods 2022, 18, 90. [Google Scholar] [CrossRef] [PubMed]
- Zhang, C.; Ge, X.; Zheng, Z.; Wu, X.; Wang, W.; Chen, L. A Plant Unit Relates to Missing Seeding Detection and Reseeding for Maize Precision Seeding. Agriculture 2022, 12, 1634. [Google Scholar] [CrossRef]
- Cox, J.; Tsagkopoulos, N.; Rozsypálek, Z.; Krajník, T.; Sklar, E.; Hanheide, M. Visual Teach and Generalise (VTAG)—Exploiting Perceptual Aliasing for Scalable Autonomous Robotic Navigation in Horticultural Environments. Comput. Electron. Agric. 2023, 212, 108054. [Google Scholar] [CrossRef]
- Calera, E.S.; Oliveira, G.C.D.; Araujo, G.L.; Filho, J.I.F.; Toschi, L.; Hernandes, A.C.; Becker, M. Under-Canopy Navigation for an Agricultural Rover Based on Image Data. J. Intell. Robot. Syst. 2023, 108, 29. [Google Scholar] [CrossRef]
- Torres-Sánchez, J.; Escolà, A.; De Castro, A.I.; López-Granados, F.; Rosell-Polo, J.R.; Sebé, F.; Peña, J.M. Mobile Terrestrial Laser Scanner vs. UAV Photogrammetry to Estimate Woody Crop Canopy Parameters–Part 2: Comparison for Different Crops and Training Systems. Comput. Electron. Agric. 2023, 212, 108083. [Google Scholar] [CrossRef]
- Lai Lap Hong, B.; Bin Mohd Izhar, M.A.; Ahmad, N.B. Improved Monte Carlo Localization for Agricultural Mobile Robots with the Normal Distributions Transform. Int. J. Adv. Comput. Sci. Appl. 2025, 16, 1043. [Google Scholar] [CrossRef]
- De Silva, R.; Cielniak, G.; Wang, G.; Gao, J. Deep Learning-Based Crop Row Detection for Infield Navigation of Agri-Robots. J. Field Robot. 2024, 41, 2299–2321. [Google Scholar] [CrossRef]
- Guo, Z.; Geng, Y.; Wang, C.; Xue, Y.; Sun, D.; Lou, Z.; Quan, L. InstaCropNet: An Efficient Unet-Based Architecture for Precise Crop Row Detection in Agricultural Applications. Artif. Intell. Agric. 2024, 12, 85–96. [Google Scholar] [CrossRef]
- Jayathunga, S.; Pearse, G.D.; Watt, M.S. Unsupervised Methodology for Large-Scale Tree Seedling Mapping in Diverse Forestry Settings Using UAV-Based RGB Imagery. Remote Sens. 2023, 15, 5276. [Google Scholar] [CrossRef]
- Rana, S.; Crimaldi, M.; Barretta, D.; Carillo, P.; Cirillo, V.; Maggio, A.; Gerbino, S. GobhiSet: Dataset of Raw, Manually, and Automatically Annotated RGB Images across Phenology of Brassica oleracea var. Botrytis. Data Brief 2024, 54, 110506. [Google Scholar] [CrossRef]
- Roggiolani, G.; Rückin, J.; Popović, M.; Behley, J.; Stachniss, C. Unsupervised Semantic Label Generation in Agricultural Fields. Front. Robot. AI 2025, 12, 1548143. [Google Scholar] [CrossRef]
- Lin, H.; Lu, Y.; Ding, R.; Gou, Y.; Yang, F. Detection of Wheat Seedling Lines in the Complex Environment via Deep Learning. Int. J. Agric. Biol. Eng. 2024, 17, 255–265. [Google Scholar] [CrossRef]
- Feng, A.; Vong, C.N.; Zhou, J.; Conway, L.S.; Zhou, J.; Vories, E.D.; Kitchen, N.R. Developing an Image Processing Pipeline to Improve the Position Accuracy of Single UAV Images. Comput. Electron. Agric. 2023, 206, 107650. [Google Scholar] [CrossRef]
- Sampurno, R.M.; Liu, Z.; Abeyrathna, R.R.D.; Ahamed, T. Intrarow Uncut Weed Detection Using You-Only-Look-Once Instance Segmentation for Orchard Plantations. Sensors 2024, 24, 893. [Google Scholar] [CrossRef]
- Meng, J.; Xian, W.; Li, F.; Li, Z.; Li, J. A Monocular Camera-Based Algorithm for Sugar Beet Crop Row Extraction. Eng. Agrícola 2024, 44, e20240034. [Google Scholar] [CrossRef]
- Gong, H.; Zhuang, W. An Improved Method for Extracting Inter-Row Navigation Lines in Nighttime Maize Crops Using YOLOv7-Tiny. IEEE Access 2024, 12, 27444–27455. [Google Scholar] [CrossRef]
- Shi, Y.; Xu, R.; Qi, Z. MSNet: A Novel Deep Learning Framework for Efficient Missing Seedling Detection in Maize Fields. Appl. Artif. Intell. 2025, 39, 2469372. [Google Scholar] [CrossRef]
- Xue, L.; Xing, M.; Lyu, H. Improved Early-Stage Maize Row Detection Using Unmanned Aerial Vehicle Imagery. ISPRS Int. J. Geo-Inf. 2024, 13, 376. [Google Scholar] [CrossRef]
- Saha, S.; Noguchi, N. Smart Vineyard Row Navigation: A Machine Vision Approach Leveraging YOLOv8. Comput. Electron. Agric. 2025, 229, 109839. [Google Scholar] [CrossRef]
- Yang, M.; Huang, C.; Li, Z.; Shao, Y.; Yuan, J.; Yang, W.; Song, P. Autonomous Navigation Method Based on RGB-D Camera for a Crop Phenotyping Robot. J. Field Robot. 2024, 41, 2663–2675. [Google Scholar] [CrossRef]
- Sun, T.; Le, F.; Cai, C.; Jin, Y.; Xue, X.; Cui, L. Soybean–Corn Seedling Crop Row Detection for Agricultural Autonomous Navigation Based on GD-YOLOv10n-Seg. Agriculture 2025, 15, 796. [Google Scholar] [CrossRef]
- Gong, H.; Wang, X.; Zhuang, W. Research on Real-Time Detection of Maize Seedling Navigation Line Based on Improved YOLOv5s Lightweighting Technology. Agriculture 2024, 14, 124. [Google Scholar] [CrossRef]
- Zheng, K.; Zhao, X.; Han, C.; He, Y.; Zhai, C.; Zhao, C. Design and Experiment of an Automatic Row-Oriented Spraying System Based on Machine Vision for Early-Stage Maize Corps. Agriculture 2023, 13, 691. [Google Scholar] [CrossRef]
- Zhang, T.; Zhou, J.; Liu, W.; Yue, R.; Shi, J.; Zhou, C.; Hu, J. SN-CNN: A Lightweight and Accurate Line Extraction Algorithm for Seedling Navigation in Ridge-Planted Vegetables. Agriculture 2024, 14, 1446. [Google Scholar] [CrossRef]
- Geng, A.; Hu, X.; Liu, J.; Mei, Z.; Zhang, Z.; Yu, W. Development and Testing of Automatic Row Alignment System for Corn Harvesters. Appl. Sci. 2022, 12, 6221. [Google Scholar] [CrossRef]
- Hruska, A.; Hamouz, P. Verification of a Machine Learning Model for Weed Detection in Maize (Zea mays) Using Infrared Imaging. Plant Prot. Sci. 2023, 59, 292–297. [Google Scholar] [CrossRef]
- Yang, Z.; Yang, Y.; Li, C.; Zhou, Y.; Zhang, X.; Yu, Y.; Liu, D. Tasseled Crop Rows Detection Based on Micro-Region of Interest and Logarithmic Transformation. Front. Plant Sci. 2022, 13, 916474. [Google Scholar] [CrossRef]
- Guo, Z.; Quan, L.; Sun, D.; Lou, Z.; Geng, Y.; Chen, T.; Wang, J. Efficient Crop Row Detection Using Transformer-Based Parameter Prediction. Biosyst. Eng. 2024, 246, 13–25. [Google Scholar] [CrossRef]
- Yu, T.; Chen, J.; Gui, Z.; Jia, J.; Li, Y.; Yu, C.; Wu, C. Multi-Scale Cross-Domain Augmentation of Tea Datasets via Enhanced Cycle Adversarial Networks. Agriculture 2025, 15, 1739. [Google Scholar] [CrossRef]
- Deng, L.; Miao, Z.; Zhao, X.; Yang, S.; Gao, Y.; Zhai, C.; Zhao, C. HAD-YOLO: An Accurate and Effective Weed Detection Model Based on Improved YOLOV5 Network. Agronomy 2025, 15, 57. [Google Scholar] [CrossRef]
- Zuo, Z.; Gao, S.; Peng, H.; Xue, Y.; Han, L.; Ma, G.; Mao, H. Lightweight detection of broccoli heads in complex field environments based on LBDC-YOLO. Agronomy 2024, 14, 2359. [Google Scholar] [CrossRef]
- Zhang, Y.W.; Liu, M.N.; Chen, D.; Xu, X.M.; Lu, J.; Lai, H.R.; Yin, Y.X. Development and Testing of Row-Controlled Weeding Intelligent Robot for Corn. J. Field Robot. 2025, 42, 850–866. [Google Scholar] [CrossRef]
- Xiang, M.; Gao, X.; Wang, G.; Qi, J.; Qu, M.; Ma, Z.; Song, K. An Application Oriented All-Round Intelligent Weeding Machine with Enhanced YOLOv5. Biosyst. Eng. 2024, 248, 269–282. [Google Scholar] [CrossRef]
- Wang, B.; Du, X.; Wang, Y.; Mao, H. Multi-Machine Collaboration Realization Conditions and Precise and Efficient Production Mode of Intelligent Agricultural Machinery. Int. J. Agric. Biol. Eng. 2024, 17, 27–36. [Google Scholar] [CrossRef]
- Ulloa, C.C.; Krus, A.; Barrientos, A.; del Cerro, J.; Valero, C. Robotic Fertilization in Strip Cropping Using a CNN Vegetables Detection-Characterization Method. Comput. Electron. Agric. 2022, 193, 106684. [Google Scholar] [CrossRef]
- Yan, Z.; Zhao, Y.; Luo, W.; Ding, X.; Li, K.; He, Z.; Cui, Y. Machine Vision-Based Tomato Plug Tray Missed Seeding Detection and Empty Cell Replanting. Comput. Electron. Agric. 2023, 208, 107800. [Google Scholar] [CrossRef]
- Wang, Y.; Li, T.; Chen, T.; Zhang, X.; Taha, M.F.; Yang, N.; Shi, Q. Cucumber Downy Mildew Disease Prediction Using a CNN-LSTM Approach. Agriculture 2024, 14, 1155. [Google Scholar] [CrossRef]
- Wang, S.; Su, D.; Jiang, Y.; Tan, Y.; Qiao, Y.; Yang, S.; Hu, N. Fusing Vegetation Index and Ridge Segmentation for Robust Vision Based Autonomous Navigation of Agricultural Robots in Vegetable Farms. Comput. Electron. Agric. 2023, 213, 108235. [Google Scholar] [CrossRef]
- Yang, Y.; Xie, H.; Zhang, K.; Wang, Y.; Li, Y.; Zhou, J.; Xu, L. Design, Development, Integration, and Field Evaluation of a Ridge-Planting Strawberry Harvesting Robot. Agriculture 2024, 14, 2126. [Google Scholar] [CrossRef]
- Jin, X.; Tang, L.; Li, R.; Zhao, B.; Ji, J.; Ma, Y. Edge Recognition and Reduced Transplantation Loss of Leafy Vegetable Seedlings with Intel RealsSense D415 Depth Camera. Comput. Electron. Agric. 2022, 198, 107030. [Google Scholar] [CrossRef]
- Huang, P.; Huang, P.; Wang, Z.; Wu, X.; Liu, J.; Zhu, L. Deep-Learning-Based Trunk Perception with Depth Estimation and DWA for Robust Navigation of Robotics in Orchards. Agronomy 2023, 13, 1084. [Google Scholar] [CrossRef]
- Zhang, H.; Meng, Z.; Wen, S.; Liu, G.; Hu, G.; Chen, J.; Zhang, S. Design and Experiment of Active Obstacle Avoidance Control System for Grapevine Interplant Weeding Based on GNSS. Smart Agric. Technol. 2025, 10, 100781. [Google Scholar] [CrossRef]
- Devanna, R.P.; Romeo, L.; Reina, G.; Milella, A. Yield Estimation in Precision Viticulture by Combining Deep Segmentation and Depth-Based Clustering. Comput. Electron. Agric. 2025, 232, 110025. [Google Scholar] [CrossRef]
- Xu, Z.; Liu, J.; Wang, J.; Cai, L.; Jin, Y.; Zhao, S. Realtime Picking Point Decision Algorithm of Trellis Grape for High-Speed Robotic Cut-and-Catch Harvesting. Agronomy 2023, 13, 1618. [Google Scholar] [CrossRef]
- Salas, B.; Salcedo, R.; Garcia-Ruiz, F.; Gil, E. Design, Implementation and Validation of a Sensor-Based Precise Airblast Sprayer to Improve Pesticide Applications in Orchards. Precis. Agric. 2024, 25, 865–888. [Google Scholar] [CrossRef]
- Nakaguchi, V.M.; Abeyrathna, R.R.D.; Liu, Z.; Noguchi, R.; Ahamed, T. Development of a Machine Stereo Vision-Based Autonomous Navigation System for Orchard Speed Sprayers. Comput. Electron. Agric. 2024, 227, 109669. [Google Scholar] [CrossRef]
- Khan, Z.; Liu, H.; Shen, Y.; Zeng, X. Deep learning improved YOLOv8 algorithm: Real-time precise instance segmentation of crown region orchard canopies in natural environment. Comput. Electron. Agric. 2024, 224, 109168. [Google Scholar] [CrossRef]
- Zhang, L.; Li, M.; Zhu, X.; Chen, Y.; Huang, J.; Wang, Z.; Fang, K. Navigation Path Recognition between Rows of Fruit Trees Based on Semantic Segmentation. Comput. Electron. Agric. 2024, 216, 108511. [Google Scholar] [CrossRef]
- Xie, X.; Li, Y.; Zhao, L.; Wang, S.; Han, X. Method for the Fruit Tree Recognition and Navigation in Complex Environment of an Agricultural Robot. Int. J. Agric. Biol. Eng. 2024, 17, 221–229. [Google Scholar] [CrossRef]
- Xu, S.; Rai, R. Vision-Based Autonomous Navigation Stack for Tractors Operating in Peach Orchards. Comput. Electron. Agric. 2024, 217, 108558. [Google Scholar] [CrossRef]
- Yang, Z.; Ouyang, L.; Zhang, Z.; Duan, J.; Yu, J.; Wang, H. Visual Navigation Path Extraction of Orchard Hard Pavement Based on Scanning Method and Neural Network. Comput. Electron. Agric. 2022, 197, 106964. [Google Scholar] [CrossRef]
- Jia, W.; Tai, K.; Dong, X.; Ou, M.; Wang, X. Design of and Experimentation on an Intelligent Intra-Row Obstacle Avoidance and Weeding Machine for Orchards. Agriculture 2025, 15, 947. [Google Scholar] [CrossRef]
- Syed, T.N.; Zhou, J.; Lakhiar, I.A.; Marinello, F.; Gemechu, T.T.; Rottok, L.T.; Jiang, Z. Enhancing Autonomous Orchard Navigation: A Real-Time Convolutional Neural Network-Based Obstacle Classification System for Distinguishing ‘Real’ and ‘Fake’ Obstacles in Agricultural Robotics. Agriculture 2025, 15, 827. [Google Scholar] [CrossRef]
- Cao, G.; Zhang, B.; Li, Y.; Wang, Z.; Diao, Z.; Zhu, Q.; Liang, Z. Environmental Mapping and Path Planning for Robots in Orchard Based on Traversability Analysis, Improved LeGO-LOAM and RRT* Algorithms. Comput. Electron. Agric. 2025, 230, 109889. [Google Scholar] [CrossRef]
- Scalisi, A.; McClymont, L.; Underwood, J.; Morton, P.; Scheding, S.; Goodwin, I. Reliability of a Commercial Platform for Estimating Flower Cluster and Fruit Number, Yield, Tree Geometry and Light Interception in Apple Trees under Different Rootstocks and Row Orientations. Comput. Electron. Agric. 2021, 191, 106519. [Google Scholar] [CrossRef]
- Mao, W.; Murengami, B.; Jiang, H.; Li, R.; He, L.; Fu, L. UAV-Based High-Throughput Phenotyping to Segment Individual Apple Tree Row Based on Geometrical Features of Poles and Colored Point Cloud. J. ASABE 2024, 67, 1231–1240. [Google Scholar] [CrossRef]
- Krklješ, D.; Kitić, G.; Panić, M.; Petes, C.; Filipović, V.; Stefanović, D.; Marko, O. Agrobot Gari, a Multimodal Robotic Solution for Blueberry Production Automation. Comput. Electron. Agric. 2025, 237, 110626. [Google Scholar] [CrossRef]
- Kim, K.; Deb, A.; Cappelleri, D.J. P-AgNav: Range View-Based Autonomous Navigation System for Cornfields. IEEE Robot. Autom. Lett. 2025, 10, 366–3373. [Google Scholar] [CrossRef]
- Li, H.; Lai, X.; Mo, Y.; He, D.; Wu, T. Pixel-Wise Navigation Line Extraction of Cross-Growth-Stage Seedlings in Complex Sugarcane Fields and Extension to Corn and Rice. Front. Plant Sci. 2025, 15, 1499896. [Google Scholar] [CrossRef]
- Pradhan, N.C.; Sahoo, P.K.; Kushwaha, D.K.; Mani, I.; Srivastava, A.; Sagar, A.; Makwana, Y. A Novel Approach for Development and Evaluation of LiDAR Navigated Electronic Maize Seeding System Using Check Row Quality Index. Sensors 2021, 21, 5934. [Google Scholar] [CrossRef] [PubMed]
- Chen, Z.; Cai, Y.; Liu, Y.; Liang, Z.; Chen, H.; Ma, R.; Qi, L. Towards End-to-End Rice Row Detection in Paddy Fields Exploiting Two-Pathway Instance Segmentation. Comput. Electron. Agric. 2025, 231, 109963. [Google Scholar] [CrossRef]
- Wang, Y.; Fu, Q.; Ma, Z.; Tian, X.; Ji, Z.; Yuan, W.; Su, Z. YOLOv5-AC: A Method of Uncrewed Rice Transplanter Working Quality Detection. Agronomy 2023, 13, 2279. [Google Scholar] [CrossRef]
- Wu, S.; Ma, X.; Jin, Y.; Yang, J.; Zhang, W.; Zhang, H.; Qi, L. A Novel Method for Detecting Missing Seedlings Based on UAV Images and Rice Transplanter Operation Information. Comput. Electron. Agric. 2025, 229, 109789. [Google Scholar] [CrossRef]
- Fu, D.; Chen, Z.; Yao, Z.; Liang, Z.; Cai, Y.; Liu, C.; Qi, L. Vision-Based Trajectory Generation and Tracking Algorithm for Maneuvering of a Paddy Field Robot. Comput. Electron. Agric. 2024, 226, 109368. [Google Scholar] [CrossRef]
- Guan, X.; Shi, L.; Ge, H.; Ding, Y.; Nie, S. Development, Design, and Improvement of an Intelligent Harvesting System for Aquatic Vegetable Brasenia schreberi. Agronomy 2025, 15, 1451. [Google Scholar] [CrossRef]
- Wang, S.; Yu, S.; Zhang, W.; Wang, X.; Li, J. The Seedling Line Extraction of Automatic Weeding Machinery in Paddy Field. Comput. Electron. Agric. 2023, 205, 107648. [Google Scholar] [CrossRef]
- Liu, Q.; Zhao, J. MA-Res U-Net: Design of Soybean Navigation System with Improved U-Net Model. Phyton 2024, 93, 2663. [Google Scholar] [CrossRef]
- Tsiakas, K.; Papadimitriou, A.; Pechlivani, E.M.; Giakoumis, D.; Frangakis, N.; Gasteratos, A.; Tzovaras, D. An Autonomous Navigation Framework for Holonomic Mobile Robots in Confined Agricultural Environments. Robotics 2023, 12, 146. [Google Scholar] [CrossRef]
- Huang, S.; Pan, K.; Wang, S.; Zhu, Y.; Zhang, Q.; Su, X.; Yu, H. Design and Test of an Automatic Navigation Fruit-Picking Platform. Agriculture 2023, 13, 882. [Google Scholar] [CrossRef]
- Ferro, M.V.; Sørensen, C.G.; Catania, P. Comparison of Different Computer Vision Methods for Vineyard Canopy Detection Using UAV Multispectral Images. Comput. Electron. Agric. 2024, 225, 109277. [Google Scholar] [CrossRef]
- Tan, Y.; Su, W.; Zhao, L.; Lai, Q.; Wang, C.; Jiang, J.; Li, P. Navigation Path Extraction for Inter-Row Robots in Panax notoginseng Shade House Based on Im-YOLOv5s. Front. Plant Sci. 2023, 14, 1246717. [Google Scholar] [CrossRef]
- Gao, X.; Wang, G.; Qi, J.; Wang, Q.; Xiang, M.; Song, K.; Zhou, Z. Improved YOLO v7 for Sustainable Agriculture Significantly Improves Precision Rate for Chinese Cabbage (Brassica pekinensis Rupr.) Seedling Belt (CCSB) Detection. Sustainability 2024, 16, 4759. [Google Scholar] [CrossRef]
- Di Gennaro, S.F.; Vannini, G.L.; Berton, A.; Dainelli, R.; Toscano, P.; Matese, A. Missing Plant Detection in Vineyards Using UAV Angled RGB Imagery Acquired in Dormant Period. Drones 2023, 7, 349. [Google Scholar] [CrossRef]
- Lu, Z.; Han, B.; Dong, L.; Zhang, J. COTTON-YOLO: Enhancing Cotton Boll Detection and Counting in Complex Environmental Conditions Using an Advanced YOLO Model. Appl. Sci. 2024, 14, 6650. [Google Scholar] [CrossRef]
Technical Type | Application Scenarios | Characteristic | Ref. | |
---|---|---|---|---|
Traditional visual | Machine vision | Monitoring of rice seedling rows | Adapts to seedling row gaps and missing seedlings | [78] |
Projection transformation | Agricultural robot navigation | Good robustness in high weed scenes | [79] | |
KNN-RGB gradient filtering | Testing of sugarcane industry | Can adapt to 40 day and 80 day reproductive periods | [80] | |
Template matching + digital twin | Measurement of row deviation | Adapts to scenarios with insufficient seedlings and dense weeds | [81] | |
Deep learning | YOLOv8s EFF+geometric algorithm | Tobacco row spacing extraction | Adapts to irregular plots and dense planting scenarios | [82] |
Deep semantic segmentation | Navigation of high-density canopy orchard | Anti canopy obstruction, path planning can be achieved without GPS | [83] | |
Automated annotation + VGG16 | Cornrow plant counting | Adapts to different land parcels | [84] | |
Skeleton segmentation | Variable field conditions | Can resist the interference of dense weeds and discontinuous crop rows | [85] | |
Mask R-CNN | Drone cornrow plant recognition | Can detect seeding errors such as missed sowing and replay | [86] | |
Lightweight U-Net segmentation | Crop row consistency assessment | Adapt to irregular plots and dense planting scenarios | [87] | |
LiDAR-based technology | Self supervised deep learning | Canopy navigation | Copes with changes in brightness | [88] |
2D LiDAR SLAM | Orchard environment positioning | Adapt to different terrain roughness and orchard sizes | [89] | |
LiDAR high bed cultivation navigation | Strawberry Farm High Bed Cultivation Environment | No need to rely on path planning | [90] | |
LiDAR-based SLAM semantic mapping | Orchard robot autonomous navigation | Integrates terrain analysis; supports phenotype monitoring/harvesting | [91] | |
Rolling 2D LiDAR navigation line extraction | Thin tree trunks and dense foliage block the environment | Overcomes traditional LiDAR’s narrow vertical FOV and sparse point clouds | [92] | |
Hierarchical coupled LiDAR inertial odometer | Multiple agricultural scenarios | Capable of adapting to different LiDAR and dense/open environments | [93] | |
RTMR-LOAM | Reconstruction of corn crops | High-density crop morphological parameter measurement | [94] | |
Adaptive LiDAR odometer and mapping | Autonomous Navigation for Unmanned Farms | Capable of resisting motion distortion and dynamic object interference | [95] | |
Multi-sensor fusion | Camera LiDAR IMU Fusion | Extraction of inter-row navigation lines | High precision, capable of dealing with complex farmland environment interference | [96] |
LiDAR and multi-sensor fusion (EKF) | Navigation of orchard spray robot | Supports obstacle avoidance/multitasking | [97] | |
Fusion of instance segmentation and depth prediction | Positioning of lodging stubble rows | Adapt to different monocular cameras to reduce cross device errors | [98] | |
Fusion of fluorescence imaging and computer vision | Soybean seedling stage positioning | Stable fluorescence signal, not affected by early growth interference | [99] | |
Others | Unsupervised graph representation | Agricultural scenarios lacking labeled data | Disposable weed aggregation and other inconsistent structures | [100] |
Indicator Category | Specific Indicators | Method | Performance Parameters | Ref. |
---|---|---|---|---|
Precision | IoU of crops | Deep learning + row-structure constraints | Crop IoU = 88.6% | [140] |
Angle and distance errors | Improved YOLOv3 | Angle error = 0.75°, distance error = 10.84 pix | [141] | |
Position error | UAV image processing pipeline | Cotton = 0.32 ± 0.21 m, Maize = 0.57 ± 0.28 m | [142] | |
Segmentation accuracy | YOLOv8n-seg | Superior to YOLOv5n-seg | [143] | |
Efficiency | Processing time | Beet extraction algorithm | Single frame = 11.751 ms | [144] |
Detection speed | Improved YOLOv7 Tiny | Detection speed = 32.4 fps | [145] | |
Reasoning speed | SeedNet/PeakNet | SeedNet = 105, PeakNet = 2295 fps | [146] | |
Robustness | Angular deviation | ROI based method | Avg angle dev = 0.456–0.789° | [147] |
Lateral Error | YOLOv8m-vine-classes | Lateral RMSE ≈ 5 cm | [148] | |
Walking deviation | PP LiteSeg semantic segmentation | Avg = 1.33 cm, max = 3.82 cm | [149] |
Scenario | Purpose | Method | Performance Parameters | Ref. |
---|---|---|---|---|
Soybean field | Light and shadow changes/broken lines/dense weeds | MA-Res U-Net | Deviation 3°, mIOU > traditional U-Net | [196] |
Greenhouse | Closed environment crop inspection | Stereo camera-LiDAR-semantic segmentation | Auto inspection, fit confined spaces | [197] |
Dwarfing orchard | Navigation of high-density planting and harvesting platform | Beidou Navigation-Stanley Algorithm | Max straight lateral dev 101.5 mm | [198] |
Vineyard | Canopy parameter monitoring | Mask R-CNN/U-Net semantic segmentation | OA, F1, IoU > OBIA | [199] |
Greenhouse Sanqi | Sunshade net shadow recognition | Im YOLOv5s-least squares fitting | Max dev 1.64°, 94.9% mAP | [200] |
Vegetable fields | Image recognition of Chinese cabbage seedling stage | Improved YOLOv7 | Fit accuracy = 94.2%, identification rate = 91.3% | [201] |
Vineyard | Detection of plants missing during dormancy period | Tilted RGB image + point cloud spatial analysis | Overall accuracy = 92.72% | [202] |
Cotton field | Cotton boll detection and counting | COTTON-YOLO (YOLOv8n) | AP50 up10.3% | [203] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ma, Z.; Wang, X.; Chen, X.; Hu, B.; Li, J. Advances in Crop Row Detection for Agricultural Robots: Methods, Performance Indicators, and Scene Adaptability. Agriculture 2025, 15, 2151. https://doi.org/10.3390/agriculture15202151
Ma Z, Wang X, Chen X, Hu B, Li J. Advances in Crop Row Detection for Agricultural Robots: Methods, Performance Indicators, and Scene Adaptability. Agriculture. 2025; 15(20):2151. https://doi.org/10.3390/agriculture15202151
Chicago/Turabian StyleMa, Zhen, Xinzhong Wang, Xuegeng Chen, Bin Hu, and Jingbin Li. 2025. "Advances in Crop Row Detection for Agricultural Robots: Methods, Performance Indicators, and Scene Adaptability" Agriculture 15, no. 20: 2151. https://doi.org/10.3390/agriculture15202151
APA StyleMa, Z., Wang, X., Chen, X., Hu, B., & Li, J. (2025). Advances in Crop Row Detection for Agricultural Robots: Methods, Performance Indicators, and Scene Adaptability. Agriculture, 15(20), 2151. https://doi.org/10.3390/agriculture15202151