A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination
Abstract
1. Introduction
2. Review Methodology and Literature Selection
2.1. Literature Retrieval Criteria
Inclusion Criteria
- Relevance to subject of review: An essential requirement for inclusion was the study’s relevance to the core focus on agricultural robot technologies and their application domains. Eligible works needed to address at least one relevant aspect of this review, including classifications of agricultural robots, sensing technologies, control approaches, or networking solutions. Relevance was determined by examining the title, abstract, objectives, and methodology of each study to ensure alignment with the review’s scope.
- Publication timeframe: While the review primarily focused on recent advancements and emerging technologies in the field, earlier works were also included to capture the evolution of developments over time. Consequently, the literature surveyed spans the last decade, covering publications from 2015 to 2025.
- Article type and subject areas: The literature search for this review mainly focused on review and research articles published within the domains of agricultural and biological sciences, computer science, robotics, and engineering. The selected sources included journal articles, conference proceedings, as well as theses and dissertations.
- Language: To ensure consistency and wide accessibility, the review was limited to publications written in English. This restriction helped maintain clarity and uniformity in analysing the selected body of literature.
2.2. Literature Selection Process
2.2.1. Database Search
2.2.2. Keywords Search
2.2.3. Initial Screening
2.2.4. Full-Text Evaluation and Final Selection
2.3. Keyword Analysis
3. Classifications of Agricultural Robots
- Airborne systems, primarily represented by Unmanned Aerial Vehicles (UAVs), which are extensively used for crop monitoring, spraying, and mapping tasks due to their ability to cover large areas efficiently.
- Earthbound systems, commonly referred to as Unmanned Ground Vehicles (UGVs), which operate at the field level to carry out activities such as planting, weeding, harvesting, and crop and soil analyses.
- Aquatic or water-surface systems, known as Unmanned Surface Vehicles (USVs), which are particularly relevant in water-intensive farming systems, aquaculture, and irrigation management.
- Robotic arms and end-effectors, which serve as precision tools for delicate and highly specific operations such as harvesting fruits, pruning, or performing tasks in controlled environments like greenhouses.
3.1. Unmanned Aerial Vehicles (UAVs)
3.1.1. Types of UAV Platforms
Multi-Rotor UAVs
Fixed-Wing UAVs
Hybrid UAVs (VTOL)
Unmanned Helicopters
3.1.2. UAV-Mounted Sensors
3.2. Unmanned Ground Vehicles (UGVs)
3.2.1. UGV Configurations
Wheeled UGVs
Tracked UGVs
Legged Robots
3.2.2. UGV-Mounted Sensors
3.3. Robotic Arms and End-Effectors
3.4. Unmanned Surface Vehicles (USVs)
3.5. Research Focus Trends in Agricultural Robotics (2015–2025)
4. Sensing Technologies for Agricultural Robots
4.1. Navigation Techniques
4.1.1. Localization Methods
4.1.2. Pre-Mapping vs. On-Line Planning
4.1.3. Suitability in Different Farm Environments
4.2. Object Detection
4.3. Obstacle Avoidance Strategies
4.4. Sensing for Agricultural Robots in the Age of Artificial Intelligence (AI)
5. Control Technologies for Agricultural Robots
5.1. Control Methods for UAVs
| Control | UAV Type | Application | Citation |
|---|---|---|---|
| Hybrid (PID + PWM) | Quadrotor | Spraying | [174] |
| LQR | Quadrotor | Spraying | [175] |
| Feedback linearization | Quadrotor | Predefined trajectory following | [176] |
| Feedback linearization | Quadrotor | Swarm UAV formation | [177] |
| Feedback linearization | Quadrotor | Swarm UAV formation | [178] |
| Backstepping | Quadrotor | Visual servoing | [179] |
| Hybrid (Adaptive backstepping + Sliding mode) | Multirotor | Spraying | [180] |
5.2. Control Methods for UGVs
5.3. Control Methods for USVs
5.4. Control Methods for Robotic Arms and End-Effectors
6. Networking Technologies for Agricultural Robots
6.1. Communication Protocols for Agricultural Robotics
6.1.1. Short-to-Medium-Range Wireless Technologies
ZigBee (IEEE 802.15.4)
Bluetooth/BLE (Bluetooth Low Energy)
Wi-Fi (IEEE 802.11 a/b/g/n/ac/ax), Especially Wi-Fi 6
6.1.2. Low-Power Wide-Area Networks (LPWAN)
LoRa/LoRaWAN
NB-IoT, Sigfox, LTE-M, RPMA, WavIoT
- NB-IoT (Narrowband IoT) [247]: Operating on licensed cellular spectrum, NB-IoT offers long-range (up to 10 km rural), ultra-low-power communication with downlink speeds up to 200 kbps and uplink around 10 kbps [240]. It supports high scalability—over 100,000 devices per cell and provides reliable QoS (Quality of Service). However, it has lower interference immunity than LoRaWAN or Sigfox, relies on existing LTE infrastructure (limiting rural deployment), and entails higher deployment and device costs. NB-IoT is used for livestock tracking and remote sensing in areas with cellular coverage.
- Sigfox [248]: Using unlicensed sub-GHz ISM bands and Ultra Narrow Band (UNB) modulation, Sigfox achieves very long range (up to 20 km) and ultra-low power consumption [240]. Its limitations include very low throughput (10–50 kbps), strict message limits (140 uplink, 4 downlink/day), and small payload sizes. While unsuitable for high-data tasks, it excels in low-power, infrequent sensing applications such as soil monitoring.
- LTE-M [249]: Built on existing 4G/LTE infrastructure, LTE-M provides better latency (10 ms) and higher throughput than other LPWANs. However, it consumes more energy per message and is less effective over long distances or through obstacles [240]. It is better suited for urban or peri-urban agricultural use, but insufficient for high-throughput robotics or remote deployments.
- RPMA and WavIoT: RPMA (2.4 GHz) [250] and WavIoT (868 MHz) [251] suffer from higher path loss, at least 9 dB more than Sigfox and LoRaWAN, making them less suitable for rural or obstructed environments [240]. RPMA is less energy-efficient, while WavIoT offers battery life comparable to LoRaWAN and Sigfox. These are generally less favoured for wide-area agricultural deployments.
6.1.3. Cellular Networks (4G, 5G, and Emerging 6G)
4G/LTE-Advanced
5G (Including URLLC, eMBB, mMTC)
6G/Beyond-5G
6.1.4. Comparative Analysis of Communication Protocols
6.2. Swarm Robotics Networking
6.2.1. Ad-Hoc/MANET/FANET Approaches
Mobile Ad Hoc Networks (MANET) and Flying Ad Hoc Networks (FANET)
IEEE 802.15.4/ZigBee in Swarms
LoRaWAN in Swarm Contexts
6.2.2. Hybrid and Multi-Layer Networking Designs for Swarms
7. Conclusions: Limitations and Future Prospects of Agricultural Robots
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Amin, A.; Wang, X.; Zhang, Y.; Tianhua, L.; Chen, Y.; Zheng, J.; Shi, Y.; Abdelhamid, M.A. A Comprehensive Review of Applications of Robotics and Artificial Intelligence in Agricultural Operations. Stud. Inform. Control 2023, 32, 59–70. [Google Scholar] [CrossRef]
- Botta, A.; Cavallone, P.; Baglieri, L.; Colucci, G.; Tagliavini, L.; Quaglia, G. A Review of Robots, Perception, and Tasks in Precision Agriculture. Appl. Mech. 2022, 3, 830–854. [Google Scholar] [CrossRef]
- Liu, L.; Yang, F.; Liu, X.; Du, Y.; Li, X.; Li, G.; Chen, D.; Zhu, Z.; Song, Z. A Review of the Current Status and Common Key Technologies for Agricultural Field Robots. Comput. Electron. Agric. 2024, 227, 109630. [Google Scholar] [CrossRef]
- Xie, D.; Chen, L.; Liu, L.; Chen, L.; Wang, H. Actuators and Sensors for Application in Agricultural Robots: A Review. Machines 2022, 10, 913. [Google Scholar] [CrossRef]
- Hernández, H.A.; Mondragón, I.F.; González, S.R.; Pedraza, L.F. Reconfigurable Agricultural Robotics: Control Strategies, Communication, and Applications. Comput. Electron. Agric. 2025, 234, 110161. [Google Scholar] [CrossRef]
- Peng, Y.; Liu, J.; Xie, B.; Shan, H.; He, M.; Hou, G.; Jin, Y. Research Progress of Urban Dual-Arm Humanoid Grape Harvesting Robot. In Proceedings of the 2021 IEEE 11th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Jiaxing, China, 27–31 July 2021; pp. 879–885. [Google Scholar]
- Jiang, S.; Wang, S.; Yi, Z.; Zhang, M.; Lv, X. Autonomous Navigation System of Greenhouse Mobile Robot Based on 3D Lidar and 2D Lidar SLAM. Front. Plant Sci. 2022, 13, 815218. [Google Scholar] [CrossRef]
- Rovira-Más, F.; Saiz-Rubio, V.; Cuenca-Cuenca, A. Augmented Perception for Agricultural Robots Navigation. IEEE Sens. J. 2021, 21, 11712–11727. [Google Scholar] [CrossRef]
- Upadhyay, A.; Chandel, N.S.; Singh, K.P.; Chakraborty, S.K.; Nandede, B.M.; Kumar, M.; Subeesh, A.; Upendar, K.; Salem, A.; Elbeltagi, A. Deep Learning and Computer Vision in Plant Disease Detection: A Comprehensive Review of Techniques, Models, and Trends in Precision Agriculture. Artif. Intell. Rev. 2025, 58, 92. [Google Scholar] [CrossRef]
- Nkwocha, C.L.; Chandel, A.K. Towards an End-to-End Digital Framework for Precision Crop Disease Diagnosis and Management Based on Emerging Sensing and Computing Technologies: State over Past Decade and Prospects. Computers 2025, 14, 443. [Google Scholar] [CrossRef]
- Ashwini, C.; Sellam, V. EOS-3D-DCNN: Ebola Optimization Search-Based 3D-Dense Convolutional Neural Network for Corn Leaf Disease Prediction. Neural Comput. Appl. 2023, 35, 11125–11139. [Google Scholar] [CrossRef] [PubMed]
- Singla, A.; Nehra, A.; Joshi, K.; Kumar, A.; Tuteja, N.; Varshney, R.K.; Gill, S.S.; Gill, R. Exploration of Machine Learning Approaches for Automated Crop Disease Detection. Curr. Plant Biol. 2024, 40, 100382. [Google Scholar] [CrossRef]
- Das, S.; Chapman, S.; Christopher, J.; Choudhury, M.R.; Menzies, N.W.; Apan, A.; Dang, Y.P. UAV-Thermal Imaging: A Technological Breakthrough for Monitoring and Quantifying Crop Abiotic Stress to Help Sustain Productivity on Sodic Soils—A Case Review on Wheat. Remote Sens. Appl. Soc. Environ. 2021, 23, 100583. [Google Scholar] [CrossRef]
- Singh, A.P.; Yerudkar, A.; Mariani, V.; Iannelli, L.; Glielmo, L. A Bibliometric Review of the Use of Unmanned Aerial Vehicles in Precision Agriculture and Precision Viticulture for Sensing Applications. Remote Sens. 2022, 14, 1604. [Google Scholar] [CrossRef]
- Madroñal, D.; Palumbo, F.; Capotondi, A.; Marongiu, A. Unmanned Vehicles in Smart Farming: A Survey and a Glance at Future Horizons. In Proceedings of the 2021 Drone Systems Engineering and Rapid Simulation and Performance Evaluation: Methods and Tools Proceedings, Budapest, Hungary, 18–20 February 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 1–8. [Google Scholar]
- Zhang, H.; Wang, L.; Tian, T.; Yin, J. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
- Daponte, P.; De Vito, L.; Glielmo, L.; Iannelli, L.; Liuzza, D.; Picariello, F.; Silano, G. A Review on the Use of Drones for Precision Agriculture. IOP Conf. Ser. Earth Environ. Sci. 2019, 275, 012022. [Google Scholar] [CrossRef]
- Huang, H.; Yang, A.; Tang, Y.; Zhuang, J.; Hou, C.; Tan, Z.; Dananjayan, S.; He, Y.; Guo, Q.; Luo, S. Deep Color Calibration for UAV Imagery in Crop Monitoring Using Semantic Style Transfer with Local to Global Attention. Int. J. Appl. Earth Obs. Geoinf. 2021, 104, 102590. [Google Scholar] [CrossRef]
- Negash, L.; Kim, H.-Y.; Choi, H.-L. Emerging UAV Applications in Agriculture. In Proceedings of the 2019 7th International Conference on Robot Intelligence Technology and Applications (RiTA), Daejeon, Republic of Korea, 1–3 November 2019; IEEE: Piscataway, NJ, USA; pp. 254–257. [Google Scholar]
- Inoue, Y. Satellite- and Drone-Based Remote Sensing of Crops and Soils for Smart Farming—A Review. Soil Sci. Plant Nutr. 2020, 66, 798–810. [Google Scholar] [CrossRef]
- Panday, U.S.; Pratihast, A.K.; Aryal, J.; Kayastha, R.B. A Review on Drone-Based Data Solutions for Cereal Crops. Drones 2020, 4, 41. [Google Scholar] [CrossRef]
- Rahman, M.F.F.; Fan, S.; Zhang, Y.; Chen, L. A Comparative Study on Application of Unmanned Aerial Vehicle Systems in Agriculture. Agriculture 2021, 11, 22. [Google Scholar] [CrossRef]
- Spoorthi, S.; Shadaksharappa, B.; Suraj, S.; Manasa, V.K. Freyr Drone: Pesticide/Fertilizers Spraying Drone—An Agricultural Approach. In Proceedings of the 2017 2nd International Conference on Computing and Communications Technologies (ICCCT), Chennai, India, 23–24 February 2017; pp. 252–255. [Google Scholar]
- Shilin, W.; Jianli, S.; Xiongkui, H.; Le, S.; Xiaonan, W.; Changling, W.; Zhichong, W.; Yun, L. Performances Evaluation of Four Typical Unmanned Aerial Vehicles Used for Pesticide Application in China. Int. J. Agric. Biol. Eng. 2017, 10, 22–31. [Google Scholar] [CrossRef]
- Fernández-Guisuraga, J.M.; Sanz-Ablanedo, E.; Suárez-Seoane, S.; Calvo, L. Using Unmanned Aerial Vehicles in Postfire Vegetation Survey Campaigns through Large and Heterogeneous Areas: Opportunities and Challenges. Sensors 2018, 18, 586. [Google Scholar] [CrossRef] [PubMed]
- Guo, Q.; Zhu, Y.; Tang, Y.; Hou, C.; Fang, M.; Chen, X. Numerical Simulation of the Effects of Downwash Airflow and Crosswinds on the Spray Performance of Quad-Rotor Agricultural UAVs. Smart Agric. Technol. 2025, 11, 100940. [Google Scholar] [CrossRef]
- Xiao, X.; Qu, W.; Xia, G.-S.; Xu, M.; Shao, Z.; Gong, J.; Li, D. A Novel Real-Time Matching and Pose Reconstruction Method for Low-Overlap Agricultural UAV Images with Repetitive Textures. ISPRS J. Photogramm. Remote Sens. 2025, 226, 54–75. [Google Scholar] [CrossRef]
- Demir, S.; Dedeoğlu, M.; Başayiğit, L. Yield Prediction Models of Organic Oil Rose Farming with Agricultural Unmanned Aerial Vehicles (UAVs) Images and Machine Learnaing Algorithms. Remote Sens. Appl. Soc. Environ. 2024, 33, 101131. [Google Scholar] [CrossRef]
- Singh, P.K.; Sharma, A. An Intelligent WSN-UAV-Based IoT Framework for Precision Agriculture Application. Comput. Electr. Eng. 2022, 100, 107912. [Google Scholar] [CrossRef]
- Park, M.; Lee, S.; Lee, S. Dynamic Topology Reconstruction Protocol for UAV Swarm Networking. Symmetry 2020, 12, 1111. [Google Scholar] [CrossRef]
- Zhang, Z.; Zhu, L. A Review on Unmanned Aerial Vehicle Remote Sensing: Platforms, Sensors, Data Processing Methods, and Applications. Drones 2023, 7, 398. [Google Scholar] [CrossRef]
- Ahmed, F.; Mohanta, J.C.; Keshari, A.; Yadav, P.S. Recent Advances in Unmanned Aerial Vehicles: A Review. Arab. J. Sci. Eng. 2022, 47, 7963–7984. [Google Scholar] [CrossRef]
- Shekh, M.; Rani, S.; Datta, R. Review on Design, Development, and Implementation of an Unmanned Aerial Vehicle for Various Applications. Int. J. Intell. Robot. Appl. 2025, 9, 299–318. [Google Scholar] [CrossRef]
- Guo, X.; Shao, Q.; Li, Y.; Wang, Y.; Wang, D.; Liu, J.; Fan, J.; Yang, F. Application of UAV Remote Sensing for a Population Census of Large Wild Herbivores—Taking the Headwater Region of the Yellow River as an Example. Remote Sens. 2018, 10, 1041. [Google Scholar] [CrossRef]
- Mammarella, M.; Capello, E.; Dabbene, F.; Guglieri, G. Sample-Based SMPC for Tracking Control of Fixed-Wing UAV. IEEE Control Syst. Lett. 2018, 2, 611–616. [Google Scholar] [CrossRef]
- Pfeifer, C.; Barbosa, A.; Mustafa, O.; Peter, H.-U.; Rümmler, M.-C.; Brenning, A. Using Fixed-Wing UAV for Detecting and Mapping the Distribution and Abundance of Penguins on the South Shetlands Islands, Antarctica. Drones 2019, 3, 39. [Google Scholar] [CrossRef]
- Divazi, A.; Askari, R.; Roohi, E. Experimental and Numerical Investigation on the Spraying Performance of an Agricultural Unmanned Aerial Vehicle. Aerosp. Sci. Technol. 2025, 160, 110083. [Google Scholar] [CrossRef]
- Kovalev, I.V.; Kovalev, D.I.; Astanakulov, K.D.; Podoplelova, V.A.; Borovinsky, D.V.; Shaporova, Z.E. Productivity Analysis of Agricultural UAVs by Field Crop Spraying. IOP Conf. Ser. Earth Environ. Sci. 2023, 1284, 012026. [Google Scholar] [CrossRef]
- Ukaegbu, U.F.; Tartibu, L.K.; Okwu, M.O.; Olayode, I.O. Development of a Light-Weight Unmanned Aerial Vehicle for Precision Agriculture. Sensors 2021, 21, 4417. [Google Scholar] [CrossRef]
- Abramov, N.V.; Semizorov, S.A.; Sherstobitov, S.V.; Gunger, M.V.; Petukhov, D.A. Digitization of Agricultural Land Using an Unmanned Aerial Vehicle. IOP Conf. Ser. Earth Environ. Sci. 2020, 548, 032002. [Google Scholar] [CrossRef]
- Chen, P.-C.; Chiang, Y.-C.; Weng, P.-Y. Imaging Using Unmanned Aerial Vehicles for Agriculture Land Use Classification. Agriculture 2020, 10, 416. [Google Scholar] [CrossRef]
- Yang, M.-D.; Boubin, J.G.; Tsai, H.P.; Tseng, H.-H.; Hsu, Y.-C.; Stewart, C.C. Adaptive Autonomous UAV Scouting for Rice Lodging Assessment Using Edge Computing with Deep Learning EDANet. Comput. Electron. Agric. 2020, 179, 105817. [Google Scholar] [CrossRef]
- Ju, C.; Son, H.I. Multiple UAV Systems for Agricultural Applications: Control, Implementation, and Evaluation. Electronics 2018, 7, 162. [Google Scholar] [CrossRef]
- Psirofonia, P.; Samaritakis, V.; Eliopoulos, P.; Potamitis, I. Use of Unmanned Aerial Vehicles for Agricultural Applications with Emphasis on Crop Protection: Three Novel Case-Studies. Int. J. Agric. Sci. Technol. 2017, 5, 30–39. [Google Scholar] [CrossRef]
- Torres-Sánchez, J.; López-Granados, F.; Serrano, N.; Arquero, O.; Peña, J.M. High-Throughput 3-D Monitoring of Agricultural-Tree Plantations with Unmanned Aerial Vehicle (UAV) Technology. PLoS ONE 2015, 10, e0130479. [Google Scholar] [CrossRef] [PubMed]
- Vega, F.A.; Ramírez, F.C.; Saiz, M.P.; Rosúa, F.O. Multi-Temporal Imaging Using an Unmanned Aerial Vehicle for Monitoring a Sunflower Crop. Biosyst. Eng. 2015, 132, 19–27. [Google Scholar] [CrossRef]
- Khan, N.; Ray, R.L.; Sargani, G.R.; Ihtisham, M.; Khayyam, M.; Ismail, S. Current Progress and Future Prospects of Agriculture Technology: Gateway to Sustainable Agriculture. Sustainability 2021, 13, 4883. [Google Scholar] [CrossRef]
- Ferreira, F.; Faria, J.; Azevedo, A.; Marques, A.L. Product Lifecycle Management in Knowledge Intensive Collaborative Environments: An Application to Automotive Industry. Int. J. Inf. Manag. 2017, 37, 1474–1487. [Google Scholar] [CrossRef]
- Singh, S.; Vaishnav, R.; Gautam, S.; Banerjee, S. Agricultural Robotics: A Comprehensive Review of Applications, Challenges and Future Prospects. In Proceedings of the 2024 2nd International Conference on Artificial Intelligence and Machine Learning Applications Theme: Healthcare and Internet of Things (AIMLA), Namakkal, India, 15–16 March 2024; pp. 1–8. [Google Scholar]
- Teng, H.; Wang, Y.; Chatziparaschis, D.; Karydis, K. Adaptive LiDAR Odometry and Mapping for Autonomous Agricultural Mobile Robots in Unmanned Farms. Comput. Electron. Agric. 2025, 232, 110023. [Google Scholar] [CrossRef]
- Linford, J.; Haghshenas-Jaryani, M. A Ground Robotic System for Crops and Soil Monitoring and Data Collection in New Mexico Chile Pepper Farms. Discov. Agric. 2024, 2, 101. [Google Scholar] [CrossRef]
- Wang, S.; Zhou, H.; Zhang, C.; Ge, L.; Li, W.; Yuan, T.; Zhang, W.; Zhang, J. Design, Development and Evaluation of Latex Harvesting Robot Based on Flexible Toggle. Robot. Auton. Syst. 2022, 147, 103906. [Google Scholar] [CrossRef]
- Hemanth Kumar, N.; Suresh, R.; Balappa, B.U. Development of an Unmanned Ground Vehicle for Pesticide Spraying in Chilli Crop. In Proceedings of the 2023 IEEE Technology & Engineering Management Conference-Asia Pacific (TEMSCON-ASPAC), Bengaluru, India, 14–16 December 2023; pp. 1–5. [Google Scholar]
- Xu, R.; Li, C. A Review of High-Throughput Field Phenotyping Systems: Focusing on Ground Robots. Plant Phenomics 2022, 2022, 9760269. [Google Scholar] [CrossRef]
- Fernandes, H.R.; Polania, E.C.M.; Garcia, A.P.; Mendonza, O.B.; Albiero, D. Agricultural Unmanned Ground Vehicles: A Review from the Stability Point of View. Rev. Ciênc. Agronômica 2021, 51, e20207761. [Google Scholar] [CrossRef]
- Bonadies, S.; Lefcourt, A.; Gadsden, S.A. A Survey of Unmanned Ground Vehicles with Applications to Agricultural and Environmental Sensing. In Proceedings of the SPIE Proceedings, Baltimore, MD, USA, 17–21 April 2016; Valasek, J., Thomasson, J.A., Eds.; SPIE: Bellingham, WA, USA, 2016; Volume 9866, p. 98660Q. [Google Scholar]
- Roshanianfard, A.; Noguchi, N.; Okamoto, H.; Ishii, K. A Review of Autonomous Agricultural Vehicles (The Experience of Hokkaido University). J. Terramechanics 2020, 91, 155–183. [Google Scholar] [CrossRef]
- Etezadi, H.; Eshkabilov, S. A Comprehensive Overview of Control Algorithms, Sensors, Actuators, and Communication Tools of Autonomous All-Terrain Vehicles in Agriculture. Agriculture 2024, 14, 163. [Google Scholar] [CrossRef]
- Pham, V.; Malladi, B.; Moreno, F.; Gonzalez, C.; Bhandari, S.; Raheja, A. Collaboration between Aerial and Ground Robots for Weed Detection and Removal. In Precision Agriculture’ 25; Wageningen Academic: Wageningen, The Netherlands, 2025. [Google Scholar]
- Pour Arab, D.; Spisser, M.; Essert, C. 3D Hybrid Path Planning for Optimized Coverage of Agricultural Fields: A Novel Approach for Wheeled Robots. J. Field Robot. 2025, 42, 455–473. [Google Scholar] [CrossRef]
- Zhang, Y.; Shen, Y.; Liu, H.; He, S.; Khan, Z. A Composite Sliding Mode Controller with Extended Disturbance Observer for 4WSS Agricultural Robots in Unstructured Farmlands. Comput. Electron. Agric. 2025, 232, 110069. [Google Scholar] [CrossRef]
- Zhang, Z.; Li, Z.; Yang, M.; Cui, J.; Shao, Y.; Ding, Y.; Yang, W.; Qiao, W.; Song, P. An Autonomous Navigation Method for Field Phenotyping Robot Based on Ground-Air Collaboration. Artif. Intell. Agric. 2025, 15, 610–621. [Google Scholar] [CrossRef]
- Banić, M.; Stojanović, L.; Perić, M.; Rangelov, D.; Pavlović, V.; Miltenović, A.; Simonović, M. AgAR: A Multipurpose Robotic Platform for the Digital Transformation of Agriculture. In Proceedings of the 11th International Scientific Conference IRMES 2025, Vrnjačka Banja, Serbia, 19–21 June 2025; Faculty of Mechanical Engineering, University of Niš: Niš, Serbia, 2025; pp. XXIII–XXXI. [Google Scholar]
- Dokic, K.; Kukina, H.; Mikolcevic, H. A Low-Cost Agriculture Robot for Dataset Creation-Software and Hardware Solutions. In Proceedings of the 2024 1st International Conference on Innovative Engineering Sciences and Technological Research (ICIESTR), Muscat, Oman, 14–15 May 2024; pp. 1–6. [Google Scholar]
- de Silva, R.; Cielniak, G.; Wang, G.; Gao, J. Deep Learning-Based Crop Row Detection for Infield Navigation of Agri-Robots. J. Field Robot. 2024, 41, 2299–2321. [Google Scholar] [CrossRef]
- Raikwar, S.; Fehrmann, J.; Herlitzius, T. Navigation and Control Development for a Four-Wheel-Steered Mobile Orchard Robot Using Model-Based Design. Comput. Electron. Agric. 2022, 202, 107410. [Google Scholar] [CrossRef]
- Shojaei, K. Intelligent Coordinated Control of an Autonomous Tractor-Trailer and a Combine Harvester. Eur. J. Control 2021, 59, 82–98. [Google Scholar] [CrossRef]
- Skoczeń, M.; Ochman, M.; Spyra, K.; Nikodem, M.; Krata, D.; Panek, M.; Pawłowski, A. Obstacle Detection System for Agricultural Mobile Robot Application Using RGB-D Cameras. Sensors 2021, 21, 5292. [Google Scholar] [CrossRef] [PubMed]
- Gai, J.; Xiang, L.; Tang, L. Using a Depth Camera for Crop Row Detection and Mapping for Under-Canopy Navigation of Agricultural Robotic Vehicle. Comput. Electron. Agric. 2021, 188, 106301. [Google Scholar] [CrossRef]
- Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperative Agricultural Operations of Aerial and Ground Unmanned Vehicles. In Proceedings of the 2020 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento, Italy, 4–6 November 2020; pp. 224–229. [Google Scholar]
- Utstumo, T.; Urdal, F.; Brevik, A.; Dørum, J.; Netland, J.; Overskeid, Ø.; Berge, T.W.; Gravdahl, J.T. Robotic In-Row Weed Control in Vegetables. Comput. Electron. Agric. 2018, 154, 36–45. [Google Scholar] [CrossRef]
- Mueller-Sim, T.; Jenkins, M.; Abel, J.; Kantor, G. The Robotanist: A Ground-Based Agricultural Robot for High-Throughput Crop Phenotyping. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3634–3639. [Google Scholar]
- Gonzalez-de-Santos, P.; Ribeiro, A.; Fernandez-Quintanilla, C.; Lopez-Granados, F.; Brandstoetter, M.; Tomic, S.; Pedrazzi, S.; Peruzzi, A.; Pajares, G.; Kaplanis, G.; et al. Fleets of Robots for Environmentally-Safe Pest Control in Agriculture. Precis. Agric. 2017, 18, 574–614. [Google Scholar] [CrossRef]
- Kayacan, E.; Ramon, H.; Saeys, W. Robust Trajectory Tracking Error Model-Based Predictive Control for Unmanned Ground Vehicles. IEEEASME Trans. Mechatron. 2016, 21, 806–814. [Google Scholar] [CrossRef]
- Yin, H.; Sun, Q.; Ren, X.; Guo, J.; Yang, Y.; Wei, Y.; Huang, B.; Chai, X.; Zhong, M. Development, Integration, and Field Evaluation of an Autonomous Citrus-harvesting Robot. J. Field Robot. 2023, 40, 1363–1387. [Google Scholar] [CrossRef]
- Davidson, J.R.; Mo, C. Mechanical Design and Initial Performance Testing of an Apple-Picking End-Effector. In Proceedings of the ASME 2015 International Mechanical Engineering Congress and Exposition, Houston, TX, USA, 13–19 November 2015; Volume 4A: Dynamics, Vibration, and Control. American Society of Mechanical Engineers: New York, NY, USA, 2015. [Google Scholar]
- Kaleem, A.; Hussain, S.; Aqib, M.; Cheema, M.J.M.; Saleem, S.R.; Farooq, U. Development Challenges of Fruit-Harvesting Robotic Arms: A Critical Review. AgriEngineering 2023, 5, 2216–2237. [Google Scholar] [CrossRef]
- Xiao, X.; Wang, Y.; Jiang, Y. Review of Research Advances in Fruit and Vegetable Harvesting Robots. J. Electr. Eng. Technol. 2024, 19, 773–789. [Google Scholar] [CrossRef]
- Wang, Z.; Xun, Y.; Wang, Y.; Yang, Q. Review of Smart Robots for Fruit and Vegetable Picking in Agriculture. Int. J. Agric. Biol. Eng. 2022, 15, 33–54. [Google Scholar] [CrossRef]
- Gao, J.; Zhang, F.; Zhang, J.; Yuan, T.; Yin, J.; Guo, H.; Yang, C. Development and Evaluation of a Pneumatic Finger-like End-Effector for Cherry Tomato Harvesting Robot in Greenhouse. Comput. Electron. Agric. 2022, 197, 106879. [Google Scholar] [CrossRef]
- Gao, J.; Zhang, F.; Zhang, J.; Guo, H.; Gao, J. Picking Patterns Evaluation for Cherry Tomato Robotic Harvesting End-Effector Design. Biosyst. Eng. 2024, 239, 1–12. [Google Scholar] [CrossRef]
- Yeshmukhametov, A.; Koganezawa, K.; Yamamoto, Y.; Buribayev, Z.; Mukhtar, Z.; Amirgaliyev, Y. Development of Continuum Robot Arm and Gripper for Harvesting Cherry Tomatoes. Appl. Sci. 2022, 12, 6922. [Google Scholar] [CrossRef]
- Chen, M.; Chen, F.; Zhou, W.; Zuo, R. Design of Flexible Spherical Fruit and Vegetable Picking End-Effector Based on Vision Recognition. J. Phys. Conf. Ser. 2022, 2246, 012060. [Google Scholar] [CrossRef]
- Arikapudi, R.; Vougioukas, S.G. Robotic Tree-Fruit Harvesting with Arrays of Cartesian Arms: A Study of Fruit Pick Cycle Times. Comput. Electron. Agric. 2023, 211, 108023. [Google Scholar] [CrossRef]
- Sun, Q.; Zhong, M.; Chai, X.; Zeng, Z.; Yin, H.; Zhou, G.; Sun, T. Citrus Pose Estimation from an RGB Image for Automated Harvesting. Comput. Electron. Agric. 2023, 211, 108022. [Google Scholar] [CrossRef]
- Ji, W.; Tang, C.; Xu, B.; He, G. Contact Force Modeling and Variable Damping Impedance Control of Apple Harvesting Robot. Comput. Electron. Agric. 2022, 198, 107026. [Google Scholar] [CrossRef]
- Xiao, X.; Wang, Y.; Jiang, Y. End-Effectors Developed for Citrus and Other Spherical Crops. Appl. Sci. 2022, 12, 7945. [Google Scholar] [CrossRef]
- Fan, P.; Yan, B.; Wang, M.; Lei, X.; Liu, Z.; Yang, F. Three-Finger Grasp Planning and Experimental Analysis of Picking Patterns for Robotic Apple Harvesting. Comput. Electron. Agric. 2021, 188, 106353. [Google Scholar] [CrossRef]
- Roshanianfard, A. Development of a Harvesting Robot for Heavy-Weight Crop. Doctoral Dissertation, Hokkaido University, Sapporo, Japan, 2018. [Google Scholar] [CrossRef]
- Rahul, K.; Raheman, H.; Paradkar, V. Design of a 4 DOF Parallel Robot Arm and the Firmware Implementation on Embedded System to Transplant Pot Seedlings. Artif. Intell. Agric. 2020, 4, 172–183. [Google Scholar] [CrossRef]
- Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and Field Evaluation of a Strawberry Harvesting Robot with a Cable-Driven Gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
- Huang, M.; He, L.; Choi, D.; Pecchia, J.; Li, Y. Picking Dynamic Analysis for Robotic Harvesting of Agaricus Bisporus Mushrooms. Comput. Electron. Agric. 2021, 185, 106145. [Google Scholar] [CrossRef]
- De Preter, A.; Anthonis, J.; De Baerdemaeker, J. Development of a Robot for Harvesting Strawberries. IFAC-Pap. 2018, 51, 14–19. [Google Scholar] [CrossRef]
- Roshanianfard, A.; Noguchi, N. Development of a 5DOF Robotic Arm (RAVebots-1) Applied to Heavy Products Harvesting. IFAC-Pap. 2016, 49, 155–160. [Google Scholar] [CrossRef]
- Kaizu, Y.; Shimada, T.; Takahashi, Y.; Igarashi, S.; Yamada, H.; Furuhashi, K.; Imou, K. Development of a Small Electric Robot Boat for Mowing Aquatic Weeds. Trans. ASABE 2021, 64, 1073–1082. [Google Scholar] [CrossRef]
- Moro, S.; Uchida, H.; Kato, K.; Nomura, K.; Seikine, S.; Yamano, T. Development of an Automatic Operation Control System for a Weeding Robot in Paddy Fields to Track a Target Path and Speed. Eng. Agric. Environ. Food 2023, 16, 101–112. [Google Scholar] [CrossRef] [PubMed]
- Murugaraj, G.; Selva Kumar, S.; Pillai, A.S.; Bharatiraja, C. Implementation of In-Row Weeding Robot with Novel Wheel, Assembly and Wheel Angle Adjustment for Slurry Paddy Field. Mater. Today Proc. 2022, 65, 215–220. [Google Scholar] [CrossRef]
- Liu, Y.; Noguchi, N.; Liang, L. Development of a Positioning System Using UAV-Based Computer Vision for an Airboat Navigation in Paddy Field. Comput. Electron. Agric. 2019, 162, 126–133. [Google Scholar] [CrossRef]
- Liu, Y.; Noguchi, N. Development of an Unmanned Surface Vehicle for Autonomous Navigation in a Paddy Field. Eng. Agric. Environ. Food 2016, 9, 21–26. [Google Scholar] [CrossRef]
- Liu, Y.; Noguchi, N.; Ali, R.F. Simulation and Test of an Agricultural Unmanned Airboat Maneuverability Model. Biol Eng 2017, 10, 88–96. [Google Scholar]
- Bechar, A.; Vigneault, C. Agricultural Robots for Field Operations: Concepts and Components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
- Qu, J.; Zhang, Z.; Qin, Z.; Guo, K.; Li, D. Applications of Autonomous Navigation Technologies for Unmanned Agricultural Tractors: A Review. Machines 2024, 12, 218. [Google Scholar] [CrossRef]
- Huang, Y.; Fu, J.; Xu, S.; Han, T.; Liu, Y. Research on Integrated Navigation System of Agricultural Machinery Based on RTK-BDS/INS. Agriculture 2022, 12, 1169. [Google Scholar] [CrossRef]
- Sun, T.; Le, F.; Cai, C.; Jin, Y.; Xue, X.; Cui, L. Soybean–Corn Seedling Crop Row Detection for Agricultural Autonomous Navigation Based on GD-YOLOv10n-Seg. Agriculture 2025, 15, 796. [Google Scholar] [CrossRef]
- Kim, K.; Deb, A.; Cappelleri, D.J. P-AgNav: Range View-Based Autonomous Navigation System for Cornfields. IEEE Robot. Autom. Lett. 2025, 10, 3366–3373. [Google Scholar] [CrossRef]
- Mansur, H.; Gadhwal, M.; Abon, J.E.; Flippo, D. Mapping for Autonomous Navigation of Agricultural Robots Through Crop Rows Using UAV. Agriculture 2025, 15, 882. [Google Scholar] [CrossRef]
- Chen, J.; Li, X.; Zhang, X. SLDF: A Semantic Line Detection Framework for Robot Guidance. Signal Process. Image Commun. 2023, 115, 116970. [Google Scholar] [CrossRef]
- Yu, J.; Zhang, J.; Shu, A.; Chen, Y.; Chen, J.; Yang, Y.; Tang, W.; Zhang, Y. Study of Convolutional Neural Network-Based Semantic Segmentation Methods on Edge Intelligence Devices for Field Agricultural Robot Navigation Line Extraction. Comput. Electron. Agric. 2023, 209, 107811. [Google Scholar] [CrossRef]
- Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z. Navigation Path Extraction for Greenhouse Cucumber-Picking Robots Using the Prediction-Point Hough Transform. Comput. Electron. Agric. 2021, 180, 105911. [Google Scholar] [CrossRef]
- Nkwocha, C.L.; Wang, N. Deep Learning-Based Semantic Segmentation with Novel Navigation Line Extraction for Autonomous Agricultural Robots. Discov. Artif. Intell. 2025, 5, 73. [Google Scholar] [CrossRef]
- Zhao, Y.; Gong, L.; Huang, Y.; Liu, C. A Review of Key Techniques of Vision-Based Control for Harvesting Robot. Comput. Electron. Agric. 2016, 127, 311–323. [Google Scholar] [CrossRef]
- Lili, W.; Bo, Z.; Jinwei, F.; Xiaoan, H.; Shu, W.; Yashuo, L.; Qiangbing, Z.; Chongfeng, W. Development of a Tomato Harvesting Robot Used in Greenhouse. Int. J. Agric. Biol. Eng. 2017, 10, 140–149. [Google Scholar] [CrossRef]
- Gongal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Sensors and Systems for Fruit Detection and Localization: A Review. Comput. Electron. Agric. 2015, 116, 8–19. [Google Scholar] [CrossRef]
- Patel, K.K.; Pathare, P.B. Principle and Applications of Near-Infrared Imaging for Fruit Quality Assessment—An Overview. Int. J. Food Sci. Technol. 2024, 59, 3436–3450. [Google Scholar] [CrossRef]
- Wu, G.; Li, B.; Zhu, Q.; Huang, M.; Guo, Y. Using Color and 3D Geometry Features to Segment Fruit Point Cloud and Improve Fruit Recognition Accuracy. Comput. Electron. Agric. 2020, 174, 105475. [Google Scholar] [CrossRef]
- Sa, I.; Lehnert, C.; English, A.; McCool, C.; Dayoub, F.; Upcroft, B.; Perez, T. Peduncle Detection of Sweet Pepper for Autonomous Crop Harvesting—Combined Color and 3-D Information. IEEE Robot. Autom. Lett. 2017, 2, 765–772. [Google Scholar] [CrossRef]
- Lin, G.; Tang, Y.; Zou, X.; Xiong, J.; Fang, Y. Color-, Depth-, and Shape-Based 3D Fruit Detection. Precis. Agric. 2020, 21, 1–17. [Google Scholar] [CrossRef]
- Luo, L.; Tang, Y.; Lu, Q.; Chen, X.; Zhang, P.; Zou, X. A Vision Methodology for Harvesting Robot to Detect Cutting Points on Peduncles of Double Overlapping Grape Clusters in a Vineyard. Comput. Ind. 2018, 99, 130–139. [Google Scholar] [CrossRef]
- Méndez, V.; Velasco, J.; Rodríguez, F.; Berenguel, M.; Martínez, A.; Guzmán, J.L. In-Field Estimation of Orange Number and Size by 3D Laser Scanning. Agronomy 2019, 9, 885. [Google Scholar] [CrossRef]
- Fu, L.; Feng, Y.; Majeed, Y.; Zhang, X.; Zhang, J.; Karkee, M.; Zhang, Q. Kiwifruit Detection in Field Images Using Faster R-CNN with ZFNet. IFAC-Pap. 2018, 51, 45–50. [Google Scholar] [CrossRef]
- Badgujar, C.M.; Poulose, A.; Gan, H. Agricultural Object Detection with You Only Look Once (YOLO) Algorithm: A Bibliometric and Systematic Literature Review. Comput. Electron. Agric. 2024, 223, 109090. [Google Scholar] [CrossRef]
- Onishi, Y.; Yoshida, T.; Kurita, H.; Fukao, T.; Arihara, H.; Iwai, A. An Automated Fruit Harvesting Robot by Using Deep Learning. ROBOMECH J. 2019, 6, 13. [Google Scholar] [CrossRef]
- Birrell, S.; Hughes, J.; Cai, J.Y.; Iida, F. A Field-Tested Robotic Harvesting System for Iceberg Lettuce. J. Field Robot. 2020, 37, 225–245. [Google Scholar] [CrossRef] [PubMed]
- Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep Learning–Method Overview and Review of Use for Fruit Detection and Yield Estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
- Lin, G.; Tang, Y.; Zou, X.; Cheng, J.; Xiong, J. Fruit Detection in Natural Environment Using Partial Shape Matching and Probabilistic Hough Transform. Precis. Agric. 2020, 21, 160–177. [Google Scholar] [CrossRef]
- Yu, Y.; Zhang, K.; Yang, L.; Zhang, D. Fruit Detection for Strawberry Harvesting Robot in Non-Structural Environment Based on Mask-RCNN. Comput. Electron. Agric. 2019, 163, 104846. [Google Scholar] [CrossRef]
- Yang, C.H.; Xiong, L.Y.; Wang, Z.; Wang, Y.; Shi, G.; Kuremot, T.; Zhao, W.H.; Yang, Y. Integrated Detection of Citrus Fruits and Branches Using a Convolutional Neural Network. Comput. Electron. Agric. 2020, 174, 105469. [Google Scholar] [CrossRef]
- Wang, W.; Lin, C.; Shui, H.; Zhang, K.; Zhai, R. Adaptive Symmetry Self-Matching for 3D Point Cloud Completion of Occluded Tomato Fruits in Complex Canopy Environments. Plants 2025, 14, 2080. [Google Scholar] [CrossRef] [PubMed]
- Zhao, H.; Tang, Z.; Li, Z.; Dong, Y.; Si, Y.; Lu, M.; Panoutsos, G. Real-Time Object Detection and Robotic Manipulation for Agriculture Using a YOLO-Based Learning Approach. In Proceedings of the 2024 IEEE International Conference on Industrial Technology (ICIT), Bristol, UK, 25–27 March 2024; pp. 1–6. [Google Scholar]
- Hu, N.; Su, D.; Wang, S.; Nyamsuren, P.; Qiao, Y.; Jiang, Y.; Cai, Y. LettuceTrack: Detection and Tracking of Lettuce for Robotic Precision Spray in Agriculture. Front. Plant Sci. 2022, 13, 1003243. [Google Scholar] [CrossRef] [PubMed]
- Reina, G.; Milella, A.; Rouveure, R.; Nielsen, M.; Worst, R.; Blas, M.R. Ambient Awareness for Agricultural Robotic Vehicles. Biosyst. Eng. 2016, 146, 114–132. [Google Scholar] [CrossRef]
- Yan, J.; Liu, Y. A Stereo Visual Obstacle Detection Approach Using Fuzzy Logic and Neural Network in Agriculture. In Proceedings of the 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), Kuala Lumpur, Malaysia, 12–15 December 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1539–1544. [Google Scholar]
- Qiu, Z.; Zhao, N.; Zhou, L.; Wang, M.; Yang, L.; Fang, H.; He, Y.; Liu, Y. Vision-Based Moving Obstacle Detection and Tracking in Paddy Field Using Improved Yolov3 and Deep SORT. Sensors 2020, 20, 4082. [Google Scholar] [CrossRef]
- Xu, H.; Li, S.; Ji, Y.; Cao, R.; Zhang, M. Dynamic Obstacle Detection Based on Panoramic Vision in the Moving State of Agricultural Machineries. Comput. Electron. Agric. 2021, 184, 106104. [Google Scholar] [CrossRef]
- Bayar, G.; Bergerman, M.; Koku, A.B.; Konukseven, E. ilhan Localization and Control of an Autonomous Orchard Vehicle. Comput. Electron. Agric. 2015, 115, 118–128. [Google Scholar] [CrossRef]
- Ball, D.; Upcroft, B.; Wyeth, G.; Corke, P.; English, A.; Ross, P.; Patten, T.; Fitch, R.; Sukkarieh, S.; Bate, A. Vision-based Obstacle Detection and Navigation for an Agricultural Robot. J. Field Robot. 2016, 33, 1107–1130. [Google Scholar] [CrossRef]
- Xue, J.; Xia, C.; Zou, J. A Velocity Control Strategy for Collision Avoidance of Autonomous Agricultural Vehicles. Auton. Robots 2020, 44, 1047–1063. [Google Scholar] [CrossRef]
- Liu, C.; Zhao, X.; Du, Y.; Cao, C.; Zhu, Z.; Mao, E. Research on Static Path Planning Method of Small Obstacles for Automatic Navigation of Agricultural Machinery. IFAC-Pap. 2018, 51, 673–677. [Google Scholar] [CrossRef]
- Liu, Z.; Lü, Z.; Zheng, W.; Zhang, W.; Cheng, X. Design of Obstacle Avoidance Controller for Agricultural Tractor Based on ROS. Int. J. Agric. Biol. Eng. 2019, 12, 8. [Google Scholar] [CrossRef]
- Chen, H.; Xie, H.; Sun, L.; Shang, T. Research on Tractor Optimal Obstacle Avoidance Path Planning for Improving Navigation Accuracy and Avoiding Land Waste. Agriculture 2023, 13, 934. [Google Scholar] [CrossRef]
- Cui, J.; Zhang, X.; Fan, X.; Feng, W.; Li, P.; Wu, Y. Path Planning of Autonomous Agricultural Machineries in Complex Rural Road. J. Eng. 2020, 2020, 239–245. [Google Scholar] [CrossRef]
- Santos, L.C.; Santos, F.N.; Valente, A.; Sobreira, H.; Sarmento, J.; Petry, M. Collision Avoidance Considering Iterative Bézier Based Approach for Steep Slope Terrains. IEEE Access 2022, 10, 25005–25015. [Google Scholar] [CrossRef]
- Wang, Y.J.; Pan, G.T.; Xue, C.L.; Yang, F.Z. Research on Model of Laser Navigation System and Obstacle Avoidance for Orchard Unmanned Vehicle. In Proceedings of the 2019 2nd International Conference on Informatics, Control and Automation, Hangzhou, China, 26–27 May 2019. [Google Scholar]
- Yang, J.; Ni, J.; Li, Y.; Wen, J.; Chen, D. The Intelligent Path Planning System of Agricultural Robot via Reinforcement Learning. Sensors 2022, 22, 4316. [Google Scholar] [CrossRef] [PubMed]
- Bansal, A.; Sikka, K.; Sharma, G.; Chellappa, R.; Divakaran, A. Zero-Shot Object Detection. In Proceedings of the European Conference on Computer Vision (ECCV), 2018, Munich, Germany, 8–14 September 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 384–400. [Google Scholar]
- Zhu, P.; Wang, H.; Saligrama, V. Zero Shot Detection. IEEE Trans. Circuits Syst. Video Technol. 2020, 30, 998–1010. [Google Scholar] [CrossRef]
- Hossain, M.S.; Rahman, M.; Rahman, A.; Mohsin Kabir, M.; Mridha, M.F.; Huang, J.; Shin, J. Automatic Navigation and Self-Driving Technology in Agricultural Machinery: A State-of-the-Art Systematic Review. IEEE Access 2025, 13, 94370–94401. [Google Scholar] [CrossRef]
- Chen, D.; Qi, X.; Zheng, Y.; Lu, Y.; Huang, Y.; Li, Z. Synthetic Data Augmentation by Diffusion Probabilistic Models to Enhance Weed Recognition. Comput. Electron. Agric. 2024, 216, 108517. [Google Scholar] [CrossRef]
- De Clercq, D.; Nehring, E.; Mayne, H.; Mahdi, A. Large Language Models Can Help Boost Food Production, but Be Mindful of Their Risks. Front. Artif. Intell. 2024, 7, 1326153. [Google Scholar] [CrossRef]
- Sun, L.; Jha, D.K.; Hori, C.; Jain, S.; Corcodel, R.; Zhu, X.; Tomizuka, M.; Romeres, D. Interactive Planning Using Large Language Models for Partially Observable Robotic Tasks. In Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA), Yokohama, Japan, 13–17 May 2024. [Google Scholar]
- Hori, C.; Kambara, M.; Sugiura, K.; Ota, K.; Khurana, S.; Jain, S.; Corcodel, R.; Jha, D.K.; Romeres, D.; Le Roux, J. Interactive Robot Action Replanning Using Multimodal LLM Trained from Human Demonstration Videos. In Proceedings of the ICASSP 2025—2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Hyderabad, India, 6–11 April 2025. [Google Scholar]
- Li, P.; An, Z.; Abrar, S.; Zhou, L. Large Language Models for Multi-Robot Systems: A Survey. arXiv 2025, arXiv:2502.03814. [Google Scholar] [CrossRef]
- Zhu, H.; Qin, S.; Su, M.; Lin, C.; Li, A.; Gao, J. Harnessing Large Vision and Language Models in Agriculture: A Review. Front. Plant Sci. 2025, 16, 1579355. [Google Scholar] [CrossRef] [PubMed]
- How, J.P.; Frazzoli, E.; Chowdhary, G.V. Linear Flight Control Techniques for Unmanned Aerial Vehicles. In Handbook of Unmanned Aerial Vehicles; Valavanis, K.P., Vachtsevanos, G.J., Eds.; Springer: Dordrecht, The Netherlands, 2015; pp. 529–576. ISBN 978-90-481-9707-1. [Google Scholar]
- Ren, Z.; Zheng, H.; Chen, J.; Chen, T.; Xie, P.; Xu, Y.; Deng, J.; Wang, H.; Sun, M.; Jiao, W. Integrating UAV, UGV and UAV-UGV Collaboration in Future Industrialized Agriculture: Analysis, Opportunities and Challenges. Comput. Electron. Agric. 2024, 227, 109631. [Google Scholar] [CrossRef]
- Bretas, I.L.; Dubeux, J.C.B., Jr.; Cruz, P.J.R.; Oduor, K.T.; Queiroz, L.D.; Valente, D.S.M.; Chizzotti, F.H.M. Precision Livestock Farming Applied to Grazingland Monitoring and Management—A Review. Agron. J. 2024, 116, 1164–1186. [Google Scholar] [CrossRef]
- Kim, J.; Kim, S.; Ju, C.; Son, H.I. Unmanned Aerial Vehicles in Agriculture: A Review of Perspective of Platform, Control, and Applications. IEEE Access 2019, 7, 105100–105115. [Google Scholar] [CrossRef]
- Yu, S.; Zhu, J.; Zhou, J.; Cheng, J.; Bian, X.; Shen, J.; Wang, P. Key Technology Progress of Plant-Protection UAVs Applied to Mountain Orchards: A Review. Agronomy 2022, 12, 2828. [Google Scholar] [CrossRef]
- Li, P.; Liu, D.; Baldi, S. Plug-and-Play Adaptation in Autopilot Architectures for Unmanned Aerial Vehicles. In Proceedings of the IECON 2021—47th Annual Conference of the IEEE Industrial Electronics Society, Toronto, ON, Canada, 13–16 October 2021; pp. 1–6. [Google Scholar]
- Ulus, S.; Ikbal, E. Lateral and Longitudinal Dynamics Control of a Fixed Wing UAV by Using PID Controller. In Proceedings of the 4th International Conference on Engineering and Natural Sciences, Kiev, Ukraine, 2–6 May 2018. [Google Scholar]
- Wei, X.; XianYu, W.; Jiazhen, L.; Yasheng, Y. Design of Anti-Load Perturbation Flight Trajectory Stability Controller for Agricultural UAV. Front. Plant Sci. 2023, 14, 1030203. [Google Scholar] [CrossRef] [PubMed]
- Surur, K.; Kabir, I.; Ahmad, G.; Abido, M.A. Optimal Gain Scheduling for Fault-Tolerant Control of Quadrotor UAV Using Genetic Algorithm-Based Neural Network. Arab. J. Sci. Eng. 2025. [Google Scholar] [CrossRef]
- Wu, H.; Liu, D.; Zhao, Y.; Liu, Z.; Liang, Y.; Liu, Z.; Huang, T.; Liang, K.; Xie, S.; Li, J. Establishment and Verification of the UAV Coupled Rotor Airflow Backward Tilt Angle Controller. Drones 2024, 8, 146. [Google Scholar] [CrossRef]
- Lotufo, M.A.; Colangelo, L.; Perez-Montenegro, C.; Novara, C.; Canuto, E. Embedded Model Control for UAV Quadrotor via Feedback Linearization. IFAC-Pap. 2016, 49, 266–271. [Google Scholar] [CrossRef]
- Shen, Z.; Tsuchiya, T. Singular Zone in Quadrotor Yaw–Position Feedback Linearization. Drones 2022, 6, 84. [Google Scholar] [CrossRef]
- Lee, D.; Ha, C.; Zuo, Z. Backstepping Control of Quadrotor-Type UAVs and Its Application to Teleoperation over the Internet. In Intelligent Autonomous Systems 12: Volume 2 Proceedings of the 12th International Conference IAS-12, Jeju Island, Republic of Korea, 26–29 June 2012; Lee, S., Cho, H., Yoon, K.-J., Lee, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 217–225. [Google Scholar]
- Saibi, A.; Boushaki, R.; Belaidi, H. Backstepping Control of Drone. Eng. Proc. 2022, 14, 4. [Google Scholar] [CrossRef]
- Bhowmick, P.; Bhadra, S.; Panda, A. A Two-Loop Group Formation Tracking Control Scheme for Networked Tri-Rotor UAVs Using an ARE-Based Approach. Asian J. Control 2022, 24, 2834–2849. [Google Scholar] [CrossRef]
- Sierra-García, J.E.; Santos, M. Intelligent Control of an UAV with a Cable-Suspended Load Using a Neural Network Estimator. Expert Syst. Appl. 2021, 183, 115380. [Google Scholar] [CrossRef]
- Sun, Z.; Xiao, M.; Li, D.; Chu, J. Tracking Controller Design for Quadrotor UAVs under External Disturbances Using a High-Order Sliding Mode-Assisted Disturbance Observer. Meas. Control 2025, 58, 155–167. [Google Scholar] [CrossRef]
- Maaruf, M.; Ahmad, S.S.; Hamanah, W.M.; Baraean, A.M.; Shafiul Alam, M.; Abido, M.A.; Shafiullah, M. Advanced Optimization Methods for Nonlinear Backstepping Controllers for Quadrotor-Slung Load Systems. IEEE Access 2025, 13, 66607–66621. [Google Scholar] [CrossRef]
- Ijaz, S.; Shi, Y.; Khan, Y.A.; Khodaverdian, M.; Javaid, U. Robust Adaptive Control Law Design for Enhanced Stability of Agriculture UAV Used for Pesticide Spraying. Aerosp. Sci. Technol. 2024, 155, 109676. [Google Scholar] [CrossRef]
- Lachowiec, J.; Feldman, M.J.; Matias, F.I.; LeBauer, D.; Gregory, A. Adoption of Unoccupied Aerial Systems in Agricultural Research. Plant Phenome J. 2024, 7, e20098. [Google Scholar] [CrossRef]
- Wen, S.; Zhang, Q.; Deng, J.; Lan, Y.; Yin, X.; Shan, J. Design and Experiment of a Variable Spray System for Unmanned Aerial Vehicles Based on PID and PWM Control. Appl. Sci. 2018, 8, 2482. [Google Scholar] [CrossRef]
- Yadava, R.; Aslam, A. Farming System: Quadcopter Fabrication and Development. In Advances in Engineering Design; Sharma, R., Kannojiya, R., Garg, N., Gautam, S.S., Eds.; Springer Nature: Singapore, 2023; pp. 285–293. [Google Scholar]
- Martins, L.; Cardeira, C.; Oliveira, P. Feedback Linearization with Zero Dynamics Stabilization for Quadrotor Control. J. Intell. Robot. Syst. 2020, 101, 7. [Google Scholar] [CrossRef]
- Villa, D.K.D.; Brandão, A.S.; Sarcinelli-Filho, M. Path-Following and Attitude Control of a Payload Using Multiple Quadrotors. In Proceedings of the 2019 19th International Conference on Advanced Robotics (ICAR), Belo Horizonte, Brazil, 2–6 December 2019; pp. 535–540. [Google Scholar]
- Xu, X.; Watanabe, K.; Nagai, I. Feedback Linearization Control for a Tandem Rotor UAV Robot Equipped with Two 2-DOF Tiltable Coaxial-Rotors. Artif. Life Robot. 2021, 26, 259–268. [Google Scholar] [CrossRef]
- Li, J.; Xie, H.; Low, K.H.; Yong, J.; Li, B. Image-Based Visual Servoing of Rotorcrafts to Planar Visual Targets of Arbitrary Orientation. IEEE Robot. Autom. Lett. 2021, 6, 7861–7868. [Google Scholar] [CrossRef]
- Shi, Y.; Ijaz, S.; He, Z.; Xu, Z.; Javaid, U.; Xia, Y. Adaptive Backstepping Integral Sliding Mode Control of Multirotor UAV System Used for Smart Agriculture. In Proceedings of the 2024 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Kuching, Malaysia, 6–10 October 2024; pp. 303–308. [Google Scholar]
- Mohamed, A.; El-Gindy, M.; Ren, J. Advanced Control Techniques for Unmanned Ground Vehicle: Literature Survey. Int. J. Veh. Perform. 2018, 4, 46–73. [Google Scholar] [CrossRef]
- Ao, X.; Wang, L.-M.; Hou, J.-X.; Xue, Y.-Q.; Rao, S.-J.; Zhou, Z.-Y.; Jia, F.-X.; Zhang, Z.-Y.; Li, L.-M. Road Recognition and Stability Control for Unmanned Ground Vehicles on Complex Terrain. IEEE Access 2023, 11, 77689–77702. [Google Scholar] [CrossRef]
- Wang, Q.; He, J.; Lu, C.; Wang, C.; Lin, H.; Yang, H.; Li, H.; Wu, Z. Modelling and Control Methods in Path Tracking Control for Autonomous Agricultural Vehicles: A Review of State of the Art and Challenges. Appl. Sci. 2023, 13, 7155. [Google Scholar] [CrossRef]
- Azimi, A.; Shamshiri, R.R.; Ghasemzadeh, A. Adaptive Dynamic Programming for Robust Path Tracking in an Agricultural Robot Using Critic Neural Networks. Agric. Eng. 2025, 80, 1–15. [Google Scholar] [CrossRef]
- Liu, L.; Wang, X.; Yang, X.; Liu, H.; Li, J.; Wang, P. Path Planning Techniques for Mobile Robots: Review and Prospect. Expert Syst. Appl. 2023, 227, 120254. [Google Scholar] [CrossRef]
- Utstumo, T.; Berge, T.W.; Gravdahl, J.T. Non-Linear Model Predictive Control for Constrained Robot Navigation in Row Crops. In Proceedings of the 2015 IEEE International Conference on Industrial Technology (ICIT), Seville, Spain, 17–19 March 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 357–362. [Google Scholar]
- Soitinaho, R.; Oksanen, T. Local Navigation and Obstacle Avoidance for an Agricultural Tractor With Nonlinear Model Predictive Control. IEEE Trans. Control Syst. Technol. 2023, 31, 2043–2054. [Google Scholar] [CrossRef]
- Song, Y.; Xue, J.; Zhang, T.; Sun, X.; Sun, H.; Gao, W.; Chen, Q. Path Tracking Control of Crawler Tractor Based on Adaptive Adjustment of Lookahead Distance Using Sparrow Search Algorithm. Comput. Electron. Agric. 2025, 234, 110219. [Google Scholar] [CrossRef]
- Wen, J.; Yao, L.; Zhou, J.; Yang, Z.; Xu, L.; Yao, L. Path Tracking Control of Agricultural Automatic Navigation Vehicles Based on an Improved Sparrow Search-Pure Pursuit Algorithm. Agriculture 2025, 15, 1215. [Google Scholar] [CrossRef]
- Hoffmann, G.M.; Tomlin, C.J.; Montemerlo, M.; Thrun, S. Autonomous Automobile Trajectory Tracking for Off-Road Driving: Controller Design, Experimental Validation and Racing. In Proceedings of the 2007 American Control Conference, New York, NY, USA, 9–13 July 2007; pp. 2296–2301. [Google Scholar]
- Wang, L.; Zhai, Z.; Zhu, Z.; Mao, E. Path Tracking Control of an Autonomous Tractor Using Improved Stanley Controller Optimized with Multiple-Population Genetic Algorithm. Actuators 2022, 11, 22. [Google Scholar] [CrossRef]
- Sun, Y.; Cui, B.; Ji, F.; Wei, X.; Zhu, Y. The Full-Field Path Tracking of Agricultural Machinery Based on PSO-Enhanced Fuzzy Stanley Model. Appl. Sci. 2022, 12, 7683. [Google Scholar] [CrossRef]
- Cui, B.; Cui, X.; Wei, X.; Zhu, Y.; Ma, Z.; Zhao, Y.; Liu, Y. Design and Testing of a Tractor Automatic Navigation System Based on Dynamic Path Search and a Fuzzy Stanley Model. Agriculture 2024, 14, 2136. [Google Scholar] [CrossRef]
- Wang, J.; Yang, L.; Cen, H.; He, Y.; Liu, Y. Dynamic Obstacle Avoidance Control Based on a Novel Dynamic Window Approach for Agricultural Robots. Comput. Ind. 2025, 167, 104272. [Google Scholar] [CrossRef]
- Qun, R. Intelligent Control Technology Of Agricultural Greenhouse Operation Robot Based On Fuzzy Pid Path Tracking Algorithm. INMATEH Agric. Eng. 2020, 62, 181–190. [Google Scholar] [CrossRef]
- Jiao, J.; Chen, J.; Qiao, Y.; Wang, W.; Wang, C.; Gu, L. Single Neuron PID Control of Agricultural Robot Steering System Based on Online Identification. In Proceedings of the 2018 IEEE Fourth International Conference on Big Data Computing Service and Applications (BigDataService), Bamberg, Germany, 26–29 March 2018; pp. 193–199. [Google Scholar]
- Gökçe, B.; Koca, Y.B.; Aslan, Y.; Gökçe, C.O. Particle Swarm Optimization-Based Optimal PID Control of an Agricultural Mobile Robot; “Prof. Marin Drinov” Publishing House of Bulgarian Academy of Sciences: Sofia, Bulgaria, 2021. [Google Scholar]
- Huang, P.; Zhang, Z.; Luo, X. Feedforward-plus-Proportional–Integral–Derivative Controller for Agricultural Robot Turning in Headland. Int. J. Adv. Robot. Syst. 2020, 17, 1729881419897678. [Google Scholar] [CrossRef]
- Liu, J.; Wu, X.; Quan, L.; Xu, H.; Hua, Y. Fuzzy Adaptive PID Control for Path Tracking of Field Intelligent Weeding Machine. AIP Adv. 2024, 14, 035045. [Google Scholar] [CrossRef]
- Mekonen, E.A.; Kassahun, E.; Tigabu, K.; Bekele, M.; Yehule, A. Model Predictive Controller Design for Precision Agricultural Robot. In Proceedings of the 2024 International Conference on Information and Communication Technology for Development for Africa (ICT4DA), Bahir Dar, Ethiopia, 18–20 November 2024; IEEE: Piscataway, NJ, USA; pp. 49–54. [Google Scholar]
- Mehndiratta, M.; Kayacan, E.; Patel, S.; Kayacan, E.; Chowdhary, G. Learning-Based Fast Nonlinear Model Predictive Control for Custom-Made 3D Printed Ground and Aerial Robots. In Handbook of Model Predictive Control; Raković, S.V., Levine, W.S., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 581–605. ISBN 978-3-319-77489-3. [Google Scholar]
- Zhang, Z.; Kayacan, E.; Thompson, B.; Chowdhary, G. High Precision Control and Deep Learning-Based Corn Stand Counting Algorithms for Agricultural Robot. Auton. Robots 2020, 44, 1289–1302. [Google Scholar] [CrossRef]
- Mitsuhashi, T.; Chida, Y.; Tanemura, M. Autonomous Travel of Lettuce Harvester Using Model Predictive Control. IFAC-Pap. 2019, 52, 155–160. [Google Scholar] [CrossRef]
- Wang, L.; Liu, M. Path Tracking Control for Autonomous Harvesting Robots Based on Improved Double Arc Path Planning Algorithm. J. Intell. Robot. Syst. 2020, 100, 899–909. [Google Scholar] [CrossRef]
- Kulathunga, G.; Yilmaz, A.; Huang, Z.; Hroob, I.; Singh, J.; Guevara, L.; Cielniak, G.; Hanheide, M. Navigating Narrow Spaces: A Comprehensive Framework for Agricultural Robots. IEEE Robot. Autom. Lett. 2025, 10, 9296–9303. [Google Scholar] [CrossRef]
- Li, Z.; Wang, W.; Zhang, C.; Zheng, Q.; Liu, L. Fault-Tolerant Control Based on Fractional Sliding Mode: Crawler Plant Protection Robot. Comput. Electr. Eng. 2023, 105, 108527. [Google Scholar] [CrossRef]
- Jiao, J.; Wang, W.; He, Y.; Wu, Y.; Zhang, F.; Gu, L. Adaptive Fuzzy Sliding Mode-Based Steering Control of Agricultural Tracked Robot. In Fuzzy Systems and Data Mining V; IOS Press: Amsterdam, The Netherlands, 2019; pp. 243–254. [Google Scholar]
- Din, A.; Ismail, M.Y.; Shah, B.; Babar, M.; Ali, F.; Baig, S.U. A Deep Reinforcement Learning-Based Multi-Agent Area Coverage Control for Smart Agriculture. Comput. Electr. Eng. 2022, 101, 108089. [Google Scholar] [CrossRef]
- Gökçe, C.O. Single-Layer Neural-Network Based Control of Agricultural Mobile Robot. Meas. Control 2023, 56, 1446–1454. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, J.; Shi, Y.; He, Z.; Liu, F.; Kong, W.; He, Y. Unmanned Airboat Technology and Applications in Environment and Agriculture. Comput. Electron. Agric. 2022, 197, 106920. [Google Scholar] [CrossRef]
- Xu, Q. USV Course Controller Optimization Based on Elitism Estimation of Distribution Algorithm. In Proceedings of the 2014 IEEE Chinese Guidance, Navigation and Control Conference, Yantai, China, 8–10 August 2014; pp. 958–961. [Google Scholar]
- Liu, T.; Dong, Z.; Du, H.; Song, L.; Mao, Y. Path Following Control of the Underactuated USV Based On the Improved Line-of-Sight Guidance Algorithm. Pol. Marit. Res. 2017, 24, 3–11. [Google Scholar] [CrossRef]
- Li, L.; Wu, D.; Huang, Y.; Yuan, Z.-M. A Path Planning Strategy Unified with a COLREGS Collision Avoidance Function Based on Deep Reinforcement Learning and Artificial Potential Field. Appl. Ocean Res. 2021, 113, 102759. [Google Scholar] [CrossRef]
- Zhu, Z.; Hu, C.; Zhu, C.; Zhu, Y.; Sheng, Y. An Improved Dueling Deep Double-Q Network Based on Prioritized Experience Replay for Path Planning of Unmanned Surface Vehicles. J. Mar. Sci. Eng. 2021, 9, 1267. [Google Scholar] [CrossRef]
- Nugroho, H.; Xan, C.J.; Yee, T.J.; Yong, L.K.; Quan, L.Z.; Rusydi, M.I. Control System Development of Unmanned Surface Vehicles (USVs) with Fuzzy Logic Controller. In Proceedings of the 13th National Technical Seminar on Unmanned System Technology 2023—Volume 2; Md. Zain, Z., Ismail, Z.H., Li, H., Xiang, X., Karri, R.R., Eds.; Springer Nature: Singapore, 2024; pp. 83–94. [Google Scholar]
- Liu, Y.; Noguchi, N.; Yusa, T. Development of an Unmanned Surface Vehicle Platform for Autonomous Navigation in Paddy Field. IFAC Proc. Vol. 2014, 47, 11553–11558. [Google Scholar] [CrossRef]
- Temilolorun, A.; Singh, Y. Towards Design and Development of a Low-Cost Unmanned Surface Vehicle for Aquaculture Water Quality Monitoring in Shallow Water Environments. arXiv 2024, arXiv:2410.09513. [Google Scholar] [CrossRef]
- Griffiths, N.A.; Levi, P.S.; Riggs, J.S.; DeRolph, C.R.; Fortner, A.M.; Richards, J.K. Sensor-Equipped Unmanned Surface Vehicle for High-Resolution Mapping of Water Quality in Low- to Mid-Order Streams. ACS EST Water 2022, 2, 425–435. [Google Scholar] [CrossRef]
- Yanes Luis, S.; Peralta, F.; Tapia Córdoba, A.; Rodríguez Del Nozal, Á.; Toral Marín, S.; Gutiérrez Reina, D. An Evolutionary Multi-Objective Path Planning of a Fleet of ASVs for Patrolling Water Resources. Eng. Appl. Artif. Intell. 2022, 112, 104852. [Google Scholar] [CrossRef]
- Nguyen, A.; Ore, J.-P.; Castro-Bolinaga, C.; Hall, S.G.; Young, S. Towards Autonomous, Optimal Water Sampling with Aerial and Surface Vehicles for Rapid Water Quality Assessment. J. ASABE 2024, 67, 91–98. [Google Scholar] [CrossRef]
- Huang, H.; Wang, R.; Huang, F.; Chen, J. Analysis and Realization of a Self-Adaptive Grasper Grasping for Non-Destructive Picking of Fruits and Vegetables. Comput. Electron. Agric. 2025, 232, 110119. [Google Scholar] [CrossRef]
- Woon Choi, D.; Hyeon Park, J.; Yoo, J.-H.; Ko, K. AI-Driven Adaptive Grasping and Precise Detaching Robot for Efficient Citrus Harvesting. Comput. Electron. Agric. 2025, 232, 110131. [Google Scholar] [CrossRef]
- Palmieri, J.; Di Lillo, P.; Chiaverini, S.; Marino, A. A Comprehensive Control Architecture for Semi-Autonomous Dual-Arm Robots in Agriculture Settings. Control Eng. Pract. 2025, 163, 106394. [Google Scholar] [CrossRef]
- Jin, T.; Han, X. Robotic Arms in Precision Agriculture: A Comprehensive Review of the Technologies, Applications, Challenges, and Future Prospects. Comput. Electron. Agric. 2024, 221, 108938. [Google Scholar] [CrossRef]
- Kolhalkar, N.R.; Pandit, A.A.; Kedar, S.A.; Yedukondalu, G. Artificial Intelligence Algorithms for Robotic Harvesting of Agricultural Produce. In Proceedings of the 2025 1st International Conference on AIML-Applications for Engineering & Technology (ICAET), Pune, India, 16–17 January 2025; IEEE: Piscataway, NJ, USA; pp. 1–6. [Google Scholar]
- Jin, T.; Han, X.; Wang, P.; Lyu, Y.; Chang, E.; Jeong, H.; Xiang, L. Performance Evaluation of Robotic Harvester with Integrated Real-Time Perception and Path Planning for Dwarf Hedge-Planted Apple Orchard. Agriculture 2025, 15, 1593. [Google Scholar] [CrossRef]
- Ali Hassan, M.; Cao, Z.; Man, Z. End Effector Position Control of Pantograph Type Robot Using Sliding Mode Controller. In Proceedings of the 2022 Australian & New Zealand Control Conference (ANZCC), Gold Coast, Australia, 24–25 November 2022; pp. 156–160. [Google Scholar]
- Liu, Z.; Lv, Z.; Zheng, W.; Wang, X. Trajectory Control of Two-Degree-of-Freedom Sweet Potato Transplanting Robot Arm. IEEE Access 2022, 10, 26294–26306. [Google Scholar] [CrossRef]
- Mueangprasert, M.; Chermprayong, P.; Boonlong, K. Robot Arm Movement Control by Model-Based Reinforcement Learning Using Machine Learning Regression Techniques and Particle Swarm Optimization. In Proceedings of the 2023 Third International Symposium on Instrumentation, Control, Artificial Intelligence, and Robotics (ICA-SYMP), Bangkok, Thailand, 18–20 January 2023; pp. 83–86. [Google Scholar]
- Li, M.; Liu, P. A Bionic Adaptive End-Effector with Rope-Driven Fingers for Pear Fruit Harvesting. Comput. Electron. Agric. 2023, 211, 107952. [Google Scholar] [CrossRef]
- Wei, Z.; Zhao, C.; Huang, Y.; Fu, X.; Li, J.; Li, G. A Super-Hydrophobic Tactile Sensor for Damage-Free Fruit Grasping. Comput. Electron. Agric. 2025, 239, 111043. [Google Scholar] [CrossRef]
- Kumar, S.; Mohan, S.; Skitova, V. Designing and Implementing a Versatile Agricultural Robot: A Vehicle Manipulator System for Efficient Multitasking in Farming Operations. Machines 2023, 11, 776. [Google Scholar] [CrossRef]
- Sriram, A.; R, A.R.; Krishnan, R.; Jagadeesh, S.; Gnanasekaran, K. IoT-Enabled 6DOF Robotic Arm with Inverse Kinematic Control: Design and Implementation. In Proceedings of the 2023 IEEE World Conference on Applied Intelligence and Computing (AIC), Virtual, 29–30 July 2023; pp. 795–800. [Google Scholar]
- Yoshida, T.; Onishi, Y.; Kawahara, T.; Fukao, T. Automated Harvesting by a Dual-Arm Fruit Harvesting Robot. ROBOMECH J. 2022, 9, 19. [Google Scholar] [CrossRef]
- Mapes, J.; Dai, A.; Xu, Y.; Agehara, S. Harvesting End-Effector Design and Picking Control. In Proceedings of the 2021 IEEE Symposium Series on Computational Intelligence (SSCI), Orlando, FL, USA, 5–7 December 2021; pp. 1–6. [Google Scholar]
- Seno, K.; Abe, T.; Tomori, H. Editorial Office Development of 2-DOF Manipulator Using Straight-Fiber-Type Pneumatic Artificial Muscle for Agriculture. J. Robot. Mechatron. 2025, 37, 64–75. [Google Scholar] [CrossRef]
- MarketsandMarkets Smart Agriculture Market Size, Share and Trends, 2025. Available online: https://www.marketsandmarkets.com/Market-Reports/smart-agriculture-market-239736790.html (accessed on 11 July 2025).
- La Rocca, P.; Guennebaud, G.; Bugeau, A. To What Extent Can Current French Mobile Network Support Agricultural Robots? arXiv 2025, arXiv:2505.10044. [Google Scholar] [CrossRef]
- IEEE 802.15.4-2020; IEEE Standards Association. Available online: https://standards.ieee.org/ieee/802.15.4/7029/ (accessed on 26 October 2025).
- Aldhaheri, L.; Alshehhi, N.; Manzil, I.I.J.; Khalil, R.A.; Javaid, S.; Saeed, N.; Alouini, M.-S. LoRa Communication for Agriculture 4.0: Opportunities, Challenges, and Future Directions. IEEE Internet Things J. 2024. [Google Scholar] [CrossRef]
- Zhivkov, T.; Sklar, E.I. 5g on the Farm: Evaluating Wireless Network Capabilities for Agricultural Robotics. arXiv 2022, arXiv:2301.01600. [Google Scholar] [CrossRef]
- Bluetooth/BLE Core Specification. Bluetooth® Technol. Website 2024. Available online: https://www.bluetooth.com/specifications/specs/core-specification-6-0/ (accessed on 26 October 2025).
- IEEE 802.11ax-2021; IEEE Standards Association. Available online: https://standards.ieee.org/ieee/802.11ax/7180/ (accessed on 26 October 2025).
- Ahmad, S.J.; Yasmin, S.; Khandoker, R.; Chowdhury, F.; Rahman, S.; Khatun, A.; Rajvor, P. A Case Study: A Review On Agriculture Robot. J. Emerg. Technol. Innov. Res. 2024, 11, b254–b263. [Google Scholar]
- Bicamumakuba, E.; Habineza, E.; Samsuzzaman, S.; Reza, M.N.; Chung, S.-O. IoT-Enabled LoRaWAN Gateway for Monitoring and Predicting Spatial Environmental Parameters in Smart Greenhouses: A Review. Precis. Agric. Sci. Technol. 2025, 7, 28–46. [Google Scholar] [CrossRef]
- Bailey, J.K. IoT and Generative AI for Enhanced Data-Driven Agriculture. Ph.D. Thesis, Purdue University Graduate School, West Lafayette, IN, USA, 2025. [Google Scholar]
- Nair, K.K.; Abu-Mahfouz, A.M.; Lefophane, S. Analysis of the Narrow Band Internet of Things (NB-IoT) Technology. In Proceedings of the 2019 Conference on Information Communications Technology and Society (ICTAS), Durban, South Africa, 6–8 March 2019; IEEE: Durban, South Africa, 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Lauridsen, M.; Vejlgaard, B.; Kovacs, I.Z.; Nguyen, H.; Mogensen, P. Interference Measurements in the European 868 MHz ISM Band with Focus on LoRa and SigFox. In Proceedings of the 2017 IEEE Wireless Communications and Networking Conference (WCNC), San Francisco, CA, USA, 19–22 March 2017; IEEE: San Francisco, CA, USA, 2017; pp. 1–6. [Google Scholar] [CrossRef]
- LTE-M Global LTE-M Connectivity | Emnify. Available online: https://www.emnify.com/iot-supernetwork/global-iot-coverage/lte-m (accessed on 26 October 2025).
- RPMA—The World’s Premier IoT Solutions Provider. Available online: https://rpmanetworks.com/ (accessed on 26 October 2025).
- WavIoT WAVIoT—LPWAN Solutions for IoT and M2M. Available online: https://waviot.com/ (accessed on 26 October 2025).
- 4G/LTE-Advanced LTE vs LTE Advanced: Is 4G LTE Different from LTE Advanced?—Commsbrief. Available online: https://commsbrief.com/lte-vs-lte-advanced-is-4g-lte-different-from-lte-advanced/ (accessed on 26 October 2025).
- Akhila, S. Hemavathi 5G Ultra-Reliable Low-Latency Communication: Use Cases, Concepts and Challenges. In Proceedings of the 2023 10th International Conference on Computing for Sustainable Global Development (INDIACom), New Delhi, India, 15–17 March 2023; pp. 53–58. [Google Scholar]
- Kim, H. Enhanced Mobile Broadband Communication Systems*. In Design and Optimization for 5G Wireless Communications; IEEE: New York, NY, USA, 2020; pp. 239–302. ISBN 978-1-119-49444-7. [Google Scholar] [CrossRef]
- Dutkiewicz, E.; Costa-Perez, X.; Kovacs, I.Z.; Mueck, M. Massive Machine-Type Communications. IEEE Netw. 2017, 31, 6–7. [Google Scholar] [CrossRef]
- Informed, T. Agriculture Gets Boost from Bots and Portable 5G Network. Available online: https://techinformed.com/agriculture-gets-boost-from-bots-and-portable-5g-network/ (accessed on 11 July 2025).
- Dresden, T.U. Digitalization for a Sustainable Future in Agriculture: Successful Completion of the LANDNETZ Collaborative Project. Available online: https://tu-dresden.de/ing/maschinenwesen/die-fakultaet/news/digitalisierung-fuer-eine-nachhaltigere-landwirtschaft-der-zukunft-erfolgreicher-abschluss-des-verbundprojektes-landnetz?set_language=en (accessed on 11 July 2025).
- Lindenschmitt, D.; Fischer, C.; Haussmann, S.; Kalter, M.; Kallfass, I.; Schotten, H. Agricultural On-Demand Networks for 6G Enabled by THz Communication. arXiv 2024, arXiv:2408.15665. [Google Scholar] [CrossRef]
- Cheraghi, A.R.; Shahzad, S.; Graffi, K. Past, Present, and Future of Swarm Robotics. In Proceedings of the SAI Intelligent Systems Conference, Virtual, 2–3 September 2021; Springer: Cham, Switzerland, 2021; pp. 190–233. [Google Scholar]



















| Parameters | Multi-Rotor UAVs | Fixed-Wing UAVs | Hybrid (VTOL) UAVs | Unmanned Helicopters |
|---|---|---|---|---|
| Weight | 15 kg) | 25 kg) | 30 kg) | 50 kg) |
| Payload capacity | Low to moderate (1–5 kg) | Moderate to high (5–20 kg) | Moderate (2–10 kg) | High (up to 30 kg) |
| Endurance | Short (15–45 min) | Long (1–3 h) | Moderate to long (45 min–2 h) | Long (1–3 h) |
| Manoeuvrability | Excellent (precise hover, agile) | Moderate (requires forward motion) | High (hover + forward flight) | High (can hover and perform complex movements) |
| Operational flexibility | High (can operate in small fields, vertical take-off & landing) | Medium (needs runway or catapult for launch) | Very high (vertical take-off and landing + long range) | High (suitable for spraying and imaging) |
| Required expertise | Low to moderate | Moderate to high | High | High |
| Adoption rate | Very high (most common) | Moderate | Low to moderate | Low (specialised use) |
| Sensor | Primary Function | Key Features | Applications in Agriculture | Spectral Range |
|---|---|---|---|---|
| Digital (RGB) camera | Captures images in the visible light spectrum (Red, Green, Blue) | Cost-effective, lightweight, provides true-colour imagery | Crop and soil mapping, plant counting, canopy cover estimation, land use classification | 400–700 nm |
| Thermal camera | Detects infrared radiation to measure temperature differences | Sensitive to temperature variation, useful for stress mapping | Water stress detection, irrigation planning, plant health assessment, pest/disease hotspot identification | |
| Multispectral camera | Captures data in a few discrete spectral bands, including visible and near-infrared | Limited bands (e.g., 4–10), good for vegetation indices | Vegetation health monitoring (NDVI), crop stress detection, precision farming, yield estimation | 400–1000 nm |
| Hyperspectral camera | Captures data across hundreds of narrow spectral bands across visible to shortwave IR | High spectral resolution, detailed material characterization | Crop disease detection, nutrient analysis, soil property mapping, species identification | 400–2500 nm |
| LiDAR | Uses laser pulses to measure distance and create 3D models | Provides accurate 3D data, works in low-light conditions | Crop height measurement, biomass estimation, terrain modelling, canopy structure analysis | 905 nm & 1550 nm |
| Reference | Configuration | Application | Sensors | Algorithms | Key Outcomes |
|---|---|---|---|---|---|
| [37] | Multi-rotor UAV | Precision spraying to protect plants from pests and diseases | N/A | Eulerian-Lagrangian modeling approach, Multi-Reference Frame (MRF) method, Discrete Phase Modeling (DPM), and turbulence models like SST k-ω for numerical simulations | Propeller-atomizer distance affects spraying efficiency |
| [38] | Unmanned helicopter, rotary-wing UAV, and multi-rotor UAV | Spraying field crops in precision agriculture | RGB and multispectral cameras | Efficient matching algorithms for UAV operations, including route planning and decision-making for swarm deployment | Developed a mathematical model to optimize UAV performance for precision crop spraying |
| [39] | Multi-rotor UAV (quadcopter) | Weed detection and selective herbicide spraying | GPS, flight controller sensors, and Raspberry Pi camera | Deep learning algorithms to detect and classify weeds | A quadcopter built to detect and selectively spray herbicides Developed system demonstrated improved weed management |
| [40] | Fixed-wing UAV | Digitization of agricultural land | RGB camera | Quantum GIS (QGIS), Agisoft Metashape Professional, and Sputnik Agro | UAVs provide accurate, efficient field digitization and mapping, outperforming traditional and satellite methods. |
| [41] | Multi-rotor and Fixed-wing UAVs | High-resolution aerial and multispectral imaging for agricultural land use classification | RGB and Multispectral cameras | Maximum Likelihood Method and Single Feature Probability for image classification | Achieved nearly 90% accuracy in agricultural land use classification |
| [42] | Fixed-wing and Rotary-wing UAVs | Rice lodging assessment | RGB cameras | EDANet deep learning model for semantic segmentation and machine learning algorithms for autonomous UAV scouting | The study achieved 99.25% accuracy in rice lodging prediction; scouting time reduced by 35% compared to conventional methods. |
| [43] | Multi-rotor UAV | Remote sensing in agriculture | RGB cameras | Distributed swarm control algorithm for multi-UAV systems | Multi-UAV system significantly improves efficiency, reduces working time, and solves battery shortage issues compared to single-UAV systems |
| [44] | Multi-rotor UAV | Detecting pest infestation symptoms on olive and palm trees, mapping plantations, and cooperating with e-traps for targeted pesticide spraying | Fully stabilized 3-axis 1080p full HD video camera | Image stitching (MapsMadeEasy), mission planning (Pix4Dcapture), and UAV simulation/control (DroneKit-Python, DroneKit-SITL, MAVProxy) | Demonstrated UAVs’ effectiveness in detecting crop infestations, mapping affected areas, and enabling targeted pesticide spraying |
| [23] | Multi-rotor UAV (quadcopter) | Spraying pesticides and fertilizers in agricultural fields | Accelerometer, gyroscope, magnetometer, and GPS | N/A | Successfully developed the FREYR Drone, a GPS-enabled, Android-controlled quadcopter for pesticide application |
| [45] | Multi-rotor UAV (quadcopter) | 3D monitoring of agricultural tree plantations | Visible-light and multispectral cameras | Object-Based Image Analysis (OBIA) algorithms for image segmentation, classification, and geometric feature extraction | UAV and OBIA technology achieved accurate 3D monitoring of agricultural trees, enabling efficient crop management. |
| [46] | Multi-rotor UAV | Multi-temporal imaging to monitor a sunflower crop and estimate NDVI | Tetracam ADC Lite digital camera with multispectral sensors | Maximum Likelihood Classification (MLC) for image classification and linear regression models | NDVI from UAV-acquired multispectral images can reliably predict sunflower crop yield, aerial biomass, and nitrogen content |
| UGV Configuration | Advantages | Disadvantages | Use-Case Scenario |
|---|---|---|---|
| Wheel-type UGV |
|
|
|
| Track-type UGV |
|
|
|
| Legged robot |
|
|
|
| Reference | Configuration | Application | Sensors | Algorithms | Key Outcomes |
|---|---|---|---|---|---|
| [59] | Wheeled UGV with differential steering mechanism | Collaboration with UAV for automatic weed detection and removal | RGB cameras, LIDAR, and GNSS | SSD MobileNetV1 and YOLOv8 machine-learning algorithms | Successfully demonstrated real-time collaboration between UAVs and UGVs for automated weed detection and removal |
| [60] | Wheeled UGV | Complete coverage path planning (CCPP) in agricultural fields | N/A | H-CCPP algorithm, combining features from O-CCPP and Fields2Cover, along with Dubins and Reeds-Shepp methods for turn generation | The study introduced H-CCPP, offering faster computation and better slope optimization than O-CCPP |
| [61] | Four-wheel self-steering (with differential steering) UGV | Path tracking of agricultural robots in unstructured farmlands | RTK-GNSS, angle sensor | Extended disturbance observer-based sliding mode controller (EDO-SMC) | The designed EDO-SMC method suggests sufficient robustness in control of the UGV, with more minor offsets indicating good performance |
| [62] | Wheeled UGV with four-wheel independent steering drive capabilities, including Ackermann model, Crab model, and Rotate model | Autonomous navigation for high-throughput phenotypic data collection | Visible light camera and RTK-GPS module | SegFormer-B0 semantic segmentation model, Douglas-Peucker algorithm for path simplification, and the Pure Pursuit algorithm for path tracking | High-precision autonomous navigation for phenotyping robots with lateral errors mostly within 2 cm in field environments. |
| [63] | Wheeled UGV with skid-steering mechanism | Agricultural applications, including tasks like ploughing, seeding, mowing, spraying, crop monitoring, and robotic harvesting | Stereo cameras, LiDAR, and IMU sensors | Control algorithms integrated with ROS2 | Developed a modular robot compatible with standard implements for versatile and stable operation on complex terrains |
| [64] | Wheeled UGV with differential steering mechanism | Agricultural dataset creation by capturing images of plants | RGB camera | ArUco marker detection algorithms for navigation and positioning | Developed a low-cost autonomous robot to efficiently create agricultural image datasets |
| [65] | Wheeled UGV with differential steering mechanism | Autonomous navigation for detecting and following crop rows in sugar beet fields | Depth camera, tracking camera, and RTK-GPS | U-Net deep learning-based segmentation algorithm for crop row detection, Triangle Scan Method (TSM), and proportional controller for visual servoing | Robust crop row detection with an average angular error of 1.65° and displacement error of 11.99 pixels, outperforming the baseline |
| [66] | Wheeled UGV with four-wheel steering (4WS) mechanism | Autonomous navigation in GNSS-denied environments: orchards and vineyards | Wheel and steering encoders | Pure Pursuit for path tracking and Vector Field Histogram (VFH) for obstacle avoidance | Developed a navigation algorithm enabling autonomous orchard robot operation in GNSS-denied environments |
| [8] | Wheeled UGV with Ackermann steering mechanism | Autonomous navigation in vineyards for field monitoring tasks | 3D stereoscopic cameras, multi-beam lidar, and ultrasonic sensors | Perception-based navigation algorithms: Augmented Perception Obstacle Map (APOM), 3D density mapping, and occupancy matrix calculations | Augmented perception combining 3D vision, lidar, and ultrasonics enhances autonomous navigation stability and safety in vineyard rows |
| [67] | Wheeled UGV | Autonomous crop harvesting | N/A | Neural adaptive PID control, multi-layer neural networks, and the prescribed performance control (PPC) technique | Neural adaptive PID controller ensures tractor-trailer tracking with collision avoidance, connectivity, and robustness. |
| [68] | Wheeled UGV with skid-steering mechanism | Autonomous lawn mowing in agricultural applications | Depth cameras, RP-Lidar-S1, Piksi Multi RTK module, SICK incremental encoders, and Xsens Mti-7 IMU | DeepLabv3+ for semantic segmentation, point cloud reconstruction, and occupancy grid mapping | Improved obstacle detection accuracy with a 38 cm average error |
| [69] | Wheeled UGV with central articulated steering mechanism | Crop row detection and mapping for under-canopy navigation of agricultural robots | Depth camera, RTK-GPS module, IMU, wheel and steering encoders | RANSAC, Slicing-based clustering, Linear programming, Bayesian mapping, and Kalman filter | Achieved reliable crop row mapping (MAE: 3.4 cm in corn, 3.6 cm in sorghum) and inter-row positioning (MAE: 5.0 cm in corn, 4.2 cm in sorghum) |
| [70] | Wheeled UGV with four-wheel Ackermann Steering Mechanism (ASM) | Spraying and shredding operations within vineyard rows | Proximity sensors, mapping and navigation sensors | Dynamic-Window Approach (DWA), Rapid Random Exploring Tree (RRT), and tracking controllers | Cooperative use of UAVs and UGVs in complex agricultural scenarios; developed innovative path planning and control systems |
| [71] | Three-wheeled robot with an off-center rear castor wheel with differential drive mechanism | Precision in-row weed control in vegetable crops | RGB camera, Forward-facing camera for row detection, GPS module, and Wheel encoders | Support Vector Machine (SVM), Extended Kalman Filter (EKF), Hough Transform, Line-following algorithm, and motion estimation | The UGV achieved over 90% reduction in herbicide use while effectively controlling weeds |
| [72] | Wheeled UGV with skid-steering mechanism | High-throughput crop phenotyping | LiDARs, RTK GPS, RGB cameras, inertial measurement unit (IMU), time-of-flight sensor, custom stereo camera, and fish-eye cameras | Pure Pursuit algorithm, Unscented Kalman Filter (UKF), RANSAC cylinder detection, and Extended Kalman Filter (EKF) | The Robotanist: demonstrated a contact-based automated phenotyping using a ground robot capable of autonomous navigation in Sorghum plots |
| [73] | Wheeled UGV with electro-hydraulic steering system | Pest and weed control | GNSS receivers, cameras, ultrasonic sensors, laser range finders, and inertial measurement units (IMU) | Simulated Annealing, NSGA-II, Genetic Algorithms for planning, OBIA for weed detection, Hough Transformation, and Theil-Sen Estimator for crop row detection | Multi-robot system reduced pesticide and herbicide use, improved precision in crop management |
| [74] | Wheeled UGV with electro-hydraulic steering mechanism | Accurate trajectory tracking between crop rows in challenging field conditions | RTK GPS antennas, potentiometer, inductive sensor, and wheel encoder | Robust trajectory tracking error-based Model Predictive Control (MPC) | Achieved accurate trajectory tracking for an autonomous tractor–trailer system |
| Reference | Configuration | Application | Sensors | Algorithms | Key Outcomes |
|---|---|---|---|---|---|
| [84] | Parallel Cartesian robot arms | Tree fruit harvesting | Multiple sensors | Path planning optimization algorithms | Picking success rate of 90% with cycle time reduced to 4.5–6 s using multiple arms operating in parallel |
| [85] | RGB image-based pose estimation system | Citrus pose estimation and harvesting | RGB cameras | Pose estimation algorithms combined with end-effector adjustment | System achieved 85% success rate with average picking time of 12 s |
| [80] | Pneumatic finger-like end-effector | Cherry tomato harvesting in greenhouse | Pressure sensors, RGB-D camera | Hand-picking dynamic measurement system, Arduino control | Average cycle time of picking single cherry tomato: 6.4 s |
| [86] | Contact force modelling system | Apple harvesting | Force sensors | Variable damping impedance control, Burgers model | Improved force control and dynamic performance compared to traditional impedance control |
| [87] | Various end-effector types | Citrus harvesting | RGB cameras | ImprovedYOLOv3 for citrus detection and localization | Picking success rate of 87% with average picking time of 10 s per fruit |
| [88] | Robotic arm system | Apple harvesting | RTK-GPS, IMU | Machine vision algorithm | Optimal performance according to feedback from reverse kinematic equation algorithm |
| [18] | Thin-film pressure sensor system | Cherry tomato harvesting in greenhouse | High-precision thin-film pressure sensor, six-axis attitude sensor | Pressure-sensing algorithms | Enhanced precision in fruit handling and damage reduction |
| [89] | Multi-finger gripper | Pumpkin harvesting | N/A | Denavit–Hartenberg (D-H) method | Designed end-effector can harvest different varieties of pumpkin with sufficient capability |
| [90] | 4 DOF Gripper system | Pot seedlings transportation | Position sensors | 3D Bresenham algorithm, Region-based inverse kinematic equations | Cycle time for pickup and dropping of each seedling: 3.5 s with 93.3% success rate |
| [91] | Cable-driven gripper | Strawberry harvesting | RGB-D camera | Machine vision algorithm | Average cycle time of picking: 7.5 s; Success rate of 96.8% for isolated strawberries |
| [92] | Bending mechanism | Agaricus bisporus mushrooms | Force sensors | Force optimization algorithms | Bending method required least picking force and least picking time for detaching mushrooms |
| [93] | Custom robotic arm gripper | Strawberry harvesting | RGB cameras, Wheel encoders, gyroscope, and ultra-wideband (UWB) indoor positioning system | Trilateration algorithm, machine vision algorithms | Robot capable of picking partially surrounded strawberries with success rates ranging from 50.0% to 97.1% on first attempt |
| [94] | PLC-controlled system | Strawberry harvesting | Machine vision sensors | Denavit–Hartenberg method, reverse kinematics | Current prototype picked strawberry in 4 s |
| Navigation Method | Typical Accuracy | Strengths and Applications | Limitations and Suitable Environments |
|---|---|---|---|
| RTK GNSS (GPS) | ~2–5 cm (with RTK) | Absolute positioning; well-suited for large open fields and straight rows (e.g., tractor guidance). Provides global coordinates for precise coverage. | Signal dropout under canopy or indoors; requires clear sky view and base station. No info on crop-relative position. Best for open fields; unreliable in orchards, greenhouses. |
| Vision-Based (Camera) | ~5–10 cm relative (depends on feature) | Low-cost sensor, rich information (color/texture) for row following and visual odometry. Effective under canopy or in orchards where GPS fails; can detect crop alignments and landmarks for in-row navigation. | Sensitive to lighting changes and occlusion. Requires robust image processing or learning algorithms. Limited range and field of view. Works best in structured rows with consistent visual cues; challenged by night, fog, or uniform fields (e.g., mature wheat). |
| LiDAR-Based SLAM | ~1–10 cm locally (high-resolution mapping) | Provides precise distance measurements and 3D mapping. Excellent for obstacle detection and mapping in varied terrain. Not affected by light; useful for orchard navigation (detecting tree rows) and unstructured fields (creating maps). | High sensor cost and data rate. Performance drops in heavy dust or rain. Cannot identify object type (sees shape only). Suited for environments with geometric structure or when visual data is insufficient; may be overkill for simple open fields due to cost. |
| Multi-Sensor Fusion | N/A (improves consistency) | Combines complementary sensors (e.g., GNSS + INS + camera) to mitigate individual weaknesses. Yields robust localization, e.g., GPS provides global fix, IMU smooths short-term motion, camera/LiDAR corrects drift. Enhances reliability in diverse conditions. | Increased system complexity (calibration and synchronization required). Still constrained by environmental limitations (e.g., if all sensors degrade in certain conditions). Used across all environments: essential for high reliability in real farms but requires careful integration and tuning. |
| Sensor | Advantages | Limitations |
|---|---|---|
| Vision (Camera, Stereo) | Provides rich visual and depth information for obstacle recognition and localization. Cameras can detect texture and colour (useful to identify obstacle type, e.g., animal vs. rock) and stereoscopic vision gives 3D structure. | Affected by lighting (night requires illumination, glare can blind cameras) and weather (fog, heavy rain). Depth range from stereo is moderate and accuracy decreases with distance. Best suited for moderate speeds and known obstacle appearances; often combined with learning algorithms for classification. |
| LiDAR | Highly accurate distance measurements and 3D mapping of obstacles. Effective day or night, not dependent on ambient light. Particularly good for structural obstacles (walls, trees) and creating a local map for path planning. | Expensive and power-demanding at high performance. Can be degraded by dust, smoke, or rain (loss of returns). Provides shape but no inherent ability to distinguish material or colour (e.g., cannot tell a black tarp from a water puddle except by shape). Typically used on larger platforms or when precise obstacle contours are needed (e.g., navigating close to tree rows or equipment). |
| Radar | Uses radio waves to detect obstacles at relatively long range and in all weather conditions. Robust to dust, fog, and rain where optical sensors struggle. Automotive-style radars can detect large obstacles (vehicles, humans) and measure their relative speed, useful for detecting moving hazards. | Lower resolution, small or thin objects (e.g., wires, slender plant stems) may be missed. Less effective for precise shape or terrain profiling. Often used as a complementary sensor to cover the cases when vision/LiDAR are blinded by weather. |
| Ultrasonic | Emits high-frequency sound pulses; good for short-range obstacle detection (a few meters). Inexpensive and simple; commonly used on small robots or tractors as proximity sensors (e.g., to stop if an object is very close). They work in darkness and are not affected by colour or transparency of objects. | Very limited range and cone of detection, and poor angular resolution (hard to know exact direction of obstacle). Can be triggered by wind noise or certain ambient sounds. Suitable as a safety bumper or for slow-moving platforms in clutter where fine resolution is not needed beyond “object is near.” |
| Infrared/Thermal | Detects heat signatures, enabling obstacle sensing in the dark and potential identification of warm-blooded animals or humans as distinct from the cooler crop environment. Thermal cameras have been used to detect living obstacles (people, livestock) even through mild obstructions like dust. Also useful for finding stressed plants or fires. | Lower spatial resolution and influenced by ambient temperature changes (a hot day can reduce contrast between objects and background). Not typically used as a primary obstacle sensor, but rather for specific detection tasks (e.g., wildlife detection to avoid collisions). |
| Control Method | Agricultural Vehicle | Working Scenario | Note | Source |
|---|---|---|---|---|
| PID | Crawler-type robot | Greenhouse | Fuzzy PID | [195] |
| Two drive wheels and two caster wheels | Seeding and fertilizing operation | Online Particle Swarm Optimization Continuously Tuned PID | [196] | |
| Four-wheel skid-steer agrobot | Simulation | Particle swarm optimized PID | [197] | |
| Crawler-type robot | Headland turning | Feedforward PID | [198] | |
| Four-wheel weeder | Simulation of pat tracking control | Fuzzy adaptive PID | [199] | |
| Model Predictive Control | Differential drive wheeled robot | Mathematical model simulation for robot trajectory tracking | MPC | [200] |
| Wheeled mobile robot for weed control | Row crop navigation | Nonlinear MPC | [186] | |
| TerraSentia robot | High precision path tracking in the presence of unknown wheel-terrain interaction | Learning-based Nonlinear MPC | [201,202] | |
| Crawler-type robot | Lettuce harvesting | MPC | [203] | |
| PPA | Two-wheeled robot model | Simulation of path tracking control for autonomous harvesting robots | Large-angle steering control and PPA | [204] |
| Non-holonomic robot | Autonomous navigation in narrow space (greenhouse) | Adaptive trajectory tracking and regulated PPA control | [205] | |
| Crawler-type robot | Autonomous full-coverage path planning | PPA based on linear interpolation | ||
| Sliding Model control | Crawler-type robot | Plant protection | Fractional-order sliding mode control | [206] |
| Crawler-type robot | Steering control of robot | Fuzzy-Sliding mode control | [207] | |
| Autonomous agricultural vehicle | Simulation of path-tracking | First- and second-order sliding mode control | ||
| Learning/AI-based control | Multi-agent system | Patrolling for crop health monitoring | Centralized Convolutional Neural Network (CNN)-based Dual Deep Q-learning (DDQN) | [208] |
| Differential-driven mobile robot | Automatic robot control in farms and greenhouse (simulation) | Single-layer neural-network controller | [209] |
| Control Methods | Application | Note | Source |
|---|---|---|---|
| PID control | Autonomous navigation to a predefined navigation map in a paddy field | Achieved in-system RMS lateral error ≤ 0.45 m and RMS heading error ≤ 4.4 degrees in map-based navigation | [216] |
| Extended Kalman Filter (EKF) + PWM | Aquaculture water quality monitoring | Low-cost USV design with a total cost of approximately $1118. Communication uses an RC system for control and a local wireless network (2.4 GHZ ISM band) for telemetry/ROS data | [217] |
| Differential control | High-resolution water quality mapping in low- to mid-order streams | small pontoon-style USV (AquaBOT/HyDrone) with a high payload capacity (16 kg). Collects data with higher spatial resolution than manual grab sampling | [218] |
| Multi-objective evolutionary approach (NSGA-II) and genetic algorithms (GAs) | Water Monitoring/Patrolling | Multi-agent or fleet optimization. Uses a graph-based formulation and messy individual representation for variable path lengths | [219] |
| Multiple Traveling Salesperson Problem (MTSP) Formulation and Guided Local Search Metaheuristic | Mariculture Water Quality Sampling | USV is equipped with in situ water quality sensors. The method can reach near-optimal solutions in approximately 30 s. USV tours cover a larger spatial extent to maximize spatial information gain | [220] |
| Control Domain | Control Method | Source |
|---|---|---|
| High-Level Architecture & Safety | Hierarchical Quadratic Programming (HQP), Control Barrier Functions (CBFs), Admittance Control, Hand-Guiding Control, Finite-State Machine (FSM) | [223] |
| Motion & Trajectory Execution | Inverse Kinematics (IK), Rapidly-exploring Random Tree (RRT) Algorithms, DSA-BiTRRT Algorithm, SMC, Linear Model Predictive Control (LMPC), Model-Based Reinforcement Learning (MBRL) | [224,225,226,228,232,233,234] |
| End-Effector Grasping & Sensing | Sensor-Based Grasping Force Control, Posture Dynamic Monitoring, Binary Code Feedback Motion Stop, Pneumatic Actuation Control (SF-PAM), Vision-Based Depth Control | [221,230,235,236] |
| Technology | Range | Throughput | Latency | Power | Notes |
|---|---|---|---|---|---|
| ZigBee/802.15.4 | ∼10–100 m | 20–250 kbps | Low (∼tens ms) | Very low | Mesh, many nodes, field sensors |
| BLE | ∼10–100 m | ∼1 Mb/s | Low | Low | Mobile device interfaces, livestock tracking |
| Wi-Fi 6 | ∼100–300 m | 100+ Mb/s | Low (∼ms) | Moderate | Edge video, robot control in greenhouses/fields |
| LoRa/LoRaWAN | Several km | ∼10–50 kbps | High (∼s) | Very low | Environmental sensors, wide area monitoring |
| NB-IoT/Sigfox | Up to km range | ∼10 kbps | High (∼s) | Very low | Simple remote device tracking or sensing |
| 4G/LTE | Wide area | ∼100 Mb/s | ∼10 ms | Moderate | Baseline cellular connectivity for robots |
| 5G (private/public) | Wide area | 1+ Gbps | <10 ms | Moderate | Real-time control, video, multi-robot coordination |
| THz/6G | Field/Very remote | multi-Gbps | Ultra-low | TBD | Experimental; future infrastructure for rural 6G |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nkwocha, C.L.; Adewumi, A.; Folorunsho, S.O.; Eze, C.; Jjagwe, P.; Kemeshi, J.; Wang, N. A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination. Robotics 2025, 14, 159. https://doi.org/10.3390/robotics14110159
Nkwocha CL, Adewumi A, Folorunsho SO, Eze C, Jjagwe P, Kemeshi J, Wang N. A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination. Robotics. 2025; 14(11):159. https://doi.org/10.3390/robotics14110159
Chicago/Turabian StyleNkwocha, Chijioke Leonard, Adeayo Adewumi, Samuel Oluwadare Folorunsho, Chrisantus Eze, Pius Jjagwe, James Kemeshi, and Ning Wang. 2025. "A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination" Robotics 14, no. 11: 159. https://doi.org/10.3390/robotics14110159
APA StyleNkwocha, C. L., Adewumi, A., Folorunsho, S. O., Eze, C., Jjagwe, P., Kemeshi, J., & Wang, N. (2025). A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination. Robotics, 14(11), 159. https://doi.org/10.3390/robotics14110159

