Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (237)

Search Parameters:
Keywords = autonomous agricultural robots

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 4407 KB  
Systematic Review
Artificial Intelligence in Agri-Robotics: A Systematic Review of Trends and Emerging Directions Leveraging Bibliometric Tools
by Simona Casini, Pietro Ducange, Francesco Marcelloni and Lorenzo Pollini
Robotics 2026, 15(1), 24; https://doi.org/10.3390/robotics15010024 - 15 Jan 2026
Viewed by 327
Abstract
Agricultural robotics and artificial intelligence (AI) are becoming essential to building more sustainable, efficient, and resilient food systems. As climate change, food security pressures, and labour shortages intensify, the integration of intelligent technologies in agriculture has gained strategic importance. This systematic review provides [...] Read more.
Agricultural robotics and artificial intelligence (AI) are becoming essential to building more sustainable, efficient, and resilient food systems. As climate change, food security pressures, and labour shortages intensify, the integration of intelligent technologies in agriculture has gained strategic importance. This systematic review provides a consolidated assessment of AI and robotics research in agriculture from 2000 to 2025, identifying major trends, methodological trajectories, and underexplored domains. A structured search was conducted in the Scopus database—which was selected for its broad coverage of engineering, computer science, and agricultural technology—and records were screened using predefined inclusion and exclusion criteria across title, abstract, keywords, and eligibility levels. The final dataset was analysed through descriptive statistics and science-mapping techniques (VOSviewer, SciMAT). Out of 4894 retrieved records, 3673 studies met the eligibility criteria and were included. As with all bibliometric reviews, the synthesis reflects the scope of indexed publications and available metadata, and potential selection bias was mitigated through a multi-stage screening workflow. The analysis revealed four dominant research themes: deep-learning-based perception, UAV-enabled remote sensing, data-driven decision systems, and precision agriculture. Several strategically relevant but underdeveloped areas also emerged, including soft manipulation, multimodal sensing, sim-to-real transfer, and adaptive autonomy. Geographical patterns highlight a strong concentration of research in China and India, reflecting agricultural scale and investment dynamics. Overall, the field appears technologically mature in perception and aerial sensing but remains limited in physical interaction, uncertainty-aware control, and long-term autonomous operation. These gaps indicate concrete opportunities for advancing next-generation AI-driven robotic systems in agriculture. Funding sources are reported in the full manuscript. Full article
(This article belongs to the Special Issue Smart Agriculture with AI and Robotics)
Show Figures

Figure 1

5 pages, 1197 KB  
Proceeding Paper
Experimental Assessment of Autonomous Fleet Operations for Precision Viticulture Under Real Vineyard Conditions
by Gavriela Asiminari, Vasileios Moysiadis, Dimitrios Kateris, Aristotelis C. Tagarakis, Athanasios Balafoutis and Dionysis Bochtis
Proceedings 2026, 134(1), 47; https://doi.org/10.3390/proceedings2026134047 - 14 Jan 2026
Viewed by 107
Abstract
The increase in global population and climatic instability places unprecedented demands on agricultural productivity. Autonomous robotic systems, specifically unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs), provide potential solutions by enhancing precision viticulture operations. This work presents the experimental evaluation of a [...] Read more.
The increase in global population and climatic instability places unprecedented demands on agricultural productivity. Autonomous robotic systems, specifically unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs), provide potential solutions by enhancing precision viticulture operations. This work presents the experimental evaluation of a heterogeneous robotic fleet composed of Unmanned Ground Vehicles (UGVs) and Unmanned Aerial Vehicles (UAVs), operating autonomously under real-world vineyard conditions. Over the course of a full growing season, the fleet demonstrated effective autonomous navigation, environment sensing, and data acquisition. More than 4 UGV missions and 10 UAV flights were successfully completed, achieving a 95% data acquisition rate and mapping resolution of 2.5 cm/pixel. Vegetation indices and thermal imagery enabled accurate detection of water stress and crop vigor. These capabilities enabled high-resolution mapping and agricultural task execution, contributing significantly to operational efficiency and sustainability in viticulture. Full article
Show Figures

Figure 1

31 pages, 21618 KB  
Article
Cohesion-Based Flocking Formation Using Potential Linked Nodes Model for Multi-Robot Agricultural Swarms
by Kevin Marlon Soza-Mamani, Marcelo Saavedra Alcoba, Felipe Torres and Alvaro Javier Prado-Romo
Agriculture 2026, 16(2), 155; https://doi.org/10.3390/agriculture16020155 - 8 Jan 2026
Viewed by 262
Abstract
Accurately modeling and representing the collective dynamics of large-scale robotic systems remains one of the fundamental challenges in swarm robotics. Within the context of agricultural robotics, swarm-based coordination schemes enable scalable and adaptive control of multi-robot teams performing tasks such as crop monitoring [...] Read more.
Accurately modeling and representing the collective dynamics of large-scale robotic systems remains one of the fundamental challenges in swarm robotics. Within the context of agricultural robotics, swarm-based coordination schemes enable scalable and adaptive control of multi-robot teams performing tasks such as crop monitoring and autonomous field maintenance. This paper introduces a cohesive Potential Linked Nodes (PLNs) framework, an adjustable formation structure that employs Artificial Potential Fields (APFs), and virtual node–link interactions to regulate swarm cohesion and coordinated motion (CM). The proposed model governs swarm formation, modulates structural integrity, and enhances responsiveness to external perturbations. The PLN framework facilitates swarm stability, maintaining high cohesion and adaptability while the system’s tunable parameters enable online adjustment of inter-agent coupling strength and formation rigidity. Comprehensive simulation experiments were conducted to assess the performance of the model under multiple swarm conditions, including static aggregation and dynamic flocking behavior using differential-drive mobile robots. Additional tests within a simulated cropping environment were performed to evaluate the framework’s stability and cohesiveness under agricultural constraints. Swarm cohesion and formation stability were quantitatively analyzed using density-based and inter-robot distance metrics. The experimental results demonstrate that the PLN model effectively maintains formation integrity and cohesive stability throughout all scenarios. Full article
Show Figures

Figure 1

21 pages, 19413 KB  
Article
Efficient Real-Time Row Detection and Navigation Using LaneATT for Greenhouse Environments
by Ricardo Navarro Gómez, Joel Milla, Paolo Alfonso Reyes Ramírez, Jesús Arturo Escobedo Cabello and Alfonso Gómez-Espinosa
Agriculture 2026, 16(1), 111; https://doi.org/10.3390/agriculture16010111 - 31 Dec 2025
Viewed by 395
Abstract
This study introduces an efficient real-time lane detection and navigation system for greenhouse environments, leveraging the LaneATT architecture. Designed for deployment on the Jetson Xavier NX edge computing platform, the system utilizes an RGB camera to enable autonomous navigation in greenhouse rows. From [...] Read more.
This study introduces an efficient real-time lane detection and navigation system for greenhouse environments, leveraging the LaneATT architecture. Designed for deployment on the Jetson Xavier NX edge computing platform, the system utilizes an RGB camera to enable autonomous navigation in greenhouse rows. From real-world agricultural environments, data were collected and annotated to train the model, achieving 90% accuracy, 91% F1 Score, and an inference speed of 48 ms per frame. The LaneATT-based vision system was trained and validated in greenhouse environments under heterogeneous illumination conditions and across multiple phenological stages of crop development. The navigation system was validated using a commercial skid-steering mobile robot operating within an experimental greenhouse environment under actual operating conditions. The proposed solution minimizes computational overhead, making it highly suitable for deployment on edge devices within resource-constrained environments. Furthermore, experimental results demonstrate robust performance, with precise lane detection and rapid response times on embedded systems. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

39 pages, 3635 KB  
Review
Application of Navigation Path Planning and Trajectory Tracking Control Methods for Agricultural Robots
by Fan Ye, Feixiang Le, Longfei Cui, Shaobo Han, Jingxing Gao, Junzhe Qu and Xinyu Xue
Agriculture 2026, 16(1), 64; https://doi.org/10.3390/agriculture16010064 - 27 Dec 2025
Viewed by 500
Abstract
Autonomous navigation is a core enabler of smart agriculture, where path planning and trajectory tracking control play essential roles in achieving efficient and precise operations. Path planning determines operational efficiency and coverage completeness, while trajectory tracking directly affects task accuracy and system robustness. [...] Read more.
Autonomous navigation is a core enabler of smart agriculture, where path planning and trajectory tracking control play essential roles in achieving efficient and precise operations. Path planning determines operational efficiency and coverage completeness, while trajectory tracking directly affects task accuracy and system robustness. This paper presents a systematic review of agricultural robot navigation research published between 2020 and 2025, based on literature retrieved from major databases including Web of Science and EI Compendex (ultimately including 95 papers). Research advances in global planning (coverage and point-to-point), local planning (obstacle avoidance and replanning), multi-robot cooperative planning, and classical, advanced, and learning-based trajectory tracking control methods are comprehensively summarized. Particular attention is given to their application and limitations in typical agricultural scenarios such as open-fields, orchards, greenhouses, and hilly slopes. Despite notable progress, key challenges remain, including limited algorithm comparability, weak cross-scenario generalization, and insufficient long-term validation. To address these issues, a scenario-driven “scenario–constraint–performance” adaptive framework is proposed to systematically align navigation methods with environmental and operational conditions, providing practical guidance for developing scalable and engineering-ready agricultural robot navigation systems. Full article
(This article belongs to the Section Agricultural Technology)
Show Figures

Figure 1

19 pages, 9601 KB  
Article
Lightweight Transformer and Faster Convolution for Efficient Strawberry Detection
by Jieyan Wu, Jinlai Zhang, Liuqi Tan, You Wu and Kai Gao
Appl. Sci. 2026, 16(1), 293; https://doi.org/10.3390/app16010293 - 27 Dec 2025
Viewed by 188
Abstract
The agricultural system faces the formidable challenge of efficiently harvesting strawberries, a labor-intensive process that has long relied on manual labor. The advent of autonomous harvesting robot systems offers a transformative solution, but their success hinges on the accuracy and efficiency of strawberry [...] Read more.
The agricultural system faces the formidable challenge of efficiently harvesting strawberries, a labor-intensive process that has long relied on manual labor. The advent of autonomous harvesting robot systems offers a transformative solution, but their success hinges on the accuracy and efficiency of strawberry detection. In this paper, we present DPViT-YOLOV8, a novel approach that leverages advancements in computer vision and deep learning to significantly enhance strawberry detection. DPViT-YOLOV8 integrates the EfficientViT backbone for multi-scale linear attention, the Dynamic Head mechanism for unified object detection heads with attention, and the proposed C2f_Faster module for enhanced computational efficiency into the YOLOV8 architecture. We meticulously curate and annotate a diverse dataset of strawberry images on a farm. A rigorous evaluation demonstrates that DPViT-YOLOV8 outperforms baseline models, achieving superior Mean Average Precision (mAP), precision, and recall. Additionally, an ablation study highlights the individual contributions of each enhancement. Qualitative results showcase the model’s proficiency in locating ripe strawberries in real-world agricultural settings. Notably, DPViT-YOLOV8 maintains computational efficiency, reducing inference time and FLOPS compared to the baseline YOLOV8. Our research bridges the gap between computer vision and agriculture systems, offering a powerful tool to accelerate the adoption of autonomous strawberry harvesting, reduce labor costs, and ensure the sustainability of strawberry farming. Full article
(This article belongs to the Section Agricultural Science and Technology)
Show Figures

Figure 1

5 pages, 180 KB  
Editorial
Advanced Autonomous Systems and the Artificial Intelligence Stage
by Liviu Marian Ungureanu and Iulian-Sorin Munteanu
Technologies 2026, 14(1), 9; https://doi.org/10.3390/technologies14010009 - 23 Dec 2025
Viewed by 336
Abstract
This Editorial presents an integrative overview of the Special Issue “Advanced Autonomous Systems and Artificial Intelligence Stage”, which assembles fifteen peer-reviewed articles dedicated to the recent evolution of AI-enabled and autonomous systems. The contributions span a broad spectrum of domains, including renewable energy [...] Read more.
This Editorial presents an integrative overview of the Special Issue “Advanced Autonomous Systems and Artificial Intelligence Stage”, which assembles fifteen peer-reviewed articles dedicated to the recent evolution of AI-enabled and autonomous systems. The contributions span a broad spectrum of domains, including renewable energy and power systems, intelligent transportation, agricultural robotics, clinical and assistive technologies, mobile robotic platforms, and space robotics. Across these diverse applications, the collection highlights core research themes such as robust perception and navigation, semantic and multi modal sensing, resource-efficient embedded inference, human–machine interaction, sustainable infrastructures, and validation frameworks for safety-critical systems. Several articles demonstrate how physical modeling, hybrid control architectures, deep learning, and data-driven methods can be combined to enhance operational robustness, reliability, and autonomy in real-world environments. Other works address challenges related to fall detection, predictive maintenance, teleoperation safety, and the deployment of intelligent systems in large-scale or mission-critical contexts. Overall, this Special Issue offers a consolidated and rigorous academic synthesis of current advances in Autonomous Systems and Artificial Intelligence, providing researchers and practitioners with a valuable reference for understanding emerging trends, practical implementations, and future research directions. Full article
(This article belongs to the Special Issue Advanced Autonomous Systems and Artificial Intelligence Stage)
19 pages, 8575 KB  
Article
RobotOBchain: Neighbor Observation for Byzantine Detection in Multi-Robot Systems
by Jie Luo, Yumeng Guo, Tiancheng Cao and Wuyang Zhu
Electronics 2025, 14(24), 4815; https://doi.org/10.3390/electronics14244815 - 7 Dec 2025
Viewed by 313
Abstract
Multi-robot systems are increasingly deployed in critical applications such as search and rescue, precision agriculture, and autonomous transportation. However, the presence of Byzantine robots—agents that intentionally transmit false or misleading information—can severely compromise mission success and system safety, highlighting the urgent need for [...] Read more.
Multi-robot systems are increasingly deployed in critical applications such as search and rescue, precision agriculture, and autonomous transportation. However, the presence of Byzantine robots—agents that intentionally transmit false or misleading information—can severely compromise mission success and system safety, highlighting the urgent need for robust fault-tolerant coordination mechanisms. To address the challenge of Byzantine faults in multi-robot systems, we propose a novel approach utilizing a blockchain-based framework, termed RobotOBchain (Robot Observation Blockchain). RobotOBchain permanently records each robot’s own state information and its observed neighboring robots’ states at every time step. By leveraging smart contracts encoded within the blockchain, our method automatically detects state inconsistencies or conflicts among recorded observations, enabling early identification of intentionally deceptive Byzantine robots. Experimental validation demonstrates that RobotOBchain achieves 100% consistent Byzantine identification across all robots, maintains estimation errors within 3% of ground-truth, and exhibits robust tolerance to up to 50% malicious agents. These results significantly surpass the performance of classical W-MSR algorithms, while eliminating the dependency on predefined fault bounds. The framework’s demonstrated capabilities indicate strong potential for practical deployment in dynamic and safety-critical multi-robot applications. Full article
(This article belongs to the Special Issue Coordination and Communication of Multi-Robot Systems)
Show Figures

Figure 1

26 pages, 20055 KB  
Article
Design and Development of a Neural Network-Based End-Effector for Disease Detection in Plants with 7-DOF Robot Integration
by Harol Toro, Hector Moncada, Kristhian Dierik Gonzales, Cristian Moreno, Claudia L. Garzón-Castro and Jose Luis Ordoñez-Avila
Processes 2025, 13(12), 3934; https://doi.org/10.3390/pr13123934 - 5 Dec 2025
Viewed by 504
Abstract
This study presents the design and development of an intelligent end-effector integrated into a custom 7-degree-of-freedom (DOF) robotic arm for monitoring the health status of tomato plants during their growth stages. The robotic system combines five rotational and two prismatic joints, enabling both [...] Read more.
This study presents the design and development of an intelligent end-effector integrated into a custom 7-degree-of-freedom (DOF) robotic arm for monitoring the health status of tomato plants during their growth stages. The robotic system combines five rotational and two prismatic joints, enabling both horizontal reach and vertical adaptability to inspect plants of varying heights without repositioning the robot’s base. The integrated vision module employs a YOLOv5 neural network trained with 7864 images of tomato leaves, including both healthy and diseased samples. Image preprocessing included normalization and data augmentation to enhance robustness under natural lighting conditions. The optimized model achieved a detection accuracy of 90.2% and a mean average precision (mAP) of 92.3%, demonstrating high reliability in real-time disease classification. The end-effector, fabricated using additive manufacturing, incorporates a Raspberry Pi 4 for onboard processing, allowing autonomous operation in agricultural environments. The experimental results validate the feasibility of combining a custom 7-DOF robotic structure with a deep learning-based detector for continuous plant monitoring. This research contributes to the field of agricultural robotics by providing a flexible and precise platform capable of early disease detection in dynamic cultivation conditions, promoting sustainable and data-driven crop management. Full article
Show Figures

Figure 1

30 pages, 7942 KB  
Article
Research on Agricultural Autonomous Positioning and Navigation System Based on LIO-SAM and Apriltag Fusion
by Xianping Guan, Hongrui Ge, Shicheng Nie and Yuhan Ding
Agronomy 2025, 15(12), 2731; https://doi.org/10.3390/agronomy15122731 - 27 Nov 2025
Viewed by 924
Abstract
The application of autonomous navigation in intelligent agriculture is becoming more and more extensive. Traditional navigation schemes in greenhouses, orchards, and other agricultural environments often have problems such as the inability to deal with an uneven illumination distribution, complex layout, highly repetitive and [...] Read more.
The application of autonomous navigation in intelligent agriculture is becoming more and more extensive. Traditional navigation schemes in greenhouses, orchards, and other agricultural environments often have problems such as the inability to deal with an uneven illumination distribution, complex layout, highly repetitive and similar structures, and difficulty in receiving GNSS (Global Navigation Satellite System) signals. In order to solve this problem, this paper proposes a new tightly coupled LiDAR (Light Detection and Ranging) inertial odometry SLAM (LIO-SAM) framework named April-LIO-SAM. The framework innovatively uses Apriltag, a two-dimensional bar code widely used for precise positioning, pose estimation, and scene recognition of objects as a global positioning beacon to replace GNSS to provide absolute pose observation. The system uses three-dimensional LiDAR (VLP-16) and IMU (inertial measurement unit) to collect environmental data and uses Apriltag as absolute coordinates instead of GNSS to solve the problem of unreliable GNSS signal reception in greenhouses, orchards, and other agricultural environments. The SLAM trajectories and navigation performance were validated in a carefully built greenhouse and orchard environment. The experimental results show that the navigation map developed by the April-LIO-SAM yields a root mean square error of 0.057 m. The average positioning errors are 0.041 m, 0.049 m, 0.056 m, and 0.070 m, respectively, when the density of Apriltag is 3 m, 5 m, and 7 m. The navigation experimental results indicate that, at speeds of 0.4, 0.3, and 0.2 m/s, the average lateral deviation is less than 0.053 m, with a standard deviation below 0.034 m. The average heading deviation is less than 2.3°, with a standard deviation below 1.6°. The positioning stability experiments under interference conditions such as illumination and occlusion were carried out. It was verified that the system maintained a good stability under complex external conditions, and the positioning error fluctuation was within 3.0 mm. The results confirm that the robot positioning and navigation accuracy of mobile robots satisfy the continuity in the facility. Full article
(This article belongs to the Special Issue Research Progress in Agricultural Robots in Arable Farming)
Show Figures

Figure 1

21 pages, 5173 KB  
Article
EdgeFormer-YOLO: A Lightweight Multi-Attention Framework for Real-Time Red-Fruit Detection in Complex Orchard Environments
by Zhiyuan Xu, Tianjun Luo, Yinyi Lai, Yuheng Liu and Wenbin Kang
Mathematics 2025, 13(23), 3790; https://doi.org/10.3390/math13233790 - 26 Nov 2025
Cited by 1 | Viewed by 552
Abstract
Accurate and efficient detection of red fruits in complex orchard environments is crucial for the autonomous operation of agricultural harvesting robots. However, existing methods still face challenges such as high false negative rates, poor localization accuracy, and difficulties in edge deployment in real-world [...] Read more.
Accurate and efficient detection of red fruits in complex orchard environments is crucial for the autonomous operation of agricultural harvesting robots. However, existing methods still face challenges such as high false negative rates, poor localization accuracy, and difficulties in edge deployment in real-world scenarios involving occlusion, strong light reflection, and drastic scale changes. To address these issues, this paper proposes a lightweight multi-attention detection framework, EdgeFormer-YOLO. While maintaining the efficiency of the YOLO series’ single-stage detection architecture, it introduces a multi-head self-attention mechanism (MHSA) to enhance the global modeling capability for occluded fruits and employs a hierarchical feature fusion strategy to improve multi-scale detection robustness. To further adapt to the quantitative deployment requirements of edge devices, the model introduces the arsinh activation function, improving numerical stability and convergence speed while maintaining a non-zero gradient. On the red fruit dataset, EdgeFormer-YOLO achieves 95.7% mAP@0.5, a 2.2 percentage point improvement over the YOLOv8n baseline, while maintaining 90.0% precision and 92.5% recall. Furthermore, on the edge GPU, the model achieves an inference speed of 148.78 FPS with a size of 6.35 MB, 3.21 M parameters, and a computational overhead of 4.18 GFLOPs, outperforming some existing mainstream lightweight YOLO variants in both speed and mAP@50. Experimental results demonstrate that EdgeFormer-YOLO possesses comprehensive advantages in real-time performance, robustness, and deployment feasibility in complex orchard environments, providing a viable technical path for agricultural robot vision systems. Full article
Show Figures

Figure 1

42 pages, 68297 KB  
Review
AI-Driven Cooperative Control for Autonomous Tractors and Implements: A Comprehensive Review
by Hongjie Jia, Weipeng Chen, Zhihao Su, Yaozu Sun, Zhengpeng Qian and Longxia Huang
AgriEngineering 2025, 7(11), 394; https://doi.org/10.3390/agriengineering7110394 - 20 Nov 2025
Viewed by 1646
Abstract
Artificial intelligence (AI) is driving the evolution of autonomous agriculture towards multi-agent collaborative control, breaking through the limitations of traditional isolated automation. Although existing research has focused on hierarchical control and perception-decision-making technologies for agricultural machinery, the overall integration of these elements in [...] Read more.
Artificial intelligence (AI) is driving the evolution of autonomous agriculture towards multi-agent collaborative control, breaking through the limitations of traditional isolated automation. Although existing research has focused on hierarchical control and perception-decision-making technologies for agricultural machinery, the overall integration of these elements in building a resilient physical perception collaborative system is still insufficient. This paper systematically reviews the progress of AI-driven tractor-implement cooperative control from 2018 to 2025, focusing on four major technical pillars: (1) perception-decision-execution hierarchical architecture, (2) distributed multi-agent collaborative framework, (3) physical perception modeling and adaptive control, and (4) staged operation applications (such as collaborative harvesting). The research reveals core challenges such as real-time collaborative planning, perception robustness under environmental disturbances, and collaborative control and safety assurance under operational disturbances. To this end, three solutions are proposed: an AI framework for formalizing agronomic constraints and mechanical dynamics; a disturbance-resistant adaptive tractor-implement cooperative control strategy; and a real-time collaborative ecosystem integrating neuromorphic computing and FarmOS. Finally, a research roadmap is summarized with agronomic constraint reinforcement learning, self-reconfigurable collaboration, and biomechanical mechatronic systems as the core. By integrating the scattered progress in AI, robotics and agronomy, we provide theoretical foundation and practical guidance for scalable and sustainable autonomous farm systems. Full article
Show Figures

Figure 1

5587 KB  
Proceeding Paper
Towards Autonomous Raised Bed Flower Pollination with IoT and Robotics
by Rusira Thamuditha Karunarathna, Chathupa Wickramarathne, Mohamed Akmal Mohamed Alavi, Chamath Shanaka Wickrama Arachchi, Kapila Dissanayaka, Bhagya Nathali Silva and Ruchire Eranga Wijesinghe
Eng. Proc. 2025, 118(1), 55; https://doi.org/10.3390/ECSA-12-26572 - 7 Nov 2025
Viewed by 194
Abstract
Strawberries, a high-value crop with growing demand, face increasing challenges from labour shortages, declining pollinator populations, and the limitations of inconsistent manual pollination. This paper presents an IoT-enabled robotic system designed to automate strawberry pollination in open-field raised-bed environments with minimal human intervention. [...] Read more.
Strawberries, a high-value crop with growing demand, face increasing challenges from labour shortages, declining pollinator populations, and the limitations of inconsistent manual pollination. This paper presents an IoT-enabled robotic system designed to automate strawberry pollination in open-field raised-bed environments with minimal human intervention. The system consists of a mobile rover equipped with an ESP32-CAM for image capture and a robotic arm mounted on an Arduino Uno, capable of controlled X, Y, and Z positioning to perform targeted pollination. Images of strawberry beds are transmitted to a locally deployed server, which uses a lightweight detection model to identify flowers. System components communicate asynchronously via HTTP and I2C protocols, and the onboard event-driven architecture enables responsive behaviour while minimizing RAM and power usage, which is an essential requirement for low-cost, field-deployable robotics. The server also manages multi-rover scheduling through a custom priority queue designed for low-end hardware. In controlled lo0ad tests, the scheduler improved average response time by 6.9% and handled 2.4% more requests compared to the default queueing system, while maintaining stability. Preliminary field tests demonstrate successful flower identification and reliable arm positioning under real-world conditions. Although full system yield measurements are ongoing, current results validate the core design’s functional feasibility. Unlike previous systems that focus on greenhouse deployments or simpler navigation approaches, this work emphasizes modularity, affordability, and adaptability for small and medium farms, particularly in resource-constrained agricultural regions such as Sri Lanka. This study presents a promising step toward autonomous and scalable pollination systems that integrate embedded systems, robotics, and IoT for practical use in precision agriculture. Full article
Show Figures

Figure 1

17 pages, 1288 KB  
Article
Effects of Staggered Application of Chemical Defoliants on Cotton Fiber Quality
by Aashish Karki, Michael W. Marshall, Gilbert Miller, Van Patiluna, Jun Luo, Edward Barnes and Joe Mari Maja
AgriEngineering 2025, 7(11), 372; https://doi.org/10.3390/agriengineering7110372 - 4 Nov 2025
Viewed by 673
Abstract
Chemical defoliation is an important management practice in cotton to facilitate mechanical harvesting and leaf removal and maintain lint quality. Recent advances in precision agriculture have enabled the development of autonomous robotic platforms with a targeted side-spraying system that can achieve good canopy [...] Read more.
Chemical defoliation is an important management practice in cotton to facilitate mechanical harvesting and leaf removal and maintain lint quality. Recent advances in precision agriculture have enabled the development of autonomous robotic platforms with a targeted side-spraying system that can achieve good canopy penetration while preventing soil compaction and crop mechanical damage. A side-wise spraying system allows for application of defoliant at different canopy heights. However, information on the effects of staggered defoliation on cotton fiber quality is limited. Thus, field research was conducted to evaluate the effects of various staggered application timing intervals (15, 10, 8, 5, and 3 days) on fiber quality and compare them with standard over-the-top broadcast applications. Staggered defoliation affected fiber length, with significant differences observed for upper half mean length, fiber length based on weight, and upper quartile length. Fiber maturity was also influenced by staggered defoliation timing, with a 15-day interval resulting in the lowest micronaire and higher immature fiber content. The effects of staggered defoliation on other parameters, such as strength, uniformity, and trash characteristics, varied across locations. The findings highlight the potential of robotic systems for chemical spraying and emphasize the need for further research on more precise and targeted application of defoliants to improve fiber quality. Full article
Show Figures

Figure 1

19 pages, 3577 KB  
Article
Orchard Robot Navigation via an Improved RTAB-Map Algorithm
by Jinxing Niu, Le Zhang, Tao Zhang, Jinpeng Guan and Shuheng Shi
Appl. Sci. 2025, 15(21), 11673; https://doi.org/10.3390/app152111673 - 31 Oct 2025
Viewed by 1372
Abstract
To address issues such as low visual SLAM (Simultaneous Localization and Mapping) positioning accuracy and poor map construction robustness caused by light variations, foliage occlusion, and texture repetition in unstructured orchard environments, this paper proposes an orchard robot navigation method based on an [...] Read more.
To address issues such as low visual SLAM (Simultaneous Localization and Mapping) positioning accuracy and poor map construction robustness caused by light variations, foliage occlusion, and texture repetition in unstructured orchard environments, this paper proposes an orchard robot navigation method based on an improved RTAB-Map algorithm. By integrating ORB-SLAM3 as the visual odometry module within the RTAB-Map framework, the system achieves significantly improved accuracy and stability in pose estimation. During the post-processing stage of map generation, a height filtering strategy is proposed to effectively filter out low-hanging branch point clouds, thereby generating raster maps that better meet navigation requirements. The navigation layer integrates the ROS (Robot Operating System) Navigation framework, employing the A* algorithm for global path planning while incorporating the TEB (Timed Elastic Band) algorithm to achieve real-time local obstacle avoidance and dynamic adjustment. Experimental results demonstrate that the improved system exhibits higher mapping consistency in simulated orchard environments, with the odometry’s absolute trajectory error reduced by approximately 45.5%. The robot can reliably plan paths and traverse areas with low-hanging branches. This study provides a solution for autonomous navigation in agricultural settings that balances precision with practicality. Full article
Show Figures

Figure 1

Back to TopTop