Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (113)

Search Parameters:
Keywords = robotic weeding

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 17213 KiB  
Review
Empowering Smart Soybean Farming with Deep Learning: Progress, Challenges, and Future Perspectives
by Huihui Sun, Hao-Qi Chu, Yi-Ming Qin, Pingfan Hu and Rui-Feng Wang
Agronomy 2025, 15(8), 1831; https://doi.org/10.3390/agronomy15081831 - 28 Jul 2025
Viewed by 228
Abstract
This review comprehensively examines the application of deep learning technologies across the entire soybean production chain, encompassing areas such as disease and pest identification, weed detection, crop phenotype recognition, yield prediction, and intelligent operations. By systematically analyzing mainstream deep learning models, optimization strategies [...] Read more.
This review comprehensively examines the application of deep learning technologies across the entire soybean production chain, encompassing areas such as disease and pest identification, weed detection, crop phenotype recognition, yield prediction, and intelligent operations. By systematically analyzing mainstream deep learning models, optimization strategies (e.g., model lightweighting, transfer learning), and sensor data fusion techniques, the review identifies their roles and performances in complex agricultural environments. It also highlights key challenges including data quality limitations, difficulties in real-world deployment, and the lack of standardized evaluation benchmarks. In response, promising directions such as reinforcement learning, self-supervised learning, interpretable AI, and multi-source data fusion are proposed. Specifically for soybean automation, future advancements are expected in areas such as high-precision disease and weed localization, real-time decision-making for variable-rate spraying and harvesting, and the integration of deep learning with robotics and edge computing to enable autonomous field operations. This review provides valuable insights and future prospects for promoting intelligent, efficient, and sustainable development in soybean production through deep learning. Full article
(This article belongs to the Section Precision and Digital Agriculture)
Show Figures

Figure 1

35 pages, 6030 KiB  
Review
Common Ragweed—Ambrosia artemisiifolia L.: A Review with Special Regards to the Latest Results in Protection Methods, Herbicide Resistance, New Tools and Methods
by Bence Knolmajer, Ildikó Jócsák, János Taller, Sándor Keszthelyi and Gabriella Kazinczi
Agronomy 2025, 15(8), 1765; https://doi.org/10.3390/agronomy15081765 - 23 Jul 2025
Viewed by 360
Abstract
Common ragweed (Ambrosia artemisiifolia L.) has been identified as one of the most harmful invasive weed species in Europe due to its allergenic pollen and competitive growth in diverse habitats. In the first part of this review [Common Ragweed—Ambrosia artemisiifolia L.: [...] Read more.
Common ragweed (Ambrosia artemisiifolia L.) has been identified as one of the most harmful invasive weed species in Europe due to its allergenic pollen and competitive growth in diverse habitats. In the first part of this review [Common Ragweed—Ambrosia artemisiifolia L.: A Review with Special Regards to the Latest Results in Biology and Ecology], its biological characteristics and ecological behavior were described in detail. In the current paper, control strategies are summarized, focusing on integrated weed management adapted to the specific habitat where the species causes damage—arable land, semi-natural vegetation, urban areas, or along linear infrastructures. A range of management methods is reviewed, including agrotechnical, mechanical, physical, thermal, biological, and chemical approaches. Particular attention is given to the spread of herbicide resistance and the need for diversified, habitat-specific interventions. Among biological control options, the potential of Ophraella communa LeSage, a leaf beetle native to North America, is highlighted. Furthermore, innovative technologies such as UAV-assisted weed mapping, site-specific herbicide application, and autonomous weeding robots are discussed as environmentally sustainable tools. The role of legal regulations and pollen monitoring networks—particularly those implemented in Hungary—is also emphasized. By combining traditional and advanced methods within a coordinated framework, effective and ecologically sound ragweed control can be achieved. Full article
(This article belongs to the Section Weed Science and Weed Management)
Show Figures

Figure 1

22 pages, 9981 KiB  
Article
Design and Experiment of Autonomous Shield-Cutting End-Effector for Dual-Zone Maize Field Weeding
by Yunxiang Li, Yinsong Qu, Yuan Fang, Jie Yang and Yanfeng Lu
Agriculture 2025, 15(14), 1549; https://doi.org/10.3390/agriculture15141549 - 18 Jul 2025
Viewed by 262
Abstract
This study presented an autonomous shield-cutting end-effector for maize surrounding weeding (SEMSW), addressing the challenges of the low weed removal rate (WRR) and high seedling damage rate (SDR) in northern China’s 3–5 leaf stage maize. The SEMSW integrated seedling positioning, robotic arm control, [...] Read more.
This study presented an autonomous shield-cutting end-effector for maize surrounding weeding (SEMSW), addressing the challenges of the low weed removal rate (WRR) and high seedling damage rate (SDR) in northern China’s 3–5 leaf stage maize. The SEMSW integrated seedling positioning, robotic arm control, and precision weeding functionalities: a seedling positioning sensor identified maize seedlings and weeds, guiding XYZ translational motions to align the robotic arm. The seedling-shielding anti-cutting mechanism (SAM) enclosed crop stems, while the contour-adaptive weeding mechanism (CWM) activated two-stage retractable blades (TRWBs) for inter/intra-row weeding operations. The following key design parameters were determined: 150 mm inner diameter for the seedling-shielding disc; 30 mm minimum inscribed-circle for retractable clamping units (RCUs); 40 mm ground clearance for SAM; 170 mm shielding height; and 100 mm minimum inscribed-circle diameter for the TRWB. Mathematical optimization defined the shape-following weeding cam (SWC) contour and TRWB dimensional chain. Kinematic/dynamic models were introduced alongside an adaptive sliding mode controller, ensuring lateral translation error convergence. A YOLOv8 model achieved 0.951 precision, 0.95 mAP50, and 0.819 mAP50-95, striking a balance between detection accuracy and localization precision. Field trials of the prototype showed 88.3% WRR and 2.2% SDR, meeting northern China’s agronomic standards. Full article
Show Figures

Figure 1

28 pages, 8982 KiB  
Article
Decision-Level Multi-Sensor Fusion to Improve Limitations of Single-Camera-Based CNN Classification in Precision Farming: Application in Weed Detection
by Md. Nazmuzzaman Khan, Adibuzzaman Rahi, Mohammad Al Hasan and Sohel Anwar
Computation 2025, 13(7), 174; https://doi.org/10.3390/computation13070174 - 18 Jul 2025
Viewed by 276
Abstract
The United States leads in corn production and consumption in the world with an estimated USD 50 billion per year. There is a pressing need for the development of novel and efficient techniques aimed at enhancing the identification and eradication of weeds in [...] Read more.
The United States leads in corn production and consumption in the world with an estimated USD 50 billion per year. There is a pressing need for the development of novel and efficient techniques aimed at enhancing the identification and eradication of weeds in a manner that is both environmentally sustainable and economically advantageous. Weed classification for autonomous agricultural robots is a challenging task for a single-camera-based system due to noise, vibration, and occlusion. To address this issue, we present a multi-camera-based system with decision-level sensor fusion to improve the limitations of a single-camera-based system in this paper. This study involves the utilization of a convolutional neural network (CNN) that was pre-trained on the ImageNet dataset. The CNN subsequently underwent re-training using a limited weed dataset to facilitate the classification of three distinct weed species: Xanthium strumarium (Common Cocklebur), Amaranthus retroflexus (Redroot Pigweed), and Ambrosia trifida (Giant Ragweed). These weed species are frequently encountered within corn fields. The test results showed that the re-trained VGG16 with a transfer-learning-based classifier exhibited acceptable accuracy (99% training, 97% validation, 94% testing accuracy) and inference time for weed classification from the video feed was suitable for real-time implementation. But the accuracy of CNN-based classification from video feed from a single camera was found to deteriorate due to noise, vibration, and partial occlusion of weeds. Test results from a single-camera video feed show that weed classification accuracy is not always accurate for the spray system of an agricultural robot (AgBot). To improve the accuracy of the weed classification system and to overcome the shortcomings of single-sensor-based classification from CNN, an improved Dempster–Shafer (DS)-based decision-level multi-sensor fusion algorithm was developed and implemented. The proposed algorithm offers improvement on the CNN-based weed classification when the weed is partially occluded. This algorithm can also detect if a sensor is faulty within an array of sensors and improves the overall classification accuracy by penalizing the evidence from a faulty sensor. Overall, the proposed fusion algorithm showed robust results in challenging scenarios, overcoming the limitations of a single-sensor-based system. Full article
(This article belongs to the Special Issue Moving Object Detection Using Computational Methods and Modeling)
Show Figures

Figure 1

28 pages, 5813 KiB  
Article
YOLO-SW: A Real-Time Weed Detection Model for Soybean Fields Using Swin Transformer and RT-DETR
by Yizhou Shuai, Jingsha Shi, Yi Li, Shaohao Zhou, Lihua Zhang and Jiong Mu
Agronomy 2025, 15(7), 1712; https://doi.org/10.3390/agronomy15071712 - 16 Jul 2025
Cited by 1 | Viewed by 426
Abstract
Accurate weed detection in soybean fields is essential for enhancing crop yield and reducing herbicide usage. This study proposes a YOLO-SW model, an improved version of YOLOv8, to address the challenges of detecting weeds that are highly similar to the background in natural [...] Read more.
Accurate weed detection in soybean fields is essential for enhancing crop yield and reducing herbicide usage. This study proposes a YOLO-SW model, an improved version of YOLOv8, to address the challenges of detecting weeds that are highly similar to the background in natural environments. The research stands out for its novel integration of three key advancements: the Swin Transformer backbone, which leverages local window self-attention to achieve linear O(N) computational complexity for efficient global context capture; the CARAFE dynamic upsampling operator, which enhances small target localization through context-aware kernel generation; and the RTDETR encoder, which enables end-to-end detection via IoU-aware query selection, eliminating the need for complex post-processing. Additionally, a dataset of six common soybean weeds was expanded to 12,500 images through simulated fog, rain, and snow augmentation, effectively resolving data imbalance and boosting model robustness. The experimental results highlight both the technical superiority and practical relevance: YOLO-SW achieves 92.3% mAP@50 (3.8% higher than YOLOv8), with recognition accuracy and recall improvements of 4.2% and 3.9% respectively. Critically, on the NVIDIA Jetson AGX Orin platform, it delivers a real-time inference speed of 59 FPS, making it suitable for seamless deployment on intelligent weeding robots. This low-power, high-precision solution not only bridges the gap between deep learning and precision agriculture but also enables targeted herbicide application, directly contributing to sustainable farming practices and environmental protection. Full article
Show Figures

Figure 1

39 pages, 14267 KiB  
Review
Smart Precision Weeding in Agriculture Using 5IR Technologies
by Chaw Thiri San and Vijay Kakani
Electronics 2025, 14(13), 2517; https://doi.org/10.3390/electronics14132517 - 20 Jun 2025
Viewed by 722
Abstract
The rise of smart precision weeding driven by Fifth Industrial Revolution (5IR) technologies symbolizes a quantum leap in sustainable agriculture. The modern weeding systems are becoming promisingly efficient, intelligently autonomous, and environmentally responsible by introducing artificial intelligence (AI), robotics, Internet of Things (IoT), [...] Read more.
The rise of smart precision weeding driven by Fifth Industrial Revolution (5IR) technologies symbolizes a quantum leap in sustainable agriculture. The modern weeding systems are becoming promisingly efficient, intelligently autonomous, and environmentally responsible by introducing artificial intelligence (AI), robotics, Internet of Things (IoT), 5G connectivity, and edge computing technologies. This review discusses a comprehensive analysis of the traditional and contemporary weeding techniques, thereby focusing on the technological innovations paving way for the smart systems. Primarily, this work investigates the application of 5IR technologies in weed detection and decision-making with particular emphasis on the role of the aspects such as AI-driven models, drone-robot integration, GPS-guided practices, and intelligent sensor networks. Additionally, the work outlines key commercial solutions, sustainability metrics, data-driven decision support systems, and blockchain traceable practices. The prominent challenges in the context of global agricultural equity pertaining to cost, scalability, policy alignment, and adoption barriers in accordance to the low-resource environments are discussed in this study. The paper concludes with strategic recommendations and future research directions, highlighting the potential of 5IR technologies on the smart precision weeding. Full article
Show Figures

Figure 1

33 pages, 2741 KiB  
Review
Deep Learning in Multimodal Fusion for Sustainable Plant Care: A Comprehensive Review
by Zhi-Xiang Yang, Yusi Li, Rui-Feng Wang, Pingfan Hu and Wen-Hao Su
Sustainability 2025, 17(12), 5255; https://doi.org/10.3390/su17125255 - 6 Jun 2025
Cited by 6 | Viewed by 992
Abstract
With the advancement of Agriculture 4.0 and the ongoing transition toward sustainable and intelligent agricultural systems, deep learning-based multimodal fusion technologies have emerged as a driving force for crop monitoring, plant management, and resource conservation. This article systematically reviews research progress from three [...] Read more.
With the advancement of Agriculture 4.0 and the ongoing transition toward sustainable and intelligent agricultural systems, deep learning-based multimodal fusion technologies have emerged as a driving force for crop monitoring, plant management, and resource conservation. This article systematically reviews research progress from three perspectives: technical frameworks, application scenarios, and sustainability-driven challenges. At the technical framework level, it outlines an integrated system encompassing data acquisition, feature fusion, and decision optimization, thereby covering the full pipeline of perception, analysis, and decision making essential for sustainable practices. Regarding application scenarios, it focuses on three major tasks—disease diagnosis, maturity and yield prediction, and weed identification—evaluating how deep learning-driven multisource data integration enhances precision and efficiency in sustainable farming operations. It further discusses the efficient translation of detection outcomes into eco-friendly field practices through agricultural navigation systems, harvesting and plant protection robots, and intelligent resource management strategies based on feedback-driven monitoring. In addressing challenges and future directions, the article highlights key bottlenecks such as data heterogeneity, real-time processing limitations, and insufficient model generalization, and proposes potential solutions including cross-modal generative models and federated learning to support more resilient, sustainable agricultural systems. This work offers a comprehensive three-dimensional analysis across technology, application, and sustainability challenges, providing theoretical insights and practical guidance for the intelligent and sustainable transformation of modern agriculture through multimodal fusion. Full article
Show Figures

Figure 1

28 pages, 962 KiB  
Review
Precision Weeding in Agriculture: A Comprehensive Review of Intelligent Laser Robots Leveraging Deep Learning Techniques
by Chengming Wang, Caixia Song, Tong Xu and Runze Jiang
Agriculture 2025, 15(11), 1213; https://doi.org/10.3390/agriculture15111213 - 1 Jun 2025
Viewed by 1146
Abstract
With the advancement of modern agriculture, intelligent laser robots driven by deep learning have emerged as an effective solution to address the limitations of traditional weeding methods. These robots offer precise and efficient weed control, crucial for boosting agricultural productivity. This paper provides [...] Read more.
With the advancement of modern agriculture, intelligent laser robots driven by deep learning have emerged as an effective solution to address the limitations of traditional weeding methods. These robots offer precise and efficient weed control, crucial for boosting agricultural productivity. This paper provides a comprehensive review of recent research on laser weeding applications using intelligent robots. Firstly, we introduce the content analysis method employed to organize the reviewed literature. Subsequently, we present the workflow of weeding systems, emphasizing key technologies such as the perception, decision-making, and execution layers. A detailed discussion follows on the application of deep learning algorithms, including Convolutional Neural Networks (CNNs), YOLO, and Faster R-CNN, in weed control. Here, we show that these algorithms can achieve high accuracy in weed detection, with YOLO demonstrating particularly fast and accurate performance. Furthermore, we analyze the challenges and open problems associated with deep learning detection systems and explore future trends in this research field. By summarizing the role of intelligent laser robots powered by deep learning, we aim to provide insights for researchers and practitioners in agriculture, fostering further innovation and development in this promising area. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

40 pages, 3280 KiB  
Review
Precision Weed Control Using Unmanned Aerial Vehicles and Robots: Assessing Feasibility, Bottlenecks, and Recommendations for Scaling
by Shanmugam Vijayakumar, Palanisamy Shanmugapriya, Pasoubady Saravanane, Thanakkan Ramesh, Varunseelan Murugaiyan and Selvaraj Ilakkiya
NDT 2025, 3(2), 10; https://doi.org/10.3390/ndt3020010 - 16 May 2025
Viewed by 2060
Abstract
Weeds cause significant yield and economic losses by competing with crops and increasing production costs. Compounding these challenges are labor shortages, herbicide resistance, and environmental pollution, making weed management increasingly difficult. In response, precision weed control (PWC) technologies, such as robots and unmanned [...] Read more.
Weeds cause significant yield and economic losses by competing with crops and increasing production costs. Compounding these challenges are labor shortages, herbicide resistance, and environmental pollution, making weed management increasingly difficult. In response, precision weed control (PWC) technologies, such as robots and unmanned aerial vehicles (UAVs), have emerged as innovative solutions. These tools offer farmers high precision (±1 cm spatial accuracy), enabling efficient and sustainable weed management. Herbicide spraying robots, mechanical weeding robots, and laser-based weeders are deployed on large-scale farms in developed countries. Similarly, UAVs are gaining popularity in many countries, particularly in Asia, for weed monitoring and herbicide application. Despite advancements in robotic and UAV weed control, their large-scale adoption remains limited. The reasons for this slow uptake and the barriers to widespread implementation are not fully understood. To address this knowledge gap, our review analyzes 155 articles and provides a comprehensive understanding of PWC challenges and needed interventions for scaling. This review revealed that AI-driven weed mapping in robots and UAVs struggles with data (quality, diversity, bias) and technical (computation, deployment, cost) barriers. Improved data (collection, processing, synthesis, bias mitigation) and efficient, affordable technology (edge/hybrid computing, lightweight algorithms, centralized computing resources, energy-efficient hardware) are required to improve AI-driven weed mapping adoption. Specifically, robotic weed control adoption is hindered by challenges in weed recognition, navigation complexity, limited battery life, data management (connectivity), fragmented farms, high costs, and limited digital literacy. Scaling requires advancements in weed detection and energy efficiency, development of affordable robots with shared service models, enhanced farmer training, improved rural connectivity, and precise engineering solutions. Similarly, UAV adoption in agriculture faces hurdles such as regulations (permits), limited payload and battery life, weather dependency, spray drift, sensor accuracy, lack of skilled operators, high initial and operational costs, and absence of standardized protocol. Scaling requires financing (subsidies, loans), favorable regulations (streamlined permits, online training), infrastructure development (service providers, hiring centers), technological innovation (interchangeable sensors, multipurpose UAVs), and capacity building (farmer training programs, awareness initiatives). Full article
Show Figures

Figure 1

21 pages, 14425 KiB  
Review
Progress and Challenges in Research on Key Technologies for Laser Weed Control Robot-to-Target System
by Rui Lu, Daode Zhang, Siqi Wang and Xinyu Hu
Agronomy 2025, 15(5), 1015; https://doi.org/10.3390/agronomy15051015 - 23 Apr 2025
Viewed by 1038
Abstract
The development of precise and sustainable agriculture has made non-chemical, highly selective laser weed control technology a hot research topic. The core of this technology lies in the overall performance of the targeting system, which consists of three key technologies, namely, target identification, [...] Read more.
The development of precise and sustainable agriculture has made non-chemical, highly selective laser weed control technology a hot research topic. The core of this technology lies in the overall performance of the targeting system, which consists of three key technologies, namely, target identification, dynamic positioning, and precise removal, which are interrelated and jointly determine the overall performance of the weed control system. In this paper, the key technologies of the targeting system are systematically analyzed to clarify the coupling relationship among the technologies and their role in performance optimization. This review systematically compares the mainstream recognition algorithms for the needs of laser weeding for specific parts, reveals the performance bottleneck of the existing algorithms in the laser weeding environment, and points out new research directions, such as developing weed apical growth zone recognition algorithms. The influence of laser beam control technology on weeding accuracy is analyzed, the advantages of vibroseis technology are explored, and the applicability problems of existing vibroseis technology in farmland environments are revealed, such as the shift of irradiation point caused by ground undulation. The key laws of laser parameter optimization are summarized, guiding the optimal design of the system. Through the systematic summary and in-depth analysis of the related research, this review reveals the key challenges facing the development of laser technology. It provides a prospective outlook on the future research direction, aiming to promote the development of laser weed control technology in terms of high efficiency, precision, and intelligence. Full article
Show Figures

Figure 1

30 pages, 24057 KiB  
Article
Enhancing Autonomous Orchard Navigation: A Real-Time Convolutional Neural Network-Based Obstacle Classification System for Distinguishing ‘Real’ and ‘Fake’ Obstacles in Agricultural Robotics
by Tabinda Naz Syed, Jun Zhou, Imran Ali Lakhiar, Francesco Marinello, Tamiru Tesfaye Gemechu, Luke Toroitich Rottok and Zhizhen Jiang
Agriculture 2025, 15(8), 827; https://doi.org/10.3390/agriculture15080827 - 10 Apr 2025
Cited by 3 | Viewed by 920
Abstract
Autonomous navigation in agricultural environments requires precise obstacle classification to ensure collision-free movement. This study proposes a convolutional neural network (CNN)-based model designed to enhance obstacle classification for agricultural robots, particularly in orchards. Building upon a previously developed YOLOv8n-based real-time detection system, the [...] Read more.
Autonomous navigation in agricultural environments requires precise obstacle classification to ensure collision-free movement. This study proposes a convolutional neural network (CNN)-based model designed to enhance obstacle classification for agricultural robots, particularly in orchards. Building upon a previously developed YOLOv8n-based real-time detection system, the model incorporates Ghost Modules and Squeeze-and-Excitation (SE) blocks to enhance feature extraction while maintaining computational efficiency. Obstacles are categorized as “Real”—those that physically impact navigation, such as tree trunks and persons—and “Fake”—those that do not, such as tall weeds and tree branches—allowing for precise navigation decisions. The model was trained on separate orchard and campus datasets and fine-tuned using Hyperband optimization and evaluated on an external test set to assess generalization to unseen obstacles. The model’s robustness was tested under varied lighting conditions, including low-light scenarios, to ensure real-world applicability. Computational efficiency was analyzed based on inference speed, memory consumption, and hardware requirements. Comparative analysis against state-of-the-art classification models (VGG16, ResNet50, MobileNetV3, DenseNet121, EfficientNetB0, and InceptionV3) confirmed the proposed model’s superior precision (p), recall (r), and F1-score, particularly in complex orchard scenarios. The model maintained strong generalization across diverse environmental conditions, including varying illumination and previously unseen obstacles. Furthermore, computational analysis revealed that the orchard-combined model achieved the highest inference speed at 2.31 FPS while maintaining a strong balance between accuracy and efficiency. When deployed in real-time, the model achieved 95.0% classification accuracy in orchards and 92.0% in campus environments. The real-time system demonstrated a false positive rate of 8.0% in the campus environment and 2.0% in the orchard, with a consistent false negative rate of 8.0% across both environments. These results validate the model’s effectiveness for real-time obstacle differentiation in agricultural settings. Its strong generalization, robustness to unseen obstacles, and computational efficiency make it well-suited for deployment in precision agriculture. Future work will focus on enhancing inference speed, improving performance under occlusion, and expanding dataset diversity to further strengthen real-world applicability. Full article
Show Figures

Figure 1

19 pages, 13823 KiB  
Article
Autonomous Agricultural Robot Using YOLOv8 and ByteTrack for Weed Detection and Destruction
by Ardin Bajraktari and Hayrettin Toylan
Machines 2025, 13(3), 219; https://doi.org/10.3390/machines13030219 - 7 Mar 2025
Cited by 1 | Viewed by 2090
Abstract
Automating agricultural machinery presents a significant opportunity to lower costs and enhance efficiency in both current and future field operations. The detection and destruction of weeds in agricultural areas via robots can be given as an example of this process. Deep learning algorithms [...] Read more.
Automating agricultural machinery presents a significant opportunity to lower costs and enhance efficiency in both current and future field operations. The detection and destruction of weeds in agricultural areas via robots can be given as an example of this process. Deep learning algorithms can accurately detect weeds in agricultural fields. Additionally, robotic systems can effectively eliminate these weeds. However, the high computational demands of deep learning-based weed detection algorithms pose challenges for their use in real-time applications. This study proposes a vision-based autonomous agricultural robot that leverages the YOLOv8 model in combination with ByteTrack to achieve effective real-time weed detection. A dataset of 4126 images was used to create YOLO models, with 80% of the images designated for training, 10% for validation, and 10% for testing. Six different YOLO object detectors were trained and tested for weed detection. Among these models, YOLOv8 stands out, achieving a precision of 93.8%, a recall of 86.5%, and a mAP@0.5 detection accuracy of 92.1%. With an object detection speed of 18 FPS and the advantages of the ByteTrack integrated object tracking algorithm, YOLOv8 was selected as the most suitable model. Additionally, the YOLOv8-ByteTrack model, developed for weed detection, was deployed on an agricultural robot with autonomous driving capabilities integrated with ROS. This system facilitates real-time weed detection and destruction, enhancing the efficiency of weed management in agricultural practices. Full article
(This article belongs to the Section Robotics, Mechatronics and Intelligent Machines)
Show Figures

Figure 1

16 pages, 7077 KiB  
Article
A Variable-Threshold Segmentation Method for Rice Row Detection Considering Robot Travelling Prior Information
by Jing He, Wenhao Dong, Qingneng Tan, Jianing Li, Xianwen Song and Runmao Zhao
Agriculture 2025, 15(4), 413; https://doi.org/10.3390/agriculture15040413 - 15 Feb 2025
Viewed by 726
Abstract
Accurate rice row detection is critical for autonomous agricultural machinery navigation in complex paddy environments. Existing methods struggle with terrain unevenness, water reflections, and weed interference. This study aimed to develop a robust rice row detection method by integrating multi-sensor data and leveraging [...] Read more.
Accurate rice row detection is critical for autonomous agricultural machinery navigation in complex paddy environments. Existing methods struggle with terrain unevenness, water reflections, and weed interference. This study aimed to develop a robust rice row detection method by integrating multi-sensor data and leveraging robot travelling prior information. A 3D point cloud acquisition system combining 2D LiDAR, AHRS, and RTK-GNSS was designed. A variable-threshold segmentation method, dynamically adjusted based on real-time posture perception, was proposed to handle terrain variations. Additionally, a clustering algorithm incorporating rice row spacing and robot path constraints was developed to filter noise and classify seedlings. Experiments in dryland with simulated seedlings and real paddy fields demonstrated high accuracy: maximum absolute errors of 59.41 mm (dryland) and 69.36 mm (paddy), with standard deviations of 14.79 mm and 19.18 mm, respectively. The method achieved a 0.6489° mean angular error, outperforming existing algorithms. The fusion of posture-aware thresholding and path-based clustering effectively addresses the challenges in complex rice fields. This work enhances the automation of field management, offering a reliable solution for precision agriculture in unstructured environments. Its technical framework can be adapted to other row crop systems, promoting sustainable mechanization in global rice production. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

38 pages, 15114 KiB  
Article
YS3AM: Adaptive 3D Reconstruction and Harvesting Target Detection for Clustered Green Asparagus
by Si Mu, Jian Liu, Ping Zhang, Jin Yuan and Xuemei Liu
Agriculture 2025, 15(4), 407; https://doi.org/10.3390/agriculture15040407 - 14 Feb 2025
Cited by 2 | Viewed by 729
Abstract
Green asparagus grows in clusters, which can cause overlaps with weeds and immature stems, making it difficult to identify suitable harvest targets and cutting points. Extracting precise stem details in complex spatial arrangements is a challenge. This paper explored the YS3AM (Yolo-SAM-3D-Adaptive-Modeling) method [...] Read more.
Green asparagus grows in clusters, which can cause overlaps with weeds and immature stems, making it difficult to identify suitable harvest targets and cutting points. Extracting precise stem details in complex spatial arrangements is a challenge. This paper explored the YS3AM (Yolo-SAM-3D-Adaptive-Modeling) method for detecting green asparagus and performing 3D adaptive-section modeling using a depth camera, which could benefit harvesting path planning for selective harvesting robots. Firstly, the model was developed and deployed to extract bounding boxes for individual asparagus stems within clusters. Secondly, the stems inside these bounding boxes were segmented, and binary masks were generated. Thirdly, high-quality depth images were obtained through pixel block completion. Finally, a novel 3D reconstruction method, based on adaptive section modeling and combining the mask and depth data, is proposed. And an evaluation method is introduced to assess modeling accuracy. Experimental validation showed high-performance detection (1095 field images demonstrated, Precision: 98.75%, Recall: 95.46%, F1: 0.97) and robust 3D modeling (103 asparagus stems, average RMSE: length 0.74, depth: 1.105) under varying illumination conditions. The system achieved 22 ms per stem processing speed, enabling real-time operation. The results demonstrated that the 3D model accurately represents the spatial distribution of clustered green asparagus, enabling precise identification of harvest targets and cutting points. This model provided essential spatial pathways for end-effector path planning, thereby fulfilling the operational requirements for efficient green asparagus harvesting robots. Full article
Show Figures

Figure 1

26 pages, 16808 KiB  
Article
Design and Experimental Evaluation of a Smart Intra-Row Weed Control System for Open-Field Cabbage
by Shenyu Zheng, Xueguan Zhao, Hao Fu, Haoran Tan, Changyuan Zhai and Liping Chen
Agronomy 2025, 15(1), 112; https://doi.org/10.3390/agronomy15010112 - 4 Jan 2025
Cited by 4 | Viewed by 1404
Abstract
Addressing the challenges of complex structure, limited modularization capability, and insufficient responsiveness in traditional hydraulically driven inter-plant mechanical weeding equipment, this study designed and developed an electric swing-type opening and closing intra-row weeding control system. The system integrates deep learning technology for accurate [...] Read more.
Addressing the challenges of complex structure, limited modularization capability, and insufficient responsiveness in traditional hydraulically driven inter-plant mechanical weeding equipment, this study designed and developed an electric swing-type opening and closing intra-row weeding control system. The system integrates deep learning technology for accurate identification and localization of cabbage, enabling precise control and dynamic obstacle avoidance for the weeding knives. The system’s performance was comprehensively evaluated through laboratory and field experiments. Laboratory experiments demonstrated that, under conditions of low speed and large plant spacing, the system achieved a weeding accuracy of 96.67%, with a minimum crop injury rate of 0.83%. However, as the operational speed increased, the weeding accuracy decreased while the crop injury rate increased. Two-way ANOVA results indicated that operational speed significantly affected both weeding accuracy and crop injury rate, whereas plant spacing had a significant effect on weeding accuracy but no significant effect on crop injury rate. Field experiment results further confirmed that the system maintained high weeding accuracy and crop protection under varying speed conditions. At a low speed of 0.1 m/s, the weeding accuracy was 96.00%, with a crop injury rate of 1.57%. However, as the speed increased to 0.5 m/s, the weeding accuracy dropped to 81.79%, while the crop injury rate rose to 5.49%. These experimental results verified the system’s adaptability and reliability in complex field environments, providing technical support for the adoption of intelligent mechanical weeding systems. Future research will focus on optimizing control algorithms and feedback mechanisms to enhance the system’s dynamic response capability and adaptability, thereby advancing the development of sustainable agriculture and precision field management. Full article
Show Figures

Figure 1

Back to TopTop