AI, Sensors and Robotics for Smart Agriculture

A topical collection in Agronomy (ISSN 2073-4395). This collection belongs to the section "Precision and Digital Agriculture".

Viewed by 2758

Editors


E-Mail Website
Collection Editor
College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210095, China
Interests: quality and safety assessment of agricultural products; harvesting robots; robot vision; robotic grasping; spectral analysis and modeling; robotic systems and their applications in agriculture
Special Issues, Collections and Topics in MDPI journals

Topical Collection Information

Dear Colleagues,

Food security is an enormous issue for human society, with traditional labor-based agricultural production being unable to meet the increasing needs of the latter. With the continuous progress of artificial intelligence (AI), sensors, and robotics, smart agriculture is gradually being applied to agricultural production across the world. The purpose of smart agriculture is to enhance agricultural production efficiency, improve production and management methods, implement green production, and retain the ecological environment.

Smart agriculture represents the profound combination of IoT technology and traditional agriculture. The IoT will elevate the future of agriculture to a new level, with the utilization of smart agriculture becoming increasingly common among farmers. By employing the Internet of Things, sensor technology, and agricultural robots, smart agriculture could achieve the precise control and scientific management of production and operation processes, as well as the intelligent control of agricultural cultivation, and promote agriculture’s transformation toward intensive and large-scale production. This Collection aims to share recent studies and developments in the application of AI, sensors, and robots in smart agriculture.

Dr. Baohua Zhang
Dr. Yongliang Qiao
Collection Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the collection website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Agronomy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • panchromatic, multispectral, and hyperspectral approaches
  • field phenotyping and yield estimation
  • disease and stress detection
  • computer vision
  • robot sensing systems
  • artificial intelligence and machine learning
  • sensor fusion in agri-robotics
  • variable-rate applications
  • farm management information systems
  • remote sensing
  • ICT applications
  • agri-robotics navigation and awareness

Published Papers (3 papers)

2025

Jump to: 2024

22 pages, 118441 KiB  
Article
CBLN-YOLO: An Improved YOLO11n-Seg Network for Cotton Topping in Fields
by Yufei Xie and Liping Chen
Agronomy 2025, 15(4), 996; https://doi.org/10.3390/agronomy15040996 - 21 Apr 2025
Viewed by 167
Abstract
The positioning of the top bud by the topping machine in the cotton topping operation depends on the recognition algorithm. The detection results of the traditional target detection algorithm contain a lot of useless information, which is not conducive to the positioning of [...] Read more.
The positioning of the top bud by the topping machine in the cotton topping operation depends on the recognition algorithm. The detection results of the traditional target detection algorithm contain a lot of useless information, which is not conducive to the positioning of the top bud. In order to obtain a more efficient recognition algorithm, we propose a top bud segmentation algorithm CBLN-YOLO based on the YOLO11n-seg model. Firstly, the standard convolution and multihead self-attention (MHSA) mechanisms in YOLO11n-seg are replaced by linear deformable convolution (LDConv) and coordinate attention (CA) mechanisms to reduce the parameter growth rate of the original model and better mine detailed features of the top buds. In the neck, the feature pyramid network (FPN) is reconstructed using an enhanced interlayer feature correlation (EFC) module, and regression loss is calculated using the Inner CIoU loss function. When tested on a self-built dataset, the mAP@0.5 values of CBLN-YOLO for detection and segmentation are 98.3% and 95.8%, respectively, which are higher than traditional segmentation models. At the same time, CBLN-YOLO also shows strong robustness under different weather and time periods, and its recognition speed reaches 135 frames per second, which provides strong support for cotton top bud positioning in the field environment. Full article
Show Figures

Figure 1

23 pages, 15990 KiB  
Article
A Lightweight Model for Shine Muscat Grape Detection in Complex Environments Based on the YOLOv8 Architecture
by Changlei Tian, Zhanchong Liu, Haosen Chen, Fanglong Dong, Xiaoxiang Liu and Cong Lin
Agronomy 2025, 15(1), 174; https://doi.org/10.3390/agronomy15010174 - 13 Jan 2025
Viewed by 1005
Abstract
Automated harvesting of “Sunshine Rose” grapes requires accurate detection and classification of grape clusters under challenging orchard conditions, such as occlusion and variable lighting, while ensuring that the model can be deployed on resource- and computation-constrained edge devices. This study addresses these challenges [...] Read more.
Automated harvesting of “Sunshine Rose” grapes requires accurate detection and classification of grape clusters under challenging orchard conditions, such as occlusion and variable lighting, while ensuring that the model can be deployed on resource- and computation-constrained edge devices. This study addresses these challenges by proposing a lightweight YOLOv8-based model, incorporating DualConv and the novel C2f-GND module to enhance feature extraction and reduce computational complexity. Evaluated on the newly developed Shine-Muscat-Complex dataset of 4715 images, the proposed model achieved a 2.6% improvement in mean Average Precision (mAP) over YOLOv8n while reducing parameters by 36.8%, FLOPs by 34.1%, and inference time by 15%. Compared with the latest YOLOv11n, our model achieved a 3.3% improvement in mAP, with reductions of 26.4% in parameters, 14.3% in FLOPs, and 14.6% in inference time, demonstrating comprehensive enhancements. These results highlight the potential of our model for accurate and efficient deployment on resource-constrained edge devices, providing an algorithmic foundation for the automated harvesting of “Sunshine Rose” grapes. Full article
Show Figures

Figure 1

2024

Jump to: 2025

21 pages, 15422 KiB  
Article
A Lightweight Model for Weed Detection Based on the Improved YOLOv8s Network in Maize Fields
by Jinyong Huang, Xu Xia, Zhihua Diao, Xingyi Li, Suna Zhao, Jingcheng Zhang, Baohua Zhang and Guoqiang Li
Agronomy 2024, 14(12), 3062; https://doi.org/10.3390/agronomy14123062 - 22 Dec 2024
Viewed by 1043
Abstract
To address the issue of the computational intensity and deployment difficulties associated with weed detection models, a lightweight target detection model for weeds based on YOLOv8s in maize fields was proposed in this study. Firstly, a lightweight network, designated as Dualconv High Performance [...] Read more.
To address the issue of the computational intensity and deployment difficulties associated with weed detection models, a lightweight target detection model for weeds based on YOLOv8s in maize fields was proposed in this study. Firstly, a lightweight network, designated as Dualconv High Performance GPU Net (D-PP-HGNet), was constructed on the foundation of the High Performance GPU Net (PP-HGNet) framework. Dualconv was introduced to reduce the computation required to achieve a lightweight design. Furthermore, Adaptive Feature Aggregation Module (AFAM) and Global Max Pooling were incorporated to augment the extraction of salient features in complex scenarios. Then, the newly created network was used to reconstruct the YOLOv8s backbone. Secondly, a four-stage inverted residual moving block (iRMB) was employed to construct a lightweight iDEMA module, which was used to replace the original C2f feature extraction module in the Neck to improve model performance and accuracy. Finally, Dualconv was employed instead of the conventional convolution for downsampling, further diminishing the network load. The new model was fully verified using the established field weed dataset. The test results showed that the modified model exhibited a notable improvement in detection performance compared with YOLOv8s. Accuracy improved from 91.2% to 95.8%, recall from 87.9% to 93.2%, and mAP@0.5 from 90.8% to 94.5%. Furthermore, the number of GFLOPs and the model size were reduced to 12.7 G and 9.1 MB, respectively, representing a decrease of 57.4% and 59.2% compared to the original model. Compared with the prevalent target detection models, such as Faster R-CNN, YOLOv5s, and YOLOv8l, the new model showed superior performance in accuracy and lightweight. The new model proposed in this paper effectively reduces the cost of the required hardware to achieve accurate weed identification in maize fields with limited resources. Full article
Show Figures

Figure 1

Back to TopTop