Plant Diagnosis and Monitoring for Agricultural Production

A special issue of Agriculture (ISSN 2077-0472). This special issue belongs to the section "Artificial Intelligence and Digital Agriculture".

Deadline for manuscript submissions: 15 February 2026 | Viewed by 1703

Special Issue Editors


E-Mail Website
Guest Editor
Center for Precision and Automated Agricultural Systems, Department of Biological Systems Engineering, Washington State University, Prosser, WA 99350, USA
Interests: data mining; machine learning; image processing; deep learning; data fusion
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
College of Information Technology, Jilin Agricultural University, Changchun 130118, China
Interests: computer vsion; pest and disease identification; machine learning; deep learning

E-Mail Website
Guest Editor
College of Mechanical and Electrical Engineering, Shandong Agricultural University, Tai’an 271018, China
Interests: intelligent agriculture; agricultural product detection; hyperspectral image processing; deep learning; agricultural machinery
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Modern agriculture demands transformative advancements in detecting and addressing plant health challenges to ensure productivity and ecological balance. This Special Issue focuses on digital tools for plant diagnosis and real-time monitoring, enabling the precise detection of diseases, nutrient deficiencies, and environmental stressors. Innovations such as hyperspectral imaging, IoT-enabled sensor networks, UAV-based remote sensing, and AI-driven predictive models enable early disease detection, nutrient deficiency identification, and real-time phenotyping across diverse cropping systems.  By leveraging machine learning, edge computing, and geospatial analytics, these solutions empower farmers to implement targeted interventions, reduce agrochemical use, and enhance climate resilience while preserving soil health.

We invite contributions exploring novel techniques in plant phenotyping, automated disease diagnosis, predictive modeling for crop stress, and scalable monitoring systems. Submissions addressing precision farming, IoT-enabled solutions, machine learning applications, big data analytics, drone technology, and innovations for sustainable agriculture. Selected papers will be featured in a Special Issue showcasing advancements in digital agriculture, with opportunities for publication in indexed journals or proceedings. Join us in shaping the future of agriculture through technology-driven solutions and contributing to this dynamic and impactful research domain.

Dr. Zongmei Gao
Dr. Yanlei Xu
Dr. Yuanyuan Shao
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Agriculture is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • precision agriculture
  • remote sensing
  • IoT in farming
  • machine learning in agriculture
  • big data analytics
  • drone technology
  • sustainable farming
  • geospatial analytics
  • crop monitoring
  • soil health management

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 15968 KB  
Article
YOLOv8n-RMB: UAV Imagery Rubber Milk Bowl Detection Model for Autonomous Robots’ Natural Latex Harvest
by Yunfan Wang, Lin Yang, Pengze Zhong, Xin Yang, Chuanchuan Su, Yi Zhang and Aamir Hussain
Agriculture 2025, 15(19), 2075; https://doi.org/10.3390/agriculture15192075 - 3 Oct 2025
Viewed by 418
Abstract
Natural latex harvest is pushing the boundaries of unmanned agricultural production in rubber milk collection via integrated robots in hilly and mountainous regions, such as the fixed and mobile tapping robots widely deployed in forests. As there are bad working conditions and complex [...] Read more.
Natural latex harvest is pushing the boundaries of unmanned agricultural production in rubber milk collection via integrated robots in hilly and mountainous regions, such as the fixed and mobile tapping robots widely deployed in forests. As there are bad working conditions and complex natural environments surrounding rubber trees, the real-time and precision assessment of rubber milk yield status has emerged as a key requirement for improving the efficiency and autonomous management of these kinds of large-scale automatic tapping robots. However, traditional manual rubber milk yield status detection methods are limited in their ability to operate effectively under conditions involving complex terrain, dense forest backgrounds, irregular surface geometries of rubber milk, and the frequent occlusion of rubber milk bowls (RMBs) by vegetation. To address this issue, this study presents an unmanned aerial vehicle (UAV) imagery rubber milk yield state detection method, termed YOLOv8n-RMB, in unstructured field environments instead of manual watching. The proposed method improved the original YOLOv8n by integrating structural enhancements across the backbone, neck, and head components of the network. First, a receptive field attention convolution (RFACONV) module is embedded within the backbone to improve the model’s ability to extract target-relevant features in visually complex environments. Second, within the neck structure, a bidirectional feature pyramid network (BiFPN) is applied to strengthen the fusion of features across multiple spatial scales. Third, in the head, a content-aware dynamic upsampling module of DySample is adopted to enhance the reconstruction of spatial details and the preservation of object boundaries. Finally, the detection framework is integrated with the BoT-SORT tracking algorithm to achieve continuous multi-object association and dynamic state monitoring based on the filling status of RMBs. Experimental evaluation shows that the proposed YOLOv8n-RMB model achieves an AP@0.5 of 94.9%, an AP@0.5:0.95 of 89.7%, a precision of 91.3%, and a recall of 91.9%. Moreover, the performance improves by 2.7%, 2.9%, 3.9%, and 9.7%, compared with the original YOLOv8n. Plus, the total number of parameters is kept within 3.0 million, and the computational cost is limited to 8.3 GFLOPs. This model meets the requirements of yield assessment tasks by conducting computations in resource-limited environments for both fixed and mobile tapping robots in rubber plantations. Full article
(This article belongs to the Special Issue Plant Diagnosis and Monitoring for Agricultural Production)
Show Figures

Figure 1

24 pages, 14851 KB  
Article
LiteFocus-YOLO: An Efficient Network for Identifying Dense Tassels in Field Environments
by Heyang Wang, Jinghuan Hu, Yunlong Ji, Chong Peng, Yu Bao, Hang Zhu, Caocan Zhu, Mengchao Chen, Ye Mu and Hongyu Guo
Agriculture 2025, 15(19), 2036; https://doi.org/10.3390/agriculture15192036 - 28 Sep 2025
Viewed by 287
Abstract
High-efficiency and precise detection of crop ears in the field is a core component of intelligent agricultural yield estimation. However, challenges such as overlapping ears caused by dense planting, complex background interference, and blurred boundaries of small targets severely limit the accuracy and [...] Read more.
High-efficiency and precise detection of crop ears in the field is a core component of intelligent agricultural yield estimation. However, challenges such as overlapping ears caused by dense planting, complex background interference, and blurred boundaries of small targets severely limit the accuracy and practicality of existing detection models. This paper introduces LiteFocus-YOLO(LF-YOLO), an efficient small-object detection model. By synergistically enhancing feature expression through cross-scale texture optimization and attention mechanisms, it achieves high-precision identification of maize tassels and wheat ears. The model innovatively incorporates the following: The Lightweight Target-Aware Attention Module (LTAM) strengthens high-frequency feature expression for small targets while reducing background interference, enhancing robustness in densely occluded scenes. The Cross-Feature Fusion Module (CFFM) addresses semantic detail loss through deep-shallow feature fusion modulation, optimizing small target localization accuracy. The experiment validated performance on the drone-based maize tassel dataset. Results show that LF-YOLO achieved an mAP50 of 97.9%, with mAP50 scores of 94.6% and 95.7% on the publicly available maize tassel and wheat ear datasets, respectively. It achieves generalization across different crops while maintaining high accuracy and recall. Compared to current mainstream object detection models, LF-YOLO delivers higher precision at lower computational cost, providing efficient technical support for dense small object detection tasks in agricultural fields. Full article
(This article belongs to the Special Issue Plant Diagnosis and Monitoring for Agricultural Production)
Show Figures

Figure 1

23 pages, 63827 KB  
Article
A Two-Stage Weed Detection and Localization Method for Lily Fields Targeting Laser Weeding
by Yanlei Xu, Chao Liu, Jiahao Liang, Xiaomin Ji and Jian Li
Agriculture 2025, 15(18), 1967; https://doi.org/10.3390/agriculture15181967 - 18 Sep 2025
Viewed by 412
Abstract
The cultivation of edible lilies is highly susceptible to weed infestation during its growth period, and the application of herbicides is often impractical, leading to the rampant growth of diverse weed species. Laser weeding, recognized as an efficient and precise method for field [...] Read more.
The cultivation of edible lilies is highly susceptible to weed infestation during its growth period, and the application of herbicides is often impractical, leading to the rampant growth of diverse weed species. Laser weeding, recognized as an efficient and precise method for field weed management, presents a novel solution to the weed challenges in lily fields. The accurate localization of weed regions and the optimal selection of laser targeting points are crucial technologies for successful laser weeding implementation. In this study, we propose a two-stage weed detection and localization method specifically designed for lily fields. In the first stage, we introduce an enhanced detection model named YOLO-Morse, aimed at identifying and removing lily plants. YOLO-Morse is built upon the YOLOv8 architecture and integrates the RCS-MAS backbone, the SPD-Conv spatial enhancement module, and an adaptive focal loss function (ATFL) to enhance detection accuracy in conditions characterized by sample imbalance and complex backgrounds. Experimental results indicate that YOLO-morse achieves a mean Average Precision (mAP) of 86%, reflecting a 3.2% improvement over the original YOLOv8, and facilitates stable identification of lily regions. Subsequently, a ResNet-based segmentation network is employed to conduct semantic segmentation on the detected lily targets. The segmented results are utilized to mask the original lily areas in the image, thereby generating weed-only images for the subsequent stage. In the second stage, the original RGB field images are first converted into weed-only images by removing lily regions; these weed-only images are then analyzed in the HSV color space combined with morphological processing to precisely extract green weed regions. The centroid of the weed coordinate set is automatically determined as the laser targeting point.The proposed system exhibits superior performance in weed detection, achieving a Precision, Recall, and F1-score of 94.97%, 90.00%, and 92.42%, respectively. The proposed two-stage approach significantly enhances multi-weed detection performance in complex environments, improving detection accuracy while maintaining operational efficiency and cost-effectiveness. This method proposes a precise, efficient, and intelligent laser weeding solution for weed management in lily fields. Although certain limitations remain, such as environmental lighting variation, leaf occlusion, and computational resource constraints, the method still exhibits significant potential for broader application in other high-value crops. Full article
(This article belongs to the Special Issue Plant Diagnosis and Monitoring for Agricultural Production)
Show Figures

Figure 1

Back to TopTop