Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (73)

Search Parameters:
Keywords = crop disease and pest identification

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 17213 KiB  
Review
Empowering Smart Soybean Farming with Deep Learning: Progress, Challenges, and Future Perspectives
by Huihui Sun, Hao-Qi Chu, Yi-Ming Qin, Pingfan Hu and Rui-Feng Wang
Agronomy 2025, 15(8), 1831; https://doi.org/10.3390/agronomy15081831 - 28 Jul 2025
Viewed by 426
Abstract
This review comprehensively examines the application of deep learning technologies across the entire soybean production chain, encompassing areas such as disease and pest identification, weed detection, crop phenotype recognition, yield prediction, and intelligent operations. By systematically analyzing mainstream deep learning models, optimization strategies [...] Read more.
This review comprehensively examines the application of deep learning technologies across the entire soybean production chain, encompassing areas such as disease and pest identification, weed detection, crop phenotype recognition, yield prediction, and intelligent operations. By systematically analyzing mainstream deep learning models, optimization strategies (e.g., model lightweighting, transfer learning), and sensor data fusion techniques, the review identifies their roles and performances in complex agricultural environments. It also highlights key challenges including data quality limitations, difficulties in real-world deployment, and the lack of standardized evaluation benchmarks. In response, promising directions such as reinforcement learning, self-supervised learning, interpretable AI, and multi-source data fusion are proposed. Specifically for soybean automation, future advancements are expected in areas such as high-precision disease and weed localization, real-time decision-making for variable-rate spraying and harvesting, and the integration of deep learning with robotics and edge computing to enable autonomous field operations. This review provides valuable insights and future prospects for promoting intelligent, efficient, and sustainable development in soybean production through deep learning. Full article
(This article belongs to the Section Precision and Digital Agriculture)
Show Figures

Figure 1

18 pages, 2795 KiB  
Article
Study on the Detection of Chlorophyll Content in Tomato Leaves Based on RGB Images
by Xuehui Zhang, Huijiao Yu, Jun Yan and Xianyong Meng
Horticulturae 2025, 11(6), 593; https://doi.org/10.3390/horticulturae11060593 - 26 May 2025
Viewed by 917
Abstract
Chlorophyll is a key substance in plant photosynthesis, and its content detection methods are of great significance in the field of agricultural AI. These methods provide important technical support for crop growth monitoring, pest and disease identification, and yield prediction, playing a crucial [...] Read more.
Chlorophyll is a key substance in plant photosynthesis, and its content detection methods are of great significance in the field of agricultural AI. These methods provide important technical support for crop growth monitoring, pest and disease identification, and yield prediction, playing a crucial role in improving agricultural productivity and the level of intelligence in farming. This paper aims to explore an efficient and low-cost non-destructive method for detecting chlorophyll content (SPAD) and investigate the feasibility of smartphone image analysis technology in predicting chlorophyll content in greenhouse tomatoes. This study uses greenhouse tomato leaves as the experimental object and analyzes the correlation between chlorophyll content and image color features. First, leaf images are captured using a smartphone, and 42 color features based on the red, green, and blue (R, G, B) color channels are constructed to assess their correlation with chlorophyll content. The experiment selects eight color features most sensitive to chlorophyll content, including B, (2G − R − B)/(2G + R + B), GLA, RGBVI, g, g − b, ExG, and CIVE. Based on this, this study constructs and evaluates the predictive performance of multiple models, including multiple linear regression (MLR), ridge regression (RR), support vector regression (SVR), random forest (RF), and the Stacking ensemble learning model. The experimental results indicate that the Stacking ensemble learning model performs the best in terms of prediction accuracy and stability (R2 = 0.8359, RMSE = 0.8748). The study confirms the feasibility of using smartphone image analysis for estimating chlorophyll content, providing a convenient, cost-effective, and efficient technological approach for crop health monitoring and precision agriculture management. This method helps agricultural workers to monitor crop growth in real-time and optimize management decisions. Full article
(This article belongs to the Section Vegetable Production Systems)
Show Figures

Figure 1

31 pages, 3016 KiB  
Review
Image Recognition Technology in Smart Agriculture: A Review of Current Applications Challenges and Future Prospects
by Chunxia Jiang, Kangshu Miao, Zhichao Hu, Fengwei Gu and Kechuan Yi
Processes 2025, 13(5), 1402; https://doi.org/10.3390/pr13051402 - 4 May 2025
Viewed by 2494
Abstract
The implementation of image recognition technology can significantly enhance the levels of automation and intelligence in smart agriculture. However, most researchers focused on its applications in medical imaging, industry, and transportation, while fewer focused on smart agriculture. Based on this, this study aims [...] Read more.
The implementation of image recognition technology can significantly enhance the levels of automation and intelligence in smart agriculture. However, most researchers focused on its applications in medical imaging, industry, and transportation, while fewer focused on smart agriculture. Based on this, this study aims to contribute to the comprehensive understanding of the application of image recognition technology in smart agriculture by investigating the scientific literature related to this technology in the last few years. We discussed and analyzed the applications of plant disease and pest detection, crop species identification, crop yield prediction, and quality assessment. Then, we made a brief introduction to its applications in soil testing and nutrient management, as well as in agricultural machinery operation quality assessment and agricultural product grading. At last, the challenges and the emerging trends of image recognition technology were summarized. The results indicated that the models used in image recognition technology face challenges such as limited generalization, real-time processing, and insufficient dataset diversity. Transfer learning and green Artificial Intelligence (AI) offer promising solutions to these issues by reducing the reliance on large datasets and minimizing computational resource consumption. Advanced technologies like transformers further enhance the adaptability and accuracy of image recognition in smart agriculture. This comprehensive review provides valuable information on the current state of image recognition technology in smart agriculture and prospective future opportunities. Full article
Show Figures

Figure 1

25 pages, 13043 KiB  
Article
Coffee-Leaf Diseases and Pests Detection Based on YOLO Models
by Jonatan Fragoso, Clécio Silva, Thuanne Paixão, Ana Beatriz Alvarez, Olacir Castro Júnior, Ruben Florez, Facundo Palomino-Quispe, Lucas Graciolli Savian and Paulo André Trazzi
Appl. Sci. 2025, 15(9), 5040; https://doi.org/10.3390/app15095040 - 1 May 2025
Viewed by 1503
Abstract
Coffee cultivation is vital to the global economy, but faces significant challenges with diseases such as rust, miner, phoma, and cercospora, which impact production and sustainable crop management. In this scenario, deep learning techniques have shown promise for the early identification of these [...] Read more.
Coffee cultivation is vital to the global economy, but faces significant challenges with diseases such as rust, miner, phoma, and cercospora, which impact production and sustainable crop management. In this scenario, deep learning techniques have shown promise for the early identification of these diseases, enabling more efficient monitoring. This paper proposes an approach for detecting diseases and pests on coffee leaves using an efficient single-shot object-detection algorithm. The experiments were conducted using the YOLOv8, YOLOv9, YOLOv10 and YOLOv11 versions, including their variations. The BRACOL dataset, annotated by an expert, was used in the experiments to guarantee the quality of the annotations and the reliability of the trained models. The evaluation of the models included quantitative and qualitative analyses, considering the mAP, F1-Score, and recall metrics. In the analyses, YOLOv8s stands out as the most effective, with a mAP of 54.5%, an inference time of 11.4 ms and the best qualitative predictions, making it ideal for real-time applications. Full article
(This article belongs to the Special Issue Applied Computer Vision in Industry and Agriculture)
Show Figures

Figure 1

16 pages, 1415 KiB  
Review
Advancing Crop Resilience Through High-Throughput Phenotyping for Crop Improvement in the Face of Climate Change
by Hoa Thi Nguyen, Md Arifur Rahman Khan, Thuong Thi Nguyen, Nhi Thi Pham, Thu Thi Bich Nguyen, Touhidur Rahman Anik, Mai Dao Nguyen, Mao Li, Kien Huu Nguyen, Uttam Kumar Ghosh, Lam-Son Phan Tran and Chien Van Ha
Plants 2025, 14(6), 907; https://doi.org/10.3390/plants14060907 - 14 Mar 2025
Cited by 1 | Viewed by 1842
Abstract
Climate change intensifies biotic and abiotic stresses, threatening global crop productivity. High-throughput phenotyping (HTP) technologies provide a non-destructive approach to monitor plant responses to environmental stresses, offering new opportunities for both crop stress resilience and breeding research. Innovations, such as hyperspectral imaging, unmanned [...] Read more.
Climate change intensifies biotic and abiotic stresses, threatening global crop productivity. High-throughput phenotyping (HTP) technologies provide a non-destructive approach to monitor plant responses to environmental stresses, offering new opportunities for both crop stress resilience and breeding research. Innovations, such as hyperspectral imaging, unmanned aerial vehicles, and machine learning, enhance our ability to assess plant traits under various environmental stresses, including drought, salinity, extreme temperatures, and pest and disease infestations. These tools facilitate the identification of stress-tolerant genotypes within large segregating populations, improving selection efficiency for breeding programs. HTP can also play a vital role by accelerating genetic gain through precise trait evaluation for hybridization and genetic enhancement. However, challenges such as data standardization, phenotyping data management, high costs of HTP equipment, and the complexity of linking phenotypic observations to genetic improvements limit its broader application. Additionally, environmental variability and genotype-by-environment interactions complicate reliable trait selection. Despite these challenges, advancements in robotics, artificial intelligence, and automation are improving the precision and scalability of phenotypic data analyses. This review critically examines the dual role of HTP in assessment of plant stress tolerance and crop performance, highlighting both its transformative potential and existing limitations. By addressing key challenges and leveraging technological advancements, HTP can significantly enhance genetic research, including trait discovery, parental selection, and hybridization scheme optimization. While current methodologies still face constraints in fully translating phenotypic insights into practical breeding applications, continuous innovation in high-throughput precision phenotyping holds promise for revolutionizing crop resilience and ensuring sustainable agricultural production in a changing climate. Full article
(This article belongs to the Section Crop Physiology and Crop Production)
Show Figures

Figure 1

20 pages, 3488 KiB  
Article
Indigenous Knowledge on Edible Wild Yams (Kumbu) in the Mount Cameroon Region: Towards Domestication for Enhanced Food Security
by Frederick Tilili Moleye, Mercy Dione Abwe Ngone, Solange Dzekewong Ndzeshala Takwi, Jean-Pierre Mvodo and Christopher Ngosong
Crops 2025, 5(2), 9; https://doi.org/10.3390/crops5020009 - 7 Mar 2025
Viewed by 841
Abstract
Growing food insecurity can in part be attributed to a lack of diversity in arable crops, with most African countries now focused on the production of a few “green revolution crops”. Indigenous knowledge of traditional food types could hold the key to the [...] Read more.
Growing food insecurity can in part be attributed to a lack of diversity in arable crops, with most African countries now focused on the production of a few “green revolution crops”. Indigenous knowledge of traditional food types could hold the key to the genetic diversification of crop production systems. Wild yams are indigenous crops that have been relegated to the background. This study aimed to assess the state of knowledge in, and cultivation of, wild yams collectively called “Kumbu” by the Bakweris of the Mount Cameroon Region. Following reconnaissance surveys, semi-structured questionnaires were administered to 583 interviewees across 41 villages in this region. Data were analysed in the SPSS version 21 statistical package with significance at α = 0.05 where necessary. Results showed that the study population was fairly balanced in terms of gender (SD = 0.534), with males representing 56.8% of the sample. A majority of the interviewees (53.3%) were married, and most had received at least primary education (85.2%). Most (61.6%) of the interviewees do not cultivate Kumbu due to a lack of available seeds (69.3%) and a preference for other yams (30.7%). Of those who cultivate Kumbu (38.4% of the interviewees), a majority (89.6%) have less than five stands of Kumbu. The different names (10) and types (13) of Kumbu could represent linguistic polymorphism, requiring further studies for proper identification. A majority (68.1%) of the interviewees had no idea of the differences between Kumbu types. Agronomic practices, pests, and disease management reported for Kumbu are similar to those of other mainstream yam types. We conclude that the state of knowledge on Kumbu in the Mt Cameroon Region is limited and on the decline. Bringing Kumbu production to the mainstream requires research on molecular taxonomy, propagation techniques, and agronomic practices for better yields. Full article
Show Figures

Figure 1

20 pages, 3296 KiB  
Article
Presence of Soybean Vein Necrosis Orthotospovirus (Tospoviridae: Orthotospovirus) in Pakistan, Pakistani Scientists’ and Farmers’ Perception of Disease Dynamics and Management, and Policy Recommendations to Improve Soybean Production
by Asifa Hameed, Cristina Rosa, Paige Castillanos and Edwin G. Rajotte
Viruses 2025, 17(3), 315; https://doi.org/10.3390/v17030315 - 25 Feb 2025
Viewed by 684
Abstract
Soybean vein necrosis orthotospovirus (SVNV: Tospoviridae: Orthotospovirus) is a well-recognized thrips-vectored and seed-borne virus common in the United States (U.S.), Canada, and Egypt. Pakistan started the commercial cultivation of soybeans in the 1970s, when some soybean cultivars were imported from the U.S. to [...] Read more.
Soybean vein necrosis orthotospovirus (SVNV: Tospoviridae: Orthotospovirus) is a well-recognized thrips-vectored and seed-borne virus common in the United States (U.S.), Canada, and Egypt. Pakistan started the commercial cultivation of soybeans in the 1970s, when some soybean cultivars were imported from the U.S. to meet the country’s domestic requirement of oil, poultry, animal feed, and forage. A survey of farmers and scientists was conducted in the Punjab and Khyber Pakhtunkhwa provinces of Pakistan to understand perceptions of SVNV in the indigenous Pakistani community. Concurrently, soybean fields were sampled for SVNV presence at the National Agricultural Research Institute in Islamabad, Pakistan. Based upon survey and SVNV detection results through ELISA and qRT-PCR, a policy was developed. Overall, we found that SVNV was present in Islamabad, Pakistan in USDA-approved soybean cultivars. Although scientists knew about general thrips biology and insecticides, knowledge about identification of vectors (Thrips species) was not significantly different between the scientists and the farmers. Scientists at the Islamabad location were more aware of crop production technology and pests. This study reports that Pakistan needs to strengthen its research institutes, scientists’ and farmers’ capacity building, and extension programs to understand the disease complex in soybean crops. Full article
(This article belongs to the Special Issue Plant Viruses and Their Vectors: Epidemiology and Control)
Show Figures

Figure 1

30 pages, 5329 KiB  
Review
Advances in Deep Learning Applications for Plant Disease and Pest Detection: A Review
by Shaohua Wang, Dachuan Xu, Haojian Liang, Yongqing Bai, Xiao Li, Junyuan Zhou, Cheng Su and Wenyu Wei
Remote Sens. 2025, 17(4), 698; https://doi.org/10.3390/rs17040698 - 18 Feb 2025
Cited by 16 | Viewed by 7023
Abstract
Traditional methods for detecting plant diseases and pests are time-consuming, labor-intensive, and require specialized skills and resources, making them insufficient to meet the demands of modern agricultural development. To address these challenges, deep learning technologies have emerged as a promising solution for the [...] Read more.
Traditional methods for detecting plant diseases and pests are time-consuming, labor-intensive, and require specialized skills and resources, making them insufficient to meet the demands of modern agricultural development. To address these challenges, deep learning technologies have emerged as a promising solution for the accurate and timely identification of plant diseases and pests, thereby reducing crop losses and optimizing agricultural resource allocation. By leveraging its advantages in image processing, deep learning technology has significantly enhanced the accuracy of plant disease and pest detection and identification. This review provides a comprehensive overview of recent advancements in applying deep learning algorithms to plant disease and pest detection. It begins by outlining the limitations of traditional methods in this domain, followed by a systematic discussion of the latest developments in applying various deep learning techniques—including image classification, object detection, semantic segmentation, and change detection—to plant disease and pest identification. Additionally, this study highlights the role of large-scale pre-trained models and transfer learning in improving detection accuracy and scalability across diverse crop types and environmental conditions. Key challenges, such as enhancing model generalization, addressing small lesion detection, and ensuring the availability of high-quality, diverse training datasets, are critically examined. Emerging opportunities for optimizing pest and disease monitoring through advanced algorithms are also emphasized. Deep learning technology, with its powerful capabilities in data processing and pattern recognition, has become a pivotal tool for promoting sustainable agricultural practices, enhancing productivity, and advancing precision agriculture. Full article
Show Figures

Figure 1

37 pages, 3785 KiB  
Review
Key Intelligent Pesticide Prescription Spraying Technologies for the Control of Pests, Diseases, and Weeds: A Review
by Kaiqiang Ye, Gang Hu, Zijie Tong, Youlin Xu and Jiaqiang Zheng
Agriculture 2025, 15(1), 81; https://doi.org/10.3390/agriculture15010081 - 1 Jan 2025
Cited by 5 | Viewed by 3355
Abstract
In modern agriculture, plant protection is the key to ensuring crop health and improving yields. Intelligent pesticide prescription spraying (IPPS) technologies monitor, diagnose, and make scientific decisions about pests, diseases, and weeds; formulate personalized and precision control plans; and prevent and control pests [...] Read more.
In modern agriculture, plant protection is the key to ensuring crop health and improving yields. Intelligent pesticide prescription spraying (IPPS) technologies monitor, diagnose, and make scientific decisions about pests, diseases, and weeds; formulate personalized and precision control plans; and prevent and control pests through the use of intelligent equipment. This study discusses key IPSS technologies from four perspectives: target information acquisition, information processing, pesticide prescription spraying, and implementation and control. In the target information acquisition section, target identification technologies based on images, remote sensing, acoustic waves, and electronic nose are introduced. In the information processing section, information processing methods such as information pre-processing, feature extraction, pest and disease identification, bioinformatics analysis, and time series data are addressed. In the pesticide prescription spraying section, the impact of pesticide selection, dose calculation, spraying time, and method on the resulting effect and the formulation of prescription pesticide spraying in a certain area are explored. In the implement and control section, vehicle automatic control technology, precision spraying technology, and droplet characteristic control technology and their applications are studied. In addition, this study discusses the future development prospectives of IPPS technologies, including multifunctional target information acquisition systems, decision-support systems based on generative AI, and the development of precision intelligent sprayers. The advancement of these technologies will enhance agricultural productivity in a more efficient, environmentally sustainable manner. Full article
(This article belongs to the Section Agricultural Technology)
Show Figures

Figure 1

24 pages, 3377 KiB  
Article
A Hybrid Model for Soybean Yield Prediction Integrating Convolutional Neural Networks, Recurrent Neural Networks, and Graph Convolutional Networks
by Vikram S. Ingole, Ujwala A. Kshirsagar, Vikash Singh, Manish Varun Yadav, Bipin Krishna and Roshan Kumar
Computation 2025, 13(1), 4; https://doi.org/10.3390/computation13010004 - 27 Dec 2024
Cited by 3 | Viewed by 1120
Abstract
Soybean yield prediction is one of the most critical activities for increasing agricultural productivity and ensuring food security. Traditional models often underestimate yields because of limitations associated with single data sources and simplistic model architectures. These prevent complex, multifaceted factors influencing crop growth [...] Read more.
Soybean yield prediction is one of the most critical activities for increasing agricultural productivity and ensuring food security. Traditional models often underestimate yields because of limitations associated with single data sources and simplistic model architectures. These prevent complex, multifaceted factors influencing crop growth and yield from being captured. In this line, this work fuses multi-source data—satellite imagery, weather data, and soil properties—through the approach of multi-modal fusion using Convolutional Neural Networks and Recurrent Neural Networks. While satellite imagery provides information on spatial data regarding crop health, weather data provides temporal insights, and the soil properties provide important fertility information. Fusing these heterogeneous data sources embeds an overall understanding of yield-determining factors in the model, decreasing the RMSE by 15% and improving R2 by 20% over single-source models. We further push the frontier of feature engineering by using Temporal Convolutional Networks (TCNs) and Graph Convolutional Networks (GCNs) to capture time series trends, geographic and topological information, and pest/disease incidence. TCNs can capture long-range temporal dependencies well, while the GCN model has complex spatial relationships and enhanced the features for making yield predictions. This increases the prediction accuracy by 10% and boosts the F1 score for low-yield area identification by 5%. Additionally, we introduce other improved model architectures: a custom UNet with attention mechanisms, Heterogeneous Graph Neural Networks (HGNNs), and Variational Auto-encoders. The attention mechanism enables more effective spatial feature encoding by focusing on critical image regions, while the HGNN captures interaction patterns that are complex between diverse data types. Finally, VAEs can generate robust feature representation. Such state-of-the-art architectures could then achieve an MAE improvement of 12%, while R2 for yield prediction improves by 25%. In this paper, the state of the art in yield prediction has been advanced due to the employment of multi-source data fusion, sophisticated feature engineering, and advanced neural network architectures. This provides a more accurate and reliable soybean yield forecast. Thus, the fusion of Convolutional Neural Networks with Recurrent Neural Networks and Graph Networks enhances the efficiency of the detection process. Full article
Show Figures

Figure 1

27 pages, 1920 KiB  
Review
Recent Technological Advancements for Identifying and Exploiting Novel Sources of Pest and Disease Resistance for Peanut Improvement
by Akshaya Kumar Biswal, Peggy Ozias-Akins and Carl Corley Holbrook
Agronomy 2024, 14(12), 3071; https://doi.org/10.3390/agronomy14123071 - 23 Dec 2024
Cited by 1 | Viewed by 1791
Abstract
Peanut, also known as groundnut (Arachis hypogaea L.), is an important oilseed and food crop globally, contributing significantly to the economy and food security. However, its productivity is often hampered by pests and diseases. Traditional breeding methods have been used to develop [...] Read more.
Peanut, also known as groundnut (Arachis hypogaea L.), is an important oilseed and food crop globally, contributing significantly to the economy and food security. However, its productivity is often hampered by pests and diseases. Traditional breeding methods have been used to develop resistant cultivars, but these are often time-consuming and labor-intensive. Recent technological advancements have revolutionized the identification of novel resistance sources and the development of resistant peanut cultivars. This review explores the latest techniques and approaches used in peanut breeding for pest and disease resistance, focusing on the identification of resistance loci and their incorporation into peanut using marker-assisted selection (MAS) and genomic tools. Next-generation sequencing (NGS) technologies, bioinformatics pipelines, comparative genomics, and transcriptomics have helped identify a plethora of candidate genes involved in pest resistance. However, peanut lags behind other cereal crops in terms of phenomics and precision genetic techniques for their functional validation. In conclusion, recent technological advancements have significantly improved the efficiency and precision of peanut breeding for pest and disease resistance and hold great promise for developing durable and sustainable resistance in peanut cultivars, ultimately benefiting peanut farmers and consumers globally. Full article
(This article belongs to the Special Issue Pest Control Technologies Applied in Peanut Production Systems)
Show Figures

Figure 1

12 pages, 2178 KiB  
Article
Detection and Classification of Agave angustifolia Haw Using Deep Learning Models
by Idarh Matadamas, Erik Zamora and Teodulfo Aquino-Bolaños
Agriculture 2024, 14(12), 2199; https://doi.org/10.3390/agriculture14122199 - 2 Dec 2024
Cited by 1 | Viewed by 1374
Abstract
In Oaxaca, Mexico, there are more than 30 species of the Agave genus, and its cultivation is of great economic and social importance. The incidence of pests, diseases, and environmental stress cause significant losses to the crop. The identification of damage through non-invasive [...] Read more.
In Oaxaca, Mexico, there are more than 30 species of the Agave genus, and its cultivation is of great economic and social importance. The incidence of pests, diseases, and environmental stress cause significant losses to the crop. The identification of damage through non-invasive tools based on visual information is important for reducing economic losses. The objective of this study was to evaluate and compare five deep learning models: YOLO versions 7, 7-tiny, and 8, and two from the Detectron2 library, Faster-RCNN and RetinaNet, for the detection and classification of Agave angustifolia plants in digital images. In the town of Santiago Matatlán, Oaxaca, 333 images were taken in an open-air plantation, and 1317 plants were labeled into five classes: sick, yellow, healthy, small, and spotted. Models were trained with a 70% random partition, validated with 10%, and tested with the remaining 20%. The results obtained from the models indicate that YOLOv7 is the best-performing model, in terms of the test set, with a mAP of 0.616, outperforming YOLOv7-tiny and YOLOv8, both with a mAP of 0.606 on the same set; demonstrating that artificial intelligence for the detection and classification of Agave angustifolia plants under planting conditions is feasible using digital images. Full article
(This article belongs to the Special Issue Computational, AI and IT Solutions Helping Agriculture)
Show Figures

Figure 1

19 pages, 30871 KiB  
Article
Comparative Analysis of YOLO Models for Bean Leaf Disease Detection in Natural Environments
by Diana-Carmen Rodríguez-Lira, Diana-Margarita Córdova-Esparza, José M. Álvarez-Alvarado, Julio-Alejandro Romero-González, Juan Terven and Juvenal Rodríguez-Reséndiz
AgriEngineering 2024, 6(4), 4585-4603; https://doi.org/10.3390/agriengineering6040262 - 30 Nov 2024
Cited by 4 | Viewed by 2197
Abstract
This study presents a comparative analysis of YOLO detection models for the accurate identification of bean leaf diseases caused by Coleoptera pests in natural environments. By using a manually collected dataset of healthy and infected bean leaves in natural conditions, we labeled at [...] Read more.
This study presents a comparative analysis of YOLO detection models for the accurate identification of bean leaf diseases caused by Coleoptera pests in natural environments. By using a manually collected dataset of healthy and infected bean leaves in natural conditions, we labeled at the leaf level and evaluated the performance of the YOLOv5, YOLOv8, YOLOv9, YOLOv10, and YOLOv11 models. Mean average precision (mAP) was used to assess the performance of the models. Among these, YOLOv9e exhibited the best performance, effectively balancing precision and recall for datasets with limited size and variability. In addition, we integrated the Sophia optimizer and PolyLoss function into YOLOv9e and enhanced it, providing even more accurate detection results. This paper highlights the potential of advanced deep learning models, optimized with second-order optimizers and custom loss functions, in improving pest detection, crop management, and overall agricultural yield. Full article
(This article belongs to the Special Issue Application of Artificial Neural Network in Agriculture)
Show Figures

Figure 1

26 pages, 2867 KiB  
Review
A Review of the Application of Hyperspectral Imaging Technology in Agricultural Crop Economics
by Jinxing Wu, Yi Zhang, Pengfei Hu and Yanying Wu
Coatings 2024, 14(10), 1285; https://doi.org/10.3390/coatings14101285 - 9 Oct 2024
Cited by 1 | Viewed by 2501
Abstract
China is a large agricultural country, and the crop economy holds an important place in the national economy. The identification of crop diseases and pests, as well as the non-destructive classification of crops, has always been a challenge in agricultural development, hindering the [...] Read more.
China is a large agricultural country, and the crop economy holds an important place in the national economy. The identification of crop diseases and pests, as well as the non-destructive classification of crops, has always been a challenge in agricultural development, hindering the rapid growth of the agricultural economy. Hyperspectral imaging technology combines imaging and spectral techniques, using hyperspectral cameras to acquire raw image data of crops. After correcting and preprocessing the raw image data to obtain the required spectral features, it becomes possible to achieve the rapid non-destructive detection of crop diseases and pests, as well as the non-destructive classification and identification of agricultural products. This paper first provides an overview of the current applications of hyperspectral imaging technology in crops both domestically and internationally. It then summarizes the methods of hyperspectral data acquisition and application scenarios. Subsequently, it organizes the processing of hyperspectral data for crop disease and pest detection and classification, deriving relevant preprocessing and analysis methods for hyperspectral data. Finally, it conducts a detailed analysis of classic cases using hyperspectral imaging technology for detecting crop diseases and pests and non-destructive classification, while also analyzing and summarizing the future development trends of hyperspectral imaging technology in agricultural production. The non-destructive rapid detection and classification technology of hyperspectral imaging can effectively select qualified crops and classify crops of different qualities, ensuring the quality of agricultural products. In conclusion, hyperspectral imaging technology can effectively serve the agricultural economy, making agricultural production more intelligent and holding significant importance for the development of agriculture in China. Full article
(This article belongs to the Special Issue Machine Learning-Driven Advancements in Coatings)
Show Figures

Figure 1

24 pages, 8312 KiB  
Article
Real-Time Identification of Strawberry Pests and Diseases Using an Improved YOLOv8 Algorithm
by Danyan Xie, Wenyi Yao, Wenbo Sun and Zhenyu Song
Symmetry 2024, 16(10), 1280; https://doi.org/10.3390/sym16101280 - 29 Sep 2024
Cited by 5 | Viewed by 2432
Abstract
Strawberry crops are susceptible to a wide range of pests and diseases, some of which are insidious and diverse due to the shortness of strawberry plants, and they pose significant challenges to accurate detection. Although deep learning-based techniques to detect crop pests and [...] Read more.
Strawberry crops are susceptible to a wide range of pests and diseases, some of which are insidious and diverse due to the shortness of strawberry plants, and they pose significant challenges to accurate detection. Although deep learning-based techniques to detect crop pests and diseases are effective in addressing these challenges, determining how to find the optimal balance between accuracy, speed, and computation remains a key issue for real-time detection. In this paper, we propose a series of improved algorithms based on the YOLOv8 model for strawberry disease detection. These include improvements to the Convolutional Block Attention Module (CBAM), Super-Lightweight Dynamic Upsampling Operator (DySample), and Omni-Dimensional Dynamic Convolution (ODConv). In experiments, the accuracy of these methods reached 97.519%, 98.028%, and 95.363%, respectively, and the F1 evaluation values reached 96.852%, 97.086%, and 95.181%, demonstrating significant improvement compared to the original YOLOv8 model. Among the three improvements, the improved model based on CBAM has the best performance in training stability and convergence, and the change in each index is relatively smooth. The model is accelerated by TensorRT, which achieves fast inference through highly optimized GPU computation, improving the real-time identification of strawberry diseases. The model has been deployed in the cloud, and the developed client can be accessed by calling the API. The feasibility and effectiveness of the system have been verified, providing an important reference for the intelligent research and application of strawberry disease identification. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

Back to TopTop