A Survey of Crop Disease Recognition Methods Based on Spectral and RGB Images
Abstract
1. Introduction
1.1. Overview
1.2. Methodology and Contributions
2. Public Datasets and Evaluation Metrics
2.1. Public Datasets
- PlantVillage—The PlantVillage dataset is widely used for crop disease classification tasks. It covers 38 crop disease categories across 14 plant species, with 54,305 images [16]. All images were obtained in controlled laboratory environments with simple backgrounds.
- XDB [17] from Embrapa Agricultural Research Institute introduced the Plant Disease Detection Database (PDDB), covering 171 diseases and other conditions across 21 plant species, with 2326 images as of October 2016. To support powerful technologies like deep learning, this dataset was expanded into the XDB dataset, where each image was subdivided according to specific criteria, increasing the total number of images to 46,513. However, as indicated in Table 2, most public datasets are collected in controlled environments. To address the scarcity of diverse field data, recent studies have increasingly explored generative models for data augmentation. Han et al. [26] demonstrated that GAN-based synthesis can significantly improve disease detection robustness. Moreover, recent reviews highlight diffusion-based augmentation as a promising direction for enhancing generalization in real-world agricultural scenarios [27].
- SWD [18] constructed a large dataset of 3531 images to evaluate and test their proposed method. The dataset includes four categories: Healthy leaf, Healthy petiole, Verticilium leaf, and Verticilium petiole. Each image indicates whether the strawberry plant is infected with Verticillium wilt disease.
- NLB—NLB is an object detection dataset created by [19] specifically for northern corn leaf blight. It consists of 18,222 images collected from actual fields using smartphones, fixed cameras, and drones, with annotations of 105,735 bounding boxes for northern corn leaf blight lesions.
- FGVC8 [20] manually captured 3651 high-quality images of actual symptoms, showcasing various apple leaf diseases with different lighting, angles, surfaces, and noise. Based on this, they created pilot datasets for apple scab, cedar rust, and healthy leaves provided to the Kaggle community and named FGVC7 for the Fine-Grained Visual Categorization Challenge at CVPR2020. FGVC8 significantly increased the number of apple leaf disease images from FGVC7 and added more segmented disease categories, totaling 18,632 images covering 12 categories.
- LWDCD2020—LWDCD2020 comprises approximately 12,000 images covering nine wheat disease categories (loose smut, spot blotch, powdery mildew, leaf rust, Fusarium head blight, crown rot, black point, Karnal bunt, and wheat streak mosaic virus) and one healthy category [21]. These images have undergone dimensionally unified preprocessing. Almost all images in LWDCD2020 contain only one type of disease and exhibit complex backgrounds, varying acquisition conditions, features from different disease development stages, and similar features among different wheat diseases.
- Plantdoc—The Plantdoc dataset consists of 2598 images covering 13 plant species and 17 disease categories [22]. It is a publicly available dataset used for object detection.
- BARI-Sunflower—The BARI-Sunflower dataset was constructed from collections at the Bangladesh Agricultural Research Institute (BARI) Gazipur Demonstration Farm, comprising 467 original images of healthy and diseased sunflower leaves and flowers [23]. To meet the demands of deep learning for data, data augmentation techniques such as random rotation, scaling, and cropping were applied, resulting in 470 images of downy mildew, 509 images of leaf scars, 398 images of gray mold, and 515 images of fresh leaves.
- Katra-Twelve—Katra-Twelve is a public leaf image dataset provided by Shri Mata Vaishno Devi University, encompassing images of healthy and diseased leaves [24]. The dataset features 12 economically and environmentally beneficial plant species, covering 22 leaf types. There are 4503 images, with 2278 images of healthy leaves and 2225 images of diseased leaves.
- AFB—The dataset by [25] comprises three hyperspectral images of apple tree plants. The first set of images involves a 15-day monitoring period of seven apple trees infected with fire blight and six control plants. The second set of images includes time monitoring of three infected plants, seven plants subjected to water stress, and seven control plants. The third set of images was collected in an orchard, involving nine trees exhibiting symptoms of fire blight and six control trees. All images displaying symptomatic plants provide pixel location information for the infected areas.
2.2. Evaluation Metrics
3. Crop Disease Recognition Methods Based on Spectral Images
3.1. Spectral Image-Based Traditional Machine Learning Crop Disease Recognition Methods
3.2. Limitations of Spectral Image-Based Traditional Machine Learning Crop Disease Recognition Methods
3.2.1. Limitations of Data Processing and Traditional Machine Learning Methods
3.2.2. Limitations of Datasets and Single Spectral Modality
3.2.3. Limitations of Experimental Analysis and External Influences
4. Disease Recognition Methods Based on Red–Green–Blue Images
4.1. Deep Learning Methods for Crop Disease Recognition Based on Red–Green–Blue Images
4.2. The Limitations of Red–Green–Blue Image-Based Deep Learning Crop Disease Recognition Methods
4.2.1. Lack of Crop Disease Data
4.2.2. Lack of Publicly Available Standard Field Datasets
4.2.3. The Problems of Interference from Complex Scenes and Difficulty in Identifying Small Lesion Sizes
5. Discussion
6. Summary and Prospects
6.1. Integrating Multi-Source Data for Crop Disease Identification
6.1.1. Space–Ground Integration
6.1.2. Multimodal Fusion
6.2. Constructing a High-Quality Dataset
6.3. Conduct Research on Disease Identification Using Video Modality
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Nomenclature
| NN | Neural Network | ||
| GBRT | Gradient Boosting Regression Tree | LSVC | Linear Support Vector Classifier |
| NB | Naive Bayes | PLSR | Partial Least Squares Regression |
| GWO-ELM | Grey Wolf Optimization-Extreme Learning Machine | BPNN | Backpropagation Neural Network |
| PSO-BP | Particle Swarm Optimization-Backpropagation Neural Network | DT | Decision Trees |
| SVM | Support Vector Machine | PLS-DA | Partial Least Squares-Discriminant Analysis |
| KNN | K-nearest Neighbors | LDA | Linear Discriminant Analysis |
| RF | Random Forest | LS-SVM | Least Squares-Support Vector Machine |
| GPR | Gaussian Process Regression | AdaBoost | Adaptive Boosting |
| LR | Logistic Regression | XGBoost | Extreme Gradient Boosting |
References
- Legrand, N. War in Ukraine: The rational “wait-and-see” mode of global food markets. Appl. Econ. Perspect. Policy 2023, 45, 626–644. [Google Scholar] [CrossRef]
- da Silveira, F.; Barbedo, J.G.A.; da Silva, S.L.C.; Amaral, F.G. Proposal for a framework to manage the barriers that hinder the development of agriculture 4.0 in the agricultural production chain. Comput. Electron. Agric. 2023, 214, 108281. [Google Scholar] [CrossRef]
- Shao, Y.; Guan, X.; Xuan, G.; Gao, F.; Feng, W.; Gao, G.; Wang, Q.; Huang, X.; Li, J. GTCBS-YOLOv5s: A lightweight model for weed species identification in paddy fields. Comput. Electron. Agric. 2023, 215, 108461. [Google Scholar] [CrossRef]
- Yang, R.; Zhou, J.; Lu, X.; Shen, J.; Chen, H.; Chen, M.; He, Y.; Liu, F. A robust rice yield estimation framework developed by grading modeling and normalized weight decision-making strategy using UAV imaging technology. Comput. Electron. Agric. 2023, 215, 108417. [Google Scholar] [CrossRef]
- Dai, G.; Tian, Z.; Fan, J.; Sunil, C.; Dewi, C. DFN-PSAN: Multi-level deep information feature fusion extraction network for interpretable plant disease classification. Comput. Electron. Agric. 2024, 216, 108481. [Google Scholar] [CrossRef]
- Mishra, S.; Volety, D.R.; Bohra, N.; Alfarhood, S.; Safran, M. A smart and sustainable framework for millet crop monitoring equipped with disease detection using enhanced predictive intelligence. Alex. Eng. J. 2023, 83, 298–306. [Google Scholar] [CrossRef]
- Margni, M.; Rossier, D.; Crettaz, P.; Jolliet, O. Life cycle impact assessment of pesticides on human health and ecosystems. Agric. Ecosyst. Environ. 2002, 93, 379–392. [Google Scholar] [CrossRef]
- Gao, C.; Guo, W.; Yang, C.; Gong, Z.; Yue, J.; Fu, Y.; Feng, H. A fast and lightweight detection model for wheat fusarium head blight spikes in natural environments. Comput. Electron. Agric. 2024, 216, 108484. [Google Scholar] [CrossRef]
- Iqbal, Z.; Khan, M.A.; Sharif, M.; Shah, J.H.; ur Rehman, M.H.; Javed, K. An automated detection and classification of citrus plant diseases using image processing techniques: A review. Comput. Electron. Agric. 2018, 153, 12–32. [Google Scholar] [CrossRef]
- Upadhyay, S.K.; Kumar, A. Deep learning and computer vision in plant disease detection: A comprehensive review of techniques, models, and trends in precision agriculture. Artif. Intell. Rev. 2025, 58, 1–55. [Google Scholar] [CrossRef]
- Noon, S.K.; Amjad, M.; Qureshi, M.A.; Mannan, A. Use of deep learning techniques for identification of plant leaf stresses: A review. Sustain. Comput. Inform. Syst. 2020, 28, 100443. [Google Scholar] [CrossRef]
- Abade, A.; Ferreira, P.A.; de Barros Vidal, F. Plant diseases recognition on images using convolutional neural networks: A systematic review. Comput. Electron. Agric. 2021, 185, 106125. [Google Scholar] [CrossRef]
- Terentev, A.; Dolzhenko, V.; Fedotov, A.; Eremenko, D. Current state of hyperspectral remote sensing for early plant disease detection: A review. Sensors 2022, 22, 757. [Google Scholar] [CrossRef] [PubMed]
- Thakur, P.S.; Khanna, P.; Sheorey, T.; Ojha, A. Trends in vision-based machine learning techniques for plant disease identification: A systematic review. Expert Syst. Appl. 2022, 208, 118117. [Google Scholar] [CrossRef]
- Jafar, A.; Bibi, N.; Naqvi, R.A.; Sadeghi-Niaraki, A.; Jeong, D. Revolutionizing agriculture with artificial intelligence: Plant disease detection methods, applications, and their limitations. Front. Plant Sci. 2024, 15, 1356260. [Google Scholar] [CrossRef]
- Hughes, D.; Salathé, M. An open access repository of images on plant health to enable the development of mobile disease diagnostics. arXiv 2015, arXiv:1511.08060. [Google Scholar] [CrossRef]
- Barbedo, J.G.A.; Koenigkan, L.V.; Halfeld-Vieira, B.A.; Costa, R.V.; Nechet, K.L.; Godoy, C.V.; Junior, M.L.; Patricio, F.R.A.; Talamini, V.; Chitarra, L.G.; et al. Annotated plant pathology databases for image-based detection and recognition of diseases. IEEE Lat. Am. Trans. 2018, 16, 1749–1757. [Google Scholar] [CrossRef]
- Nie, X.; Wang, L.; Ding, H.; Xu, M. Strawberry verticillium wilt detection network based on multi-task learning and attention. IEEE Access 2019, 7, 170003–170011. [Google Scholar] [CrossRef]
- Sun, J.; Yang, Y.; He, X.; Wu, X. Northern maize leaf blight detection under complex field environment based on deep learning. IEEE Access 2020, 8, 33679–33688. [Google Scholar] [CrossRef]
- Thapa, R.; Snavely, N.; Belongie, S.; Khan, A. The plant pathology 2020 challenge dataset to classify foliar disease of apples. arXiv 2020, arXiv:2004.11958. [Google Scholar] [CrossRef]
- Goyal, L.; Sharma, C.M.; Singh, A.; Singh, P.K. Leaf and spike wheat disease detection & classification using an improved deep convolutional architecture. Inform. Med. Unlocked 2021, 25, 100642. [Google Scholar] [CrossRef]
- Singh, D.; Jain, N.; Jain, P.; Kayal, P.; Kumawat, S.; Batra, N. PlantDoc: A dataset for visual plant disease detection. In Proceedings of the 7th ACM IKDD CoDS and 25th COMAD; Association for Computing Machinery: New York, NY, USA, 2020; pp. 249–253. [Google Scholar] [CrossRef]
- Sara, U.; Rajbongshi, A.; Shakil, R.; Akter, B.; Sazzad, S.; Uddin, M.S. An extensive sunflower dataset representation for successful identification and classification of sunflower diseases. Data Brief 2022, 42, 108043. [Google Scholar] [CrossRef] [PubMed]
- Chouhan, S.S.; Singh, U.P.; Kaul, A.; Jain, S. A data repository of leaf images: Practice towards plant conservation with plant pathology. In Proceedings of the 2019 4th International Conference on Information Systems and Computer Networks (ISCON); IEEE: New York, NY, USA, 2019; pp. 700–707. [Google Scholar] [CrossRef]
- Gaci, B.; Abdelghafour, F.; Ryckewaert, M.; Mas-Garcia, S.; Louargant, M.; Verpont, F.; Laloum, Y.; Moronvalle, A.; Bendoula, R.; Roger, J.M. Visible–Near infrared hyperspectral dataset of healthy and infected apple tree leaves images for the monitoring of apple fire blight. Data Brief 2023, 50, 109532. [Google Scholar] [CrossRef] [PubMed]
- Han, G.; Asiedu, D.K.P.; Bennin, K.E. Plant disease detection with generative adversarial networks. Heliyon 2025, 11, e43002. [Google Scholar] [CrossRef]
- Shafay, M.; Hassan, T.; Owais, M.; Hussain, I.; Khawaja, S.G.; Seneviratne, L.; Werghi, N. Recent advances in plant disease detection: Challenges and opportunities. Plant Methods 2025, 21, 140. [Google Scholar] [CrossRef]
- Zhang, X.; Han, L.; Dong, Y.; Shi, Y.; Huang, W.; Han, L.; González-Moreno, P.; Ma, H.; Ye, H.; Sobeih, T. A deep learning-based approach for automated yellow rust disease detection from high-resolution hyperspectral UAV images. Remote Sens. 2019, 11, 1554. [Google Scholar] [CrossRef]
- Abbott, J.A. Quality measurement of fruits and vegetables. Postharvest Biol. Technol. 1999, 15, 207–225. [Google Scholar] [CrossRef]
- Goyal, S.B.; Malik, V.; Rajawat, A.S.; Khan, M.; Ikram, A.; Alabdullah, B.; Almjally, A. Smart intercropping system to detect leaf disease using hyperspectral imaging and hybrid deep learning for precision agriculture. Front. Plant Sci. 2025, 16, 1662251. [Google Scholar] [CrossRef]
- Zhang, H.; Ye, N.; Gong, J.; Xue, H.; Wang, P.; Jiao, B.; Qiao, X. Hyperspectral imaging-based deep learning method for apple quarantine disease detection. Foods 2025, 14, 3246. [Google Scholar] [CrossRef]
- Chossegros, M.; Hubbard, A.; Burt, M.; Harrison, R.J.; Nellist, C.F.; Grinberg, N.F. Hyperspectral image analysis for classification of multiple infections in wheat. Plant Methods 2025, 21, 144. [Google Scholar] [CrossRef]
- Bi, Q.; Goodman, K.E.; Kaminsky, J.; Lessler, J. What is machine learning? A primer for the epidemiologist. Am. J. Epidemiol. 2019, 188, 2222–2239. [Google Scholar] [CrossRef]
- Meshram, V.; Patil, K.; Meshram, V.; Hanchate, D.; Ramkteke, S. Machine learning in agriculture domain: A state-of-art survey. Artif. Intell. Life Sci. 2021, 1, 100010. [Google Scholar] [CrossRef]
- Ma, R.; Zhang, N.; Zhang, X.; Bai, T.; Yuan, X.; Bao, H.; He, D.; Sun, W.; He, Y. Cotton Verticillium wilt monitoring based on UAV multispectral-visible multi-source feature fusion. Comput. Electron. Agric. 2024, 217, 108628. [Google Scholar] [CrossRef]
- Xie, Y.; Plett, D.; Evans, M.; Garrard, T.; Butt, M.; Clarke, K.; Liu, H. Hyperspectral imaging detects biological stress of wheat for early diagnosis of crown rot disease. Comput. Electron. Agric. 2024, 217, 108571. [Google Scholar] [CrossRef]
- Mustafa, G.; Zheng, H.; Khan, I.H.; Zhu, J.; Yang, T.; Wang, A.; Xue, B.; He, C.; Jia, H.; Li, G.; et al. Enhancing fusarium head blight detection in wheat crops using hyperspectral indices and machine learning classifiers. Comput. Electron. Agric. 2024, 218, 108663. [Google Scholar] [CrossRef]
- Yang, M.; Kang, X.; Qiu, X.; Ma, L.; Ren, H.; Huang, C.; Zhang, Z.; Lv, X. Method for early diagnosis of verticillium wilt in cotton based on chlorophyll fluorescence and hyperspectral technology. Comput. Electron. Agric. 2024, 216, 108497. [Google Scholar] [CrossRef]
- Mustafa, G.; Zheng, H.; Li, W.; Yin, Y.; Wang, Y.; Zhou, M.; Liu, P.; Bilal, M.; Jia, H.; Li, G.; et al. Fusarium head blight monitoring in wheat ears using machine learning and multimodal data from asymptomatic to symptomatic periods. Front. Plant Sci. 2023, 13, 1102341. [Google Scholar] [CrossRef]
- Ren, K.; Dong, Y.; Huang, W.; Guo, A.; Jing, X. Monitoring of winter wheat stripe rust by collaborating canopy SIF with wavelet energy coefficients. Comput. Electron. Agric. 2023, 215, 108366. [Google Scholar] [CrossRef]
- Sun, H.; Song, X.; Guo, W.; Guo, M.; Mao, Y.; Yang, G.; Feng, H.; Zhang, J.; Feng, Z.; Wang, J.; et al. Potato late blight severity monitoring based on the relief-mRmR algorithm with dual-drone cooperation. Comput. Electron. Agric. 2023, 215, 108438. [Google Scholar] [CrossRef]
- Tang, Y.; Yang, J.; Zhuang, J.; Hou, C.; Miao, A.; Ren, J.; Huang, H.; Tan, Z.; Paliwal, J. Early detection of citrus anthracnose caused by Colletotrichum gloeosporioides using hyperspectral imaging. Comput. Electron. Agric. 2023, 214, 108348. [Google Scholar] [CrossRef]
- Zhang, J.; Jing, X.; Song, X.; Zhang, T.; Duan, W.; Su, J. Hyperspectral estimation of wheat stripe rust using fractional order differential equations and Gaussian process methods. Comput. Electron. Agric. 2023, 206, 107671. [Google Scholar] [CrossRef]
- Zhang, S.; Li, X.; Ba, Y.; Lyu, X.; Zhang, M.; Li, M. Banana fusarium wilt disease detection by supervised and unsupervised methods from UAV-based multispectral imagery. Remote Sens. 2022, 14, 1231. [Google Scholar] [CrossRef]
- Almoujahed, M.B.; Rangarajan, A.K.; Whetton, R.L.; Vincke, D.; Eylenbosch, D.; Vermeulen, P.; Mouazen, A.M. Detection of fusarium head blight in wheat under field conditions using a hyperspectral camera and machine learning. Comput. Electron. Agric. 2022, 203, 107456. [Google Scholar] [CrossRef]
- Jing, X.; Zou, Q.; Yan, J.; Dong, Y.; Li, B. Remote sensing monitoring of winter wheat stripe rust based on mRMR-XGBoost algorithm. Remote Sens. 2022, 14, 756. [Google Scholar] [CrossRef]
- Xiao, D.; Pan, Y.; Feng, J.; Yin, J.; Liu, Y.; He, L. Remote sensing detection algorithm for apple fire blight based on UAV multispectral image. Comput. Electron. Agric. 2022, 199, 107137. [Google Scholar] [CrossRef]
- Mustafa, G.; Zheng, H.; Khan, I.H.; Tian, L.; Jia, H.; Li, G.; Cheng, T.; Tian, Y.; Cao, W.; Zhu, Y.; et al. Hyperspectral reflectance proxies to diagnose in-field fusarium head blight in wheat with machine learning. Remote Sens. 2022, 14, 2784. [Google Scholar] [CrossRef]
- Xuan, G.; Li, Q.; Shao, Y.; Shi, Y. Early diagnosis and pathogenesis monitoring of wheat powdery mildew caused by blumeria graminis using hyperspectral imaging. Comput. Electron. Agric. 2022, 197, 106921. [Google Scholar] [CrossRef]
- Narmilan, A.; Gonzalez, F.; Salgadoe, A.S.A.; Powell, K. Detection of white leaf disease in sugarcane using machine learning techniques over UAV multispectral images. Drones 2022, 6, 230. [Google Scholar] [CrossRef]
- Pérez-Roncal, C.; Arazuri, S.; Lopez-Molina, C.; Jarén, C.; Santesteban, L.G.; López-Maestresalas, A. Exploring the potential of hyperspectral imaging to detect Esca disease complex in asymptomatic grapevine leaves. Comput. Electron. Agric. 2022, 196, 106863. [Google Scholar] [CrossRef]
- Tian, L.; Xue, B.; Wang, Z.; Li, D.; Yao, X.; Cao, Q.; Zhu, Y.; Cao, W.; Cheng, T. Spectroscopic detection of rice leaf blast infection from asymptomatic to mild stages with integrated machine learning and feature selection. Remote Sens. Environ. 2021, 257, 112350. [Google Scholar] [CrossRef]
- Khan, I.H.; Liu, H.; Li, W.; Cao, A.; Wang, X.; Liu, H.; Cheng, T.; Tian, Y.; Zhu, Y.; Cao, W.; et al. Early detection of powdery mildew disease and accurate quantification of its severity using hyperspectral images in wheat. Remote Sens. 2021, 13, 3612. [Google Scholar] [CrossRef]
- Rodriguez, J.; Lizarazo, I.; Prieto, F.; Angulo-Morales, V. Assessment of potato late blight from UAV-based multispectral imagery. Comput. Electron. Agric. 2021, 184, 106061. [Google Scholar] [CrossRef]
- Xiao, Y.; Dong, Y.; Huang, W.; Liu, L.; Ma, H. Wheat fusarium head blight detection using UAV-based spectral and texture features in optimal window size. Remote Sens. 2021, 13, 2437. [Google Scholar] [CrossRef]
- Gao, Z.; Khot, L.R.; Naidu, R.A.; Zhang, Q. Early detection of grapevine leafroll disease in a red-berried wine grape cultivar using hyperspectral imaging. Comput. Electron. Agric. 2020, 179, 105807. [Google Scholar] [CrossRef]
- Li, X.; Yang, C.; Huang, W.; Tang, J.; Tian, Y.; Zhang, Q. Identification of cotton root rot by multifeature selection from sentinel-2 images using random forest. Remote Sens. 2020, 12, 3504. [Google Scholar] [CrossRef]
- Ardila, C.E.C.; Ramirez, L.A.; Ortiz, F.A.P. Spectral analysis for the early detection of anthracnose in fruits of Sugar Mango (Mangifera indica). Comput. Electron. Agric. 2020, 173, 105357. [Google Scholar] [CrossRef]
- Lan, Y.; Huang, Z.; Deng, X.; Zhu, Z.; Huang, H.; Zheng, Z.; Lian, B.; Zeng, G.; Tong, Z. Comparison of machine learning methods for citrus greening detection on UAV multispectral images. Comput. Electron. Agric. 2020, 171, 105234. [Google Scholar] [CrossRef]
- Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of banana fusarium wilt based on UAV remote sensing. Remote Sens. 2020, 12, 938. [Google Scholar] [CrossRef]
- Van De Vijver, R.; Mertens, K.; Heungens, K.; Somers, B.; Nuyttens, D.; Borra-Serrano, I.; Lootens, P.; Roldán-Ruiz, I.; Vangeyte, J.; Saeys, W. In-field detection of Alternaria solani in potato crops using hyperspectral imaging. Comput. Electron. Agric. 2020, 168, 105106. [Google Scholar] [CrossRef]
- Khan, A.; Vibhute, A.D.; Mali, S.; Patil, C.H. A systematic review on hyperspectral imaging technology with a machine and deep learning methodology for agricultural applications. Ecol. Inform. 2022, 69, 101678. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25. [Google Scholar] [CrossRef]
- Chen, F.; Zhang, Y.; Zhang, J.; Liu, L.; Wu, K. Rice false smut detection and prescription map generation in a complex planting environment, with mixed methods, based on near earth remote sensing. Remote Sens. 2022, 14, 945. [Google Scholar] [CrossRef]
- Shi, Y.; Han, L.; Kleerekoper, A.; Chang, S.; Hu, T. Novel cropdocnet model for automated potato late blight disease detection from unmanned aerial vehicle-based hyperspectral imagery. Remote Sens. 2022, 14, 396. [Google Scholar] [CrossRef]
- Kuswidiyanto, L.W.; Wang, P.; Noh, H.H.; Jung, H.Y.; Jung, D.H.; Han, X. Airborne hyperspectral imaging for early diagnosis of kimchi cabbage downy mildew using 3D-ResNet and leaf segmentation. Comput. Electron. Agric. 2023, 214, 108312. [Google Scholar] [CrossRef]
- Deng, J.; Zhang, X.; Yang, Z.; Zhou, C.; Wang, R.; Zhang, K.; Lv, X.; Yang, L.; Wang, Z.; Li, P.; et al. Pixel-level regression for UAV hyperspectral images: Deep learning-based quantitative inverse of wheat stripe rust disease index. Comput. Electron. Agric. 2023, 215, 108434. [Google Scholar] [CrossRef]
- Liu, Y.; Su, J.; Zheng, Z.; Liu, D.; Song, Y.; Fang, Y.; Yang, P.; Su, B. GLDCNet: A novel convolutional neural network for grapevine leafroll disease recognition using UAV-based imagery. Comput. Electron. Agric. 2024, 218, 108668. [Google Scholar] [CrossRef]
- Liu, Z.; Feng, Y.; Li, R.; Zhang, S.; Zhang, L.; Cui, G.; Ahmad, A.M.; Fu, L.; Cui, Y. Improved kiwifruit detection using VGG16 with RGB and NIR information fusion. In Proceedings of the 2019 ASABE Annual International Meeting, American Society of Agricultural and Biological Engineers, Boston, MA, USA, 7–10 July 2019; p. 1. [Google Scholar]
- Bu, Y.; Hu, J.; Chen, C.; Bai, S.; Chen, Z.; Hu, T.; Zhang, G.; Liu, N.; Cai, C.; Li, Y.; et al. ResNet incorporating the fusion data of RGB & hyperspectral images improves classification accuracy of vegetable soybean freshness. Sci. Rep. 2024, 14, 2568. [Google Scholar] [CrossRef]
- Qin, S.; Ding, Y.; Zhou, T.; Zhai, M.; Zhang, Z.; Fan, M.; Lv, X.; Zhang, Z.; Zhang, L. “Image-Spectral” fusion monitoring of small cotton samples nitrogen content based on improved deep forest. Comput. Electron. Agric. 2024, 221, 109002. [Google Scholar] [CrossRef]
- Tian, X.; Yao, J.; Yu, H.; Wang, W.; Huang, W. Early contamination warning of Aflatoxin B1 in stored maize based on the dynamic change of catalase activity and data fusion of hyperspectral images. Comput. Electron. Agric. 2024, 217, 108615. [Google Scholar] [CrossRef]
- Heidarian Dehkordi, R.; El Jarroudi, M.; Kouadio, L.; Meersmans, J.; Beyer, M. Monitoring wheat leaf rust and stripe rust in winter wheat using high-resolution UAV-based red-green-blue imagery. Remote Sens. 2020, 12, 3696. [Google Scholar] [CrossRef]
- Takakura, T.; Shimomachi, T.; Takemassa, T. Non-destructive detection of plant health. In Proceedings of the International Symposium on Design and Environmental Control of Tropical and Subtropical Greenhouses 578, Taichung, Taiwan, 15–18 April 2001; pp. 303–306. [Google Scholar]
- Bao, W.; Huang, C.; Hu, G.; Su, B.; Yang, X. Detection of Fusarium head blight in wheat using UAV remote sensing based on parallel channel space attention. Comput. Electron. Agric. 2024, 217, 108630. [Google Scholar] [CrossRef]
- Uddin, M.S.; Mazumder, M.K.A.; Prity, A.J.; Mridha, M.; Alfarhood, S.; Safran, M.; Che, D. Cauli-Det: Enhancing cauliflower disease detection with modified YOLOv8. Front. Plant Sci. 2024, 15, 1373590. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Yu, Q.; Geng, S. Real-time and lightweight detection of grape diseases based on Fusion Transformer YOLO. Front. Plant Sci. 2024, 15, 1269423. [Google Scholar] [CrossRef] [PubMed]
- Lu, J.; Lu, B.; Ma, W.; Sun, Y. EAIS-Former: An efficient and accurate image segmentation method for fruit leaf diseases. Comput. Electron. Agric. 2024, 218, 108739. [Google Scholar] [CrossRef]
- Mao, R.; Zhang, Y.; Wang, Z.; Hao, X.; Zhu, T.; Gao, S.; Hu, X. DAE-Mask: A novel deep-learning-based automatic detection model for in-field wheat diseases. Precis. Agric. 2024, 25, 785–810. [Google Scholar] [CrossRef]
- Li, W.; Yu, X.; Chen, C.; Gong, Q. Identification and localization of grape diseased leaf images captured by UAV based on CNN. Comput. Electron. Agric. 2023, 214, 108277. [Google Scholar] [CrossRef]
- Zheng, J.; Li, K.; Wu, W.; Ruan, H. RepDI: A light-weight CPU network for apple leaf disease identification. Comput. Electron. Agric. 2023, 212, 108122. [Google Scholar] [CrossRef]
- Li, G.; Jiao, L.; Chen, P.; Liu, K.; Wang, R.; Dong, S.; Kang, C. Spatial convolutional self-attention-based transformer module for strawberry disease identification under complex background. Comput. Electron. Agric. 2023, 212, 108121. [Google Scholar] [CrossRef]
- Zhang, D.; Huang, Y.; Wu, C.; Ma, M. Detecting tomato disease types and degrees using multi-branch and destruction learning. Comput. Electron. Agric. 2023, 213, 108244. [Google Scholar] [CrossRef]
- Zhang, Y.; Zhou, G.; Chen, A.; He, M.; Li, J.; Hu, Y. A precise apple leaf diseases detection using BCTNet under unconstrained environments. Comput. Electron. Agric. 2023, 212, 108132. [Google Scholar] [CrossRef]
- Bao, W.; Zhu, Z.; Hu, G.; Zhou, X.; Zhang, D.; Yang, X. UAV remote sensing detection of tea leaf blight based on DDMA-YOLO. Comput. Electron. Agric. 2023, 205, 107637. [Google Scholar] [CrossRef]
- He, J.; Liu, T.; Li, L.; Hu, Y.; Zhou, G. MFaster r-CNN for maize leaf diseases detection based on machine vision. Arab. J. Sci. Eng. 2023, 48, 1437–1449. [Google Scholar] [CrossRef]
- Zhu, S.; Ma, W.; Wang, J.; Yang, M.; Wang, Y.; Wang, C. EADD-YOLO: An efficient and accurate disease detector for apple leaf using improved lightweight YOLOv5. Front. Plant Sci. 2023, 14, 1120724. [Google Scholar] [CrossRef] [PubMed]
- Zhao, Y.; Yang, Y.; Xu, X.; Sun, C. Precision detection of crop diseases based on improved YOLOv5 model. Front. Plant Sci. 2023, 13, 1066835. [Google Scholar] [CrossRef] [PubMed]
- Lin, J.; Yu, D.; Pan, R.; Cai, J.; Liu, J.; Zhang, L.; Wen, X.; Peng, X.; Cernava, T.; Oufensou, S.; et al. Improved YOLOX-Tiny network for detection of tobacco brown spot disease. Front. Plant Sci. 2023, 14, 1135105. [Google Scholar] [CrossRef] [PubMed]
- Khan, F.; Zafar, N.; Tahir, M.N.; Aqib, M.; Waheed, H.; Haroon, Z. A mobile-based system for maize plant leaf disease detection and classification using deep learning. Front. Plant Sci. 2023, 14, 1079366. [Google Scholar] [CrossRef]
- Xue, Z.; Xu, R.; Bai, D.; Lin, H. YOLO-tea: A tea disease detection model improved by YOLOv5. Forests 2023, 14, 415. [Google Scholar] [CrossRef]
- Zhao, S.; Liu, J.; Wu, S. Multiple disease detection method for greenhouse-cultivated strawberry based on multiscale feature fusion Faster R_CNN. Comput. Electron. Agric. 2022, 199, 107176. [Google Scholar] [CrossRef]
- Ji, M.; Wu, Z. Automatic detection and severity analysis of grape black measles disease based on deep learning and fuzzy logic. Comput. Electron. Agric. 2022, 193, 106718. [Google Scholar] [CrossRef]
- Li, J.; Qiao, Y.; Liu, S.; Zhang, J.; Yang, Z.; Wang, M. An improved YOLOv5-based vegetable disease detection method. Comput. Electron. Agric. 2022, 202, 107345. [Google Scholar] [CrossRef]
- Li, D.; Ahmed, F.; Wu, N.; Sethi, A.I. Yolo-JD: A Deep Learning Network for jute diseases and pests detection from images. Plants 2022, 11, 937. [Google Scholar] [CrossRef]
- Zhang, Y.; Ma, B.; Hu, Y.; Li, C.; Li, Y. Accurate cotton diseases and pests detection in complex background based on an improved YOLOX model. Comput. Electron. Agric. 2022, 203, 107484. [Google Scholar] [CrossRef]
- Qiu, R.Z.; Chen, S.P.; Chi, M.X.; Wang, R.B.; Huang, T.; Fan, G.C.; Zhao, J.; Weng, Q.Y. An automatic identification system for citrus greening disease (Huanglongbing) using a YOLO convolutional neural network. Front. Plant Sci. 2022, 13, 1002606. [Google Scholar] [CrossRef] [PubMed]
- Qi, J.; Liu, X.; Liu, K.; Xu, F.; Guo, H.; Tian, X.; Li, M.; Bao, Z.; Li, Y. An improved YOLOv5 model based on visual attention mechanism: Application to recognition of tomato virus disease. Comput. Electron. Agric. 2022, 194, 106780. [Google Scholar] [CrossRef]
- Liu, S.; Qiao, Y.; Li, J.; Zhang, H.; Zhang, M.; Wang, M. An improved lightweight network for real-time detection of apple leaf diseases in natural scenes. Agronomy 2022, 12, 2363. [Google Scholar] [CrossRef]
- Kaur, P.; Harnal, S.; Gautam, V.; Singh, M.P.; Singh, S.P. An approach for characterization of infected area in tomato leaf disease based on deep learning and object detection technique. Eng. Appl. Artif. Intell. 2022, 115, 105210. [Google Scholar] [CrossRef]
- Roy, A.M.; Bose, R.; Bhaduri, J. A fast accurate fine-grain object detection model based on YOLOv4 deep neural network. Neural Comput. Appl. 2022, 34, 3895–3921. [Google Scholar] [CrossRef]
- Zhang, Z.; Qiao, Y.; Guo, Y.; He, D. Deep learning based automatic grape downy mildew detection. Front. Plant Sci. 2022, 13, 872107. [Google Scholar] [CrossRef]
- Xu, Y.; Chen, Q.; Kong, S.; Xing, L.; Wang, Q.; Cong, X.; Zhou, Y. Real-time object detection method of melon leaf diseases under complex background in greenhouse. J. Real-Time Image Process. 2022, 19, 985–995. [Google Scholar] [CrossRef]
- Li, Y.; Wang, J.; Wu, H.; Yu, Y.; Sun, H.; Zhang, H. Detection of powdery mildew on strawberry leaves based on DAC-YOLOv4 model. Comput. Electron. Agric. 2022, 202, 107418. [Google Scholar] [CrossRef]
- Li, S.; Li, K.; Qiao, Y.; Zhang, L. A multi-scale cucumber disease detection method in natural scenes based on YOLOv5. Comput. Electron. Agric. 2022, 202, 107363. [Google Scholar] [CrossRef]
- Fang, S.; Wang, Y.; Zhou, G.; Chen, A.; Cai, W.; Wang, Q.; Hu, Y.; Li, L. Multi-channel feature fusion networks with hard coordinate attention mechanism for maize disease identification under complex backgrounds. Comput. Electron. Agric. 2022, 203, 107486. [Google Scholar] [CrossRef]
- Wang, J.; Yu, L.; Yang, J.; Dong, H. DBA_SSD: A novel end-to-end object detection algorithm applied to plant disease detection. Information 2021, 12, 474. [Google Scholar] [CrossRef]
- Liu, C.; Zhu, H.; Guo, W.; Han, X.; Chen, C.; Wu, H. EFDet: An efficient detection method for cucumber disease under natural complex environments. Comput. Electron. Agric. 2021, 189, 106378. [Google Scholar] [CrossRef]
- Bao, W.; Yang, X.; Liang, D.; Hu, G.; Yang, X. Lightweight convolutional neural network model for field wheat ear disease identification. Comput. Electron. Agric. 2021, 189, 106367. [Google Scholar] [CrossRef]
- Zhang, K.; Wu, Q.; Chen, Y. Detecting soybean leaf disease from synthetic image using multi-feature fusion faster R-CNN. Comput. Electron. Agric. 2021, 183, 106064. [Google Scholar] [CrossRef]
- Sun, H.; Xu, H.; Liu, B.; He, D.; He, J.; Zhang, H.; Geng, N. MEAN-SSD: A novel real-time detector for apple leaf diseases using improved light-weight convolutional neural networks. Comput. Electron. Agric. 2021, 189, 106379. [Google Scholar] [CrossRef]
- Zhu, J.; Cheng, M.; Wang, Q.; Yuan, H.; Cai, Z. Grape leaf black rot detection based on super-resolution image enhancement and deep learning. Front. Plant Sci. 2021, 12, 695749. [Google Scholar] [CrossRef]
- Liu, J.; Wang, X. Tomato diseases and pests detection based on improved Yolo V3 convolutional neural network. Front. Plant Sci. 2020, 11, 521544. [Google Scholar] [CrossRef]
- Mi, Z.; Zhang, X.; Su, J.; Han, D.; Su, B. Wheat stripe rust grading by deep learning with attention mechanism and images from mobile devices. Front. Plant Sci. 2020, 11, 558126. [Google Scholar] [CrossRef]
- Chen, X.; Zhou, G.; Chen, A.; Yi, J.; Zhang, W.; Hu, Y. Identification of tomato leaf diseases based on combination of ABCK-BWTR and B-ARNet. Comput. Electron. Agric. 2020, 178, 105730. [Google Scholar] [CrossRef]
- Xie, X.; Ma, Y.; Liu, B.; He, J.; Li, S.; Wang, H. A deep-learning-based real-time detector for grape leaf diseases using improved convolutional neural networks. Front. Plant Sci. 2020, 11, 751. [Google Scholar] [CrossRef] [PubMed]
- Liu, J.; Wang, X. Early recognition of tomato gray leaf spot disease based on MobileNetv2-YOLOv3 model. Plant Methods 2020, 16, 1–16. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Song, C.; Zhang, D. Deep learning-based object detection improvement for tomato disease. IEEE Access 2020, 8, 56607–56614. [Google Scholar] [CrossRef]
- Wang, A.; Chen, H.; Liu, L.; Chen, K.; Lin, Z.; Han, J.; Ding, G. YOLOv10: Real-Time End-to-End Object Detection. arXiv 2024, arXiv:2405.14458. [Google Scholar]
- Alanazi, R. A YOLOv10-based Approach for Banana Leaf Disease Detection. Eng. Technol. Appl. Sci. Res. 2025, 15, 23522–23526. [Google Scholar] [CrossRef]
- Murugavalli, S.; Gopi, R. Plant leaf disease detection using vision transformers for precision agriculture. Sci. Rep. 2025, 15, 22361. [Google Scholar] [CrossRef]
- Zhang, Z.; Li, X.; Xie, Y.; Chen, J. VMamba for Plant Leaf Disease Identification: Design and experiment. Front. Plant Sci. 2025, 16, 1515021. [Google Scholar] [CrossRef]
- Cap, Q.; Suwa, K.; Fujita, E.; Uga, H.; Kagiwada, S.; Iyatomi, H. An end-to-end practical plant disease diagnosis system for wide-angle cucumber images. Int. J. Eng. Technol. 2018, 7, 106–111. [Google Scholar] [CrossRef]
- Zhang, D.Y.; Luo, H.S.; Wang, D.Y.; Zhou, X.G.; Li, W.F.; Gu, C.Y.; Zhang, G.; He, F.M. Assessment of the levels of damage caused by Fusarium head blight in wheat using an improved YoloV5 method. Comput. Electron. Agric. 2022, 198, 107086. [Google Scholar] [CrossRef]
- Barbedo, J.G. Deep learning applied to plant pathology: The problem of data representativeness. Trop. Plant Pathol. 2022, 47, 85–94. [Google Scholar] [CrossRef]
- Pan, S.J.; Yang, Q. A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 2009, 22, 1345–1359. [Google Scholar] [CrossRef]
- Too, E.C.; Yujian, L.; Njuki, S.; Yingchun, L. A comparative study of fine-tuning deep learning models for plant disease identification. Comput. Electron. Agric. 2019, 161, 272–279. [Google Scholar] [CrossRef]
- Feng, L.; Wu, B.; He, Y.; Zhang, C. Hyperspectral imaging combined with deep transfer learning for rice disease detection. Front. Plant Sci. 2021, 12, 693521. [Google Scholar] [CrossRef] [PubMed]
- Liu, G.; Peng, J.; El-Latif, A.A.A. Sk-mobilenet: A lightweight adaptive network based on complex deep transfer learning for plant disease recognition. Arab. J. Sci. Eng. 2023, 48, 1661–1675. [Google Scholar] [CrossRef]
- Jin, X.; Xiong, J.; Rao, Y.; Zhang, T.; Ba, W.; Gu, S.; Zhang, X.; Lu, J. TranNas-NirCR: A method for improving the diagnosis of asymptomatic wheat scab with transfer learning and neural architecture search. Comput. Electron. Agric. 2023, 213, 108271. [Google Scholar] [CrossRef]
- Wang, X.; Pan, T.; Qu, J.; Sun, Y.; Miao, L.; Zhao, Z.; Li, Y.; Zhang, Z.; Zhao, H.; Hu, Z.; et al. Diagnosis of soybean bacterial blight progress stage based on deep learning in the context of data-deficient. Comput. Electron. Agric. 2023, 212, 108170. [Google Scholar] [CrossRef]
- Jung, Y.; Byun, S.; Kim, B.; Amin, S.U.; Seo, S. Harnessing synthetic data for enhanced detection of Pine Wilt Disease: An image classification approach. Comput. Electron. Agric. 2024, 218, 108690. [Google Scholar] [CrossRef]
- Li, H.; Huang, L.; Ruan, C.; Huang, W.; Wang, C.; Zhao, J. A dual-branch neural network for crop disease recognition by integrating frequency domain and spatial domain information. Comput. Electron. Agric. 2024, 219, 108843. [Google Scholar] [CrossRef]
- Shrivastava, A.; Gupta, A.; Girshick, R. Training region-based object detectors with online hard example mining. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 761–769. [Google Scholar] [CrossRef]
- Chen, K.; Chen, Y.; Han, C.; Sang, N.; Gao, C. Hard sample mining makes person re-identification more efficient and accurate. Neurocomputing 2020, 382, 259–267. [Google Scholar] [CrossRef]
- Zhang, M.; Chen, Y.; Zhang, B.; Pang, K.; Lv, B. Recognition of pest based on faster rcnn. In Proceedings of the Signal and Information Processing, Networking and Computers: Proceedings of the 6th International Conference on Signal and Information Processing, Networking and Computers (ICSINC); Springer: Berlin/Heidelberg, Germany, 2020; pp. 62–69. [Google Scholar] [CrossRef]
- Kim, B.; Han, Y.K.; Park, J.H.; Lee, J. Improved vision-based detection of strawberry diseases using a deep neural network. Front. Plant Sci. 2021, 11, 559172. [Google Scholar] [CrossRef]
- Lin, P.; Zhang, H.; Zhao, F.; Wang, X.; Liu, H.; Chen, Y. Boosted Mask R-CNN algorithm for accurately detecting strawberry plant canopies in the fields from low-altitude drone images. Food Sci. Technol. 2022, 42, e95922. [Google Scholar] [CrossRef]
- Dai, F.; Wang, F.; Yang, D.; Lin, S.; Chen, X.; Lan, Y.; Deng, X. Detection method of citrus psyllids with field high-definition camera based on improved cascade region-based convolution neural networks. Front. Plant Sci. 2022, 12, 816272. [Google Scholar] [CrossRef] [PubMed]
- Wang, K.; Peng, Y.; Huang, H.; Hu, Y.; Li, S. Mining hard samples locally and globally for improved speech separation. In Proceedings of the ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP); IEEE: New York, NY, USA, 2022; pp. 6037–6041. [Google Scholar] [CrossRef]
- Cap, Q.H.; Fukuda, A.; Kagiwada, S.; Uga, H.; Iwasaki, N.; Iyatomi, H. Towards robust plant disease diagnosis with hard-sample re-mining strategy. Comput. Electron. Agric. 2023, 215, 108375. [Google Scholar] [CrossRef]
- Tian, Y.; Sun, C.; Poole, B.; Krishnan, D.; Schmid, C.; Isola, P. What makes for good views for contrastive learning? Adv. Neural Inf. Process. Syst. 2020, 33, 6827–6839. [Google Scholar] [CrossRef]
- Figueroa-Mata, G.; Mata-Montero, E. Using a convolutional siamese network for image-based plant species identification with small datasets. Biomimetics 2020, 5, 8. [Google Scholar] [CrossRef]
- Shang, Z.; He, J.; Wang, D. Image recognition of plant diseases based on Siamese Networks. In Proceedings of the 2022 China Automation Congress (CAC); IEEE: New York, NY, USA, 2022; pp. 5328–5332. [Google Scholar] [CrossRef]
- Janarthan, S.; Thuseethan, S.; Rajasegarar, S.; Yearwood, J. P2OP—Plant Pathology on Palms: A deep learning-based mobile solution for in-field plant disease detection. Comput. Electron. Agric. 2022, 202, 107371. [Google Scholar] [CrossRef]
- Zhang, S.; Wang, D.; Yu, C. Apple leaf disease recognition method based on Siamese dilated Inception network with less training samples. Comput. Electron. Agric. 2023, 213, 108188. [Google Scholar] [CrossRef]
- Fang, U.; Li, J.; Lu, X.; Gao, L.; Ali, M.; Xiang, Y. Self-supervised cross-iterative clustering for unlabeled plant disease images. Neurocomputing 2021, 456, 36–48. [Google Scholar] [CrossRef]
- Liu, H.; Zhan, Y.; Xia, H.; Mao, Q.; Tan, Y. Self-supervised transformer-based pre-training method using latent semantic masking auto-encoder for pest and disease classification. Comput. Electron. Agric. 2022, 203, 107448. [Google Scholar] [CrossRef]
- Zhao, R.; Zhu, Y.; Li, Y. CLA: A self-supervised contrastive learning method for leaf disease identification with domain adaptation. Comput. Electron. Agric. 2023, 211, 107967. [Google Scholar] [CrossRef]
- Dai, Y.; Gieseke, F.; Oehmcke, S.; Wu, Y.; Barnard, K. Attentional feature fusion. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Virtual, 5–9 January 2021; pp. 3560–3569. [Google Scholar] [CrossRef]
- Zhang, M.; Xu, S.; Song, W.; He, Q.; Wei, Q. Lightweight underwater object detection based on yolo v4 and multi-scale attentional feature fusion. Remote Sens. 2021, 13, 4706. [Google Scholar] [CrossRef]
- Salman, Z.; Muhammad, A.; Han, D. Plant disease classification in the wild using vision transformer-based models. Front. Plant Sci. 2025, 16, 1522985. [Google Scholar] [CrossRef] [PubMed]
- Aboelenin, S.; Elbasheer, F.A.; Eltoukhy, M.M.; El-Hady, W.M.; Hosny, K.M. A hybrid framework for plant leaf disease detection and classification using convolutional neural networks and vision transformer. Complex Intell. Syst. 2025, 11, 142. [Google Scholar] [CrossRef]
- Rashid, M.R.A.; Korim, M.A.E.; Hasan, M.; Ali, M.S.; Islam, M.M.; Jabid, T.; Islam, M. An ensemble learning framework with explainable AI for interpretable leaf disease detection. Array 2025, 26, 100386. [Google Scholar] [CrossRef]
- Van de Ven, G.M.; Tuytelaars, T.; Tolias, A.S. Three types of incremental learning. Nat. Mach. Intell. 2022, 4, 1185–1197. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z.; Hoai, M. Object detection with self-supervised scene adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 21589–21599. [Google Scholar]
- Zhang, N.; Wu, H.; Zhu, H.; Deng, Y.; Han, X. Tomato disease classification and identification method based on multimodal fusion deep learning. Agriculture 2022, 12, 2014. [Google Scholar] [CrossRef]
- Cao, Y.; Chen, L.; Yuan, Y.; Sun, G. Cucumber disease recognition with small samples using image-text-label-based multi-modal language model. Comput. Electron. Agric. 2023, 211, 107993. [Google Scholar] [CrossRef]
- Lu, Y.; Chen, D.; Olaniyi, E.; Huang, Y. Generative adversarial networks (GANs) for image augmentation in agriculture: A systematic review. Comput. Electron. Agric. 2022, 200, 107208. [Google Scholar] [CrossRef]
- Qi, H.; Huang, Z.; Jin, B.; Tang, Q.; Jia, L.; Zhao, G.; Cao, D.; Sun, Z.; Zhang, C. SAM-GAN: An improved DCGAN for rice seed viability determination using near-infrared hyperspectral imaging. Comput. Electron. Agric. 2024, 216, 108473. [Google Scholar] [CrossRef]
- Abbas, A.; Jain, S.; Gour, M.; Vankudothu, S. Tomato plant disease detection using transfer learning with C-GAN synthetic images. Comput. Electron. Agric. 2021, 187, 106279. [Google Scholar] [CrossRef]
- Li, X.; Li, X.; Zhang, M.; Dong, Q.; Zhang, G.; Wang, Z.; Wei, P. SugarcaneGAN: A novel dataset generating approach for sugarcane leaf diseases based on lightweight hybrid CNN-Transformer network. Comput. Electron. Agric. 2024, 219, 108762. [Google Scholar] [CrossRef]
- Moreno, H.; Gómez, A.; Altares-López, S.; Ribeiro, A.; Andújar, D. Analysis of Stable Diffusion-derived fake weeds performance for training Convolutional Neural Networks. Comput. Electron. Agric. 2023, 214, 108324. [Google Scholar] [CrossRef]
- Chen, D.; Qi, X.; Zheng, Y.; Lu, Y.; Huang, Y.; Li, Z. Synthetic data augmentation by diffusion probabilistic models to enhance weed recognition. Comput. Electron. Agric. 2024, 216, 108517. [Google Scholar] [CrossRef]
- Unknown, A. Revolutionizing crop disease detection with computational deep learning: A comprehensive review. Environ. Monit. Assess. 2024, 196, 302. [Google Scholar] [CrossRef]
- Jahin, M.A.; Shahriar, S.; Mridha, M.F.; Hossen, M.J.; Dey, N. Soybean disease detection via interpretable hybrid CNN-GNN with cross-modal attention. arXiv 2025, arXiv:2503.01284. [Google Scholar]







| Work | Citation | Focus |
|---|---|---|
| [9] | 445 | Introduced an investigation on methods for detection and classification of leaf diseases in citrus plants. |
| [11] | 82 | Convolutional neural networks (CNNs) for diseases in vegetables, fruits, and miscellaneous plants |
| [12] | 220 | The current status of using Convolutional Neural Networks (CNNs) for identifying and diagnosing plant diseases. |
| [13] | 136 | Introduced the modern advancements in early plant disease detection based on hyperspectral remote sensing. |
| [14] | 124 | Reviewed the application of vision-based machine learning techniques in plant disease detection. |
| [15] | 32 | Reviewed the application of machine learning and deep learning methods in the identification of diseases in four crops: tomatoes, peppers, potatoes, and cucumbers. |
| [10] | 200 | Provided a comprehensive review of deep learning approaches for plant disease detection, with particular emphasis on Generative AI, Foundation Models, and emerging real-time detection trends up to 2025. |
| Present review | - | Reviewed studies on crop disease identification based on spectral images and RGB images. |
| Dataset | Classes | Image Type | Sizes | Numbers | URL (Accessed on 20 December 2025) |
|---|---|---|---|---|---|
| PlantVillage [16] | 38 | RGB | 256 × 256 | 54,305 | https://data.mendeley.com/datasets/tywbtsjrjv/1 |
| XDB [17] | 171 | RGB | 4128 × 3096 3096 × 4128 | 46,513 | https://www.digipathos-rep.cnptia.embrapa.br/ |
| SWD [18] | 4 | RGB | 1024 × 768 | 3531 | https://github.com/boxyao/Strawberry_wilt_dataset |
| NLB [19] | 1 | RGB | 3353 × 2235 5919 × 3952 6000 × 4000 | 18,222 | https://osf.io/p67rz/ |
| FGVC8 [20] | 12 | RGB | 4000 × 2672 | 18,632 | https://www.kaggle.com/c/plant-pathology-2021-fgvc8 |
| LWDCD2020 [21] | 4 | RGB | Unfixed size | 12,000 | https://drive.google.com/drive/folders/1XlwoD8xes1punAzlGSz8mluQXm-CXcOJ |
| Plantdoc [22] | 17 | RGB | Unfixed size | 2598 | https://github.com/pratikkayal/PlantDoc-Dataset |
| BARI-Sunflower [23] | 3 | RGB | 512 × 512 | 467 | https://data.mendeley.com/datasets/b83hmrzth8/1 |
| Katra-Twelve [24] | 12 | RGB | 6000 × 4000 | 4503 | https://data.mendeley.com/datasets/hb74ynkjcn |
| AFB [25] | 1 | Hyperspectral | 512 × 512 | 346 | https://entrepot.recherche.data.gouv.fr/dataset.xhtml?persistentId=https://doi.org/10.57745/R6AMN3 |
| Metric | Formula | Description |
|---|---|---|
| Accuracy | Overall correctness of classification. | |
| MCC | MCC reflects the correlation between true disease categories and predicted disease categories by calculating their correlation coefficient; the higher the correlation, the better the prediction performance. | |
| Kappa | Measures inter-rater agreement adjusted for chance. | |
| Precision | Proportion of predicted positives that are actually positive. | |
| Recall | Proportion of actual positives correctly identified by the model. | |
| F1 Score | Harmonic mean of precision and recall. | |
| AP | Area under the precision–recall curve; measures detection performance for one class. | |
| mAP | Mean of AP across all classes; evaluates overall detection performance. | |
| RMSE | Root mean squared error; measures prediction deviation from actual values. | |
| Indicates proportion of variance in dependent variable explained by the model. | ||
| MAE | Mean absolute error; average of absolute prediction errors. |
| Reference | Crop | Disease | Camera Model | Wavelength (nm) | ML | Best Results |
|---|---|---|---|---|---|---|
| [35] | Cotton | Wilt | MCA Snap Zenmuse X5R | Near infrared1 80 ± 10 Blue 490 ± 10 Green 550 ± 10 Red 680 ± 10 Red edge 720 ± 10 Near infrared2 900 ± 2 | GWO-ELM PSO-BP LR | The values for the flowering stage and the boll stage are 0.65 and 0.88, respectively. |
| [36] | Wheat | Crown rot | FX10 SWIR | 400–1000 | SVM | F1-score is above 0.75. |
| [37] | Wheat | Fusarium head blight | GaiaField-V10E | 361–1011 | XGBoost KNN RF SVM NN | XGBoost achieved average accuracies of 95.46%, 91.36%, and 96.74% on WFSI1, WFSI2, and the combination of WFSI1 and WFSI2, respectively. |
| [38] | Cotton | Wilt | SR-3500 | 350–2500 | LR SVM KNN | The average accuracy is 90.62%. |
| [39] | Wheat | Fusarium head blight | ASD FieldSpec 4 Hi-Res Malvern Panalytical | 350–2500 | KNN RF SVM | and RMSE are 0.92 and 10.21, respectively. |
| [40] | Wheat | Stripe rust | ASD FieldSpec QE Pro | 350–2500 644–806 | XGBoost PLSR BPNN RF | and RMSE are 0.867 and 0.104, respectively. |
| [41] | Potato | Late blight | CMOS | Blue 450 ± 16 Green 560 ± 16 Red 650 Red edge 730 ± 16 Near infrared 840 ± 26 | RF SVM KNN | The overall accuracy and Kappa are 97.50 and 0.96, respectively. |
| [42] | Citrus | Anthracnose | GaiaField-V10E | 388–1025 | SVM KNN RF | The average accuracy is 91.04%. |
| [43] | Wheat | Stripe rust | QE Pro | 644–806 | GPR | The , RMSE, and MAE are 0.912, 0.097, and 0.071, respectively. |
| [44] | Banana | Wilt | RedEdge MX Zenmuse X7 | Blue 475 ± 10 Green 560 ± 10 Red 668 ± 5 Red edge 717 ± 5 Near infrared 840 ± 20 | RF SVM BPNN | The overall accuracy is 97.28%. |
| [45] | Wheat | Fusarium head blight | FX10 | 400–1000 | SVM NN LR | The classification accuracy is 95.6%. |
| [46] | Wheat | Stripe rust | ASD FieldSpec 4 | 350–2500 | XGBoost GBRT | The and RMSE are 0.8894 and 0.1135, respectively. |
| [47] | Apple | Fire blight | MicaSense RedEdge-M | Blue 475 ± 20 Green 560 ± 20 Red 668 ± 10 Red edge 717 ± 10 Near infrared 840 ± 40 | RF DT SVM | The overall accuracy and Kappa are 94.0% and 0.904, respectively. |
| [48] | Wheat | Fusarium head blight | ASD FieldSpec 4 | 350–2500 | RF KNN SVM NN XGBoost | When the severity of the disease is 10.78%, the classification accuracy is 100%. |
| [49] | Wheat | Powdery mildew | GaiaField-V10E | 400–1000 | PLSR | The classification accuracy is 91.4%. |
| [50] | Sugarcane | White leaf | P4 Mulitspectral | Blue 450 ± 16 Green 560 ± 16 Red 650 ± 16 Red edge 730 ± 16 Near infrared 840 ± 26 | XGBoost KNN RF DT | The overall accuracy is 94%. |
| [51] | Grape | Esca disease | Xeva 1.7-320 InGaAs | 900–1700 | PLS-DA | The accuracy is in the range of 82.78–95.53%. |
| [52] | Rice | Blast disease | FieldSpec 4 Hi-Res | 350–2500 | SVM KNN LDA | The classification accuracy of asymptomatic, early infection stage and mild infection stage are 65%, 80%, and 95%, |
| [53] | Wheat | Powdery mildew | GaiaField-V10E | 400–1000 | PLSR | The overall accuracy and Kappa are 82.35% and 0.56, respectively. |
| [54] | Potato | Late bligh | MicaSense RedEdge | Blue 475 ± 20 Green 560 ± 20 Red 668 ± 10 Red edge 717 ± 10 Near infrared 840 ± 40 | LSVC RF SVM KNN | The average overall accuracy and average are 0.837 and 0.622, respectively. |
| [55] | Wheat | Fusarium head blight | Cubert S185 FireflEYE SE | 459–950 | LR | The accuracy, F1-score, and AUC-ROC are 0.90, 0.83, and 0.82, respectively. |
| [56] | Grape | Leafroll disease | Micro-Hyperspec NIR X-Series | 517–1729 | LS-SVM | The classification accuracy is in the range of 66.67–89.93%. |
| [57] | Cotton | Root rot | CMOS AF Nikkor | - | LR | The classification accuracies of the spectral model, texture model, and spectral-texture model are 92.95%, 84.81%, and 91.87%, respectively. |
| [58] | Mango | Anthracnose | Spectral Evolution SM-1900 | 350–1900 | SVM RF | The 29 significant bands identified through LDA outperform the bands obtained other methods in terms of performance for SVM and RF. |
| [59] | Citrus | Greening | ADC-lite | Green 520–600 Red 630–690 Near infrared 760–900 | AdaBoost XGBoost RF SVM KNN LR NB | AdaBoost using a threshold strategy achieved 100% accuracy. |
| [60] | Banana | Wilt | MicaSense RedEdge MTM | Blue 465–485 Green 550–570 Red 653–673 Red edge 712–722 Near infrared 800–880 | LR | The average overall accuracy and average Kappa are 85.85% and 0.71, respectively. |
| [61] | Potato | Early blight | Imspector V9 | 430–900 | SVM PLS-DA | The accuracy is 0.92. |
| [30] | Wheat | Stripe rust | Cubert S185 | 450–950 | Hybrid DL (3D-CNN based) | Deep spectral–spatial features achieved superior performance compared with traditional machine learning models. |
| [31] | Apple | Quarantine diseases | Hyperspectral camera | Full spectral range | 3D-CNN | The proposed model achieved high accuracy in early disease detection. |
| [32] | Wheat | Multiple infections | Hyperspectral system | VIS–NIR | Deep learning | The method effectively distinguished overlapping disease symptoms. |
| Reference | Crop | Disease | Baseline | Improvements | Results |
|---|---|---|---|---|---|
| [75] | Wheat | Fusarium head blight | YOLOv5 | Integrated a parallel channel-spatial attention module into the feature fusion module. | Accuracy, recall and mAP are 80.6%, 74.5% and 83.2%, respectively. |
| [76] | Cauliflower | Bacterial Soft Rot Downy Mildew Black Rot | YOLOv8 | Added an additional Conv block in the head section. Replaced the original activation function with Hard Swish. | Accuracy, recall and mAP are 93.2%, 82.6% and 91.1%, respectively |
| [77] | Grape | Powdery mildew Anthracnose Gray moid White rot | YOLOv5 | Integrated a VoVNet-based GC-RE-OSA module in the backbone. Used an improved real-time Transformer in the neck. Added 2D positional embeddings and SSTE to the final feature map. | Parameters, mAP, and FPS are 24.5 MB, 90.67%, and 44, respectively. |
| [8] | Wheat | Fusarium head blight | YOLOv5 | Used MobileNetV3 as the backbone. Replaced a portion of the feature fusion module’s C3 module with C3Ghost. | The mAP is 97.15%, the FLOPS increases by 71.32% compared to YOLOv5. |
| [78] | Apple Pom Mango Jamun | Rust Alternaria blotch Gray spot Cercospora spot Anthracnose Fungal disease | Mask R-CNN | Designed Dual Scale Overlap (DSO) patch embedding to effectively extract multi-scale disease features through dual paths, reducing the omission of lesions. Customized an Ultra Large Convolution (ULC) Transformer block for positional encoding and global modeling, efficiently extracting global and positional features of leaves and diseases. Proposed the Skip Convolutional Local Optimization (SCLO) module to optimize local details and edge information, enhancing the model’s pixel classification ability, resulting in finer segmentation of leaves and spots, and enabling the extraction of smaller lesions. Built a Double Layer Upsampling (DLU) decoder to efficiently fuse detailed information with semantic information and output accurate segmentation results of leaves and spots. | The IoU of lesion segmentation achieve 94.47%, 94.54%, 83.83%, 86.60%, 89.59% and 88.76%, respectively. |
| [75] | Wheat | Fusarium head blight | YOLOv5 | Integrated a parallel channel-spatial attention module into the feature fusion module. | Accuracy, recall and mAP are 80.6%, 74.5% and 83.2%, respectively. |
| [79] | Wheat | Stripe rust Powdery mildew Scab | Mask R-CNN | Used Densely Connected Convolutional Networks (DenseNet) for preliminary feature extraction. Designed a backbone feature extraction network combining Feature Pyramid Network (FPN) and attention mechanism to extract diversification-augmented features. Developed an Edge Agreement Head module based on Sobel filters to compare edge features during training, thereby improving the model’s mask generation efficiency. | The mAP is 96.02% |
| [80] | Grape | Black rot Esca black measles Leaf blight Powdery mildew | VGG-19 | The training set for the recognition model is pre-extracted using grayscale technology and grid-point positioning methods, allowing for the early formation of filters and the construction of convolution kernels. | The parameters of the improved VGG-19 were reduced to 38.5 MB, with a training time of only 49 min and 41 s, and an accuracy rate of 96.25% on the validation set. |
| [81] | Apple | Healthy Spotted leaf disease Mosaic disease Leaf rust Yellowing Leaflet disease Leaf brown spot Glomerella leaf spot Powdery mildew | - | A new lightweight network, RepDI, was proposed, and it became CPU-friendly due to structural reparameterization. Channel and spatial compression, combined with dilated convolution, facilitated the extraction of global information, improving the model’s recognition ability in complex backgrounds. | Achieved the fastest inference speed on desktop CPUs and reached a Top-1 accuracy of 98.92% |
| [82] | Strawberry | Normal strawberry fruit Normal strawberry leaves Gray mould Gnomonia fructicola fall Fertilizer damage Blight Pestalotiopsis leaf spot Calcium deficiency Fargaria ananassa duch Common leaf spot Anthracnose High-temperature damage of strawberries | Vision transformer | Utilized the self-attention mechanism, Multi-Head Self-Attention (MSA) was employed to capture long-range feature dependencies in strawberry disease images. A Transformer based on Spatial Convolutional Self-Attention (SCSA-Transformer) was proposed to reduce the parameters of the Transformer network and enhance recognition efficiency. | Recognition accuracy is 99.10%. |
| [83] | Tomato | Healthy Mosaic virus Yellow leaf curl virus Early blight Late blight Leaf mold Septoria leaf spot Target leaf spot Bacterial leaf spot Spider mite damage | ResNet50 | Without adding parameters or further boundary annotations, the addition of the target localization module enables reliable prediction of lesion positions. By erasing contextual semantic data, the proposed destruction and reassembly module is able to learn more granular features. The addition of the attention area division module can locate lesion positions, thereby improving recognition accuracy. | The accuracy rates for identifying the severity of tomato diseases and for recognizing the types of tomato diseases have reached 95.03% and 98.25% respectively. |
| [84] | Apple | Frog eye spot Powdery mildew Rust Scab | YOLOv5 | Incorporated the BCM module into the backbone. Integrated cross-attention mechanism and bidirectional transposed feature pyramid network into the feature fusion module. | Accuracy and FPS are 85.23 and 33, respectively. |
| [85] | Tea | Leaf blight | YOLOv5 | Added multi-scale RFB modules in the backbone. Added dual-dimensional mixed attention in the Neck section. | Accuracy, recall, F1-socre, AP@0.5, and AP are 72.2%, 71.1%, 71.6%, 76.8% and 44.5%, respectively. |
| [86] | Maize | Anthracnose leaf blight Tropical rust Southern maize rust Common rust Southern leaf blight Phaeosphaeria leaf spot Diplodia leaf streak Physoderma brown spot Northern leaf blight | Faster R-CNN | Added batch normalization layers in the convolutional layers. Proposed a central cost function to construct a hybrid loss function. | Average recall, accuracy, and F1-score are 0.9719, 97.23 and 0.9718, respectively. |
| [87] | Apple | Alternaria blotch Brown spot Grey spot Mosaic Rust | YOLOv5 | Reconstructed the backbone using lightweight Shufflenet reverse residual blocks (SNIR). Introduced an efficient feature learning module (DWC3) with a depthwise separable convolution design in the neck part. Integrated lightweight coordinate attention modules (CA) into the Backbone and neck parts. Replaced CIOU with SIOU as the bounding box regression loss function. | The mAP and detection speed are 95.5% and 625 frames per second, respectively. |
| [88] | Grape Peach Potato Apple Corn | Black measles Leaf blight Block rot Bacterial spot Late blight Block rot Scab Northern leaf blight | YOLOv5 | Improved the feature fusion module using lightweight structures and multi-branch structures. Proposed CAM to extract global and local features from each network layer. Added an additional grid to predict targets and modified the formula for the centroid offset of the prediction box. Used an improved DIoU loss function to replace GIoU. | The mAP, F1-score, and recall are 95.92%, 0.91, and 87.89%, respectively. |
| [89] | Tobacco | Brown spot | YOLOX | Introduced hierarchical mixed-scale units (HMUs) in the neck part. Introduced Convolutional Block Attention Module (CBAM) into the network. | Average accuracy is 80.56%. |
| [90] | Maize | Blight Leaf spot Mosaic | YOLOv8 | - | Accuracy is 99.04%. |
| [91] | Tea | Leaf blight | YOLOv5 | Integrated self-attention and convolution (ACmix) as well as CBAM into YOLOv5. Replaced the SPPF module with the RFB module. Introduced the Global Context Network (GCNet). | AP is 73.7%. |
| [92] | Strawberry | Angular leafspot Anthracnose Blossom blight Gray mold Leaf spot Powdery mildew | Faster R-CNN | Constructed a multi-scale feature fusion network using ResNet, FPN, and CBAM. | The mAP is 92.18%. |
| [93] | Grape | Black measles | DeepLabV3+ | Used DeepLabV3+ to extract features of ROI and POI. Developed a fuzzy rule-based system for each feature to predict the severity of the disease. Considered appropriate membership functions for the inputs and outputs for the fuzzification and defuzzification processes. | The accuracy of the overall disease severity classification is 97.75%. |
| [94] | Tomato Cucumber Cabbage | Umbilical rot Gray mould Target spot Authracnose | YOLOv5 | Embed the Transformer encoder into the CSP structure. Integrated the improved InceptionA module into the FPN structure. Improved NMS with the Confluence module. | The mAP is 93.1%. |
| [95] | Jute | Stem rot Anthracnose Black band Soft rot Tip blight Dieback Mosaic Chlorosis | YOLOv5 | Utilized SCFEM, SPPM, and DSCFEM in the backbone. Collected cross-stage features from three different levels of the backbone in the Neck. Merged the anchor box results at different scales to create aggregated detection boxes. | The mAP and F1-score are 96.63% and 95.83%, respectively. |
| [96] | Cotton | Brown spot Red leaf blight Verticillium wilt | YOLOX | Introduced the ECA module into YOLOX. Used the Focal Loss loss function. Added the Hard-Swish activation function. | The mAP is 94.60%. |
| [97] | Citrus | Greening disease | YOLOv5 | - | YOLOv5l-HLB2’s F1-score is 85.19%. |
| [98] | Tomato | Virus | YOLOv5 | Combined the SE module with each Res unit of the backbone network to form SE-Res modules. | Accuracy and mAP@0.5 are 91.07% and 94.10%, respectively. |
| [99] | Apple | Rust Scab Blotch | YOLOX | Designed asymmetric ShuffleBlock, CSP-SA modules, and separable convolutions to improve the backbone network. | The mAP on MSALDD and PlantDoc are 91.08% and 58.85%, respectively. |
| [100] | Tomato | Bacterial spot Leaf mold Septoria spot Target spot Mosaic Early blight | Mask R-CNN | Added a lightweight head. Modified the aspect ratios of the RPN network’s anchor boxes and the feature extraction topology. | Accuracy, mAP, and F1-score are 0.98, 0.88, and 0.912, respectively. |
| [101] | Tomato | Early blight Late blight Leaf mold Septoria leaf spot | YOLOv4 | Incorporated DenseNet into the backbone. Added two new residual blocks to both the backbone and the neck. Added CSP2-n modules to PANet. Used the Hard-Swish function as the primary activation function. | Precision, F1-score, and mAP are 90.33%, 93.64%, and 96.29, respectively. |
| [102] | Grape | Downy mildew | YOLOv5 | Integrated Coordinate Attention (CA) into YOLOv5. | Precision, mAP@0.5, and recall are 85.59, 83.70%, and 89.55% |
| [103] | Cucumis melo | Powdery mildew | YOLOv5 | Reconstructed the backbone of YOLOv5 using ShuffleNetv2 reverse residual blocks. Trimmed and fine-tuned the model by applying a channel pruning method. | The mAP@0.5 is 93.2%. |
| [104] | Strawberry | Powdery mildew | YOLOv4 | Replaced the backbone and neck with versions that incorporate depthwise convolution and a hybrid attention mechanism. | The mAP is 72.7% |
| [105] | Cucumber | Powdery mildew Downy mildew Anthrax | YOLOv5 | Integrated Coordinate Attention (CA) and Transformer. Combined multi-scale training strategy (MS) with a feature fusion network. | Model size, FLOPS, mAP, and FPS are 4.7 MB, 6.1G, 84.9%, and 143, respectively. |
| [106] | Maize | Healthy Gray leaf spot Common rust Northern leaf blight Fallarmy worm Herbicideburn Zincdeficiency | - | Used Hard Coordinated Attention (HCA) assigned at different spatial scales to extract features from maize leaf disease images, reducing the influence of complex backgrounds. Constructed a multi-feature fusion network that extracts weight information in two spatial directions, maximizing the retention of maize leaf disease features during the sampling process. Replaced the traditional convolutional layers with depthwise separable convolutional layers, reducing the number of parameters in the network. | The average recognition accuracy and F1 score were 97.75% and 97.03%, respectively. |
| [107] | Apple Tomato Potato Strawberry Chili | The severity of leaf disease | SSD | Se_SSD integrated the SSD feature extraction network with a channel attention mechanism. DB_SS improved the VGG feature extraction network. DBA_SSD integrated the improved VGG network with a channel attention mechanism. | DBA_SSD achieves the accuracy of 92.20%. |
| [108] | Cucumber | Downy mildew Bacterial angular spot | SSD | Selected EfficientNet-b1 as the backbone network. Selected the ASFF feature fusion method. | Model size, GFLOPS, and mAP are 7.72 MB, 0.14, and 85.52%. |
| [109] | Wheat | Health Glume blotch Scab | - | Designed a lightweight convolutional neural network (CNN) model called SimpleNet. Introduced the Convolutional Block Attention Module (CBAM) to enhance the model’s ability to represent disease features. Designed a feature fusion module to concatenate the down-sampled feature maps output by inverted residual blocks with the average pooled features of the input feature maps, achieving fusion between features of different depths. | The recognition accuracy on the test dataset reached 94.1%. |
| [110] | Soybean | Bacterial spot Virus disease Frogeye leaf spot | Faster R-CNN | Connected different layers in the feature extraction network in a skipping manner. | AP is 83.34%. |
| [111] | Apple | Alternaria blotch Brown spot Gray spot Rust Mosaic | SSD | Proposed a basic module called MEAN. Utilized GoogleNet’s Inception module. Replaced all convolutional kernels in Inception with MEAN modules. | The mAP and FPS are 83.12% and 12.53, respectively. |
| [112] | Grapevine | Leaf black rot | YOLOv3 | Replaced the loss function with GIOU. Added SPP (Spatial Pyramid Pooling). Enhanced the training images through the BL super-resolution method. | The PlantVillage dataset’s accuracy and recall are 95.79% and 94.52%, respectively. The orchards dataset’s accuracy and recall are 86.69% and 82.27%, respectively. On the images without background clutter’s accuracy and recall are 94.05% and 93.26%. |
| [113] | Tomato | Early bight Gray mold Yellow leaf curl Brown spot Gray mold Leaf mold Navel rot Leaf curl Mosaic | YOLOv3 | Optimized the feature layer of the YOLOv3 model using an image pyramid to achieve multi-scale feature detection. | The accuracy is 92.39%, and the detection time is only 20.39ms. |
| [114] | Wheat | Stripe rust | DenseNet | Embed the Convolutional Block Attention Module (CBAM) into the Densely Connected Convolutional Network (DenseNet). | The recognition accuracy on the test dataset reached 97.99%. |
| [115] | Tomato | Early blight Late blight Citrinitas leaf curl leaf mold Bacterial leaf spot | ResNet50 | Denoised and enhanced the image by combining Wavelet Transform with Retinex (BWTR), removing noise points and edge points while retaining important texture information. Separated the tomato leaves from the background using the KSW method optimized by the Artificial Bee Colony algorithm (ABCK). Recognized the image using the Both-channel Residual Attention Network model (B-ARNet). | The overall detection accuracy is approximately 89%. |
| [116] | Grape | Black measles Black rot Leaf blight Mites of grape | Faster R-CNN | Introduced Inception-v1 modules, Inception-ResNet-v2 modules, and SE (Squeeze-and-Excitation) modules. | The mAP and FPS are 81.1% and 15.01. |
| [117] | Tomato | Gray leaf spot | YOLOv3 | Used MobileNetv2 as the backbone network. Introduced the GIoU bounding box regression loss function. | F1-score, AP, and average IOU are 94.13%, 92.53%, and 89.92%. |
| [118] | Tomato | Leaf mold fungus Powdery mildew Blight Tomv | Faster R-CNN | Replaced VGG16 with a deep residual network. Utilized the K-means clustering algorithm to obtain anchor boxes. | The mAP is 97.18% |
| [119] | General | Leaf diseases | YOLOv8 | Integrated the YOLOv10 architecture with an NMS-free training strategy to improve detection efficiency. | The mAP on the COCO dataset is 52.5%, with superior inference speed in terms of FPS. |
| [120] | Banana | Sigatoka leaf spot | YOLOv8 | Designed a lightweight YOLOv10 detection head to enable efficient real-time disease detection. | The classification accuracy reached 98.5%, with an inference speed of 120 FPS. |
| [121] | General | Leaf diseases | Vision Transformer (ViT) | Proposed a Vision Transformer-based framework for leaf disease detection in precision agriculture, enhancing model generalization capability. | The proposed method achieved an accuracy of 97.8% and an F1-score of 0.96. |
| [122] | General | Leaf diseases | ResNet | Introduced VMamba, a visual state space model with linear computational complexity for efficient feature learning. | The model achieved an accuracy of 99.1% and demonstrated faster inference speed compared with DeiT. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Zheng, H.; Wang, H.; Dong, H.; Qian, Y. A Survey of Crop Disease Recognition Methods Based on Spectral and RGB Images. J. Imaging 2026, 12, 66. https://doi.org/10.3390/jimaging12020066
Zheng H, Wang H, Dong H, Qian Y. A Survey of Crop Disease Recognition Methods Based on Spectral and RGB Images. Journal of Imaging. 2026; 12(2):66. https://doi.org/10.3390/jimaging12020066
Chicago/Turabian StyleZheng, Haoze, Heran Wang, Hualong Dong, and Yurong Qian. 2026. "A Survey of Crop Disease Recognition Methods Based on Spectral and RGB Images" Journal of Imaging 12, no. 2: 66. https://doi.org/10.3390/jimaging12020066
APA StyleZheng, H., Wang, H., Dong, H., & Qian, Y. (2026). A Survey of Crop Disease Recognition Methods Based on Spectral and RGB Images. Journal of Imaging, 12(2), 66. https://doi.org/10.3390/jimaging12020066

