Trends and Prospect of Machine Vision Technology for Stresses and Diseases Detection in Precision Agriculture
Abstract
:1. Introduction
2. Machine Vision Techniques in Agriculture
2.1. Detection of Stresses
2.1.1. Water Stress
2.1.2. Nutrient Deficiency
2.1.3. Pest Stress
2.2. Detection of Diseases
2.2.1. Disease Detection on Leaves
Target | Techniques | Results | References |
---|---|---|---|
Apple | Color and texture analysis using GLCM and wavelet transformation | Support vector machine (SVM) with radial basis function kernel showed a better relationship with disease (R2 = 0.963) than ANN | Omrani et al., 2014 [56] |
Banana, corn, cotton, soya, and alfalfa | Color transformation from RGB and Image segmentation | I3 channel achieved good pixel matching of 69.9% with the lowest level of misclassification among others | Camargo & Smith, 2009 [57] |
Banana, lemon, and bean | CCM texture analysis with SVM | Yellow spots, brown spots, scorch, and late scorch were detected with an accuracy of 94% | Arivazhagan et al., 2013 [53] |
Bean and Bitter gourd | Color and shape analysis | 92.1% and 89.1% accuracies were found for bean and bitter gourd diseases recognition with feed-forward neural network | Muthukannan et al., 2015 [61] |
Cherry, pine, and others | Image processing using morphological features analysis with probabilistic neural network | An average accuracy of 90.312% for recognition of more than 30 different plants | Wu et al., 2007 [62] |
Cotton and soybean | Color transformation and segmentation with ANN for classification. | Highest classification accuracy of 83% for angular leaf spot of cotton and 80% for soybeans leaf diseases | Kanjalkar & Lokhande, 2013 [59] |
Cucumber | Color and shape features analysis using sparse representation (SR) | The overall recognition rate of 85.7% and the highest accuracy of 91.25% for grey mold diseases using SR | Zhang et al., 2017 [65] |
Grape | Various deep neural networks including a Faster R-CNN | Deep learning-based Faster DR-IACNN model showed a precision of 81.1% mAP on grape leaf disease | Xie et al., 2020 [68] |
Maize, cotton, blueberry, soybean, and others | Image segmentation using RGB, YCbCr, HSI, and CIELAB color space | ‘a*’ component of CIELAB detected accurately of all cases disease spots instead of other color models | Chaudhary et al., 2012 [55] |
No specific plant leaves | HSI color system and segmentation by using Otsu’s method | A fast and accurate method to grade plant leaf spot diseases was developed by using the Sobel operator | Weizheng et al., 2008 [51] |
No specific plant leaves | CCM texture analysis with ANN classifier | Classification of diseases with a precision between 83% and 94% | Naikwadi & Amoda, 2013 [60] |
No specific plant | CCM for texture and shape analysis | The radial basis function of ANN showed 100% accuracy in scorch and spot classification of leaves | Choudhary & Gulati, 2015 [58] |
Rice | 13 CNN models for the identification of four kinds of rice leaf diseases | ResNet-50 and SVM outperformed to other 11 CNN models by reporting an F1 score of 0.9838 | Sethy et al., 2020 [66] |
Rose, bean, lemon, and banana | Image segmentation and soft computing techniques | SVM showed a higher overall accuracy of 95.71% than the Minimum Distance Criterion | Singh & Mishra, 2017 [63] |
Tomato | Residual learning and deep network for detection of three types of tomato leaf diseases | The proposed residual network achieved an accuracy of 98% to detect infection automatically | Kathiik et al., 2020 [67] |
Tomato and others | CCM-based texture analysis with K-mean clustering and ANN classifier | The HS model performed better than the HSI model. Early scorch, cottony mold, ashen mold late scorch and tiny whiteness were recognized with the precision of around 93% | Al Bashish et al., 2011 [54] |
Tomato and others | CCM-based texture analysis with ANN and K-mean clustering | ANN with HS model provided overall accuracy of 99.66% for early scorch, cottony mold, and normal | Al-Hiary et al., 2011 [52] |
Watermelon | Color features analysis From regions of interest images | ANN achieved a classification accuracy of 75.9% | Kutty et al., 2013 [64] |
2.2.2. Diseases Detection on Fruits and Vegetables
Crop/Plant | Techniques | Results | References |
---|---|---|---|
Apple | Image segmentation with fractal features and ANN | Defect detection was effective with over 93% of accuracy | Li et al., 2002 [78] |
Apple | X-ray imaging with ANN classifier | 90% and 83% for red and golden delicious apples and 93% of accuracy after threshold adjustment | Shahin et al., 2002 [79] |
Apple | Image processing with multi-spectral optics | The success rate was greater than 97.6% and wavebands were 740 nm, 950 nm and visible for surface defect detection | Throop et al., 2005 [80] |
Apple | An image-based multi-spectral vision system | Accuracy of 100% for rejected apples and 98.2%, 94.5% and 55% for a recent bruise, serious and slight defect apples | Kleynen et al., 2005 [82] |
Apple | Image processing with K-mean clustering and multi-class SVM | Classification accuracy was achieved up to 93% | Dubey & Jalal, 2012 [81] |
Citrus fruits | Dense CNN algorithm used for the detection of citrus defects without pre-processing and data augmentation techniques | Initially, the proposed model achieved an accuracy of 67%, but the performance increased to an accuracy of 89.1% after the data augmentation technique | Kukreja & Dhiman, 2020 [89] |
Cucumber | Near-infrared hyperspectral imaging with the spectral region of 900-1700 nm | The best band ratio of 988 nm and 1085 nm had identification accuracies between 82% and 93% | Ariana et al., 2006 [86] |
Grape | A machine learning-based approach for the identification of grape types | This trained model reached an accuracy of 100% from 1146 images when testing | El-Mashharawi et al., 2020 [90] |
Grapefruit | CCM for color texture analysis with a discriminant function for classification | The average classification accuracy was 96.0% and the best accuracy of 96.7% with 14 selected HSI features | Kim et al., 2009 [71] |
Grapefruit | Hyperspectral reflectance imaging with spectral information divergence | Overall classification accuracy was 96.2% | Qin et al., 2009 [72] |
Olives | Image segmentation and ANN, Mahalanobis and least square discriminant classifiers | ANN with a hidden layer was classified as the defect and bruise of olives with an accuracy of over 90% | Diaz et al., 2004 [85] |
Onion | Liquid crystal tunable filter based hyperspectral imaging system | The spectral region of 1150-1280 nm was effective when onion was stored 3 days after being inoculated | Wang et al., 2009 [87] |
Orange | Hyperspectral reflectance imaging with principal component analysis. | The highest identification accuracy was 93.7% achieved for surface defects with no false positives | Li et al., 2011 [75] |
Orange | Computer vision-based combined lighting transform and image ratio algorithm. | An overall detection rate was 98.9% in the differentiation of normal and defective oranges. | Li et al., 2013 [76] |
Orange | Image processing-based window local segmentation algorithm | The image segmentation algorithm was able to correctly detect 97% of the defective orange | Rong et al., 2017 [77] |
Orange and Mandarin | Computer vision-based oriented region segmentation algorithm | Detection results showed 100% of accuracy for medfly, sooty mold, green mold, stem end injury, phytotoxicity, anthracnose and stem | Blasco et al., 2007 [73] |
Orange and Mandarin | Multispectral imaging with morphological features | The overall success rate is about 86% with the highest success achieved in serious defects | Blasco et al., 2009 [74] |
Orange and Mandarin | Multivariate image analysis | An accuracy of 91.5% for defect detection and 94.2% for damaged or sound fruits | López-García et al., 2010 [70] |
Potato | Machine vision-based minimalist adaptive boosting algorithm (AdaBoost) | Accuracy of 89.6% for white and 89.5% for red potato blemishes detection | Barnes et al., 2010 [83] |
Potato | The deep convolutional neural network was used to identify four types of potatoes (red, red-washed, sweet, and white) | The trained CNN model achieved an accuracy of 99.5% from the test set among the 2400 image dataset | Elsharif et al., 2020 [88] |
Tomato | Image processing algorithm with color transformation | Highest accuracy achieved of 100% for stem detection and 90% for both bruise and color homogeneity detection | Laykin et al., 2002 [84] |
3. Trends in Machine Vision Research
4. Conclusions
- Despite having some success in crop species identification by using RGB imaging with shape feature analysis, the newly developed deep CNN with RGB images would be a more appropriate solution for crop species identification.
- Crop stress detection can be carried out through RGB and hyperspectral imaging.
- The external crop defect can be detected successfully by RGB imaging, but this technique cannot detect internal defects.
- The texture analysis of RGB images is a suitable tool for weed detection in the crop on a field scale.
- Spectral imaging is a promising tool for crop internal quality inspection because the other imaging techniques have not been successful yet in this area of application. Magnetic resonance imaging also is a potential analytical technique for crop tissue damage detection; however, this technique is not fully matured yet for agricultural applications.
- More optimized, targeted application of the currently available imaging equipment and established machine vision techniques to specific crop cultivation tasks.
- Interconnected investigation of spatial, spectral, and temporal domains and incorporation of expert knowledge into machine vision application in agriculture.
- Ensure the application of simplified multispectral imaging technology over hyperspectral imaging to reduce the cost and for convenient real-time application on a field scale.
- Fast real-time and in-field image processing using hardware-based image processing devices (e.g., graphic processing unit, digital signal processor, and field-programmable gate array).
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Foley, J.A.; Ramankutty, N.; Brauman, K.A.; Cassidy, E.S.; Gerber, J.S.; Johnston, M.; Mueller, N.D.; O’Connell, C.; Ray, D.K.; West, P.C.; et al. Solutions for a cultivated planet. Nature 2011, 478, 337–342. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Baudron, F.; Giller, K.E. Agriculture and nature: Trouble and strife? Biol. Conserv. 2014, 170, 232–245. [Google Scholar] [CrossRef]
- Mavridou, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Machine vision systems in precision agriculture for crop farming. J. Imaging 2019, 5, 89. [Google Scholar] [CrossRef] [Green Version]
- Tian, H.; Wang, T.; Liu, Y.; Qiao, X.; Li, Y. Computer vision technology in agricultural automation—A review. Inf. Process. Agric. 2020, 7, 1–19. [Google Scholar] [CrossRef]
- Chattha, H.S.; Zaman, Q.U.; Chang, Y.K.; Read, S.; Schumann, A.W.; Brewster, G.R.; Farooque, A.A. Variable rate spreader for real-time spot-application of granular fertilizer in wild blueberry. Comput. Electron. Agric. 2014, 100, 70–78. [Google Scholar] [CrossRef]
- Chang, Y.K.; Zaman, Q.U.; Farooque, A.; Chattha, H.; Read, S.; Schumann, A. Sensing and control system for spot-application of granular fertilizer in wild blueberry field. Precis. Agric. 2017, 18, 210–223. [Google Scholar] [CrossRef]
- Rehman, T. Development of a Machine Vision Based Weed (Goldenrod) Detection System for Spot-Application of Herbicides in Wild Blueberry Cropping System. Master’s Thesis, Dalhousie University, Halifax, NS, Canada, 2017. [Google Scholar]
- Rehman, T.U.; Zaman, Q.U.; Chang, Y.K.; Schumann, A.W.; Corscadden, K.W.; Esau, T.J. Optimising the parameters influencing performance and weed (goldenrod) identification accuracy of colour co-occurrence matrices. Biosyst. Eng. 2018, 170, 85–95. [Google Scholar] [CrossRef] [Green Version]
- Farooque, A.A.; Chang, Y.K.; Zaman, Q.U.; Groulx, D.; Schumann, A.W.; Esau, T.J. Performance evaluation of multiple ground based sensors mounted on a commercial wild blueberry harvester to sense plant height, fruit yield and topographic features in real-time. Comput. Electron. Agric. 2013, 91, 135–144. [Google Scholar] [CrossRef]
- Cubero, S.; Aleixos, N.; Moltó, E.; Gómez-Sanchis, J.; Blasco, J. Advances in machine vision applications for automatic inspection and quality evaluation of fruits and vegetables. Food Bioprocess Technol. 2011, 4, 487–504. [Google Scholar] [CrossRef]
- Chang, Y.K.; Rehman, T.U. Current and future applications of cost-effective smart cameras in agriculture. In Robotics and Mechatronics for Agriculture, 1st ed.; Zhang, D., Wei, B., Eds.; CRC Press: Boca Raton, FL, USA, 2017; pp. 83–128. [Google Scholar]
- Lin, K.; Chen, J.; Si, H.; Wu, J. A review on computer vision technologies applied in greenhouse plant stress detection. In Proceedings of the Chinese Conference on Image and Graphics Technologies, Beijing, China, 2–3 April 2013. [Google Scholar]
- Li, M.; Imou, K.; Wakabayashi, K.; Yokoyama, S. Review of research on agricultural vehicle autonomous guidance. Int. J. Agric. Biol. Eng. 2009, 2, 1–16. [Google Scholar]
- Nilsson, H. Remote sensing and image analysis in plant pathology. Annu. Rev. Phytopathol. 1995, 33, 489–528. [Google Scholar] [CrossRef] [PubMed]
- Osakabe, Y.; Osakabe, K.; Shinozaki, K.; Tran, L.S.P. Response of plants to water stress. Front. Plant Sci. 2014, 5, 86. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kumari, A.; Sharma, B.; Singh, B.N.; Hidangmayum, A.; Jatav, H.S.; Chandra, K.; Singhal, R.K.; Sathyanarayana, E.; Patra, A.; Mohapatra, K.K. Physiological mechanisms and adaptation strategies of plants under nutrient deficiency and toxicity conditions. In Plant Perspectives to Global Climate Changes; Academic Press: Cambridge, MA, USA, 2022. [Google Scholar]
- Szczepaniec, A.; Finke, D. Plant-vector-pathogen interactions in the context of drought stress. Front. Ecol. Evol. 2019, 7, 262. [Google Scholar] [CrossRef] [Green Version]
- Kacira, M.; Ling, P.P.; Short, T.H. Machine vision extracted plant movement for early detection of plant water stress. Trans. ASAE 2002, 45, 1147. [Google Scholar] [CrossRef]
- Ondimu, S.N.; Murase, H. Comparison of plant water stress detection ability of color and gray-level texture in Sunagoke moss. Trans. ASABE 2008, 51, 1111–1120. [Google Scholar] [CrossRef]
- Kim, Y.; Glenn, D.M.; Park, J.; Ngugi, H.K.; Lehman, B.L. Hyperspectral image analysis for water stress detection of apple trees. Comput. Electron. Agric. 2011, 77, 155–160. [Google Scholar] [CrossRef]
- Xu, G.; Zhang, F.; Shah, S.G.; Ye, Y.; Mao, H. Use of leaf color images to identify nitrogen and potassium deficient tomatoes. Pattern Recognit. Lett. 2011, 32, 1584–1590. [Google Scholar] [CrossRef]
- Sena, D.G., Jr.; Pinto, F.A.C.; Queiroz, D.M.; Viana, P.A. Fall armyworm damaged maize plant identification using digital images. Biosyst. Eng. 2003, 85, 449–454. [Google Scholar] [CrossRef]
- Bauch, C.; Rath, T. Prototype of a vision based system for measurements of white fly infestation. In Proceedings of the International Conference on Sustainable Greenhouse Systems-Greensys, Leuven, Belgium, 12–16 September 2004. [Google Scholar]
- Foucher, P.; Revollon, P.; Vigouroux, B.; Chasseriaux, G. Morphological image analysis for the detection of water stress in potted forsythia. Biosyst. Eng. 2004, 89, 131–138. [Google Scholar] [CrossRef]
- Chung, S.; Breshears, L.E.; Yoon, J.Y. Smartphone near infrared monitoring of plant stress. Comput. Electron. Agric. 2018, 154, 93–98. [Google Scholar] [CrossRef]
- Ghosal, S.; Blystone, D.; Singh, A.K.; Ganapathysubramanian, B.; Singh, A.; Sarkar, S. An explainable deep machine vision framework for plant stress phenotyping. Proc. Natl. Acad. Sci. USA 2018, 115, 4613–4618. [Google Scholar] [CrossRef] [PubMed]
- Elvanidi, A.; Katsoulas, N.; Kittas, C. Automation for Water and Nitrogen Deficit Stress Detection in Soilless Tomato Crops Based on Spectral Indices. Horticulturae 2018, 4, 47. [Google Scholar] [CrossRef] [Green Version]
- Ramos-Giraldo, P.; Reberg-Horton, C.; Locke, A.M.; Mirsky, S.; Lobaton, E. Drought Stress Detection Using Low-Cost Computer Vision Systems and Machine Learning Techniques. IT Prof. 2020, 22, 27–29. [Google Scholar] [CrossRef]
- Story, D.; Kacira, M.; Kubota, C.; Akoglu, A.; An, L. Lettuce calcium deficiency detection with machine vision computed plant features in controlled environments. Comput. Electron. Agric. 2010, 74, 238–243. [Google Scholar] [CrossRef]
- Mao, H.; Gao, H.; Zhang, X.; Kumi, F. Nondestructive measurement of total nitrogen in lettuce by integrating spectroscopy and computer vision. Sci. Hortic. 2015, 184, 1–7. [Google Scholar] [CrossRef]
- Ahmad, I.S.; Reid, J.F. Evaluation of colour representations for maize images. J. Agric. Eng. Res. 1996, 63, 185–195. [Google Scholar] [CrossRef]
- An, J.; Li, W.; Li, M.; Cui, S.; Yue, H. Identification and Classification of Maize Drought Stress Using Deep Convolutional Neural Network. Symmetry 2019, 11, 256. [Google Scholar] [CrossRef] [Green Version]
- Chandel, N.S.; Chakraborty, S.K.; Rajwade, Y.A.; Dubey, K.; Tiwari, M.K.; Jat, D. Identifying crop water stress using deep learning models. Neural Comput. Appl. 2021, 33, 5353–5367. [Google Scholar] [CrossRef]
- Rubanga, D.P.; Loyani, L.K.; Richard, M.; Shimada, S. A Deep Learning Approach. for Determining Effects of Tuta Absoluta in Tomato Plants. arXiv 2020, arXiv:2004.04023. [Google Scholar]
- Han, K.A.M.; Watchareeruetai, U. Black Gram Plant Nutrient Deficiency Classification in Combined Images Using Convolutional Neural Network. In Proceedings of the 2020 8th International Electrical Engineering Congress (iEECON), Mai, Thailand, 4 March 2020. [Google Scholar]
- Wakamori, K.; Mizuno, R.; Nakanishi, G.; Mineno, H. Multimodal neural network with. clustering-based drop for estimating plant water stress. Comput. Electron. Agric. 2020, 168, 105118. [Google Scholar] [CrossRef]
- Shariff, A.R.M.; Aik, Y.Y.; Hong, W.T.; Mansor, S.; Mispan, R. Automated identification and counting of pests in the paddy fields using image analysis. In Computers in Agriculture and Natural Resources, Proceedings of the 4th World Congress Conference, Orlando Florida, FL, USA, 23–25 July 2006; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2006; p. 759. [Google Scholar]
- Li, H.; Yin, J.; Zhang, M.; Sigrimis, N.; Gao, Y.; Zheng, W. Automatic diagnosis of strawberry water stress status based on machine vision. Int. J. Agric. Biol. Eng. 2019, 12, 159–164. [Google Scholar]
- Khanna, R.; Schmid, L.; Walter, A.; Nieto, J.; Siegwart, R.; Liebisch, F. A spatio temporal spectral framework for plant stress phenotyping. Plant Methods 2019, 15, 1–18. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hendrawan, Y.; Murase, H. Bio-inspired feature selection to select informative image features for determining water content of cultured Sunagoke moss. Expert Syst. Appl. 2011, 38, 14321–14335. [Google Scholar] [CrossRef]
- Seginer, I.; Elster, R.T.; Goodrum, J.W.; Rieger, M.W. Plant wilt detection by computer-vision tracking of leaf tips. Trans. ASAE 1992, 35, 1563–1567. [Google Scholar] [CrossRef]
- Cevallos, C.; Ponce, H.; Moya-Albor, E.; Brieva, J. Vision-Based Analysis on. Leaves of Tomato Crops for Classifying Nutrient Deficiency using Convolutional Neural Net-works. In Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19 July 2020. [Google Scholar]
- Nhamo, L.; Magidi, J.; Nyamugama, A.; Clulow, A.D.; Sibanda, M.; Chimonyo, V.G.; Mabhaudhi, T. Prospects of Improving Agricultural and Water Productivity through Unmanned Aerial Vehicles. Agriculture 2020, 10, 256. [Google Scholar] [CrossRef]
- Koley, S. Machine learning for soil fertility and plant nutrient management using back propagation neural networks. Int. J. Recent Innov. Trends Comput. Commun. 2014, 2, 292–297. [Google Scholar]
- Hetzroni, A.; Miles, G.E.; Engel, B.A.; Hammer, P.A.; Latin, R.X. Machine vision monitoring of plant health. Adv. Space Res. 1994, 14, 203–212. [Google Scholar] [CrossRef]
- Mao, H.P.; Xu, G.; Li, P. Diagnosis of nutrient deficiency of tomato based on computer vision. Trans. Chin. Soc. Agric. Mach. 2003, 34, 73–75. [Google Scholar]
- Rangel, B.M.S.; Fernández, M.A.A.; Murillo, J.C.; Ortega, J.C.P.; Arreguín, J.M.R. KNN-based image segmentation for grapevine potassium deficiency diagnosis. In Proceedings of the 2016 International conference on Electronics Communications and Computers (CONIELECOMP), Cholula, Mexico, 24 February 2016. [Google Scholar]
- Li, D.; Zhang, P.; Chen, T.; Qin, W. Recent development and challenges in spectroscopy and machine vision technologies for crop nitrogen diagnosis: A review. Remote Sens. 2020, 12, 2578. [Google Scholar] [CrossRef]
- Boissard, P.; Martin, V.; Moisan, S. A cognitive vision approach to early pest detection in greenhouse crops. Comput. Electron. Agric. 2008, 62, 81–93. [Google Scholar] [CrossRef] [Green Version]
- Muppala, C.; Guruviah, V. Machine vision detection of pests, diseases and weeds: A review. J. Phytol. 2020, 12, 9–19. [Google Scholar] [CrossRef] [Green Version]
- Weizheng, S.; Yachun, W.; Zhanliang, C.; Hongda, W. Grading method of leaf spot disease based on image processing. In Proceedings of the 2008 International Conference on Computer Science and Software Engineering, Wuhan, China, 12 December 2008. [Google Scholar]
- Al-Hiary, H.; Bani-Ahmad, S.; Reyalat, M.; Braik, M.; Alrahamneh, Z. Fast and accurate detection and classification of plant diseases. Int. J. Comput. Appl. 2011, 17, 31–38. [Google Scholar] [CrossRef]
- Arivazhagan, S.; Shebiah, R.N.; Ananthi, S.; Varthini, S.V. Detection of unhealthy region of plant leaves and classification of plant leaf diseases using texture features. Agric. Eng. Int. CIGR J. 2013, 15, 211–217. [Google Scholar]
- Al Bashish, D.; Braik, M.; Bani-Ahmad, S. Detection and classification of leaf diseases using K-means-based segmentation and Neural-networks-based classification. Inf. Technol. J. 2011, 10, 267–275. [Google Scholar] [CrossRef] [Green Version]
- Chaudhary, P.; Chaudhari, A.K.; Cheeran, A.N.; Godara, S. Color transform based approach for disease spot detection on plant leaf. Int. J. Comput. Sci. Telecommun. 2012, 3, 65–70. [Google Scholar]
- Omrani, E.; Khoshnevisan, B.; Shamshirband, S.; Saboohi, H.; Anuar, N.B.; Nasir, M.H.N.M. Potential of radial basis function-based support vector regression for apple disease detection. Measurement 2014, 55, 512–519. [Google Scholar] [CrossRef]
- Camargo, A.; Smith, J.S. An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosyst. Eng. 2009, 102, 9–21. [Google Scholar] [CrossRef]
- Choudhary, G.M.; Gulati, V. Advance in Image Processing for Detection of Plant Diseases. Int. J. Adv. Res. Comput. Sci. Softw. Eng. 2015, 5, 1090–1093. [Google Scholar]
- Kanjalkar, H.P.; Lokhande, S.S. Detection and classification of plant leaf diseases using ANN. Int. J. Sci. Eng. Res. 2013, 4, 1777–1780. [Google Scholar]
- Naikwadi, S.; Amoda, N. Advances in image processing for detection of plant diseases. Int. J. Appl. Or Innov. Eng. Manag. 2013, 2, 11. [Google Scholar]
- Muthukannan, K.; Latha, P.; Selvi, R.P.; Nisha, P. Classification of diseased plant leaves using neural Network algorithms. ARPN J. Eng. Appl. Sci. 2015, 10, 1913–1919. [Google Scholar]
- Wu, S.G.; Bao, F.S.; Xu, E.Y.; Wang, Y.X.; Chang, Y.F.; Xiang, Q.L. A leaf recognition algorithm for plant classification using probabilistic neural network. In Proceedings of the 2007 IEEE International Symposium on Signal Processing and Information Technology, Giza, Egypt, 15 December 2007. [Google Scholar]
- Singh, V.; Misra, A.K. Detection of plant leaf diseases using image segmentation and soft computing techniques. Inf. Process. Agric. 2017, 4, 41–49. [Google Scholar] [CrossRef] [Green Version]
- Kutty, S.B.; Abdullah, N.E.; Hashim, H.; Kusim, A.S.; Yaakub, T.N.T.; Yunus, P.N.A.M.; Abd Rahman, M.F. Classification of watermelon leaf diseases using neural network analysis. In Proceedings of the Business Engineering and Industrial Applications Colloquium (Beiac), Langkawi, Malaysia, 7 April 2013. [Google Scholar]
- Zhang, S.; Wu, X.; You, Z.; Zhang, L. Leaf image based cucumber disease recognition using sparse representation classification. Comput. Electron. Agric. 2017, 134, 135–141. [Google Scholar] [CrossRef]
- Sethy, P.K.; Barpanda, N.K.; Rath, A.K.; Behera, S.K. Deep feature based rice leaf disease identification using support vector machine. Comput. Electron. Agric. 2020, 175, 105527. [Google Scholar] [CrossRef]
- Karthik, R.; Hariharan, M.; Anand, S.; Mathikshara, P.; Johnson, A.; Menaka, R. Attention embedded residual CNN for disease detection in tomato leaves. Appl. Soft Comput. 2020, 86, 105933. [Google Scholar]
- Xie, X.; Ma, Y.; Liu, B.; He, J.; Li, S.; Wang, H. A deep-learning-based real-time. detector for grape leaf diseases using improved convolutional neural networks. Front. Plant Sci. 2020, 11, 751. [Google Scholar] [CrossRef]
- Jogekar, R.N.; Tiwari, N. A review of deep learning techniques for identification and diagnosis of plant leaf disease. Trends Comput. Commun. Proc. SmartCom 2020, 182, 435–441. [Google Scholar]
- López-García, F.; Andreu-García, G.; Blasco, J.; Aleixos, N.; Valiente, J.M. Automatic detection of skin defects in citrus fruits using a multivariate image analysis approach. Comput. Electron. Agric. 2010, 71, 189–197. [Google Scholar] [CrossRef]
- Kim, D.G.; Burks, T.F.; Qin, J.; Bulanon, D.M. Classification of grapefruit peel diseases using color texture feature analysis. Int. J. Agric. Biol. Eng. 2009, 2, 41–50. [Google Scholar]
- Qin, J.; Burks, T.F.; Ritenour, M.A.; Bonn, W.G. Detection of citrus canker using hyperspectral reflectance imaging with spectral information divergence. J. Food Eng. 2009, 93, 183–191. [Google Scholar] [CrossRef]
- Blasco, J.; Aleixos, N.; Moltó, E. Computer vision detection of peel defects in citrus by means of a region oriented segmentation algorithm. J. Food Eng. 2007, 81, 535–543. [Google Scholar] [CrossRef]
- Blasco, J.; Aleixos, N.; Gómez-Sanchís, J.; Moltó, E. Recognition and classification of external skin damage in citrus fruits using multispectral data and morphological features. Biosyst. Eng. 2009, 103, 137–145. [Google Scholar] [CrossRef]
- Li, J.; Rao, X.; Ying, Y. Detection of common defects on oranges using hyperspectral reflectance imaging. Comput. Electron. Agric. 2011, 78, 38–48. [Google Scholar] [CrossRef]
- Li, J.; Rao, X.; Wang, F.; Wu, W.; Ying, Y. Automatic detection of common surface defects on oranges using combined lighting transform and image ratio methods. Postharvest Biol. Technol. 2013, 82, 59–69. [Google Scholar] [CrossRef]
- Rong, D.; Rao, X.; Ying, Y. Computer vision detection of surface defect on oranges by means of a sliding comparison window local segmentation algorithm. Comput. Electron. Agric. 2017, 137, 59–68. [Google Scholar] [CrossRef]
- Li, Q.; Wang, M.; Gu, W. Computer vision based system for apple surface defect detection. Comput. Electron. Agric. 2002, 36, 215–223. [Google Scholar] [CrossRef]
- Shahin, M.A.; Tollner, E.W.; McClendon, R.W.; Arabnia, H.R. Apple classification based on surface bruises using image processing and neural networks. Trans. ASAE 2002, 45, 1619. [Google Scholar]
- Throop, J.A.; Aneshansley, D.J.; Anger, W.C.; Peterson, D.L. Quality evaluation of apples based on surface defects: Development of an automated inspection system. Postharvest Biol. Technol. 2005, 36, 281–290. [Google Scholar] [CrossRef]
- Dubey, S.R.; Jalal, A.S. Detection and classification of apple fruit diseases using complete local binary patterns. In Proceedings of the 2012 Third International Conference on Computer and Communication Technology, Allahabad, India, 23 November 2012. [Google Scholar]
- Kleynen, O.; Leemans, V.; Destain, M.F. Development of a multi-spectral vision system for the detection of defects on apples. J. Food Eng. 2005, 69, 41–49. [Google Scholar] [CrossRef] [Green Version]
- Barnes, M.; Duckett, T.; Cielniak, G.; Stroud, G.; Harper, G. Visual detection of blemishes in potatoes using minimalist boosted classifiers. J. Food Eng. 2010, 98, 339–346. [Google Scholar] [CrossRef] [Green Version]
- Laykin, S.; Alchanatis, V.; Fallik, E.; Edan, Y. Image–processing algorithms for tomato classification. Trans. ASAE 2002, 45, 851. [Google Scholar] [CrossRef]
- Diaz, R.; Gil, L.; Serrano, C.; Blasco, M.; Moltó, E.; Blasco, J. Comparison of three algorithms in the classification of table olives by means of computer vision. J. Food Eng. 2004, 61, 101–107. [Google Scholar] [CrossRef]
- Ariana, D.P.; Lu, R.; Guyer, D.E. Near-infrared hyperspectral reflectance imaging for detection of bruises on pickling cucumbers. Comput. Electron. Agric. 2006, 53, 60–70. [Google Scholar] [CrossRef]
- Wang, W.; Thai, C.; Li, C.; Gitaitis, R.; Tollner, E.W.; Yoon, S.C. Detection of sour skin diseases in vidalia sweet onions using near-Infrared hyperspectral imaging. In Proceedings of the 2009 American Society of Agricultural and Biological Engineers AIM, Reno, Nevada, 21–24 June 2009. [Google Scholar]
- Elsharif, A.A.; Dheir, I.M.; Mettleq, A.S.A.; Abu-Naser, S.S. Potato Classification Using Deep Learning. Int. J. Acad. Pedagog. Res. (IJAPR) 2020, 3, 1–8. [Google Scholar]
- Kukreja, V.; Dhiman, P. A Deep Neural Network based disease detection scheme for Citrus fruits. In Proceedings of the 2020 International Conference on Smart Electronics and Communication (ICOSEC), Trichy, India, 10 September 2020. [Google Scholar]
- El-Mashharawi, H.Q.; Abu-Naser, S.S.; Alshawwa, I.A.; Elkahlout, M. Grape type classification using deep learning. Int. J. Acad. Eng. Res. (IJAER) 2020, 3, 12. [Google Scholar]
- Rasti, P.; Ahmad, A.; Samiei, S.; Belin, E.; Rousseau, D. Supervised Image Classification by Scattering Transform with Application to Weed Detection in Culture Crops of High Density Sensing. Remote Sens. 2019, 11, 249. [Google Scholar] [CrossRef] [Green Version]
- Shin, J.; Chang, Y.K.; Heung, B.; Nguyen-Quang, T.; Price, G.W.; Al-Mallahi, A. A deep learning approach for RGB image-based powdery mildew disease detection on strawberry leaves. Comput. Electron. Agric. 2021, 183, 106042. [Google Scholar] [CrossRef]
- Orlov, S.P.; Susarev, S.V.; Morev, A.S. Machine Vision System for Autonomous Agricultural Vehicle. In Proceedings of the 2020 International Conference on Industrial Engineering Applications and Manufacturing (ICIEAM), Sochi, Russia, 18–22 May 2020; pp. 1–5. [Google Scholar]
- Tian, Z.; Ma, W.; Yang, Q.; Duan, F. Application status and challenges of machine vision in plant factory—A review. Inf. Process. Agric. 2022, 9, 195–211. [Google Scholar] [CrossRef]
- Qiu, Z.; Zhao, N.; Zhou, L.; Wang, M.; Yang, L.; Fang, H.; He, Y.; Liu, Y. Vision-based moving obstacle detection and tracking in paddy field using improved yolov3 and deep SORT. Sensors 2020, 20, 4082. [Google Scholar] [CrossRef]
- Sarker, I.H. Machine learning: Algorithms, real-world applications and research directions. SN Comput. Sci. 2021, 2, 1–21. [Google Scholar] [CrossRef]
- Bermeo-Almeida, O.; Cardenas-Rodriguez, M.; Samaniego-Cobo, T.; Ferruzola-Gómez, E.; Cabezas-Cabezas, R.; Bazán-Vera, W. Blockchain in agriculture: A systematic literature review. In International Conference on Technologies and Innovation; Springer: Cham, Switzerland, 2018; pp. 44–56. [Google Scholar]
- Lucena, P.; Binotto, A.P.; Momo, F.D.S.; Kim, H. A case study for grain quality assurance tracking based on a Blockchain business network. arXiv 2018, arXiv:1803.07877. [Google Scholar]
- Mohi-Alden, K.; Omid, M.; Firouz, M.S.; Nasiri, A. A Machine Vision-Intelligent Modelling Based Technique for In-Line Bell Pepper Sorting. Inf. Process. Agric. 2022; in press. [Google Scholar]
Target | Techniques | Results | References |
---|---|---|---|
Apple | Hyperspectral imaging with spectral signature measurement of leaf surfaces | Red edge normalized difference vegetation index (NDVI) at 705–750 nm and NDVI at 680–800 nm showed the highest correlation with water stress | Kim et al., 2011 [20] |
Corn and soybeans | Drought stress using a pretrained DenseNet-12 model | Predicted drought/water responses with an accuracy of 88% | Ramos-Giraldo et al., 2020 [28] |
Forsythia | Shape analysis with moment invariant, fractal dimension, and skeleton measures | Measured water stress and other stresses, such as thermal, mechanical, and mineral deficiency stress | Foucher et al., 2004 [24] |
Lettuce | grey-level co-occurrence matrix (GLCM) texture, hue-saturation-luminance color, and top-projected canopy area (TPCA) morphological features analysis | TPCA and three textural parameters showed a good relationship with calcium deficiency (R2 = 0.97) | Story et al., 2010 [29] |
Lettuce | Color, texture, and morphological features analysis | Combined computer vision (CV) and spectroscopy techniques showed a higher accuracy of 88.64% for nitrogen | Mao et al., 2015 [30] |
Maize | Features extraction using red, green, and blue (RGB), hue-saturation-intensity (HSI), and chromaticity coordinate transformations | HSI detected color variations in both water and nitrogen levels ahead of RGB and chromaticity coordinates | Ahmad & Reid, 1996 [31] |
Maize | Image segmentation using E_G = 2G-R-BR+G+B | The algorithm had a 94.72% accuracy for the presence of fall armyworm on plant | Sena Jr. et al., 2003 [22] |
Maize | Deep convolutional neural network for classifying drought stress | Accuracies of the identification and classification of drought stress were 98.14% and 95.95%, respectively | An et al., 2019 [32] |
Maize, okra, and soybean | AlexNet, GoogLeNet, and Inception V3 were used to identify water stress | GoogLeNet was at 98.3%, 97.5%, and 94.1% accuracy for maize, okra, and soybean, respectively | Chandel et al., 2020 [33] |
New Guinea impatiens | Image segmentation and feature extraction using TPCA with grey images | Plant water stress detection was feasible with a 100% success rate using plant movement | Kacira et al., 2002 [18] |
Pest | VGG 16, VGG 19, ResNet, and Inception-V3 were used to detect Tuba absoluta | The Inception-V3 reported the highest accuracy of 87.2% in estimating the severity of the pest in tomato | Rubanga et al., 2020 [34] |
Plants leaves | Near-infrared spectroscopy reflectance and the red reflectance using an inexpensive high pass filter at 800 nm to a smartphone camera | A strong linear correlation with R2 = 0.948, corroborating the smartphone’s ability in evaluating NDVI associated with water stress | Chung et al., 2018 [25] |
Plant leaves | ResNet-50, logistic regression, support vector machine, and multilayer perceptron | Multilayer perceptron showed the best performance with accuracy of 88.33% for nutrient deficiency | Han & Watchareeruetai, 2020 [35] |
Plant stress | top-K high-resolution feature maps that isolate the visual symptoms | The deep convolutional neural network model built performed well with visual symptoms of chemical injury and nutrient deficiency | Ghosal et al., 2018 [26] |
Plant stress | C-Drop neural networks were used to predict water stress. | C-Drop neural networks increased the accuracy of water stress prediction by 21% | Wakamori et al., 2020 [36] |
Rice | Color, shape, and texture features analysis | Accuracy of 100% was achieved for six pest extraction and classification | Shariff et al., 2006 [37] |
Strawberry | RGB and infrared image information was fused and used for auto-detection of the crop area temperature | R2 between area Crop Water Stress Index and three strawberry groups were 0.8834, 0.8730 and 0.8851, respectively | Li et al., 2019 [38] |
Sugar beet | Canopy cover, height, hyperspectral reflectance, and vegetation indices were used as the benchmark | Mean cross-validation accuracies were 93%, 76%, and 83% for drought, nitrogen, and weed stress severity classification, respectively. | Khanna et al., 2019 [39] |
Sunagoke moss | CCM and GLCM texture analysis for 90 textural features with 3 morphological features | The n-ACO algorithm performed better with the lowest mean squared error of 0.00175 for water content determination. | Hendrawan & Murase, 2011 [40] |
Tomato | CV-based image analysis tracks the vertical movement of leaf tips of plants | Tomato plants have linear vertical motions in response to both water stress levels and carbon dioxide | Seginer et al., 1992 [41] |
Tomato | Color and shape analysis of images with a charged coupled device camera. | 83.1% of the white flies were correctly classified. | Bauch & Rath, 2004 [23] |
Tomato | The classification Tree model was used to group complex hyperspectral datasets to provide remote visual results | The combination of modified soil adjusted vegetation index, mrNDVI, and photochemical reflectance index determine water and nitrogen deficit stress with 91.4% classification accuracy values for the testing samples | Elvanidi et al., 2018 [27] |
Tomato | Convolutional neural networks were used to detect nutrient deficiencies based on nitrogen, phosphorus, and potassium | Developed automated nutrition monitoring system for tomato crops achieved an accuracy of 86.57% | Cevallos et al., 2020 [42] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shin, J.; Mahmud, M.S.; Rehman, T.U.; Ravichandran, P.; Heung, B.; Chang, Y.K. Trends and Prospect of Machine Vision Technology for Stresses and Diseases Detection in Precision Agriculture. AgriEngineering 2023, 5, 20-39. https://doi.org/10.3390/agriengineering5010003
Shin J, Mahmud MS, Rehman TU, Ravichandran P, Heung B, Chang YK. Trends and Prospect of Machine Vision Technology for Stresses and Diseases Detection in Precision Agriculture. AgriEngineering. 2023; 5(1):20-39. https://doi.org/10.3390/agriengineering5010003
Chicago/Turabian StyleShin, Jaemyung, Md. Sultan Mahmud, Tanzeel U. Rehman, Prabahar Ravichandran, Brandon Heung, and Young K. Chang. 2023. "Trends and Prospect of Machine Vision Technology for Stresses and Diseases Detection in Precision Agriculture" AgriEngineering 5, no. 1: 20-39. https://doi.org/10.3390/agriengineering5010003
APA StyleShin, J., Mahmud, M. S., Rehman, T. U., Ravichandran, P., Heung, B., & Chang, Y. K. (2023). Trends and Prospect of Machine Vision Technology for Stresses and Diseases Detection in Precision Agriculture. AgriEngineering, 5(1), 20-39. https://doi.org/10.3390/agriengineering5010003