AM-MSFF: A Pest Recognition Network Based on Attention Mechanism and Multi-Scale Feature Fusion
Abstract
:1. Introduction
- The introduction of relation-aware global attention (RGA) helps the model focus on the pest part, suppresses interference from complex backgrounds, and enhance the model’s attention to pests;
- We propose the multi-scale feature fusion (MSFF) module, which extracts features at different scales and integrates these features to capture both the characteristics and contextual information of pests across different scales. This enables the model to better adapt to variations in the morphology and appearance of different pests. Additionally, we introduce generalized-mean pooling (GeMP) to better preserve important features and enhance the sensitivity to detailed information;
- An improved version of the cross-entropy loss function, called balanced focal loss (BFL), is proposed based on the focal loss (FL). BFL takes into consideration the number of samples for each class and adjusts the weights for each class accordingly. This adjustment allows the model to pay more attention to minority samples and hard-to-classify samples, thereby allowing the model to better handle class imbalance situations.
2. Related Work
2.1. Handcrafted Features
2.2. Deep Features
3. Proposed Method
3.1. Relation-Aware Global Attention
3.1.1. Spatial Relation-Aware Global Attention
3.1.2. Channel Relation-Aware Global Attention
3.2. Multi-Scale Feature Fusion
3.3. Generalized-Mean Pooling
3.4. Balanced Focal Loss
4. Experiments
4.1. DataSets
4.2. Evaluation Metrics
4.3. Experiment Settings
4.4. Experimental Results
4.5. Ablative Study
4.5.1. The Impact of Relation-Aware Global Attention
4.5.2. The Impact of Multi-Scale Feature Fusion
4.5.3. The Impact of Generalized-Mean Pooling
4.5.4. The Impact of Balanced Focal Loss
4.5.5. Discussion of Results
4.6. Visualization
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Kang, S.; Hao, X.; Du, T.; Tong, L.; Su, X.; Lu, H.; Li, X.; Huo, Z.; Li, S.; Ding, R. Improving agricultural water productivity to ensure food security in China under changing environment: From research to practice. Agric. Water Manag. 2017, 179, 5–17. [Google Scholar] [CrossRef]
- Waddington, S.R.; Li, X.; Dixon, J.; Hyman, G.; De Vicente, M.C. Getting the focus right: Production constraints for six major food crops in Asian and African farming systems. Food Secur. 2010, 2, 27–48. [Google Scholar] [CrossRef]
- Li, W.; Zheng, T.; Yang, Z.; Li, M.; Sun, C.; Yang, X. Classification and detection of insects from field images using deep learning for smart pest management: A systematic review. Ecol. Inform. 2021, 66, 101460. [Google Scholar] [CrossRef]
- Damos, P. Modular structure of web-based decision support systems for integrated pest management. A review. Agron. Sustain. Dev. 2015, 35, 1347–1372. [Google Scholar] [CrossRef]
- Oliva, A.; Torralba, A. Modeling the shape of the scene: A holistic representation of the spatial envelope. Int. J. Comput. Vis. 2001, 42, 145–175. [Google Scholar] [CrossRef]
- Dalal, N.; Triggs, B. Histograms of oriented gradients for human detection. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–26 June 2005; Volume 1, pp. 886–893. [Google Scholar]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Bay, H.; Tuytelaars, T.; Van Gool, L. Surf: Speeded up robust features. In Proceedings of the Computer Vision–ECCV 2006: 9th European Conference on Computer Vision, Graz, Austria, 7–13 May 2006; Springer: Berlin/Heidelberg, Germany, 2006; pp. 404–417. [Google Scholar]
- Thenmozhi, K.; Reddy, U.S. Crop pest classification based on deep convolutional neural network and transfer learning. Comput. Electron. Agric. 2019, 164, 104906. [Google Scholar] [CrossRef]
- Truman, J.W.; Riddiford, L.M. The origins of insect metamorphosis. Nature 1999, 401, 447–452. [Google Scholar] [CrossRef] [PubMed]
- Gilbert, L.I.; Schneiderman, H.A. Some biochemical aspects of insect metamorphosis. Am. Zool. 1961; 11–51. [Google Scholar]
- Mayo, M.; Watson, A.T. Automatic species identification of live moths. Knowl. Based Syst. 2007, 20, 195–202. [Google Scholar] [CrossRef]
- Rasband, W. ImageJ: Image Processing and Analysis in Java. Astrophysics Source Code Library. 2012, p. ascl-1206. Available online: https://ui.adsabs.harvard.edu/abs/2012ascl.soft06013R/abstract (accessed on 18 March 2024).
- Yalcin, H. Vision based automatic inspection of insects in pheromone traps. In Proceedings of the 2015 Fourth International Conference on Agro-Geoinformatics (Agro-Geoinformatics), Istanbul, Turkey, 20–24 July 2015; pp. 333–338. [Google Scholar]
- Venugoban, K.; Ramanan, A. Image classification of paddy field insect pests using gradient-based features. Int. J. Mach. Learn. Comput. 2014, 4, 1. [Google Scholar] [CrossRef]
- Xie, C.; Zhang, J.; Li, R.; Li, J.; Hong, P.; Xia, J.; Chen, P. Automatic classification for field crop insects via multiple-task sparse representation and multiple-kernel learning. Comput. Electron. Agric. 2015, 119, 123–132. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
- Li, Y.; Yang, J. Few-shot cotton pest recognition and terminal realization. Comput. Electron. Agric. 2020, 169, 105240. [Google Scholar] [CrossRef]
- Cheng, X.; Zhang, Y.; Chen, Y.; Wu, Y.; Yue, Y. Pest identification via deep residual learning in complex background. Comput. Electron. Agric. 2017, 141, 351–356. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
- Liu, W.; Wu, G.; Ren, F.; Kang, X. DFF-ResNet: An insect pest recognition model based on residual networks. Big Data Min. Anal. 2020, 3, 300–310. [Google Scholar] [CrossRef]
- Coulibaly, S.; Kamsu-Foguem, B.; Kamissoko, D.; Traore, D. Explainable deep convolutional neural networks for insect pest recognition. J. Clean. Prod. 2022, 371, 133638. [Google Scholar] [CrossRef]
- Hu, K.; Liu, Y.; Nie, J.; Zheng, X. Rice pest identification based on multi-scale double-branch GAN-ResNet. Front. Plant Sci. 2023, 14, 1167121. [Google Scholar] [CrossRef] [PubMed]
- Lin, T.Y.; Goyal, P.; Girshick, R.; He, K.; Dollár, P. Focal loss for dense object detection. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2980–2988. [Google Scholar]
- Zhang, Z.; Lan, C.; Zeng, W.; Jin, X.; Chen, Z. Relation-aware global attention for person re-identification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 3186–3195. [Google Scholar]
- Liu, S.; Huang, D.; Wang, Y. Learning spatial fusion for single-shot object detection. arXiv 2019, arXiv:1911.09516. [Google Scholar]
- Radenović, F.; Tolias, G.; Chum, O. Fine-tuning CNN image retrieval with no human annotation. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 41, 1655–1668. [Google Scholar] [CrossRef] [PubMed]
- Wu, X.; Zhan, C.; Lai, Y.K.; Cheng, M.M.; Yang, J. Ip102: A large-scale benchmark dataset for insect pest recognition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 8787–8796. [Google Scholar]
- Xie, C.; Wang, R.; Zhang, J.; Chen, P.; Dong, W.; Li, R.; Chen, T.; Chen, H. Multi-level learning features for automatic classification of field crop pests. Comput. Electron. Agric. 2018, 152, 233–241. [Google Scholar] [CrossRef]
- Ren, F.; Liu, W.; Wu, G. Feature reuse residual networks for insect pest recognition. IEEE Access 2019, 7, 122758–122768. [Google Scholar] [CrossRef]
- Liu, W.; Wu, G.; Ren, F. Deep multibranch fusion residual network for insect pest recognition. IEEE Trans. Cogn. Dev. Syst. 2020, 13, 705–716. [Google Scholar] [CrossRef]
- Nanni, L.; Maguolo, G.; Pancino, F. Insect pest image detection and recognition based on bio-inspired methods. Ecol. Inform. 2020, 57, 101089. [Google Scholar] [CrossRef]
- Ung, H.T.; Ung, H.Q.; Nguyen, B.T. An efficient insect pest classification using multiple convolutional neural network based models. arXiv 2021, arXiv:2107.12189. [Google Scholar]
- Yang, X.; Luo, Y.; Li, M.; Yang, Z.; Sun, C.; Li, W. Recognizing pests in field-based images by combining spatial and channel attention mechanism. IEEE Access 2021, 9, 162448–162458. [Google Scholar] [CrossRef]
- Setiawan, A.; Yudistira, N.; Wihandika, R.C. Large scale pest classification using efficient Convolutional Neural Network with augmentation and regularizers. Comput. Electron. Agric. 2022, 200, 107204. [Google Scholar] [CrossRef]
- An, J.; Du, Y.; Hong, P.; Zhang, L.; Weng, X. Insect recognition based on complementary features from multiple views. Sci. Rep. 2023, 13, 2966. [Google Scholar] [CrossRef]
- Lin, S.; Xiu, Y.; Kong, J.; Yang, C.; Zhao, C. An effective pyramid neural network based on graph-related attentions structure for fine-grained disease and pest identification in intelligent agriculture. Agriculture 2023, 13, 567. [Google Scholar] [CrossRef]
- Li, Y.; Sun, M.; Qi, Y. Common pests classification based on asymmetric convolution enhance depthwise separable neural network. J. Ambient. Intell. Humaniz. Comput. 2023, 14, 8449–8457. [Google Scholar] [CrossRef]
- Yu, J.; Shen, Y.; Liu, N.; Pan, Q. Frequency-enhanced channel-spatial attention module for grain pests classification. Agriculture 2022, 12, 2046. [Google Scholar] [CrossRef]
- Su, Z.; Luo, J.; Wang, Y.; Kong, Q.; Dai, B. Comparative study of ensemble models of deep convolutional neural networks for crop pests classification. Multimed. Tools Appl. 2023, 82, 29567–29586. [Google Scholar] [CrossRef]
- Selvaraju, R.R.; Cogswell, M.; Das, A.; Vedantam, R.; Parikh, D.; Batra, D. Grad-cam: Visual explanations from deep networks via gradient-based localization. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 618–626. [Google Scholar]
Crop Type | Rice | Corn | Wheat | Beet | Alfalfa | Vitis | Citrus | Mango |
---|---|---|---|---|---|---|---|---|
Number of Categories | 14 | 13 | 9 | 8 | 13 | 16 | 19 | 10 |
Number of Images | 8417 | 14,004 | 3418 | 4420 | 10,390 | 17,551 | 7273 | 9738 |
No. | Name | Quantity | No. | Name | Quantity |
---|---|---|---|---|---|
1 | Dolycoris baccarum (Linnaeus) | 87 | 21 | Stollia ventralis (Westwood) | 72 |
2 | Lycorma delicatula (White) | 92 | 22 | Nilaparvata lugens (Stål) | 62 |
3 | Eurydema dominulus (Scopoli) | 150 | 23 | Diostrombus politus Uhler | 238 |
4 | Pieris rapae (Linnaeus) | 71 | 24 | Phyllotreta striolata (Fabricius) | 187 |
5 | Halyomorpha halys (Stål) | 101 | 25 | Aulacophora indica (Gmelin) | 78 |
6 | Spilosoma obliqua (Walker) | 66 | 26 | Laodelphax striatellus (Fallén) | 61 |
7 | Graphosoma rubrolineata (Westwood) | 116 | 27 | Ceroplastes ceriferus (Anderson) | 100 |
8 | Luperomorpha suturalis Chen | 101 | 28 | Corythucha marmorata (Uhler) | 98 |
9 | Leptocorisa acuta (Thunberg) | 133 | 29 | Dryocosmus kuriphilus Yasumatsu | 50 |
10 | Sesamia inferens (Walker) | 126 | 30 | Porthesia taiwana Shiraki | 141 |
11 | Cicadella viridis (Linnaeus) | 138 | 31 | Chromatomyia horticola (Goureau) | 114 |
12 | Callitettix versicolor (Fabricius) | 156 | 32 | Iscadia inexacta (Walker) | 79 |
13 | Scotinophara lurida (Burmeister) | 117 | 33 | Plutella xylostella (Linnaeus) | 69 |
14 | Cletus punctiger (Dallas) | 169 | 34 | Empoasca flavescens (Fabricius) | 133 |
15 | Nezara viridula (Linnaeus) | 175 | 35 | Dolerus tritici Chu | 88 |
16 | Dicladispa armigera (Olivier) | 150 | 36 | Spodoptera litura (Fabricius) | 130 |
17 | Riptortus pedestris (Fabricius) | 110 | 37 | Corythucha ciliata (Say) | 90 |
18 | Maruca testulalis Geyer | 73 | 38 | Bemisia tabaci (Gennadius) | 147 |
19 | Chauliops fallax Scott | 68 | 39 | Ceutorhynchus asper Roelofs | 146 |
20 | Chilo suppressalis (Walker) | 93 | 40 | Strongyloides variegatus (Fairmaire) | 135 |
Model | ACC | MPre | MRec | MF1 | GM |
---|---|---|---|---|---|
ResNet-50 [29] (2019) | 49.4 | 43.7 | 39.1 | 40.5 | 30.7 |
FR-ResNet [31] (2019) | 55.24 | - | - | 54.18 | - |
DMF-ResNet [32] (2020) | 59.22 | - | - | 58.37 | - |
GAEnsemble [33] (2020) | 67.13 | 67.17 | 67.13 | 65.76 | - |
MMAL [34] (2021) | 72.15 | 62.63 | 69.13 | 64.53 | 58.43 |
STN-SE-ResNet50 [35] (2021) | 69.84 | - | - | - | - |
MobileNetV2 + Sparse + CutMix + DynamicLR [36] (2022) | 71.32 | - | - | - | - |
ResNet152 + Vision-Transformer + Swin-Transformer [37] (2023) | 65.6 | 60.9 | 59.7 | 60.3 | - |
GPA-Net [38] (2023) | 56.9 | 45.9 | 43.8 | 45.0 | - |
AM-MSFF | 72.64 | 64.54 | 67.37 | 65.62 | 61.48 |
Model | ACC | MPre | MRec | MF1 |
---|---|---|---|---|
MLLF + MKB [30] (2018) | 89.3 | - | - | - |
CNNs [9] (2019) | 95.97 | - | - | - |
GAEnsemble [33] (2020) | 98.81 | 98.88 | 98.81 | 98.81 |
ResNet-50 [33] (2020) | 92.18 | 92.74 | 92.18 | 92.07 |
ACEDSNet [39] (2022) | 96.15 | - | - | - |
FcsNet [40] (2022) | 98.33 | 98.49 | 98.33 | 98.34 |
SBPEnsemble [41] (2023) | 96.18 | 96.45 | 95.37 | - |
AM-MSFF | 99.05 | 98.92 | 98.86 | 98.84 |
Model | ACC | MPre | MRec | MF1 | GM |
---|---|---|---|---|---|
ResNet-50 (baseline) | 71.30 | 63.46 | 65.24 | 64.12 | 60.64 |
baseline + RGA | 71.76 | 63.91 | 65.82 | 64.59 | 60.77 |
baseline + GeMP | 71.79 | 63.93 | 66.11 | 64.78 | 61.13 |
baseline + MSFF | 71.63 | 63.82 | 65.07 | 64.24 | 61.00 |
AM-MSFF without MSFF | 72.01 | 63.94 | 66.39 | 64.85 | 60.84 |
AM-MSFF without GeMP | 71.64 | 63.95 | 65.99 | 64.71 | 61.01 |
AM-MSFF without RGA | 72.01 | 63.93 | 66.39 | 64.85 | 60.84 |
AM-MSFF | 72.40 | 64.45 | 66.51 | 65.23 | 61.35 |
Model | ACC | MPre | MRec | MF1 | GM |
---|---|---|---|---|---|
Baseline with GAP | 71.30 | 63.46 | 65.24 | 64.12 | 60.64 |
Baseline with GMP | 71.61 | 63.88 | 65.65 | 64.48 | 60.80 |
Baseline with GeMP | 71.79 | 63.93 | 66.11 | 64.78 | 61.13 |
AM-MSFF with GAP | 71.64 | 63.95 | 65.99 | 64.71 | 61.01 |
AM-MSFF with GMP | 71.74 | 63.24 | 66.65 | 64.53 | 61.00 |
AM-MSFF with GeMP | 72.40 | 64.45 | 66.51 | 65.23 | 61.35 |
Model | ACC | MPre | MRec | MF1 | GM |
---|---|---|---|---|---|
Baseline with CEL | 71.30 | 63.46 | 65.24 | 64.12 | 60.64 |
Baseline with FL | 71.84 | 63.73 | 66.33 | 64.74 | 60.85 |
Baseline with BFL | 71.95 | 63.82 | 65.61 | 64.43 | 60.61 |
AM-MSFF with CEL | 72.40 | 64.45 | 66.51 | 65.23 | 61.35 |
AM-MSFF with FL | 72.61 | 64.54 | 67.73 | 65.73 | 61.39 |
AM-MSFF with BFL | 72.64 | 64.54 | 67.37 | 65.62 | 61.48 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, M.; Yang, W.; Chen, D.; Fu, C.; Wei, F. AM-MSFF: A Pest Recognition Network Based on Attention Mechanism and Multi-Scale Feature Fusion. Entropy 2024, 26, 431. https://doi.org/10.3390/e26050431
Zhang M, Yang W, Chen D, Fu C, Wei F. AM-MSFF: A Pest Recognition Network Based on Attention Mechanism and Multi-Scale Feature Fusion. Entropy. 2024; 26(5):431. https://doi.org/10.3390/e26050431
Chicago/Turabian StyleZhang, Meng, Wenzhong Yang, Danny Chen, Chenghao Fu, and Fuyuan Wei. 2024. "AM-MSFF: A Pest Recognition Network Based on Attention Mechanism and Multi-Scale Feature Fusion" Entropy 26, no. 5: 431. https://doi.org/10.3390/e26050431