Moth Detection from Pheromone Trap Images Using Deep Learning Object Detectors
Abstract
1. Introduction
2. Materials and Methods
2.1. Data Collection
2.2. Detector Training and Evaluation
3. Results and Discussion
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Witzgall, P.; Kirsch, P.; Cork, A. Sex pheromones and their impact on pest management. J. Chem. Ecol. 2010, 36, 80–100. [Google Scholar] [CrossRef] [PubMed]
- Boissard, P.; Martin, V.; Moisan, S. A cognitive vision approach to early pest detection in greenhouse crops. Comput. Electron. Agric. 2008, 62, 81–93. [Google Scholar] [CrossRef]
- Kaloxylos, A.; Eigenmann, R.; Teye, F.; Politopoulou, Z.; Wolfert, S.; Shrank, C.; Dillinger, M.; Lampropoulou, I.; Antoniou, E.; Pesonen, L.; et al. Farm management systems and the Future Internet era. Comput. Electron. Agric. 2012, 89, 130–144. [Google Scholar] [CrossRef]
- Brewster, C.; Roussaki, I.; Kalatzis, N.; Doolin, K.; Ellis, K. IoT in Agriculture: Designing a Europe-Wide Large-Scale Pilot. IEEE Commun. Mag. 2017, 55, 26–33. [Google Scholar] [CrossRef]
- Muangprathub, J.; Boonnam, N.; Kajornkasirat, S.; Lekbangpong, N.; Wanichsombat, A.; Nillaor, P. IoT and agriculture data analysis for smart farm. Comput. Electron. Agric. 2019, 156, 467–474. [Google Scholar] [CrossRef]
- Thorat, A.; Kumari, S.; Valakunde, N.D. An IoT based smart solution for leaf disease detection. In Proceedings of the 2017 International Conference on Big Data, IoT and Data Science, BID 2017, Pune, India, 20–22 December 2017; Institute of Electrical and Electronics Engineers Inc.: Piscataway Township, NJ, USA, 2018; pp. 193–198. [Google Scholar]
- Zhang, S.; Wang, H.; Huang, W.; You, Z. Plant diseased leaf segmentation and recognition by fusion of superpixel, K-means and PHOG. Optik (Stuttg). 2018, 157, 866–872. [Google Scholar] [CrossRef]
- Chang, K.C.; Liu, P.K.; Kuo, Z.W.; Liao, S.H. Design of persimmon growing stage monitoring system using image recognition technique. In Proceedings of the 2016 IEEE International Conference on Consumer Electronics-Taiwan, ICCE-TW 2016, Nantou, Taiwan, 27–29 May 2016; Institute of Electrical and Electronics Engineers Inc.: Piscataway Township, NJ, USA, 2016. [Google Scholar]
- Zhang, J.; Guo, Z.L.; Chen, S.S.; Shao, B.Q.; Wang, Y.T. IoT-based detection for tropical flower. In Proceedings of the Proceedings—2016 International Conference on Information System and Artificial Intelligence, ISAI 2016, Hong Kong, China, 24–26 June 2016; Institute of Electrical and Electronics Engineers Inc.: Piscataway Township, NJ, USA, 2017; pp. 219–222. [Google Scholar]
- Kass, M.; Witkin, A.; Terzopoulos, D. Snakes: Active contour models. Int. J. Comput. Vis. 1988, 1, 321–331. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Dalal, N.; Triggs, B. Histograms of Oriented Gradients for Human Detection. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; pp. 886–893. [Google Scholar]
- Viola, P.; Jones, M. Rapid object detection using a boosted cascade of simple features. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Kauai, HI, USA, 8–14 December 2001. [Google Scholar]
- Suykens, J.A.K.; Vandewalle, J. Least squares support vector machine classifiers. Neural Process. Lett. 1999, 9, 293–300. [Google Scholar] [CrossRef]
- McCulloch, W.S.; Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 1943, 5, 115–133. [Google Scholar] [CrossRef]
- Xia, C.; Chon, T.S.; Ren, Z.; Lee, J.M. Automatic identification and counting of small size pests in greenhouse conditions with low computational cost. Ecol. Inform. 2015, 29, 139–146. [Google Scholar] [CrossRef]
- Li, Y.; Xia, C.; Lee, J. Detection of small-sized insect pest in greenhouses based on multifractal analysis. Optics 2015, 126, 2138–2143. [Google Scholar] [CrossRef]
- Wen, C.; Guyer, D.E.; Li, W. Local feature-based identification and classification for orchard insects. Biosyst. Eng. 2009, 104, 299–307. [Google Scholar] [CrossRef]
- Wang, J.; Lin, C.; Ji, L.; Liang, A. A new automatic identification system of insect images at the order level. Knowl. Based Syst. 2012, 33, 102–110. [Google Scholar] [CrossRef]
- Bakkay, M.C.; Chambon, S.; Rashwan, H.A.; Lubat, C.; Barsotti, S. Automatic detection of individual and touching moths from trap images by combining contour-based and region-based segmentation. IET Comput. Vis. 2018, 12, 138–145. [Google Scholar] [CrossRef]
- Solis-Sánchez, L.O.; Castañeda-Miranda, R.; García-Escalante, J.J.; Torres-Pacheco, I.; Guevara-González, R.G.; Castañeda-Miranda, C.L.; Alaniz-Lumbreras, P.D. Scale invariant feature approach for insect monitoring. Comput. Electron. Agric. 2011, 75, 92–99. [Google Scholar] [CrossRef]
- Bechar, I.; Moisan, S.; Thonnat, M.; Bremond, F. On-line video recognition and counting of harmful insects. In Proceedings of the International Conference on Pattern Recognition, Istanbul, Turkey, 23–26 August 2010; pp. 4068–4071. [Google Scholar]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 24–27 June 2014; pp. 580–587. [Google Scholar]
- Girshick, R. Fast r-cnn. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. In Proceedings of the Advances in Neural Information Processing Systems, Montreal, QC, Canada, 7–12 December 2015; pp. 91–99. [Google Scholar]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016; pp. 779–788. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.-Y.; Berg, A.C. Ssd: Single shot multibox detector. In European Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2016; pp. 21–37. [Google Scholar]
- Lin, T.-Y.; Goyal, P.; Girshick, R.; He, K.; Dollár, P. Focal loss for dense object detection. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 42, 318–327. [Google Scholar] [CrossRef]
- Hou, Y.L.; Song, Y.; Hao, X.; Shen, Y.; Qian, M.; Chen, H. Multispectral pedestrian detection based on deep convolutional neural networks. Infrared Phys. Technol. 2018, 94, 69–77. [Google Scholar] [CrossRef]
- Fuentes, A.; Yoon, S.; Kim, S.; Park, D. A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition. Sensors 2017, 17, 2022. [Google Scholar] [CrossRef]
- Ammour, N.; Alhichri, H.; Bazi, Y.; Benjdira, B.; Alajlan, N.; Zuair, M. Deep learning approach for car detection in UAV imagery. Remote Sens. 2017, 9, 312. [Google Scholar] [CrossRef]
- Hong, S.-J.; Han, Y.; Kim, S.-Y.; Lee, A.-Y.; Kim, G.; Hong, S.-J.; Han, Y.; Kim, S.-Y.; Lee, A.-Y.; Kim, G. Application of Deep-Learning Methods to Bird Detection Using Unmanned Aerial Vehicle Imagery. Sensors 2019, 19, 1651. [Google Scholar] [CrossRef] [PubMed]
- Ding, W.; Taylor, G. Automatic moth detection from trap images for pest management. Comput. Electron. Agric. 2016, 123, 17–28. [Google Scholar] [CrossRef]
- Sun, Y.; Liu, X.; Yuan, M.; Ren, L.; Wang, J.; Chen, Z. Automatic in-trap pest detection using deep learning for pheromone-based Dendroctonus valens monitoring. Biosyst. Eng. 2018, 176, 140–150. [Google Scholar] [CrossRef]
- Huang, J.; Rathod, V.; Sun, C.; Zhu, M.; Korattikara, A.; Fathi, A.; Fischer, I.; Wojna, Z.; Song, Y.; Guadarrama, S. Speed/accuracy trade-offs for modern convolutional object detectors. In Proceedings of the IEEE CVPR, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft COCO: Common objects in context. In European Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2014; pp. 740–755. [Google Scholar]
Training | Validation | Testing | |
---|---|---|---|
Images | 714 | 211 | 217 |
Spodoptera litura moth | 838 | 277 | 254 |
Helicoverpa assulta moth | 356 | 137 | 128 |
Spodoptera exigua moth | 772 | 252 | 199 |
Other insects | 687 | 154 | 145 |
Meta Architecture | Feature Extractor | Inference Time (ms/image) | mAP | AP(SL) * | AP(HA) * | AP(SE) * |
---|---|---|---|---|---|---|
Faster R-CNN | ResNet 101 | 103 | 90.25 | 98.06 | 77.59 | 95.11 |
Inception v.2 | 72 | 90.05 | 98.43 | 77.35 | 94.37 | |
ResNet 50 | 95 | 88.62 | 98.50 | 75.28 | 92.09 | |
R-FCN | ResNet 101 | 67 | 86.91 | 98.02 | 70.21 | 92.51 |
Retinanet | ResNet 50 | 61 | 88.99 | 98.28 | 74.13 | 94.56 |
Mobilenet v.2 | 34 | 85.21 | 97.76 | 66.53 | 91.34 | |
SSD | Inception v.2 | 23 | 76.86 | 93.41 | 55.19 | 81.97 |
Number of Class | Meta Architecture | Feature Extractor | mAP | AP(SL) * | AP(HA) * | AP(SE) * |
---|---|---|---|---|---|---|
3 | Faster R-CNN | ResNet 101 | 89.64 | 98.08 | 76.65 | 94.21 |
4 | Faster R-CNN | ResNet 101 | 90.25 | 98.06 | 77.59 | 95.11 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hong, S.-J.; Kim, S.-Y.; Kim, E.; Lee, C.-H.; Lee, J.-S.; Lee, D.-S.; Bang, J.; Kim, G. Moth Detection from Pheromone Trap Images Using Deep Learning Object Detectors. Agriculture 2020, 10, 170. https://doi.org/10.3390/agriculture10050170
Hong S-J, Kim S-Y, Kim E, Lee C-H, Lee J-S, Lee D-S, Bang J, Kim G. Moth Detection from Pheromone Trap Images Using Deep Learning Object Detectors. Agriculture. 2020; 10(5):170. https://doi.org/10.3390/agriculture10050170
Chicago/Turabian StyleHong, Suk-Ju, Sang-Yeon Kim, Eungchan Kim, Chang-Hyup Lee, Jung-Sup Lee, Dong-Soo Lee, Jiwoong Bang, and Ghiseok Kim. 2020. "Moth Detection from Pheromone Trap Images Using Deep Learning Object Detectors" Agriculture 10, no. 5: 170. https://doi.org/10.3390/agriculture10050170
APA StyleHong, S.-J., Kim, S.-Y., Kim, E., Lee, C.-H., Lee, J.-S., Lee, D.-S., Bang, J., & Kim, G. (2020). Moth Detection from Pheromone Trap Images Using Deep Learning Object Detectors. Agriculture, 10(5), 170. https://doi.org/10.3390/agriculture10050170