KRS-Net: A Classification Approach Based on Deep Learning for Koi with High Similarity
Abstract
:Simple Summary
Abstract
1. Introduction
2. Materials and Methods
2.1. Image Acquisition and Data Augmentation
2.2. KRS-Net Classification Approach
3. Experimental Results and Analysis
3.1. Setup of Experiment and Performance Indexes
3.2. Visualization of Features
3.3. Comparative Analysis with Other Classification Networks
4. Discussion
4.1. Factors Influencing Test Accuracy
4.2. Influence of Batch Size on Classification Performance
4.3. Advantages of KRS-Net in Structure
4.4. Influence of Structure on Training Time and Parameters
4.5. Future Work
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Pinkey, S. Study of the fresh water fish diversity of Koshi river of Nepal. Int. J. Fauna Biol. Stud. 2016, 3, 78–81. [Google Scholar]
- Nuwansi, K.; Verma, A.; Chandrakant, M.; Prabhath, G.; Peter, R. Optimization of stocking density of koi carp (Cyprinus carpio var. koi) with gotukola (Centella asiatica) in an aquaponic system using phytoremediated aquaculture wastewater. Aquaculture 2021, 532, 735993. [Google Scholar] [CrossRef]
- Wang, K.L.; Chen, S.N.; Huo, H.J.; Nie, P. Identification and expression analysis of sixteen Toll-like receptor genes, TLR1, TLR2a, TLR2b, TLR3, TLR5M, TLR5S, TLR7-9, TLR13a-c, TLR14, TLR21-23 in mandarin fish Siniperca chuatsi. Dev. Comp. Immunol. 2021, 121, 104100. [Google Scholar] [CrossRef]
- De Kock, S.; Gomelsky, B. Japanese Ornamental Koi Carp: Origin, Variation and Genetics; Biology and Ecology of Carp; Informa UK Limited: Boca Raton, FL, USA, 2015; pp. 27–53. [Google Scholar]
- Sun, X.; Chang, Y.; Ye, Y.; Ma, Z.; Liang, Y.; Li, T.; Jiang, N.; Xing, W.; Luo, L. The effect of dietary pigments on the coloration of Japanese ornamental carp (koi, Cyprinus carpio L.). Aquaculture 2012, 342, 62–68. [Google Scholar] [CrossRef]
- Bairwa, M.K.; Saharan, N.; Rawat, K.D.; Tiwari, V.K.; Prasad, K.P. Effect of LED light spectra on reproductive performance of Koi carp (Cyprinus carpio). Indian J. Anim. Res. 2017, 51, 1012–1018. [Google Scholar] [CrossRef] [Green Version]
- Xie, Z.; Wang, D.; Jiang, S.; Peng, C.; Wang, Q.; Huang, C.; Li, S.; Lin, H.; Zhang, Y. Chromosome-Level Genome Assembly and Transcriptome Comparison Analysis of Cephalopholis sonnerati and Its Related Grouper Species. Biology 2022, 11, 1053. [Google Scholar] [CrossRef]
- Nica, A.; Mogodan, A.; Simionov, I.-A.; Petrea, S.-M.; Cristea, V. The influence of stocking density on growth performance of juvenile Japanese ornamental carp (Koi, Cyprinus carpio L.). Sci. Pap. Ser. D Anim. Sci. 2020, 63, 483–488. [Google Scholar]
- Kim, J.-I.; Baek, J.-W.; Kim, C.-B. Image Classification of Amazon Parrots by Deep Learning: A Potentially Useful Tool for Wildlife Conservation. Biology 2022, 11, 1303. [Google Scholar] [CrossRef]
- Tian, X.; Pang, X.; Wang, L.; Li, M.; Dong, C.; Ma, X.; Wang, L.; Song, D.; Feng, J.; Xu, P. Dynamic regulation of mRNA and miRNA associated with the developmental stages of skin pigmentation in Japanese ornamental carp. Gene 2018, 666, 32–43. [Google Scholar] [CrossRef]
- Peng, F.; Chen, K.; Zhong, W. Classification and appreciation of three species of koi. Sci. Fish Farming 2018, 8, 82–83. [Google Scholar]
- Song, S.; Duan, P. Koi and its variety classification. Shandong Fish. 2009, 26, 53–54. [Google Scholar]
- Garland, J.; Hu, M.; Kesha, K.; Glenn, C.; Duffy, M.; Morrow, P.; Stables, S.; Ondruschka, B.; Da Broi, U.; Tse, R. An overview of artificial intelligence/deep learning. Pathology 2021, 53, S6. [Google Scholar] [CrossRef]
- Talib, M.A.; Majzoub, S.; Nasir, Q.; Jamal, D. A systematic literature review on hardware implementation of artificial intelligence algorithms. J. Supercomput. 2021, 77, 1897–1938. [Google Scholar] [CrossRef]
- Liu, Y.; Yang, C.; Gao, Z.; Yao, Y. Ensemble deep kernel learning with application to quality prediction in industrial polymerization processes. Chemom. Intell. Lab. Syst. 2018, 174, 15–21. [Google Scholar] [CrossRef]
- Wang, T.; Zhao, Y.; Sun, Y.; Yang, R.; Han, Z.; Li, J. Recognition approach based on data-balanced faster R CNN for winter jujube with different levels of maturity. Trans. Chin. Soc. Agric. Mach. 2020, 51, 457–463. [Google Scholar]
- Xu, W.; Zhao, L.; Li, J.; Shang, S.; Ding, X.; Wang, T. Detection and classification of tea buds based on deep learning. Comput. Electron. Agric. 2022, 192, 106547. [Google Scholar] [CrossRef]
- Li, J.; Xu, C.; Jiang, L.; Xiao, Y.; Deng, L.; Han, Z. Detection and analysis of behavior trajectory for sea cucumbers based on deep learning. IEEE Access 2019, 8, 18832–18840. [Google Scholar] [CrossRef]
- Xu, W.; Zhu, Z.; Ge, F.; Han, Z.; Li, J. Analysis of behavior trajectory based on deep learning in ammonia environment for fish. Sensors 2020, 20, 4425. [Google Scholar] [CrossRef] [PubMed]
- Li, J.; Xu, W.; Deng, L.; Xiao, Y.; Han, Z.; Zheng, H. Deep Learning for Visual Recognition and Detection of Aquatic Animals: A Review. Rev. Aquac. 2022, 1–25. [Google Scholar] [CrossRef]
- Xu, Y.; Shen, H. Review of Research on Biomedical Image Processing Based on Pattern Recognition. J. Electron. Inf. Technol. 2020, 42, 201–213. [Google Scholar]
- Sarica, A.; Vaccaro, M.G.; Quattrone, A.; Quattrone, A. A Novel Approach for Cognitive Clustering of Parkinsonisms through Affinity Propagation. Algorithms 2021, 14, 49. [Google Scholar] [CrossRef]
- Liu, W.; Wang, Z.; Zeng, N.; Alsaadi, F.E.; Liu, X. A PSO-based deep learning approach to classifying patients from emergency departments. Int. J. Mach. Learn. Cybern. 2021, 12, 1939–1948. [Google Scholar] [CrossRef]
- Han, Z.; Wan, J.; Deng, L.; Liu, K. Oil Adulteration identification by hyperspectral imaging using QHM and ICA. PLoS ONE 2016, 11, e0146547. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Liu, W.; Wang, Z.; Liu, X.; Zeng, N.; Liu, Y.; Alsaadi, F.E. A survey of deep neural network architectures and their applications. Neurocomputing 2017, 234, 11–26. [Google Scholar] [CrossRef]
- Eerola, T.; Kraft, K.; Grönberg, O.; Lensu, L.; Suikkanen, S.; Seppälä, J.; Tamminen, T.; Kälviäinen, H.; Haario, H. Towards operational phytoplankton recognition with automated high-throughput imaging and compact convolutional neural networks. Ocean. Sci. Discuss. 2020, 62, 1–20. [Google Scholar]
- Zhu, D.; Qi, R.; Hu, P.; Su, Q.; Qin, X.; Li, Z. YOLO-Rip: A modified lightweight network for Rip Currents detection. Front. Mar. Sci. 2022, 9, 930478. [Google Scholar] [CrossRef]
- Lu, Y.; Tung, C.; Kuo, Y. Identifying the species of harvested tuna and billfish using deep convolutional neural networks. ICES J. Mar. Sci. 2020, 77, 1318–1329. [Google Scholar] [CrossRef]
- Rauf, H.T.; Lali, M.I.U.; Zahoor, S.; Shah, S.Z.H.; Rehman, A.U.; Bukhari, S.A.C. Visual features based automated identification of fish species using deep convolutional neural networks. Comput. Electron. Agric. 2019, 167, 105075. [Google Scholar] [CrossRef]
- Knausgård, K.M.; Wiklund, A.; Sørdalen, T.K.; Halvorsen, K.T.; Kleiven, A.R.; Jiao, L.; Goodwin, M. Temperate fish detection and classification: A deep learning based approach. Appl. Intell. 2022, 52, 6988–7001. [Google Scholar] [CrossRef]
- Ju, Z.; Xue, Y. Fish species recognition using an improved AlexNet model. Optik 2020, 223, 165499. [Google Scholar] [CrossRef]
- Hridayami, P.; Putra, I.K.G.D.; Wibawa, K.S. Fish species recognition using VGG16 deep convolutional neural network. J. Comput. Sci. Eng. 2019, 13, 124–130. [Google Scholar] [CrossRef]
- Huang, X.; Chen, W.; Yang, W. Improved Algorithm Based on the Deep Integration of Googlenet and Residual Neural Network; IOP Publishing: Bristol, UK, 2021; p. 012069. [Google Scholar]
- Xu, X.; Li, W.; Duan, Q. Transfer learning and SE-ResNet152 networks-based for small-scale unbalanced fish species identification. Comput. Electron. Agric. 2021, 180, 105878. [Google Scholar] [CrossRef]
- Zhang, K.; Guo, Y.; Wang, X.; Yuan, J.; Ding, Q. Multiple feature reweight densenet for image classification. IEEE Access 2019, 7, 9872–9880. [Google Scholar] [CrossRef]
- Gao, S.; Dai, Y.; Li, Y.; Jiang, Y.; Liu, Y. Augmented flame image soft sensor for combustion oxygen content prediction. Meas. Sci. Technol. 2022, 34, 015401. [Google Scholar] [CrossRef]
- Liu, K.; Li, Y.; Yang, J.; Liu, Y.; Yao, Y. Generative principal component thermography for enhanced defect detection and analysis. IEEE Trans. Instrum. Meas. 2020, 69, 8261–8269. [Google Scholar] [CrossRef]
- Zhu, H.; Yang, L.; Fei, J.; Zhao, L.; Han, Z. Recognition of carrot appearance quality based on deep feature and support vector machine. Comput. Electron. Agric. 2021, 186, 106185. [Google Scholar] [CrossRef]
- Hemke, R.; Buckless, C.G.; Tsao, A.; Wang, B.; Torriani, M. Deep learning for automated segmentation of pelvic muscles, fat, and bone from CT studies for body composition assessment. Skelet. Radiol. 2020, 49, 387–395. [Google Scholar] [CrossRef]
- Selvaraju, R.R.; Cogswell, M.; Das, A.; Vedantam, R.; Parikh, D.; Batra, D. Grad-CAM: Visual Explanations from Deep Networks via Gradient-based Localization. Int. J. Comput. Vis. 2020, 128, 336–359. [Google Scholar] [CrossRef] [Green Version]
- Taheri-Garavand, A.; Ahmadi, H.; Omid, M.; Mohtasebi, S.S.; Carlomagno, G.M. An intelligent approach for cooling radiator fault diagnosis based on infrared thermal image processing technique. Appl. Therm. Eng. 2015, 87, 434–443. [Google Scholar] [CrossRef]
- Zhang, Z.; Tang, Y.-G.; Yang, K. A two-stage restoration of distorted underwater images using compressive sensing and image registration. Adv. Manuf. 2021, 9, 273–285. [Google Scholar] [CrossRef]
- Burçak, K.C.; Baykan, Ö.K.; Uğuz, H. A new deep convolutional neural network model for classifying breast cancer histopathological images and the hyperparameter optimisation of the proposed model. J. Supercomput. 2021, 77, 973–989. [Google Scholar] [CrossRef]
- Goyal, P.; Dollár, P.; Girshick, R.; Noordhuis, P.; Wesolowski, L.; Kyrola, A.; Tulloch, A.; Jia, Y.; He, K. Accurate, large minibatch sgd: Training imagenet in 1 hour. arXiv 2017, arXiv:1706.02677. [Google Scholar]
Koi Variety | Original Images | Augmented Images | Images after Augmentation |
---|---|---|---|
Tancho | 37 | 74 | 111 |
Hikariu | 42 | 84 | 126 |
Utsurimono | 25 | 100 | 125 |
Bekko | 22 | 88 | 110 |
Kawarimono | 21 | 84 | 105 |
Taisho | 75 | 39 | 114 |
Showa | 104 | 0 | 104 |
Asagi | 25 | 100 | 125 |
Kohaku | 101 | 0 | 101 |
Hikarim | 22 | 88 | 110 |
Koromo | 23 | 92 | 115 |
Kinginrin | 49 | 64 | 113 |
Ogon | 21 | 84 | 105 |
Total | 567 | 897 | 1464 |
Hyperparameter | Epoch = 25 | Epoch = 50 | Epoch = 75 | Epoch = 100 |
---|---|---|---|---|
Batch Size = 4 | 93.71% | 96.50% | 97.20% | 96.50% |
Batch Size = 8 | 97.90% | 97.20% | 96.24% | 95.80% |
Batch Size = 16 | 93.01% | 93.01% | 96.50% | 95.10% |
Batch Size = 32 | 92.31% | 94.41% | 94.41% | 93.01% |
Batch Size = 64 | 83.92% | 90.91% | 93.71% | 91.61% |
Evaluation Index (%) | AlexNet | VGG16 | GoogLeNet | ResNet101 | DenseNet201 | KRS-Net |
---|---|---|---|---|---|---|
Epoch = 25 | ||||||
Accuracy | 98.17 | 98.82 | 98.82 | 98.71 | 99.25 | 99.68 |
Precision | 90.51 | 92.37 | 92.72 | 92.98 | 95.46 | 97.90 |
Recall | 87.55 | 91.96 | 91.89 | 91.19 | 94.83 | 97.76 |
F1 | 86.97 | 91.58 | 91.00 | 90.53 | 94.78 | 97.80 |
Epoch = 50 | ||||||
Accuracy | 97.63 | 99.19 | 98.92 | 99.14 | 99.57 | 99.57 |
Precision | 90.69 | 94.97 | 93.41 | 95.30 | 97.80 | 97.58 |
Recall | 88.25 | 94.69 | 92.59 | 93.99 | 96.92 | 97.13 |
F1 | 87.10 | 94.29 | 91.79 | 93.47 | 96.87 | 97.12 |
Epoch = 75 | ||||||
Accuracy | 98.60 | 98.87 | 98.39 | 98.92 | 98.92 | 99.35 |
Precision | 91.98 | 95.05 | 94.24 | 94.54 | 94.68 | 96.49 |
Recall | 90.49 | 93.91 | 93.43 | 92.52 | 92.52 | 95.84 |
F1 | 89.93 | 92.87 | 93.01 | 91.67 | 91.88 | 95.90 |
Epoch = 100 | ||||||
Accuracy | 98.81 | 98.82 | 98.71 | 95.91 | 99.16 | 99.41 |
Precision | 93.51 | 95.07 | 92.24 | 88.00 | 96.03 | 96.40 |
Recall | 91.89 | 90.71 | 91.26 | 72.38 | 93.92 | 96.54 |
F1 | 91.57 | 91.36 | 90.82 | 74.06 | 93.23 | 96.17 |
Networks | Training Time (s) | Number of Layers | Number of Connections | Size of Network (MB) | Parameters (M) |
---|---|---|---|---|---|
AlexNet | 1079 | 25 | 24 | 227.00 | 61.00 |
VGG16 | 1112 | 41 | 40 | 515.00 | 138.00 |
GoogLeNet | 1403 | 144 | 170 | 27.00 | 7.00 |
ResNet101 | 3320 | 347 | 379 | 167.00 | 44.60 |
DenseNet201 | 8864 | 708 | 805 | 77.00 | 20.00 |
KRS-Net | 1338 | 71 | 78 | 49.70 | 10.89 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zheng, Y.; Deng, L.; Lin, Q.; Xu, W.; Wang, F.; Li, J. KRS-Net: A Classification Approach Based on Deep Learning for Koi with High Similarity. Biology 2022, 11, 1727. https://doi.org/10.3390/biology11121727
Zheng Y, Deng L, Lin Q, Xu W, Wang F, Li J. KRS-Net: A Classification Approach Based on Deep Learning for Koi with High Similarity. Biology. 2022; 11(12):1727. https://doi.org/10.3390/biology11121727
Chicago/Turabian StyleZheng, Youliang, Limiao Deng, Qi Lin, Wenkai Xu, Feng Wang, and Juan Li. 2022. "KRS-Net: A Classification Approach Based on Deep Learning for Koi with High Similarity" Biology 11, no. 12: 1727. https://doi.org/10.3390/biology11121727
APA StyleZheng, Y., Deng, L., Lin, Q., Xu, W., Wang, F., & Li, J. (2022). KRS-Net: A Classification Approach Based on Deep Learning for Koi with High Similarity. Biology, 11(12), 1727. https://doi.org/10.3390/biology11121727