Next Article in Journal
Dietary Protein Modifies Hepatic Glycolipid Metabolism, Intestinal Immune Response, and Resistance to Streptococcus agalactiae of Genetically Improved Farmed Tilapia (GIFT: Oreochromis niloticus) Exposed to High Temperature
Previous Article in Journal
The Path from Traditional Fisheries to Ecotourism in Cimei Island
Previous Article in Special Issue
Low-Cost Resin 3-D Printing for Rapid Prototyping of Microdevices: Opportunities for Supporting Aquatic Germplasm Repositories
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Intelligent Diagnosis of Fish Behavior Using Deep Learning Method

Usama Iqbal
Daoliang Li
1,2,3,4,5,* and
Muhammad Akhter
College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China
National Innovation Center for Digital Fishery, China Agricultural University, Beijing 100083, China
Key Laboratory of Smart Farming Technologies for Aquatic Animal and Livestock, Ministry of Agriculture and Rural Affairs, Beijing 100083, China
Beijing Engineering and Technology Research Center for Internet of Things in Agriculture, Beijing 100083, China
Yantai Institute of China Agricultural University, Yantai 264670, China
Author to whom correspondence should be addressed.
Fishes 2022, 7(4), 201;
Submission received: 8 July 2022 / Revised: 3 August 2022 / Accepted: 8 August 2022 / Published: 11 August 2022
(This article belongs to the Special Issue Emerging Technologies for Sustainable Aquaculture)


Scientific methods are used to monitor fish growth and behavior and reduce the loss caused by stress and other circumstances. Conventional techniques are time-consuming, labor-intensive, and prone to accidents. Deep learning (DL) technology is rapidly gaining popularity in various fields, including aquaculture. Moving towards smart fish farming necessitates the precise and accurate identification of fish biodiversity. Observing fish behavior in real time is imperative to make better feeding decisions. The proposed study consists of an efficient end-to-end convolutional neural network (CNN) classifying fish behavior into the normal and starvation categories. The performance of the CNN is evaluated by varying the number of fully connected (FC) layers with or without applying max-pooling operation. The accuracy of the detection algorithm is increased by 10% by incorporating three FC layers and max pooling operation. The results demonstrated that the shallow architecture of the CNN model, which employs a max-pooling function with more FC layers, exhibits promising performance and achieves 98% accuracy. The presented system is a novel step in laying the foundation for an automated behavior identification system in modern fish farming.

1. Introduction

Global fishing production hit a peak of 178 million tons in 2020 [1]. Humans consumed 88% of this output, which is critical for the Food and Agriculture Organization (FAO) of the United Nations (UN) to achieve its aim of creating a world free of hunger and malnutrition [2]. Nonetheless, as the world’s population continues to expand, the strain on the global fisheries market will also grow [2,3].
Smart fish farming is a novel scientific area aiming to develop efficient and sustainable aquaculture [4]. In modern fish farming, the integration of emerging technologies, such as the Internet of Things (IoT), big data, cloud computing, and artificial intelligence (AI) can create sustainable aquaculture and reduce the usage of conventional techniques [5,6]. Rapid aquaculture growth has also led to several problems, including water contamination, fish malnutrition, and diseases [7]. Smart aquaculture can strive to solve the problems of fishery development and increase aquaculture productivity as part of the third green revolution [8]. Smart aquaculture monitors fish at various stages, reduces the risk of failure, and increases profitability and productivity [9]. The feeding stage of fish impacts the production efficiency and breeding cost in intensive aquaculture [10]. For some fish species, feeding expenses make up more than 60% of the total production costs. [11,12]. The excess amount of feeding reduces production efficiency, whereas insufficient feeding negatively impacts fish growth. Traditional feeding decisions for assessing the hunger desire of fish is often impeded by high fish density and water turbidity [13]. Therefore, an affordable and highly trustworthy monitoring system is required to observe the fish behavior [14].
In recent years, advancements in computer vision have provided a fast and non-destructive approach to identifying fish behavior [15]. Machine learning (ML) is a breakthrough in artificial intelligence (AI) that is applied in various domains, namely robotics, medicine, information security, and especially aquaculture [16]. Machine learning technology is widely used in a variety of applications of aquaculture, including classification, fish counting, size measurement, and behavior identification [17]. Deep learning (DL) is a subset of machine learning and is increasingly applied in aquaculture because of its effective ability to express features. DL is a multi-layer learning network that can extract semantic information from the pixel level, which is suitable for fish behavior detection through images [18].
DL-based feeding decision-making research has made outstanding progress in recent years. As a result, accurate recognition of fish behavior can achieve optimal feed control, lower feeding costs, and increase economic efficiency [19]. Various researchers have conducted a significant amount of research on fish behavior. Zhou et al. evaluated fish feeding behavior using near-infrared imaging technology and measured the feeding behavior index using a support vector machine (SVM) and gray gradient symbiosis matrix. The results through experiment and the expert manual assessment provided a correlation value of 0.945 [20,21]. Zhou continued his research in 2018 and improved the results by using an adaptive neuro fuzzy inference system (ANFIS) to assess and analyze fish-eating behavior. The results showed that the ANFIS accuracy was 98% [22]. In 2019, he proposed a deep convolutional neural network (DCNN) to categorize fish behavior into four classification levels, namely, “none” “weak” “medium” and “strong” with 90% classification accuracy [23]. Furthermore, several other studies have also used neural network models to assess fish behavior. Another researcher [24] used a convolutional neural network (CNN) and long short term memory (LSTM) network to predict feeding and non-feeding behavior of salmon species with an accuracy of 80%. In [25], authors evaluated fish escape, swimming, and feeding behavior using Fourier discrete transform and Fourier descriptor. The model using a Fourier descriptor distributed the fish feed precisely without any intervention, with an accuracy of 100%.
From a detailed analysis of the existing literature related to smart aquaculture, the application of deep learning can be divided into four categories: live fish identification, species classification, behavioral analysis, and biomass estimation [4]. It is also revealed from a literature review that fish identification and species classification are the most popular areas of research. In contrast, behavior identification has been less targeted than others [4]. The proposed model analyzes the starvation and normal condition of the fish through their behavior. The aim of this study was to provide a quick support decision system regarding the feeding decision in smart fish farming.

2. Materials and Methods

2.1. Fish Samples and Experimental Environment Creation

THAMNACONUS modestus (black scraper) is one of the approved ISO organisms for experimental purposes [26]. Black scrapers of 27 ± 2 cm in length and 275 ± 25 g in weight were chosen as the research object for this experimental study. During regular feeding, the temperature of the water was kept at (23 ± 1) °C, and dissolved oxygen was 6.3 ± 0.3 mg/L. Thamnaconus modestus was well acclimated to the environment and received one feeding a day for 2 weeks before the experiment. Data were collected in the presence of light. The total duration of the experiment was 12 days. At first, the fishes were nourished on alternate days for 12 days. The following two groups describe the state of the fish and shown in Figure 1.
Group 1: Normal behavior: the fishes were fed at a fixed time. After the feeding stage, the excited state of fishes was also observed.
Group 2: Starvation behavior: Under this condition, the feeding was stopped for 6 days to create a starvation environment. It was observed that the fishes were swimming in the areas they usually do not visit, probably looking for food. Sometimes fishes came to the water surface, which they usually would not do. During starvation, the fishes showed aggressive behavior and quickly gobbled down food.

2.2. Dataset Description

The experimental data used for the proposed study were collected in the laboratory environment at China Agricultural University, Beijing. For this study, 100 black scrapers (Thamnaconus modestus) were selected, and their behavior was recorded. The resolution of the captured videos was 3840 × 2160, while the frame rate was 25 fps. The image sequences were retrieved from the video data. The fish experimental setup was installed in the open air with a water depth of approximately 2–3 m, and the brightness of the underwater images was from natural light. One camera was placed face down, and the other was inside the water, focusing on the fishes in a shallow pool. The dataset of 2000 images was divided into two categories: 1265 images exhibited normal fish behavior, and the remaining 735 images showed the starvation (hunger) behavior. Figure 2 shows the dataset acquisition schematics.

2.3. Training Methodology

In the proposed method, CNN models classified 2000 images into two categories (normal and starvation). Table 1 shows the distribution of the dataset for training and testing purposes. The maximum training accuracy and minimum training loss depend on the learning rate adjustment in the training phase. The optimal learning rate resulted in a quick drop in the training loss, which reached the minimum level. The network training parameters included a learning rate, drop rate, and batch size of 0.00001, 0.3, and 1, respectively. In order to perform synchronization between the desired and calculated output, the optimizer stochastic gradient descent (SGD) was applied by using the gradient derivative [27]. Proposed CNN was trained for 100 epochs, and the cross-entropy loss function was used.

2.4. Software and Hardware System Description

The research implementation used the Anaconda platform TensorFlow 1.14 and Keras 2.0.0 with Python 3.7.4. The training phase was completed on a hardware unit with a Core i7- [email protected] GHz processor with 16 GB DDR4 RAM. The graphics card used was an NVIDIA RTX 2080ti with 24GB.

2.5. Proposed Convolutional Neural Network (CNN) Model

The proposed research methodology used a convolutional neural network (CNN), made from scratch in our implementation. The CNN model consisted of six convolution layers, two pool layers, two fully connected (FC) layers in model-1, and three FC layers in model-2. In the first part of the experiment, the CNN models were implemented without applying the max-pooling function and observed in the testing phase. In the second phase of the experiment, the max-pooling function was applied to both CNN models. Figure 3 depicts the general architecture of both models. Using the proposed framework, the starvation and normal stages of fish were quickly identified. The detailed information about the architecture of model-1 and -2, their convolutional layers, number of filters, kernel, and stride size can be seen in the Supplementary Materials section.

3. Results

This section details the testing and validation performance of the proposed CNN model based on the classification matrices.

3.1. Evaluation Matrices

To evaluate the effectiveness of the proposed framework and to compare it to other applicable techniques, some standard classification matrices, such as Accuracy, Recall, True Positive Rate (TPR), F1—score, and many others are used, as follows:
A c c u r a c y   ( A c c ) = T P + T N T P + T N + F P + F N × 100 %
S e n s i t i v i t y   o r   R e c a l l   ( T P R ) = T P T P + F N × 100 % = 1 ( F N R ) × 100 %
p e c i f i c i t y ,   S e l e c t i v i t y   ( T N R ) = T N T N + F P × 100 % = 1 ( F P R ) × 100 %
F a l l O u t   ( F P R ) = F P F P + T N × 100 % = 1 ( T N R ) × 100 %
M i s s R a t e   ( F N R ) = F N F N + T P × 100 % = 1 ( T P R ) × 100 %
F 1 S c o r e =   2 × P r e c i s i o n × r e c a l l P r e c i s i o n + r e c a l l × 100 % = 2 T P 2 T P + F P + F N × 100 %
E r r o r   r a t e = ( 1 A c c ) × 100 % =   F P + F N T P + T N + F P + F N × 100   %
M a t t h e w s   C o r e l a t i o n   C o e f f i c i e n t   ( M C C ) =   T P × T N F P × F N ( T P + F P ) ( T P + F N ) ( T N + F P ) ( T N + F N ) × 100 %
T P is True Positive and T N is True negative, whereas F P   and   F N are False-positive and False-negative, respectively.

3.2. Quantitative Evaluation with Statistical Analysis

The objective of the study was to provide an alternative pathway for the initial screening of the fish hunger stage by using deep learning. The performance evaluation parameters such as accuracy, precision, recall, etc., were used to compare results derived from the confusion matrix [28]. The dataset was divided into 80% for training and 20% for testing purposes. In this study, the validation and test sets were the same.

3.2.1. Experiment 1: CNN Model-1

The testing results showed that the CNN model-1 attained an overall accuracy of 88.9%. Table 2 outlines the performance parameters. The performance parameters also showed that model-1 could distinguish between two classification categories with reasonable accuracy in the test dataset.

3.2.2. Experiment 2: CNN Model-2

The CNN model-2 indicated an overall diagnostic accuracy of 98%. The experimental results showed that the CNN model-2 outperformed model-1 and significantly improved all performance factors. Table 3 outlines the performance parameters. The accuracy was improved after increasing the number of FC layers. The sensitivity value of 98% indicated that the total number of false negatives was low. In comparison, the specificity value of 98% shows that the total number of true negatives was high.
The receiver operating characteristics (ROC) curve with area under the curve (AUC) is another crucial factor to consider when analyzing a model’s behavior. This graph depicts the relationship between the true positive rate (TPR) and the false positive rate (FPR). When a model’s AUC value is close to one, it is said to be cost-effective. This study used Origin Pro 8.5 to create ROC curves. AUC was calculated using the trapezoid formula. Figure 4 displays ROC curves with AUC values of the proposed models.

4. Discussion

The development of intelligent aquaculture is necessary for determining the appropriate behavior of the fish. The accurate classification of fish behavior as normal and hungry is essential for productivity and profitability. The primary goal of this study was to obtain satisfactory results in terms of behavior recognition. After analyzing the results, the proposed approach substantially improved progress in identifying and classifying fish feeding behavior. To establish the relationship between fully connected (FC) layers with the architecture of a convolutional neural network (CNN), the study evaluated the effects on deeper/shallower architectures of CNN by varying the number of FC layers in the context of image classification. The CNN model was initially trained using a single FC (output) layer. Then, another FC layer was manually inserted before the output (FC) layer to track any performance gains or losses brought on by the new FC layer. The experiments were conducted with the newly added FC layer by varying 10, 16, 32, 64, … 4096 neurons. The model’s performance was then evaluated by adding another FC layer and changing the number of neurons up to 4096. Finally, the addition of a third FC layer with 1024 neurons improved the model’s performance. According to Bansal et al. [29] and Basha S. et al. [30], for the training of CNN models with deeper datasets, deeper architectures are more preferred than shallow architectures. However, the opposite is valid for wider datasets. Furthermore, the shallow architecture of a CNN requires more FC layers than the deeper architecture for better performance of any dataset. If two datasets have about the same number of images, one dataset is said to be deeper than the other if the number of images per class in the training set is high. The other dataset, which is called the wider dataset, consists of fewer images per class in the training set. The requirement for the increased number of FC layers for a shallow architecture is related to the features learned by the convolutional layer. This study showed that the final convolutional layer of the proposed shallow architecture of the CNN produced fewer abstract features than the deeper architecture; thus, more FC layers were needed for better performance.

5. Conclusions

Early detection of fish behavior is essential in increasing the efficiency of modern aquaculture. This research examined and evaluated fish feeding behavior to optimize real-time feed control. By using deep learning-based frameworks, accurate and precise behavior identification can become more accessible. This paper presented a novel technique for detecting the fish’s normal and starvation stages. The key contributions of our work are as follows:
  • Proposed a state-of-the-art convolutional neural network with an additional fully connected layer for a high-performance detection and classification system.
  • Due to the correlation between environmental complexity and uncertainty of fish behavior, the behavior recognition accuracy is generally low. The proposed method achieved excellent accuracy with these substantial challenges.
  • The proposed model addressed the problem of poor generalization ability with the shallow neural network and classified the fish images into two categories with an accuracy of 98%.
This proposed methodology can be adapted for large datasets because it can provide scalable performance. The future steps are to perform classification, detection, and recognition tasks with more challenging datasets. In addition, a combination of deep learning and data fusion techniques can be employed for behavior detection, which can aid in developing an intelligent feeder system and its application in modern aquaculture.

Supplementary Materials

The following supporting information can be downloaded at: Supplementary Convolutional Neural Network Architecture Description.

Author Contributions

Conceptualization, U.I. and D.L.; methodology, U.I., M.A. and D.L.; software, U.I. and M.A.; resources, D.L.; data curation, D.L.; writing—original draft preparation, U.I. and M.A.; writing—review and editing, D.L.; visualization, U.I.; supervision, D.L; project administration, D.L. All authors have read and agreed to the published version of the manuscript.


This work was supported by Key Technology Research and Creation of Digital Fishery Intelligent Equipment (Grant No: 2021TZXD006) and Yellow & Bohai Digital Fishery Innovation Center (Grant No: 2020XDRHXMPT10).

Institutional Review Board Statement

The dataset used in this study was approved by the research ethics committee of China Agriculture University under the approval code: AW02702202-5-1.

Informed Consent Statement

Not applicable.

Data Availability Statement

Upon request.

Conflicts of Interest

The authors declare no conflict of interest.


  1. Food and Agriculture Organization of the United Nations. The State of World Fisheries and Aquaculture 2022; FAO: Rome, Italy, 2022. [Google Scholar]
  2. Food and Agriculture Organization of the United Nations. The State of World Fisheries and Aquaculture—Meeting the Sustainable Goals. Nat. Resour. 2018, 210. [Google Scholar]
  3. Shi, C.; Wang, Q.; He, X.; Zhang, X.; Li, D. An Automatic Method of Fish Length Estimation Using Underwater Stereo System Based on LabVIEW. Comput. Electron. Agric. 2020, 173, 105419. [Google Scholar] [CrossRef]
  4. Yang, X.; Zhang, S.; Liu, J.; Gao, Q.; Dong, S.; Zhou, C. Deep Learning for Smart Fish Farming: Applications, Opportunities and Challenges. Rev. Aquac. 2021, 13, 66–90. [Google Scholar] [CrossRef]
  5. Wang, T.; Xu, X.; Wang, C.; Li, Z.; Li, D. From Smart Farming towards Unmanned Farms: A New Mode of Agricultural Production. Agriculture 2021, 11, 145. [Google Scholar] [CrossRef]
  6. Akbar, M.O.; Shahbaz Khan, M.S.; Ali, M.J.; Hussain, A.; Qaiser, G.; Pasha, M.; Pasha, U.; Missen, M.S.; Akhtar, N. IoT for Development of Smart Dairy Farming. J. Food Qual. 2020, 2020, 1212805. [Google Scholar] [CrossRef]
  7. O’Neill, E.A.; Stejskal, V.; Clifford, E.; Rowan, N.J. Novel Use of Peatlands as Future Locations for the Sustainable Intensification of Freshwater Aquaculture Production—A Case Study from the Republic of Ireland. Sci. Total Environ. 2020, 706, 136044. [Google Scholar] [CrossRef]
  8. Yang, L.; Liu, Y.; Yu, H.; Fang, X.; Song, L.; Li, D.; Chen, Y. Computer Vision Models in Intelligent Aquaculture with Emphasis on Fish Detection and Behavior Analysis: A Review. Arch. Comput. Methods Eng. 2020, 28, 2785–2816. [Google Scholar] [CrossRef]
  9. Siddiqui, S.A.; Salman, A.; Malik, M.I.; Shafait, F.; Mian, A.; Shortis, M.R.; Harvey, E.S. Automatic Fish Species Classification in Underwater Videos: Exploiting Pre-Trained Deep Neural Network Models to Compensate for Limited Labelled Data. ICES J. Mar. Sci. 2018, 75, 374–389. [Google Scholar] [CrossRef]
  10. Chen, L.; Yang, X.; Sun, C.; Wang, Y.; Xu, D.; Zhou, C. Feed Intake Prediction Model for Group Fish Using the MEA-BP Neural Network in Intensive Aquaculture. Inf. Process. Agric. 2020, 7, 261–271. [Google Scholar] [CrossRef]
  11. De Verdal, H.; Komen, H.; Quillet, E.; Chatain, B.; Allal, F.; Benzie, J.A.H.; Vandeputte, M. Improving Feed Efficiency in Fish Using Selective Breeding: A Review. Rev. Aquac. 2018, 10, 833–851. [Google Scholar] [CrossRef]
  12. Hu, W.-C.; Wu, H.-T.; Zhang, Y.-F.; Zhang, S.-H.; Lo, C.-H. Shrimp Recognition Using ShrimpNet Based on Convolutional Neural Network. J. Ambient Intell. Humaniz. Comput. 2020. [Google Scholar] [CrossRef]
  13. Liu, Z.; Li, X.; Fan, L.; Lu, H.; Liu, L.; Liu, Y. Measuring Feeding Activity of Fish in RAS Using Computer Vision. Aquac. Eng. 2014, 60, 20–27. [Google Scholar] [CrossRef]
  14. Zhang, S.; Yang, X.; Wang, Y.; Zhao, Z.; Liu, J.; Liu, Y.; Sun, C.; Zhou, C. Automatic Fish Population Counting by Machine Vision and a Hybrid Deep Neural Network Model. Animals 2020, 10, 364. [Google Scholar] [CrossRef] [Green Version]
  15. Wang, H.; Zhang, S.; Zhao, S.; Wang, Q.; Li, D.; Zhao, R. Real-Time Detection and Tracking of Fish Abnormal Behavior Based on Improved YOLOV5 and SiamRPN++. Comput. Electron. Agric. 2022, 192, 106512. [Google Scholar] [CrossRef]
  16. Vo, T.T.E.; Ko, H.; Huh, J.H.; Kim, Y. Overview of Smart Aquaculture System: Focusing on Applications of Machine Learning and Computer Vision. Electronics 2021, 10, 2882. [Google Scholar] [CrossRef]
  17. Bradley, D.; Merrifield, M.; Miller, K.M.; Lomonico, S.; Wilson, J.R.; Gleason, M.G. Opportunities to Improve Fisheries Management through Innovative Technology and Advanced Data Systems. Fish Fish. 2019, 20, 564–583. [Google Scholar] [CrossRef]
  18. Schneider, S.; Taylor, G.W.; Linquist, S.; Kremer, S.C. Past, Present and Future Approaches Using Computer Vision for Animal Re-Identification from Camera Trap Data. Methods Ecol. Evol. 2019, 10, 461–470. [Google Scholar] [CrossRef] [Green Version]
  19. Rauf, H.T.; Lali, M.I.U.; Zahoor, S.; Shah, S.Z.H.; Rehman, A.U.; Bukhari, S.A.C. Visual Features Based Automated Identification of Fish Species Using Deep Convolutional Neural Networks. Comput. Electron. Agric. 2019, 167, 105075. [Google Scholar] [CrossRef]
  20. Zhou, C.; Yang, X.; Zhang, B.; Lin, K.; Xu, D.; Guo, Q.; Sun, C. An Adaptive Image Enhancement Method for a Recirculating Aquaculture System. Sci. Rep. 2017, 7, 6243. [Google Scholar] [CrossRef] [Green Version]
  21. Zhou, C.; Zhang, B.; Lin, K.; Xu, D.; Chen, C.; Yang, X.; Sun, C. Near-Infrared Imaging to Quantify the Feeding Behavior of Fish in Aquaculture. Comput. Electron. Agric. 2017, 135, 233–241. [Google Scholar] [CrossRef]
  22. Zhou, C.; Lin, K.; Xu, D.; Chen, L.; Guo, Q.; Sun, C.; Yang, X. Near Infrared Computer Vision and Neuro-Fuzzy Model-Based Feeding Decision System for Fish in Aquaculture. Comput. Electron. Agric. 2018, 146, 114–124. [Google Scholar] [CrossRef]
  23. Zhou, C.; Xu, D.; Chen, L.; Zhang, S.; Sun, C.; Yang, X.; Wang, Y. Evaluation of Fish Feeding Intensity in Aquaculture Using a Convolutional Neural Network and Machine Vision. Aquaculture 2019, 507, 457–465. [Google Scholar] [CrossRef]
  24. Måløy, H.; Aamodt, A.; Misimi, E. A Spatio-Temporal Recurrent Network for Salmon Feeding Action Recognition from Underwater Videos in Aquaculture. Comput. Electron. Agric. 2019, 167, 105087. [Google Scholar] [CrossRef]
  25. Adegboye, M.A.; Aibinu, A.M.; Kolo, J.G.; Aliyu, I.; Folorunso, T.A.; Lee, S.H. Incorporating Intelligence in Fish Feeding System for Dispensing Feed Based on Fish Feeding Intensity. IEEE Access 2020, 8, 91948–91960. [Google Scholar] [CrossRef]
  26. Han, F.; Zhu, J.; Liu, B.; Zhang, B.; Xie, F. Fish Shoals Behavior Detection Based on Convolutional Neural Network and Spatiotemporal Information. IEEE Access 2020, 8, 126907–126926. [Google Scholar] [CrossRef]
  27. Stochastic Gradient Descent—Wikipedia. Available online: (accessed on 24 July 2022).
  28. Rehman, H.A.U.; Lin, C.Y.; Su, S.F. Deep Learning Based Fast Screening Approach on Ultrasound Images for Thyroid Nodules Diagnosis. Diagnostics 2021, 11, 2209. [Google Scholar] [CrossRef]
  29. Bansal, A.; Castillo, C.; Ranjan, R.; Chellappa, R. The Do’s and Don’ts for CNN-Based Face Verification. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Venesia, Italy, 22–29 October 2017; pp. 2545–2554. [Google Scholar] [CrossRef]
  30. Basha, S.H.S.; Dubey, S.R.; Pulabaigari, V.; Mukherjee, S. Impact of Fully Connected Layers on Performance of Convolutional Neural Networks for Image Classification. Neurocomputing 2019, 378, 112–119. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Thamnaconus modestus (black scraper) normal and starvation behavior during the experiment (the date in right image is 30 June 2021).
Figure 1. Thamnaconus modestus (black scraper) normal and starvation behavior during the experiment (the date in right image is 30 June 2021).
Fishes 07 00201 g001
Figure 2. Fish data acquisition experimental setup.
Figure 2. Fish data acquisition experimental setup.
Fishes 07 00201 g002
Figure 3. Architecture of the proposed fish identification and classification models.
Figure 3. Architecture of the proposed fish identification and classification models.
Fishes 07 00201 g003
Figure 4. ROC curves of the proposed behavior detection models.
Figure 4. ROC curves of the proposed behavior detection models.
Fishes 07 00201 g004
Table 1. Distribution of dataset images into training and testing for detection purposes.
Table 1. Distribution of dataset images into training and testing for detection purposes.
Behavior AnalysisTestTrainTotal
Table 2. Performance Evaluation of Proposed CNN Model-1.
Table 2. Performance Evaluation of Proposed CNN Model-1.
Parameters (%)Model-1
(Without Max-Pooling)
Error rate11.113.98
Fall out14.2885
Miss rate9.921
Table 3. Performance Evaluation of Proposed CNN Model-2.
Table 3. Performance Evaluation of Proposed CNN Model-2.
Parameters (%)Model-2
(Without Max-Pooling)
Error rate27
Fall out29.09
Miss rate29.06
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Iqbal, U.; Li, D.; Akhter, M. Intelligent Diagnosis of Fish Behavior Using Deep Learning Method. Fishes 2022, 7, 201.

AMA Style

Iqbal U, Li D, Akhter M. Intelligent Diagnosis of Fish Behavior Using Deep Learning Method. Fishes. 2022; 7(4):201.

Chicago/Turabian Style

Iqbal, Usama, Daoliang Li, and Muhammad Akhter. 2022. "Intelligent Diagnosis of Fish Behavior Using Deep Learning Method" Fishes 7, no. 4: 201.

Article Metrics

Back to TopTop