Next Article in Journal
Method for Automatic Path Planning of Underwater Vehicles Considering Ambient Noise Fields
Previous Article in Journal
Study on the Mechanism of Local Scour Around Bridge Piers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

LatentResNet: An Optimized Underwater Fish Classification Model with a Low Computational Cost

1
Electrical and Electronics Engineering Department, Çukurova University, 01330 Adana, Turkey
2
Section for Fisheries Technology, National Institute of Aquatic Resources (DTU Aqua), Technical University of Denmark, 9850 Hirtshals, Denmark
3
Biomedical Engineering Department, Çukurova University, 01330 Adana, Turkey
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2025, 13(6), 1019; https://doi.org/10.3390/jmse13061019
Submission received: 14 April 2025 / Revised: 19 May 2025 / Accepted: 21 May 2025 / Published: 23 May 2025
(This article belongs to the Section Ocean Engineering)

Abstract

Efficient deep learning models are crucial in resource-constrained environments, especially for marine image classification in underwater monitoring and biodiversity assessment. This paper presents LatentResNet, a computationally lightweight deep learning model involving two key innovations: (i) using the encoder from the proposed LiteAE, a lightweight autoencoder for image reconstruction, as input to the model to reduce the spatial dimension of the data and (ii) integrating a DeepResNet architecture with lightweight feature extraction components to refine encoder-extracted features. LiteAE demonstrated high-quality image reconstruction within a single training epoch. LatentResNet variants (large, medium, and small) are evaluated on ImageNet-1K to assess their efficiency against state-of-the-art models and on Fish4Knowledge for domain-specific performance. On ImageNet-1K, the large variant achieves 66.3% top-1 accuracy (1.7M parameters, 0.2 GFLOPs). The medium and small variants reach 60.8% (1M, 0.1 GFLOPs) and 54.8% (0.7M, 0.06 GFLOPs), respectively. After fine-tuning on Fish4Knowledge, the large, medium, and small variants achieve 99.7%, 99.8%, and 99.7%, respectively, outperforming the classification metrics of benchmark models trained on the same dataset, with up to 97.4% and 92.8% reductions in parameters and FLOPs, respectively. The results demonstrate LatentResNet’s effectiveness as a lightweight solution for real-world marine applications, offering accurate and lightweight underwater vision.
Keywords: lightweight neural networks; deep learning; underwater image classification; computational efficiency; autoencoder; real-time processing; fish species detection lightweight neural networks; deep learning; underwater image classification; computational efficiency; autoencoder; real-time processing; fish species detection

Share and Cite

MDPI and ACS Style

Hariri, M.; Avsar, E.; Aydın, A. LatentResNet: An Optimized Underwater Fish Classification Model with a Low Computational Cost. J. Mar. Sci. Eng. 2025, 13, 1019. https://doi.org/10.3390/jmse13061019

AMA Style

Hariri M, Avsar E, Aydın A. LatentResNet: An Optimized Underwater Fish Classification Model with a Low Computational Cost. Journal of Marine Science and Engineering. 2025; 13(6):1019. https://doi.org/10.3390/jmse13061019

Chicago/Turabian Style

Hariri, Muhab, Ercan Avsar, and Ahmet Aydın. 2025. "LatentResNet: An Optimized Underwater Fish Classification Model with a Low Computational Cost" Journal of Marine Science and Engineering 13, no. 6: 1019. https://doi.org/10.3390/jmse13061019

APA Style

Hariri, M., Avsar, E., & Aydın, A. (2025). LatentResNet: An Optimized Underwater Fish Classification Model with a Low Computational Cost. Journal of Marine Science and Engineering, 13(6), 1019. https://doi.org/10.3390/jmse13061019

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop