Next Article in Journal
MVL-Loc: Leveraging Vision-Language Model for Generalizable Multi-Scene Camera Relocalization
Previous Article in Journal
Loose Joint Detection, Localization, and Quantification Through a Pattern Correlation Method of Dynamical Properties on the Modular Test Structure
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Fish Farming 5.0: Advanced Tools for a Smart Aquaculture Management

Department of Agricultural, Food, Environmental and Animal Sciences (Di4A), University of Udine, Via delle Scienze 206, 33100 Udine, Italy
Appl. Sci. 2025, 15(23), 12638; https://doi.org/10.3390/app152312638
Submission received: 8 September 2025 / Revised: 20 November 2025 / Accepted: 21 November 2025 / Published: 28 November 2025
(This article belongs to the Section Agricultural Science and Technology)

Abstract

The principal goal of precision fish farming (PFF) is to use data and new technologies such as sensors, cameras, and internet connections to optimise fish-aquaculture operations. PFF improves fish farming operations, making them data-driven, accurate, and repeatable, reducing the effects of subjective choices by farmers. Thus, the daily management of operators based on manual practices and experience is shifted to knowledge-based automated processes. Modern sensors and animal biomarkers can be used to monitor environmental conditions, fish behaviour, growth performance, and key health indicators in real time, generating large datasets at low cost. The use of artificial intelligence provides useful insights from big data. Machine learning and modelling algorithms predict future outcomes such as fish growth, food requirements, or disease risk. The Internet of Things set up networks between connected devices on the farm for communication. Smart management systems can automatically adjust instruments such as aerators or feeders in response to sensor inputs. This integration between sensors, internet connectivity, and the use of automated controls enables real-time precision management.

1. Introduction

Precision livestock production, introduced in the 1990s, is rapidly spreading in the animal production sector, owing to the development of innovative technologies such as (i) sensors capable of providing, in real time and at an affordable cost, large amounts of data related to environmental variables and animal production, (ii) reduction in processing costs, (iii) use of artificial intelligence (AI) algorithms capable of extracting useful information and building predictive models from the data, and (iv) the evolution of smart management systems based on the Internet of Things (IoT) [1].
According to Føre et al. [2], the main objectives of precision fish farming (PFF) are to (i) increase the accuracy, precision, and repeatability of fish farming operations; (ii) foster autonomous and continuous monitoring of variables related to fish production, and (iii)provide reliable decision-support tools to reduce the dependence on manual labour and subjective farmer assessments.
The achievement of these objectives is based on the adoption of innovative technologies in sensors, computer vision, and AI integrated in an inter-connected cloud system [3]. At the core of this model lies an IoT platform capable of collecting data from different sources on the farm, analysing the data, and returning useful operational information [4].
Currently, most activities in the different stages of fish production are manually performed by experienced operators. Traditional fish farming systems are characterised by labour-intensive manual activities, fragmented data availability, and limited real-time monitoring and optimisation capabilities [1]. These limitations, during the production process, lead to inefficiencies and greater difficulties in objectively assessing fish health and welfare. In contrast, smart fish farming systems use digital technologies such as AI, computer vision, and IoT to collect, integrate, and analyse large volumes of data. In this way, each animal or group can become a source of data, enabling more precise management of feeding and health, while promoting greater transparency and continuous improvement in farming activities [1]. Overall, the transition from traditional to smart aquaculture offers opportunities to improve production efficiency, product quality, and traceability. The table below (Table 1) shows the advantages and disadvantages of traditional and smart systems.
Farmers directly inspect fish visually or using data acquisition tools such as video cameras and interpret this information on the basis of their experience. During the farming phase, fish farmers make daily decisions that can affect production and final yield, and therefore the company’s profit margins. In this context, data collection using sensors and the use of data mining and AI tools provide intelligent solutions for better management of aquaculture facilities. In particular, all farming processes are analysed, such as water quality control, feeding, biomass monitoring, fish welfare, and disease monitoring. Various IT tools are used, such as sensor networks, microelectronic devices, cloud computing, and IoT systems, to collect and analyse data and provide information to the decision-support systems [2]. Databases are used in an integrated and intelligent way and on different scales, from supercomputers to mobile phones. Data mining and machine learning activities make it possible to identify cause-and-effect associations, identify potential problems, and provide solutions. Intelligent management systems used in developed and mature sectors are Atlantic salmon, rainbow trout, sea bass, and sea bream farming [5].
A bibliographic search protocol was used, utilising the Scopus® and Web of Science databases to identify scientific articles. The search, carried out in September 2025, was refined based on the following criteria: year of publication (2005–2025); scientific area; and type of article, and using various strings on the topic of PFF. At the end of the search, 230 scientific articles were selected.

2. PFF Methods in the Aquaculture Sector

2.1. Computer Vision Methods

With the increasing use of optical cameras and computer technology in aquaculture, machine vision systems provide an automated and non-invasive method for analysing fish characteristics [6]. Computer vision allows images to be acquired and processed mimicking human visual perception [7]. In the fish production phase, the increasing use of underwater cameras has developed various computer vision systems [8]. Computerised image analysis enables non-invasive remote monitoring of the detailed size, shape, conformation, and movement patterns of fish [7]. These technologies are increasingly used for biomass estimation, sex determination, quality assessment, species and stock identification, and welfare monitoring. The workflow typically begins with image acquisition from a video frame or image, followed by pre-processing to ensure consistent colour space and resolution [9]. Feature extraction techniques are then applied, allowing the system to compare the extracted features with labelled datasets to recognise objects or patterns. Finally, the processed information is interpreted to support operational decisions.

2.1.1. Artificial Vision Based on Visible Light

Monocular video cameras and stereo vision systems are used for artificial vision [9]. A typical system based on visible light continuously acquires fish images for a set period, allowing constant monitoring. These tools enable automatic fish detection and recognition, analysing image information at the pixel level. An artificial vision system is based on the following steps: acquisition of image sequences, identification of fish characteristics or behaviours, and data analysis [10]. The optical flow method makes it possible to avoid problems caused by occlusion in the video.

2.1.2. Artificial Vision Based on Infrared Light

This technology involves the acquisition of images using a colour filter during the day and black and white at night [7]. Paustina et al. [11] used a three-dimensional (3D) near-infrared (NIR) vision system that has been developed, with an accuracy of 98%. The main advantage of infrared light-based systems is that the light is less absorbed and dispersed in water.

2.1.3. Stereo Vision Systems

These systems use two cameras placed at a fixed distance with a slight difference in perspective [6,9]. This configuration allows a precise detection and measurement of depth by calculating the position of points in two and 3D space using trigonometric formulas [6]. Fish are simultaneously analysed, providing detailed data on their movement and position, with an estimated margin of error of 3–5% [7].

2.1.4. Light Detection and Ranging (LiDAR) Technology

LiDAR is a remote sensing system that uses laser beams to measure distances and movements in the aquatic environment with extreme precision.
Image processing methods allow one to measure fish morphological and dimensional characteristics. The background is removed from the acquired image and numerical data are extracted. The process comprises five main steps: acquisition, digitalisation, enhancement, segmentation, and measurement [12]. A machine vision system generally comprises a lighting system, a camera, hardware, and software [6].
In the aquaculture sector, these technologies have numerous applications, ranging from feeding monitoring and management to growth assessment and animal welfare management.

2.2. Acoustic Methods

Acoustic sensors can be active or passive.

2.2.1. Active Acoustics

Active acoustics is used to estimate fish biomass, analyse spatial distribution, and track and monitor behaviour. The principle is based on the use of transmitters that emit sound waves of a certain frequency and capture the reflected echo [2]. In the aquaculture sector, the main tools for acoustic telemetry are sonars (sonar, split-beam sonar, and multi-beam sonar), echo sounders, and underwater microphones.
Sonars
Sonars are used for the detection, classification, positioning, and tracking of fish. A sonar comprises a transmitter/receiver that detects the reflected signal (echo), even in 3D environments, and converts it into an analysable digital image [4]. Sonar is the main method used for tracking and detecting fish on a large scale. The advantage of this technique is that it provides high-resolution video even in murky waters.
Split-beam sonars are capable of tracking fish in three dimensions. A horizontal scanning sonar can be used to increase the sampling volume near the surface [4].
Multi-beam sonars improve the accuracy of fish measurements compared to split-beam sonars. However, in the case of crowded groups of fish, problems with occlusion may arise [4].
In recent years, split-beam sonars and dual-frequency sonar have been used to analyse the fish vertical distribution [4]. Low-frequency sonars can be affected by underwater noise, while high-frequency sonars provide high-resolution images. Images obtained with sonars are similar to those obtained with optical systems. However, this technique provides a greater amount of data and allows to obtain the 3D position.
Echosounder
The echo sounder technique is based on the use of a transducer that transmits sound waves. If these waves encounter a fish, with a density different from that of water, echoes are generated and converted into an electrical signal [2]. By this way, the movement of fish groups is analysed. The split-beam broadband echo sounder allows to identify the position of fish groups and analyse the behaviour of individual fish.

2.2.2. Passive Acoustics

Underwater Microphones
Passive acoustic methods use underwater microphones to record sound waves produced by fish. The effectiveness of sound detection depends on signal intensity, distance, and methods used to reduce the background noise [9]. These instruments are designed to listen and record underwater sounds, useful for analysing fish behaviour or detecting environmental changes. These methods can also be used to estimate fish growth rate and biomass over the long term. The main limitation of methods based on sound recording is that these signals are not emitted continuously but only at night or during the feeding phase.

2.3. Sensor-Based Methods

In recent years, various sensors have been used to monitor water quality, fish behaviour, and physiology [2]. Sensor-based methods use less data than image-based methods. Sensors can be used in situations where visual observations are difficult to obtain. Currently, various sensors are used to monitor water quality variables, fish respiratory rate, swimming direction and speed, and physiological stress.

2.3.1. Environmental Sensors

In fish farms, specific sensors are used to monitor water quality [11]. The most common sensors are multi-parameter optical immersion probes [7]. These probes are used for the simultaneous measurement of different chemical and physical water parameters, including pH, temperature, dissolved oxygen (DO), turbidity, ammonia, nitrite, and nitrate concentrations [5].

2.3.2. Acoustic Transmitters

Acoustic telemetry can be used to monitor fish spatial distribution and characteristics in real time [12]. These instruments may contain various sensors for measuring pressure, temperature, or acceleration. Accelerometers are based on a transmitter inserted into the fish’s body that emits specific acoustic pulses that are recorded and analysed [13,14,15]. Accelerometers can also record the tiniest movements of fish. The data obtained, such as respiratory rate, acceleration, or heart rate measurement, can be associated with behavioural and stress-related information [2,10,16,17,18]. One limitation of this technique is that the accelerometer is difficult to install in small fish and can cause fish health problems [19,20,21]. The combined use of different sensors can be useful for monitoring fish stress [13]. For example, a gyroscope and an accelerometer can be used in combination to monitor fish behaviour [13,22].

2.3.3. Biosensors

Biosensors are used to monitor fish behaviour and health [23], as in Gesto et al. [24]. Biosensors are used to measure biologically active substances such as antibodies, enzymes, and microorganisms in fish [5]. Biosensors are able to detect small variations in the resistance and current of biologically active substances and convert them into electrical impulses. These measurements are essential for maintaining optimal conditions for fish growth and health.

3. Automatic Monitoring and Data Analysis

A PFF system comprises three main phases: (i) real-time monitoring of environmental and production parameters (sensors and cameras continuously collect data in the farm), (ii) predictive modelling (data are processed to generate future analyses and estimates) and process control (confirming that the system is operating in accordance with set objectives), and (iii) decision-making phase (Figure 1) [25].
Each stage of the production process in the PFF system is managed and guided using data and algorithms. It is an automatic control system with input and output data flows. To achieve precise control, it is essential to develop a mathematical model that can integrate all the information from the different sub-systems [11,26,27]. A practical example is the estimation of certain parameters such as fish feeding, effluent release into the environment, or swimming speed, using input data obtained from sonar on the vertical distribution of fish biomass [28]. After design and development, the IoT system needs to be tested to verify its reliability in the field [29]. The functionalities of a process-control system depend on the operating system, data mining algorithms, machine learning (ML), and integrated modules, which might include input/output drivers, process database generators, a human–machine interface, scanning programmes, alarm systems, tag group editors, dynamic data exchange servers, trend analysers, report and messaging generators, and remote diallers [30]. The optimal physical conditions of the rearing environment are defined, and the process-control system constantly monitors target parameters [11]. It automatically activates actuators, sending an alert to farmers in the event of deviation from expected values. A stable, high-quality internet connection is required to ensure the thorough monitoring of the system [31]. All information collected on the farm including meteorological information is sent via the internet to data processing centres. All data, acquired from sensors or external sources (e.g., the web), can be included in the feedback loop and used to make targeted supply and farm management decisions (Figure 2). Data relating to intensive fish farming in some cases can be inaccurate, partial, or difficult to use as they are characterised by noise, interference, poor lighting, and occasional occlusions. For this reason, the data must be pre-selected and standardised before analysis.

4. Tasks and Models of AI

ML and deep learning (DL) are fundamental tools for optimising the management of big data [32]. AI, through the simulation of human cognitive functions, uses an automated and adaptive decision-making process. ML, a fundamental branch of AI, uses algorithmic models trained to recognise patterns in observed data. These technologies provide effective solutions to many of the limitations of traditional farming practices and contribute to more efficient and transparent farm management. The analysis process starts with the collection of a training dataset, which is used to build an initial model. ML is based on learning non-linear relationships between input and output. Instead of explicitly programming these relationships, the ML model automatically learns patterns by observing a large number of features with their labels. Once trained, the model is able to predict the correct output for new inputs. DL, a branch of ML, is based on artificial neural networks (ANNs) and is particularly effective when the volume of data is substantially large. In ML, the statistical method K-means is used for clustering analysis [33]. This makes it possible to separate the outline of the fish’s body from the background.
The main ML types are mentioned below.

4.1. Supervised Learning

This method uses labelled data (matching input–output) to acquire underlying rules. It is mainly used for the classification and regression of data. In image analysis, the complex structure is disintegrated into several progressive steps: the first level can identify edges, the second level identifies corners and contours, and the third level identifies complex shapes, until the complete object is recognised [34,35]. Examples of models include convolutional neural networks (CNNs) for image analysis and recurrent neural networks (RNNs) for sequential data.

4.2. Unsupervised Learning

This method uses unlabelled data to identify patterns and groupings, classifying input resources based on their characteristics.

4.3. Semi-Supervised Learning

This method uses a small amount of labelled data with a large set of unlabelled data to reduce annotation costs.

4.4. Reinforcement Learning

The model learns by interacting with the environment and receiving rewards or penalties based on actions taken.
ML can be used in four main tasks: classification, regression, clustering, and dimensionality reduction [36]. The main ML/DL models used in aquaculture are decision trees, support vector machines (SVMs), ANNs, k-nearest neighbours (k-NNs), CNNs, specialised in image and computer vision processing using convolutional and pooling layers to extract feature hierarchies [37], RNNs suitable for sequential data with loop connections that allow remembering previous information, region-based CNNs (R-CNNs), and long short-term memory (LSTM) [36].
The training of a neural network comprises the following steps: estimation of weights leading to correct predictions; initialisation with random weights; calculation of the output on the training set; measurement of the error with a loss of function; updating the weights via gradient descent and backpropagation; and repetition of the cycle until convergence is reached [33,36]. DL integrates feature extraction and model building into a single process (end-to-end), unlike traditional ML, where these steps are separate. Deep hierarchical structures simplify the modelling of complex non-linear relationships. DL is particularly effective with large volumes of data and in managing complex big data. Several DL prediction models might lack robustness in certain applications, but they offer excellent self-learning, generalizability, and non-linear approximation capabilities [36,38].
A hybrid ML method is made up of the combined use of two aspects: supervised and unsupervised learning [36]. A large amount of unlabelled input data is used with a small amount of labelled input data. These models are used in various production systems such as floating cages, ponds, hatcheries, and intensive aquaculture facilities. Their applications include visual fish recognition, biomass estimation, behavioural monitoring, feeding optimisation, and environmental condition prediction.
AI techniques, particularly computer vision, can automate the processing of images and video streams, reducing labour requirements and minimising human error [39]. These systems can detect stress behaviours, identify disease symptoms and monitor feeding activity. However, high accuracy requires integration with high-resolution images and field validation. Several challenges such as object occlusion, fluctuating lighting, underwater distortions, and limited generalisability across species underline the need for a continuous research. Advances in sensor development, data pre-processing, model robustness, and real-world implementation strategies will be essential to realising the full potential of AI-based tools in precision aquaculture.

5. Using AI in Water Quality Monitoring

In intensive fish farming, the first very important control point is the continuous monitoring of the main physical and environmental parameters of the farm.
Water quality monitoring is critical to the success of fish production activities, as the growth and health of fish closely depend on the conditions of the aquatic environment. In intensive aquaculture farms, a first important point is the control of the farm’s physical environment, which includes monitoring of water quality variables, water distribution, pumping indices, and effluent and waste management.
The water quality monitoring process involves several steps: (i) collection of environmental data via sensors for temperature, DO, light, pH, and other parameters; (ii)transmission of the collected data to a control centre; (iii) analysis of the data on a cloud platform; (iv) sending decisions to the control centre, and (v) transmission of feedback to field instruments [40]. Four essential process-control models are used in intensive fish production in the ascending order of complexity: data recording systems or closed-loop controllers, programmable logic controller, supervisory control, and data acquisition and distribution control system [40]. By analysing the data from the sensors, AI algorithms can perform the following mentioned actions: detect patterns and anomalies in the system; generate timely alerts and send them to the farmer; use predictive models to estimate changes in water quality; and analyse historical data by correlating water quality, weather conditions, and fish feeding cycles [39]. A key component of AI-based smart aquaculture is the integration of various advanced sensing technologies. Sensors continuously and automatically collect key environmental data, including temperature, dissolved oxygen, light intensity, and pH, which is transmitted to centralised control platforms. After the data is processed and interpreted, often via cloud-based systems, decisions are transmitted to actuators that adjust equipment and maintain optimal farming conditions [39]. Sensors also play a crucial role in the early detection of water quality deterioration events, enabling timely corrective action. The operational effectiveness of AI systems depends on their ability to interpret environmental signals and translate decisions into concrete actions. Sensors transform physical variables into measurable electrical signals, while actuators and electromechanical devices perform the corresponding control actions. The main variables in fish farming are temperature and DO [41,42,43,44]. Temperature directly affects the growth and health of fish [45]. AI can detect abnormal temperature changes and send real-time alerts to farmers [46], allowing rapid interventions and maintenance of optimal conditions [2]. DO is crucial for optimal fish growth and welfare. Ta and Wei [47] proposed a CNN model to solve the problem of DO estimation in intensive fish farms. The advanced algorithms, based on environmental parameters, optimise farm production variables according to the needs of different species [47,48,49,50,51].
Thanks to recent technological advances such as artificial neural networks, it is possible to use intelligent water quality control systems and predict future trends [48]. Currently, several AI models focus on short-term forecasts [52,53]. AI and ML have proven to be highly effective for monitoring and predicting water quality in fish farming. Predictive models can generate real-time estimates of key water quality parameters [52]. These systems enable timely interventions, reducing overall management costs. As research progresses, the use of advanced neural networks is expected to further improve the accuracy of predictions [48]. Several ML models have been evaluated for water quality prediction; among these, the Support Vector Machine (SVM) approach has demonstrated the highest performance. In intensive fish farms, SVM-based models have achieved up to 99% accuracy [49]. These results highlight the strong potential of AI-based modelling tools to improve environmental monitoring, optimise farm management, and promote more sustainable aquaculture practices.
Lu et al. [54] developed an integrated water quality monitoring system integrated with AI. For long-term predictions, the future challenge is to use space–temporal relationships between water quality characteristics and external factors [55,56]. A few models such as LSTM and RNNs were reported [55]. RNN models exhibit better performance in estimating DO in the short- and long-term than traditional methods [57]. An increasingly popular alternative is the use of unmanned underwater vehicles (UUVs) as mobile data collection platforms. UUVs include autonomous underwater vehicles (AUVs), which operate independently of a surface vessel, and remotely operated vehicles (ROVs), which remain connected to an operator via a cable.
The critical issues in the implementation of IoT systems in PFF systems are the lack of (i) standardisation of the different sensors and devices used, (ii) interoperability between the different systems, and (iii) the excessively high installation and maintenance costs [57].
The main sensor platforms are DO, pH, temperature, salinity, turbidity, ammonia, and CO2. The main IoT sensor networks are wireless nodes, gateways, and cloud dashboards. The main commercial solutions/software are YSI AquaManager (water quality monitoring, multi-parameter sondes); Xylem/Aanderaa SmartGuard (oceanographic and aquaculture sensors); AKVAconnect (integrated cage/pen monitoring, environmental sensors); Innovasea aquaEnvironment (real-time water quality telemetry); Steinsvik/ScaleAQ (environmental monitoring integrated with feeding); and Hach WIMS (water information management system).

6. Use of AI in Fish Biomass Estimation

In fish production, biomass estimation is an important parameter for assessing the growth rate and health status of fish during the different rearing stages [9].
Traditionally, biomass estimation is performed manually by sampling and weighing fish. However, this method is slow and laborious with a considerable margin of error [9]. In addition, handling fish during the weighing phase can result in stress, with negative consequences such as reducing fish growth. To overcome these limitations, alternative biomass estimation techniques have been developed and studied, including the use of AI [58]. In particular, the combination of computer vision and ML allows more accurate estimation of the size, weight, number, and other fish biological parameters [58].
Image processing using a CNN [59] has demonstrated the effectiveness of DL in estimating fish weights and its ability to capture complex patterns and distinctive characteristics between different species. Lopez-Tejeida et al. [60] developed a system that integrated hardware and software with infrared cameras to automatically detect fish and calculate their weights and lengths. Mittún et al. [61] used a system with synchronised converging cameras, which could perform 3D segmentation of fish images. Weight was estimated from the length using the weight–length relation.
Fish counting can be performed with good accuracy using the image segmentation method [9].
The main equipment companies used for biomass estimation and growth models are AquaSim (AKVA) (biomass modelling and growth prediction); InnovaSea (stereo-camera and AI-based biomass measurements); Aquabyte (machine learning for biomass, lice counting, and growth tracking); AquaManager (stock, feed, production); Maritech (purchase and sales/processing); Fishtalk (AKVA group) (production planning, reporting, traceability); and Ace Aquatec (biomass estimation, welfare tools).

7. Use of AI in Fish Feeding Activities

Fish feed is a crucial item of expenditure in the rearing phase, accounting for 40–50% of total maintenance costs [48]. In addition, it is estimated that ~60% of the feed fed is dispersed into the water as particulate matter, causing pollution, decreasing DO and releasing harmful substances (ammonia, nitrogen, etc.), which can reduce fish growth. Optimisation of fish rations is a crucial factor in maximising farm productivity [37]. Unbalanced diets can compromise fish health and reduce product quality, while inefficient feeding practices contribute to increased environmental impact [1]. In this context, AI offers promising opportunities to improve feeding strategies. Although some applications are still conceptual, the use of AI allows for the optimisation of feed quantities by analysing complex data from sensors and monitoring systems. Decisions regarding fish feeding are very important to ensure that each animal receives an optimal amount of feed to achieve desired growth rates with minimal feed waste. Traditionally, fish feeding varies based on the farmer’s visual observation, water temperature, fish size, and total biomass. The use of intelligent feeding systems changes the decision-making process from one based on experience and intuition to an automatic method [2]. A complete monitoring feeding system must include the following components: (i) an image/video/acoustic/biosensor acquisition unit; (ii) a hardware processing unit; and (iii) an automated feeding system [62,63]. The main challenge of this feeding method is the seamless integration between the different modules to achieve an accurate and precise feeding system. The current trend is to supplement food table data with continuous monitoring of environmental variables (temperature, DO, etc.) and fish behaviour data [2]. Signals from the sensors allow the feeding of fish to be automatically interrupted or adjusted, improving fish production efficiency. In sea cages, the feedback system is easier to implement than flow-through farms and re-circulation systems [47]. In remote marine cage-farming sites or those exposed to strong currents, wind, and waves, daily manual feeding is not possible, necessitating automatic or remote control.
In recent years, several studies have analysed various aspects of the subject in detail, such as feed administration control, monitoring of feeding behaviour using acoustic/optical systems, and monitoring of uneaten feed using acoustic/optical/infrared systems [2]. Compared to early methods based solely on image analysis, current AI-based methods use more complex algorithms that involve analytical, predictive, and prescriptive phases [39]. These AI models use various biological variables, such as the level of hunger of the fish indicated by individual swimming behaviour and the vertical distribution of fish groups during the non-feeding and feeding phases. The following technologies are used for automatic feeding control: (i) artificial vision (single, stereo, 3D and NIR cameras in low light) to monitor fish feeding, (ii) acoustic systems (sonar and underwater microphones) to detect pellet consumption and fish behaviour, and (iii) acoustic telemetry to track fish position and activity levels [9,63,64]. The aim is to provide the optimal amount of feed to meet fish nutritional requirements. However, it should be noted that, the optimal amount of feed is influenced also by both physiological factors and external environmental conditions. The feeding behaviour of fish varies depending on their appetite [65]. The fish swimming behaviour varies before and after feeding [66]. If fish are hungry, they are more active, swimming with greater frequency and speed [67]. Conversely, after feeding, groups of fish tend to reduce their activity. In addition, it is important to consider variables such as swimming acceleration, turning angles, and tail stroke frequency [67]. The sounds produced by fish during feeding are due to the movement of their bodies in the water and the chewing and swallowing of food. Underwater microphones can be used to quantify the duration, frequency, and intensity of the sounds. Background noise caused by the environment or fish physiological activities may affect the accuracy of acoustic data. One possible solution to this problem is to use visual and acoustic data in combination [2]. The amount, frequency, and timing of feeding depend on an accurate assessment of the fish’s hunger and satiety levels. The artificial vision-based method uses images or videos of fish feeding behaviour and determines a model that identifies feeding status [7]. Depending on how the characteristics are extracted, either the traditional method or the deep learning-based method is used. With the traditional method, the images are segmented and the characteristics are extracted manually, while with the latter, this is not necessary [7]. The data can be processed using AI algorithms to determine the feeding states: continue, reduce, or stop feeding [4]. The first phase consists of extracting image features and creating a model with a non-linear mapping between input data and target results through continuous iterative training. The amount of feed given to the fish is adjusted according to their feeding activity [67]. Images or videos of fish feeding behaviour are used in ML methods to build a model that objectively identifies the feeding status (Table 2) [11].
Uneaten feed, present in the water or at the bottom of a fish farm, can also be used indirectly to identify the feeding status of fish [95]. For the purpose of identifying uneaten feed, the most commonly used method is artificial vision, while also acoustic telemetry can be used, but it is more expensive [96]. Recently, the MobileNet algorithm has been used by numerous researchers, allowing neural networks with fewer parameters to be used thanks to the lightweight classification method [6]. YOLO, CNN, and R-CNN models are generally used in fish feeding applications [77,89,97,98]. The first is based on a single-stage classification algorithm, while the others are based on two stages. DL methods allow high and low-level features to be extracted from fish images. Zhou et al. [99] used a CNN model and computer vision to study the feeding intensity of fish, achieving superior performance over traditional methods. Cai et al. [100] developed an innovative two-stage approach to fish feeding using in the initial stage, a YOLOv8 model with a multi-scale feature extraction module. According to Måløy et al. [64], 3D-CNN and RNN models enable optimal spatial and temporal analyses of fish feeding data, improving behaviour attributes (feeding/non-feeding pattern). Feng et al. [69] used machine vision and a lightweight 3D ResNet-GloRe method to study fish feeding behaviour and competition for food. Gu et al. [101] used a multimodal fusion network to study the feeding intensity of fish, and Fini et al. [102] and Wu et al. [73] used a method based on a video transformer. Ma et al. [103] used a fusion model (time and frequency) to study fish feeding behaviour based on a six-axis inertial sensor to detect changes on the water surface caused by fish feeding, while Du et al. [104] used a lightweight LC-GhostNet and a multi-feature fusion strategy. AI can be used to optimise fish feeding schedules based on temperature, DO, feed nutritional values, and species-specific biological parameters [105]. Zhao et al. [3] used a machine learning model and two variables (temperature and DO levels) to calculate fish requirements. It is also possible to develop customised plans for each fish, considering genetic characteristics, age, and weight [106]. The calculation can be made using a single factor or using multiple factors [2].
The main equipment used are acoustic, camera, and sensor-based systems for appetite detection, feeding rings, blowers, and robotic delivery systems. The main software and platforms used are AKVA Fishtalk Feed/AKVAconnect (feed control and optimisation); ScaleAQ FeedStation (camera-based feed monitoring); Maritech Eye/Maritech Digital Feeding (cloud-based AI feed management); eFishery (IoT smart feeders with mobile control); Pellet Observer (underwater vision for uneaten pellet detection); and InnovaFeed software (integrates feed production with aquaculture farm needs).
Despite these advantages, several challenges remain. Effective implementation of AI requires significant investment in hardware, software, and operator training. The complexity of maintaining AI systems requires specialist skills, and model performance is heavily dependent on the availability of high-quality data. Increased automation could also reduce opportunities for manual labour, with a potential impact on rural communities, while cybersecurity risks raise concerns about data protection.

8. Stock Assessment of Farmed Fish

To improve fish management during the farming phase, certain information about the fish is required, such as weight, length, and sex at different stages. To assess fish biomass, acoustic and artificial vision-based methods provide an excellent alternative to traditional sampling and manual weighing. In addition to feed management, there exist other potential applications of data mining and ML algorithms at all stages of fish production, spanning from hatchery to harvest. These applications include image processing and pattern recognition for assessing the quality of eggs and the end product. In hatcheries, the separation of diseased or dead eggs and larvae is traditionally performed via manual or semi-automated methods, which are laborious and error prone. To improve the efficiency of these activities, a method based on image analysis and the SVM model was used in a rainbow trout hatchery [1]. The integration between these tools allowed to count, calibrate, and sort eggs rapidly and accurately, making the process highly suitable for large-scale management [107]. Similarly, counting fish at different production stages, from rearing to marketing, is crucial for the optimisation of farm management. Presently, only a few image processing methods are used in commercial fish farms [107]. For optimal stock management, data on individual fish characteristics such as length, weight, skin colour, and sex are useful. During the various growth phases, computer vision systems and acoustic technologies can offer a practical, real-time alternative to the invasive physical sampling and weighing methods, which involve fish stress, labour, and time [33].
Costa et al. [108] developed a system that used optical telemetry with dual underwater cameras to capture fish images and analysed their sizes and shapes. Processing was performed using neural networks, geometric algorithms, and Fourier analysis, enabling remote monitoring of the growth rate.

9. Integration of Feeding Practices with Behaviour and Welfare Monitoring

Aquaculture faces significant economic challenges due to epidemics, with viral, bacterial, and fungal infections causing an estimated $6 billion in losses worldwide each year.
Traditional approaches, such as underwater cameras or manual collection of live fish for welfare assessment, are limited by small sample sizes, poor scalability, and operational constraints. These methods can also induce stress or damage, thereby affecting fish health and compromising the reliability of observations.
Continuous monitoring of fish appearance and behaviour allows for early identification of stressful situations and the onset of disease [109]. In fact, swimming behaviour (fin beat frequency, swimming speed, depth/position preference, trajectories) varies in stressful situations and can be monitored automatically using artificial vision systems [109]. The behaviour of farmed fish is analysed using various technologies such as artificial vision, acoustics, and biosensors. Images and videos are the most commonly used methods, although they are limited in cases of poor lighting and high background noise. Fish behaviour includes both normal behaviour such as feeding, swimming, and aggregation, as well as abnormal behaviours such as cannibalism, stress, and disease. Poor management of fish farms can lead to stressful situations for fish. It is very important to ensure the welfare and reduced stress of the fish [110,111].
Video cameras can be used to detect behavioural changes in fish related to stress or disease (e.g., decreased swimming activity or abnormal movements) [112,113]. Acoustic telemetry can provide continuous data on individuals, monitoring their physiology and behaviour (heart rates, blood composition, 3D position, swimming rates, and food intake) [14,114,115]. The system involves electronic transmitters equipped with sensors, implanted or fixed on fish. Data are sent via sound signals to underwater receivers. Although this technique requires fish manipulation and occasional surgery, it is the only method for continuous physiological monitoring of a single fish. ML algorithms allow the analysis of complex fish behaviour patterns, such as the study of swimming trajectories and spatial distribution, providing useful information on optimal density and environmental preferences (Table 3) [1,77,88,116,117].
Recently, various techniques were used to analyse the anusual behaviour of individuals [124,125,126,127,128,129,130] or small groups of fish, such as stereoscopic video analysis [6], 3D neural networks [131], and ML analysis [6,66,80,81,119,132]. Data obtained from different sources are integrated and analysed through multimodal data fusion [66,71,72,85,105]. The integration of AI and DL is opening up new challenges for non-invasive and scalable monitoring, including the use of acoustic observation systems. Acoustic methods offer substantial advantages, as they can monitor areas far beyond the limited detection range of underwater cameras, which typically only operate effectively between 0.5 and 25 metres [88]. AI plays a crucial role in managing the enormous volume of data generated, enabling efficient processing, pattern recognition, and real-time interpretation. Together, these technologies represent a promising path towards more comprehensive, sustainable, and fish-friendly monitoring practices. In particular, AI-based models offer a powerful new method of analysing behavioural data, including movement patterns, social interactions, and responses to environmental stimuli, with much greater speed and accuracy than traditional observation methods [98]. For example, AI systems can generate predictive simulations and interactive visualisations to assess how animals may respond to changes in key environmental parameters such as temperature, dissolved oxygen, or stocking density. This capability enables early identification of stress indicators and facilitates data-driven management decisions that maintain high welfare standards. Artificial neural networks (ANNs) have been used to evaluate stereoscopic vision systems employing synchronised dual underwater cameras connected to waterproof laptops, demonstrating improvements over traditional single-camera counts performed by divers [97]. Similarly, the Aqua3DNet model was developed to determine the 3D orientation of fish using monocular vision, offering an innovative and cost-effective approach by eliminating calibration requirements [98]. However, accurate head-to-tail orientation resolution remains a limitation, highlighting the need for AI-based image enhancement to expand training datasets and improve model accuracy. DL-based object detection also enables real-time monitoring of movement and spatial distribution in tanks, supporting early detection of stress or disease. Graph-based classification frameworks can encode fish posture, while time series of positions are analysed using self-recovery algorithms to characterise behaviour [112].
AI can analyse water quality data (temperature, pH, DO, etc.) to identify correlations with specific diseases [102]. Some models predict disease outbreaks using certain environmental and meteorological variables [133], enabling targeted preventive interventions [7]. RNNs are particularly useful in analysing sequential data such as videos, capturing temporal variations [97]. Zhao et al. [85] developed an RNN-based method for detecting anomalous behaviour in fish groups. Image processing and computer vision techniques are creating new possibilities for automatic detection of fish disease. The body surface of fish can provide crucial information on the occurrence of new diseases [134]. Lesions, colour changes or anusual behaviour can be detected by analysing images captured using video cameras in the farm. A further advantage is associated with disease prevention: predictive models can detect early signs of stress or infection in farmed species by analysing environmental data, thus enabling preventive intervention [135].
The main equipment used for behavioural monitoring are underwater camera using AI systems and acoustic telemetry tags to monitor movement/activity. The main software used are OptoScale (real-time fish condition and welfare metrics); Aquabyte Lice Detection (machine vision for lice monitoring in salmon); Ace Aquatec Humane Stunner (welfare technology for harvesting); Observe AI platform (detects abnormal fish behaviours).
Despite their advantages, computer vision methods in aquaculture have some limitations. Fish are sensitive and easily stressed organisms that move in tanks or cages often characterised by variable lighting and turbidity, conditions that complicate image acquisition and reduce accuracy. Low-resolution images can cause small fish to occupy only a few pixels, compromising DL-based classification [98]. Algorithms such as YOLO (You Only Look Once) offer promising solutions by dividing images into grids, classifying objects within each cell, and combining bounding-box regression with category detection to improve accuracy [98].

10. Challenges and Future Prospects

Despite the potential of precision aquaculture (PFF), there still exist critical issues to be addressed for its full implementation.

10.1. Quality and Availability of Real-Time Data

Several fish farms lack simple and reliable tools to continuously monitor important parameters such as weight gain or health status. However, studies are developing new, non-invasive, and reliable sensors. The availability of accurate data on fish variables in real time is essential. Furthermore, predictive models need to accurately represent complex biological responses. The development of such models requires extensive multi-disciplinary research based on extensive data for reliable calibration. Despite its numerous applications, DL has its limitations, as it requires a large amount of data during the training process [7]. Acquiring large amounts of data is a difficult process, especially in the aquaculture sector. The underwater environment is so complex that the data currently available does not always allow us to distinguish between normal and abnormal behaviour or to detect disease in fish [88].
Precision aquaculture operations vary depending on environmental conditions, species farmed, and farming practices [11]. This diversity in data sources is beneficial for training models that reflect the unique characteristics of different facilities, resulting in more accurate and adaptable solutions tailored to specific needs. However, relying on a single data source or a centralised approach creates a single point of failure and limits the model’s ability to generalise across different conditions. Collaborative learning, in which multiple facilities contribute their own models, offers greater insights into precision aquaculture operations. However, this approach is hampered by recurring challenges related to data privacy and confidentiality. The cost of acquisition is often cited as the main barrier to AI/ML adoption [4]. The decision which a farmer must make is whether the benefit of adopting AI technologies, relative to the total cost of investment, is advantageous compared to current conventional methods. A large number of sensors need to be deployed for data collection, making the initial capital investment substantial. Training DL models requires significant processing power. Poor data quality leads to inaccurate AI models [11]. Insufficient data leads to simplistic models that cannot accurately predict real-world outcomes. A lack of data diversity can lead to biased models that do not represent the target population. AI models may require access to sensitive data, raising privacy and data security concerns. In aquaculture, researchers do not have access to many publicly available datasets. Therefore, in many cases, they must develop custom image sets, which can take hours or days of work. Open databases on intensive fish farming are critical for researchers to more easily access a wider variety of sample data [11].
The scarcity or inconsistency of data can compromise the accuracy and reliability of AI models [4]. Infrastructure limitations such as poor internet connectivity can significantly hamper the effectiveness of the system. Although many IoT devices are optimised for low power consumption and run on rechargeable batteries or solar power, high-bandwidth data transmission, such as video, remains difficult due to network constraints. Ensuring the accuracy and reliability of sensor data, particularly in offshore or isolated environments, presents additional complications [11]. Furthermore, the enormous volume of data generated by such systems requires sophisticated data management techniques to maintain performance and scalability.
A critical issue in precision aquaculture is ensuring data privacy and security [98]. Sensitive and potentially proprietary information collected from various sources (different farms) can be valuable, raising concerns about confidentiality. Traditional centralised ML approaches have accentuated these privacy concerns, as farmers are often reluctant to share their data due to issues of ownership and trust [4].
Ultimately, AI should be seen as a complement to human labour rather than a replacement for it. However, training professional and qualified personnel to manage AI systems can be costly and represent a new challenge for the wider implementation and development of AI technologies [12]. The combination of AI and IoT could make a significant contribution to the aquaculture sector, meeting the principles of efficiency, sustainability, and productivity. In fact, by collecting and analysing data in real time, the IoT makes it possible to reduce manual effort and automate various production processes, ensuring accuracy and timeliness. Precision aquaculture systems rely heavily on the continuous collection and analysis of large volumes of data from sensors, cameras, and environmental monitoring platforms.
The integration of artificial intelligence into aquaculture also raises ethical considerations. With technological advances, there is a potential risk that over-reliance on artificial intelligence could reduce human decision-making and creativity in farm management. To mitigate this risk, it is essential to position artificial intelligence as a collaborative tool that complements human skills rather than replacing them. By promoting a balanced relationship between artificial intelligence systems and human operators, the aquaculture industry can fully exploit the potential of this technology while maintaining the fundamental role of human oversight and innovation. Looking ahead, the future of artificial intelligence in aquaculture is very promising. Advances in machine learning algorithms, coupled with the proliferation of affordable IoT devices and sensors, are likely to drive further innovation. Furthermore, with growing awareness of sustainable practices, artificial intelligence will play a key role in ensuring that aquaculture is in line with global environmental goals and consumer expectations.

10.2. Integration and Standardisation

The aquaculture sector is highly fragmented, with numerous small farms and a wide variety of sensors and proprietary software. This creates difficulties in integrating data flows between different systems. For example, a plant might have one system for water quality monitoring and another for fish feeding. However, they might not be compatible. The absence of common standards for formats, metrics, and fish welfare indicators hinders their widespread adoption. Therefore, technologies should be harmonised using defined operational guidelines, and standardised measurement methods for key variables (e.g., stress indicators) should be validated for application across all species and farming systems [40].

10.3. Implementation Costs

The initial investment in sensors, infrastructure, and training can be high. However, falling prices and increased efficiency often allow for costs to be amortised in the medium term. In summary, there are many economic benefits to using a PFF system. By monitoring water quality in real time (particularly dissolved oxygen, temperature, and pH), it prevents episodes of hypoxia and disease outbreaks, reducing mortality rates by 10–30%. Precision feeding (sensors, cameras, AI) typically reduces FCR (Food Conversion Ratio) by 5–15%, shortening production cycles and improving fish size uniformity. In addition, automation of feeding, sampling and monitoring reduces manual labour. Many PFF technologies show a Return On Investment (ROI) within 1–3 years. The future of PFF is geared towards high automation. The goal is to evolve towards a digital aquaculture, where the entire farm management is monitored and controlled via AI and robotics, with minimal human intervention. The following are the examples of some applications that are in the experimental phase: (i) digital twins (virtual models parallel to the real farm) that simulate scenarios, optimise decisions, and predict outcomes under different operating conditions [136], (ii) advanced AI that relies on predictive systems that can anticipate health problems days in advance or adapt feeding regimes on an hourly basis to maximise efficiency [30], and (iii) underwater robotics that study applications such as drones that can inspect nets, remove waste, or automatically eliminate diseased fish [137].

11. Conclusions

Traditional intensive fish farming systems rely primarily on manual operations performed by experienced and qualified personnel, with high workloads and limited data integration between different activities. The limited availability of real-time monitoring methods leads to fragmented information and reduced ability for farmers to make timely decisions.
PFF enables fish farms to reduce operating costs, increase fish production with less waste, and at the same time improve welfare and reduce the environmental impact. PFF represents a real paradigm shift, moving the industry towards an intelligent, data-driven production model. The development of next-generation sensors makes it possible to collect production, physiological, and behavioural data from fish at the individual and group levels.
The full potential of AI applied to smart aquaculture becomes most apparent when different applications such as DL, computer vision, robotics, IoT, and cloud computing are used in integrated systems. The synergy between these technologies broadens the applicability of AI and enables the automation of key operational tasks, resulting in significant gains in efficiency and decision-making accuracy. By combining data from sensors, satellite observations, and historical data, AI-based models can support optimal management strategies that improve fish growth and health. A key advantage of AI lies in its ability to implement precision aquaculture, in which production processes are constantly optimised, and to detect early warning signs of disease in real time by analysing behavioural signals. AI-based monitoring systems track water quality parameters such as temperature, pH, dissolved oxygen, and other critical variables, processing large volumes of data to identify patterns and anomalies. This enables farmers to maintain optimal living conditions, reduce stress, and improve the growth performance of farmed species. Traditional feeding methods often cause overfeeding, inefficiency, and increased environmental impact. Artificial intelligence-based feeding systems use cameras and sensors to assess fish behaviour, appetite, and biomass in real time. Based on this information, automatic feeders dispense precise amounts of feed at the most appropriate time, reducing waste and operating costs and minimising environmental impact. Overall, artificial intelligence is reshaping modern aquaculture by increasing productivity, improving resource management, and promoting more sustainable production practices.
The integration of sensors, AI, and IoT automation enabled PFF for highly controlled and optimised management, with significant benefits in production yield, sustainability, and animal welfare. However, several issues remain regarding data quality, standardisation and integration of PFF systems, even though rapidly evolving technologies and ongoing research advances are expanding the possibilities for application. An ever-widening spread of these precision aquaculture techniques can be expected in the future.

Funding

This work was supported by a research grant from the University of Udine (Italy).

Data Availability Statement

No data was produced.

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Huang, M.; Zhou, Y.G.; Yang, X.G. Optimizing feeding frequencies in fish: A meta-analysis and machine learning approach. Aquaculture 2025, 595, 741678. [Google Scholar] [CrossRef]
  2. Føre, M.; Frank, K.; Norton, T.; Svendsen, E.; Alfredsen, J.A.; Dempster, T.; Eguiraun, H.; Watson, W.; Stahl, A.; Sunde, L.M.; et al. Precision fish farming: A new framework to improve production in aquaculture. Biosyst. Eng. 2018, 173, 176–193. [Google Scholar] [CrossRef]
  3. Zhao, S.; Zhang, S.; Liu, J.; Wang, H.; Zhu, J.; Li, D.; Zhao, R. Application of machine learning in intelligent fish aquaculture: A review. Aquaculture 2021, 540, 736724. [Google Scholar] [CrossRef]
  4. Vo, T.T.E.; Ko, H.; Huh, J.H.; Kim, Y. Overview of smart aquaculture system: Focusing on applications of machine learning and computer vision. Electronics 2021, 10, 2882. [Google Scholar] [CrossRef]
  5. Brijs, J.; Føre, M.; Gräns, A.; Clark, T.D.; Axelsson, M.; Johansen, J.L. Biosensing technologies in aquaculture: How remote monitoring can bring us closer to our farm animals. Philos. Trans. R. Soc. B 2021, 376, 20200218. [Google Scholar] [CrossRef] [PubMed]
  6. Zhang, Y.; Xu, C.; Du, R.; Kong, Q.; Li, D.; Liu, C. MSIF-MobileNetV3: An improved MobileNetV3 based on multi-scale information fusion for fish feeding behavior analysis. Aquac. Eng. 2023, 102, 102338. [Google Scholar] [CrossRef]
  7. Saberioon, M.; Gholizadeh, A.; Cisar, P.; Pautsina, A.; Urban, J. Application of machine vision systems in aquaculture with emphasis on fish: State-of-the-art and key issues. Rev. Aquac. 2017, 9, 369–387. [Google Scholar] [CrossRef]
  8. Boudhane, M.; Nsiri, B. Underwater image processing method for fish localization and detection in submarine environment. J. Vis. Commun. Image Represent. 2016, 39, 226–238. [Google Scholar] [CrossRef]
  9. Li, D.; Wang, Z.; Wu, S.; Miao, Z.; Du, L.; Duan, Y. Automatic recognition methods of fish feeding behavior in aquaculture: A review. Aquaculture 2020, 528, 735508. [Google Scholar] [CrossRef]
  10. Desai, N.P.; Balucha, M.F.; Makrariyab, A.; MusheerAziz, R. Image processing model with deep learning approach for fish species classification. Turk. J. Comput. Math. Educ. 2022, 13, 85–89. [Google Scholar]
  11. Biazi, V.; Marques, C. Industry 4.0-based smart systems in aquaculture: A comprehensive review. Aquac. Eng. 2023, 103, 102360. [Google Scholar] [CrossRef]
  12. Føre, M.; Alfredsen, J.A.; Gronningsater, A. Development of two telemetry-based systems for monitoring the feeding behaviour of Atlantic salmon (Salmo salar L.) in aquaculture sea-cages. Comput. Electron. Agric. 2011, 76, 240–251. [Google Scholar] [CrossRef]
  13. Alfonso, S.; Zupa, W.; Spedicato, M.T.; Lembo, G.; Carbonara, P. Using telemetry sensors mapping the energetic costs in European sea bass (Dicentrarchus labrax) as a tool for welfare remote monitoring in aquaculture. Front. Anim. Sci. 2022, 3, 885850. [Google Scholar] [CrossRef]
  14. Carbonara, P.; Alfonso, S.; Dioguardi, M.; Zupa, W.; Vazzana, M.; Dara, M.; Spedicato, M.T.; Lembo, G.; Cammarata, M. Calibrating accelerometer data as a promising tool for health and welfare monitoring in aquaculture: Case study in European sea bass (Dicentrarchus labrax) in conventional or organic aquaculture. Aquac. Rep. 2021, 21, 100–113. [Google Scholar] [CrossRef]
  15. Føre, M.; Svendsen, E.; Alfredsen, J.A. Using acoustic telemetry to monitor the effects of crowding and delousing procedures on farmed Atlantic salmon (Salmo salar). Aquaculture 2017, 495, 757–765. [Google Scholar] [CrossRef]
  16. Gesto, M.; Zupa, W.; Alfonso, S.; Spedicato, M.T.; Lembo, G.; Carbonara, P. Using acoustic telemetry to assess behavioral responses to acute hypoxia and ammonia exposure in farmed rainbow trout of different competitive ability. Appl. Anim. Behav. Sci. 2020, 230, 105084. [Google Scholar] [CrossRef]
  17. Morgenroth, D.K.; Vaestad, B.; Økland, F.; Finstad, B.; Olsen, R.E.; Svendsen, E.; Rosten, C.; Axelsson, M.; Bloecher, N.; Føre, M.; et al. Under the sea: How can we use heart rate and accelerometers to remotely assess fish welfare in salmon aquaculture? Aquaculture 2024, 579, 740144. [Google Scholar] [CrossRef]
  18. Rosell-Moll, E.; Piazzon, M.C.; Sosa, J.; Ferrer, M.; Cabruja, E.; Vega, A.; Calduch-Giner, J.A.; Sitja-Bobadilla, A.; Lozano, M.; Montiel-Nelson, J.A.; et al. Use of accelerometer technology for individual tracking of activity patterns, metabolic rates and welfare in farmed gilthead sea bream (Sparus aurata) facing a wide range of stressors. Aquaculture 2021, 539, 736609. [Google Scholar] [CrossRef]
  19. Zupa, W.; Alfonso, S.; Gai, F.; Gasco, L.; Spedicato, M.T.; Lembo, G.; Carbonara, P. Calibrating accelerometer tags with oxygen consumption rate of rainbow trout (Oncorhynchus mykiss) and their use in aquaculture facility: A case study. Animals 2021, 11, 1496. [Google Scholar] [CrossRef]
  20. Macaulay, G.; Warren-Myers, F.; Barrett, L.; Oppedal, F.; Føre, M.; Dempster, T. Tag use to monitor fish behaviour in aquaculture: A review of benefits, problems and solutions. Rev. Aquac. 2021, 15, 1565–1582. [Google Scholar] [CrossRef]
  21. Munoz, L.; Aspillaga, E.; Palmer, M.; Saraiva, J.L.; Arechavala-Lopez, P. Acoustic telemetry: A tool to monitor fish swimming behavior in sea-cage aquaculture. Front. Mar. Sci. 2020, 7, 545896. [Google Scholar] [CrossRef]
  22. Palstra, A.P.; Arechavala-Lopez, P.; Xue, Y.; Roque, A. Accelerometry of seabream in a sea-cage: Is acceleration a good proxy for activity? Front. Mar. Sci. 2021, 8, 639608. [Google Scholar] [CrossRef]
  23. Andrewartha, S.J.; Elliott, N.G.; McCulloch, J.W.; Frappell, P.B. Aquaculture sentinels: Smart-farming with biosensor equipped stock. J. Aquac. Res. Dev. 2015, 7, 100393. [Google Scholar]
  24. Gesto, M.; Hernández, J.; López-Patiño, M.A.; Soengas, J.L.; Míguez, J.M. Is gill cortisol concentration a good acute stress indicator in fish? A study in rainbow trout and zebrafish. Comp. Biochem. Physiol. A Mol. Integr. Physiol. 2015, 188, 65–72. [Google Scholar] [CrossRef]
  25. Chiu, M.C.; Yan, W.M.; Bhat, S.A.; Huang, N.F. Development of smart aquaculture farm management system using IoT and AI-based surrogate models. J. Agric. Food Res. 2022, 9, 00357. [Google Scholar] [CrossRef]
  26. Zhao, J.; Xu, D.; Zhou, C.; Sun, C.; Yang, X. Simulation of collective swimming behavior of fish schools using a modified social force and kinetic energy model. Ecol. Model. 2017, 360, 200–210. [Google Scholar]
  27. Schraml, R.; Hofbauer, H.; Jalilian, E.; Bekkozhayeva, D.; Saberioon, M.; Cisar, P.; Uhl, A. Towards fish individuality-based aquaculture. IEEE Trans. Ind. Inform. 2021, 17, 4356–4366. [Google Scholar] [CrossRef]
  28. Fore, M.; Alver, M.; Alfredsen, J.A.; Marafioti, G.; Senneset, G.; Birkevold, J.; Willumsen, F.V.; Lange, G.; Espmark, A.; Terjesen, B.F. Modelling growth performance and feeding behaviour of Atlantic salmon (Salmo salar L.) in commercial-size aquaculture net pens: Model details and validation through full-scale experiments. Aquaculture 2016, 464, 268–278. [Google Scholar] [CrossRef]
  29. Islam, M.M. Real-time IoT dataset of pond water for fish farming (multi-pond). Data Brief 2023, 49, 10911. [Google Scholar]
  30. Prapti, D.R.; Mohamed Shariff, A.R.; Che Man, H.; Ramli, N.M.; Perumal, T.; Shariff, M. Internet of Things (IoT)-based aquaculture: An overview of IoT application on water quality monitoring. Rev. Aquac. 2022, 14, 979–992. [Google Scholar] [CrossRef]
  31. Ma, F.; Fan, Z.; Nikolaeva, A.; Bao, H. Redefining aquaculture safety with artificial intelligence: Design innovations, trends and future perspectives. Fishes 2025, 10, 88. [Google Scholar] [CrossRef]
  32. Rastegari, H.; Nadi, F.; Lam, S.S. Internet of Things in aquaculture: A review of the challenges and potential solutions based on current and future trends. Smart Agric. Technol. 2023, 4, 100187. [Google Scholar] [CrossRef]
  33. Mustapha, U.F.; Alhassan, A.W.; Jiang, D.N.; Li, G.L. Sustainable aquaculture development: A review on the roles of cloud computing, internet of things and artificial intelligence (CIA). Rev. Aquac. 2021, 13, 2076–2091. [Google Scholar] [CrossRef]
  34. Sun, M.; Yang, X.F.; Xie, Y.G. Deep learning in aquaculture: A review. J. Comput. 2020, 31, 294–310. [Google Scholar]
  35. Chen, C.; Li, X.; Huang, Y.; Xu, D.; Zhou, C.; Sun, C. Fish behavior classification using image texture features and support vector machines. Comput. Electron. Agric. 2018, 155, 131–138. [Google Scholar]
  36. Qiao, F.; Zhou, C.; Xu, D.; Sun, C.; Yang, X. Automatic analysis of fish location and quantity in aquaculture ponds using image preprocessing and edge detection. Comput. Electron. Agric. 2015, 119, 42–49. [Google Scholar]
  37. Aung, T.; Abdul Razak, R.; Rahiman, M.D.; Nor, A. Artificial intelligence methods used in various aquaculture applications: A systematic literature review. J. World Aquac. Soc. 2025, 56, e13107. [Google Scholar] [CrossRef]
  38. Iqbal, M.A.; Wang, Z.J.; Ali, Z.A. Automatic fish species classification using deep convolutional neural networks. Wirel. Pers. Commun. 2021, 116, 1043–1053. [Google Scholar] [CrossRef]
  39. Huang, Y.P.; Khabusi, S.P. Artificial intelligence of things (AIoT) advances in aquaculture: A review. Processes 2025, 13, 73. [Google Scholar] [CrossRef]
  40. Arepalli, P.G. IoT-based DSTCNN for aquaculture water-quality monitoring. Aquac. Eng. 2024, 108, 102369. [Google Scholar]
  41. Shete, R.P. IoT-enabled real-time WQ monitoring for aquafarming using Arduino measurement. Sensors 2024, 27, 10064. [Google Scholar]
  42. Khan, P.W.; Byun, Y.C. Optimized dissolved oxygen prediction using genetic algorithm and bagging ensemble learning for smart fish farm. IEEE Sens. J. 2023, 23, 15153–15164. [Google Scholar] [CrossRef]
  43. Liu, J.; Zhang, T.; Han, G.J. TD-LSTM: Temporal dependence-based LSTM networks for marine temperature prediction. Sensors 2018, 18, 3797. [Google Scholar] [CrossRef]
  44. Ren, H.; Wang, X.; Li, W.; Wei, Y.; An, D. Research of dissolved oxygen prediction in recirculating aquaculture systems based on deep belief network. Aquac. Eng. 2020, 90, 102085. [Google Scholar] [CrossRef]
  45. Claireaux, G.; Couturier, C.; Groison, A.L. Effect of temperature on maximum swimming speed and cost of transport in juvenile European sea bass (Dicentrarchus labrax). J. Exp. Biol. 2006, 209, 3420–3428. [Google Scholar] [CrossRef]
  46. Koumoundouros, G.; Sfakianakis, D.G.; Divanach, P.; Kentouri, M. Effect of temperature on swimming performance of sea bass juveniles. J. Fish Biol. 2002, 60, 923–932. [Google Scholar] [CrossRef]
  47. Hu, W.C.; Chen, L.B.; Huang, B.K.; Lin, H.M. A computer vision-based intelligent fish feeding system using deep learning techniques for aquaculture. IEEE Sens. J. 2023, 22, 7185–7194. [Google Scholar] [CrossRef]
  48. Hu, W.C.; Chen, L.B.; Wang, B.H. Design and implementation of a full-time artificial intelligence of things-based water quality inspection and prediction system for intelligent aquaculture. IEEE Sens. J. 2024, 24, 3811–3821. [Google Scholar] [CrossRef]
  49. Kumar, D.S.; Prabhaker, L.C.; Shanmugapriya, T. Water quality evaluation and monitoring model (WQEM) using machine learning techniques with IoT. Water Resour. 2024, 51, 1094–1110. [Google Scholar] [CrossRef]
  50. Baena-Navarro, R.; Carriazo-Regino, Y.; Torres-Hoyos, F.; Pinedo-López, J. Intelligent prediction & continuous monitoring of pond water quality with ML + quantum optimization. Water 2025, 17, 82. [Google Scholar]
  51. Eneh, A.H.; Udanor, C.N.; Ossai, N.I.; Aneke, S.O.; Ugwoke, P.O.; Obayi, A.A.; Ugwuishiwu, C.H.; Okereke, G.E. Improving IoT sensor data quality in aquaculture WQ systems (LoRa/Arduino cases). Sensors 2023, 26, 100625. [Google Scholar]
  52. Arepalli, P.G.; Khetavath, J.N. An IoT framework for quality analysis of aquatic water data using time-series convolutional neural network. Environ. Sci. Pollut. Res. 2023, 30, 125275–125294. [Google Scholar] [CrossRef] [PubMed]
  53. Nayoun, M.N.I.; Hossain, S.A.; Rezaul, K.M.; Siddiquee, K.N.E.A.; Islam, M.S.; Jannat, T. Internet of Things-driven precision in fish farming: A deep dive into automated temperature, oxygen, and pH regulation. Computers 2024, 13, 267. [Google Scholar] [CrossRef]
  54. Lu, H.Y.; Cheng, C.Y.; Cheng, S.C. A low-cost AI buoy system for monitoring water quality at offshore aquaculture cages. Sensors 2022, 22, 4078. [Google Scholar] [CrossRef]
  55. Chen, C.H.; Wu, Y.C.; Zhang, J.X.; Chen, Y.H. IoT-based fish farm water quality monitoring system. Sensors 2022, 22, 6700. [Google Scholar] [CrossRef]
  56. Lin, J.Y.; Tsai, H.; Lyu, W.H. An integrated wireless multi-sensor system for monitoring the water quality of aquaculture. Sensors 2021, 21, 8179. [Google Scholar] [CrossRef]
  57. Flores-Iwasaki, M.; Guadalupe, G.A.; Pachas-Caycho, M.; Chapa-Gonza, S.; Mori-Zabarburú, R.C.; Guerrero-Abad, J.C. IoT sensors for water-quality monitoring in aquaculture: Systematic review & bibliometrics (2020–2024). AgriEngineering 2025, 7, 78. [Google Scholar]
  58. Zhang, T.; Yang, Y.; Liu, Y.; Liu, C.; Zhao, R.; Li, D.; Shi, C. Fully automatic system for fish biomass estimation based on deep neural network. Ecol. Inform. 2024, 79, 102399. [Google Scholar] [CrossRef]
  59. Bravata, N.; Kelly, D.; Eickholt, J.; Bryan, J.; Miehls, S. Applications of deep convolutional neural networks to predict length, circumference, and weight from mostly dewatered images of fish. Ecol. Evol. 2020, 10, 9313–9325. [Google Scholar] [CrossRef] [PubMed]
  60. Lopez-Tejeida, S.; Soto-Zarazua, G.M.; Toledano-Ayala, M.; Contreras-Medina, L.M.; Rivas-Araiza, E.A.; Flores-Aguilar, P.S. An improved method to obtain fish weight using machine learning and NIR camera with haar cascade classifier. Appl. Sci. 2023, 13, 69. [Google Scholar] [CrossRef]
  61. Mittún, Ó.F.; Andersen, L.E.J.; Svendsen, M.B.S.; Steffensen, J.F. An inexpensive 3D camera system based on a completely synchronized stereo camera, open-source software, and a Raspberry Pi for accurate fish size, position, and swimming speed. Fishes 2025, 10, 139. [Google Scholar] [CrossRef]
  62. Fernandes, S.; DMello, A. Artificial intelligence in the aquaculture industry: Current state, challenges and future directions. Aquaculture 2025, 598, 742048. [Google Scholar] [CrossRef]
  63. Xiao, Y. Review: Computer vision for fish feeding-behaviour analysis & practice. Appl. Anim. Behav. Sci. 2025, 271, 105880. [Google Scholar]
  64. Måløy, H.; Aamodt, A.; Misimi, E. A spatio-temporal recurrent network for salmon feeding action recognition from underwater videos in aquaculture. Comput. Electron. Agric. 2019, 167, 105084. [Google Scholar] [CrossRef]
  65. An, D.; Huang, J.; Wei, Y. A survey of fish behaviour quantification indexes and methods in aquaculture. Rev. Aquac. 2021, 13, 2169–2189. [Google Scholar] [CrossRef]
  66. Chen, I.H.; Georgopoulou, D.G.; Ebbesson, L.O.E.; Voskakis, D.; Lal, P.; Papandroulakis, N. Food anticipatory behaviour on European seabass in sea cages: Activity-, positioning- and density-based approaches. Front. Mar. Sci. 2023, 10, 1168953. [Google Scholar] [CrossRef]
  67. Wei, X.; Zhang, Y.; Liu, J.; Zhang, Y.; Li, D. A customized recurrent neural network for fish behavior analysis. Aquaculture 2021, 544, 737140. [Google Scholar]
  68. Zhang, Z.; Zou, B.; Hu, Q.; Li, W. Multimodal knowledge distillation framework for fish feeding behaviour recognition in industrial aquaculture. Biosyst. Eng. 2025, 255, 104170. [Google Scholar] [CrossRef]
  69. Feng, M.; Jiang, P.; Wang, Y.; Hu, S.; Chen, S.; Li, R.; Huang, H.; Li, N.; Zhang, B.; Ke, Q.; et al. YOLO-feed: An advanced lightweight network enabling real-time, high-precision detection of feed pellets on CPU devices and its applications in quantifying individual fish feed intake. Aquaculture 2025, 608, 742700. [Google Scholar] [CrossRef]
  70. Georgopoulou, D.G.; Vouidaskis, C.; Papandroulakis, N. Swimming behavior as a potential metric to detect satiation levels of European seabass in marine cages. Front. Mar. Sci. 2024, 11, 135038. [Google Scholar] [CrossRef]
  71. Cai, Y.; Li, J.; Zhou, X.; Wang, L. A two-stage framework for fish behavior recognition: Modified YOLOv8 and ResNet-like model. Aquaculture 2024, 575, 112345. [Google Scholar]
  72. Yang, Y.; Yu, H.; Zhang, X.; Zhang, P.; Tu, W.; Gu, L. Fish behavior recognition based on an audio-visual multimodal interactive fusion network. Aquac. Eng. 2024, 107, 102471. [Google Scholar] [CrossRef]
  73. Wu, S.; Yang, T.; Lin, J.; Li, M.; Chen, X.; Li, D. DeformAtt-ViT: A largemouth bass feeding intensity assessment method based on Vision Transformer with deformable attention. J. Mar. Sci. Eng. 2024, 12, 726. [Google Scholar] [CrossRef]
  74. Ni, W.; Wei, D.; Peng, Z.; Ma, Z.; Zhu, S.; Tang, R.; Tian, X.; Zhao, J.; Ye, Z. An appetite assessment method for fish in outdoor ponds with anti-shadow disturbance. Comput. Electron. Agric. 2024, 221, 108940. [Google Scholar] [CrossRef]
  75. Zhao, H.X.; Cui, H.W.; Qu, K.M. A fish appetite assessment method based on improved ByteTrack and spatiotemporal graph convolutional network. Biosyst. Eng. 2024, 240, 46–55. [Google Scholar] [CrossRef]
  76. Yang, H.; Shi, Y.; Wang, X.; Wang, J.; Jia, B.; Zhou, C.; Ye, H. Detection method of fry feeding status based on YOLO lightweight network by shallow underwater images. Electronics 2022, 11, 3856. [Google Scholar] [CrossRef]
  77. Zeng, Q.; Liu, H.; Sun, Y.; Zhao, W.; Chen, D.; Li, D. Fish behavior recognition using audio spectrum swin transformer network. Aquac. Eng. 2023, 101, 102320. [Google Scholar]
  78. Zheng, K.; Wang, H.; Yang, T.; Liu, M.; Chen, L.; Xu, D. Spatio-temporal attention network for swimming and spatial features of pompano. Sensors 2023, 23, 3124. [Google Scholar]
  79. Du, Y.; Zhang, H.; Chen, X.; Li, Y. Fish broodstock behavior recognition using ResNet50-LSTM. Comput. Electron. Agric. 2022, 198, 106987. [Google Scholar]
  80. Du, Y.; Zhang, H.; Chen, X.; Li, Y. LC-GhostNet: Lightweight multimodal neural network for fish behavior recognition. Comput. Electron. Agric. 2023, 208, 107780. [Google Scholar]
  81. Feng, S.; Yang, X.; Liu, Y.; Zhao, Z.; Liu, J.; Yan, Y.; Zhou, C. Fish feeding intensity quantification using machine vision and a lightweight 3D ResNet-GloRe network. Aquac. Eng. 2022, 98, 102240. [Google Scholar] [CrossRef]
  82. Zhang, L.; Wang, J.; Li, B.; Liu, Y.; Zhang, H.; Duan, Q. A MobileNetV2-SENet-based method for identifying fish school feeding behavior. Aquac. Eng. 2022, 99, 102288. [Google Scholar] [CrossRef]
  83. Wang, H.; Zhang, S.; Zhao, S.L. Real-time detection and tracking of fish abnormal behavior based on improved YOLOv5 and SiamRPN++. Comput. Electron. Agric. 2022, 192, 106512. [Google Scholar] [CrossRef]
  84. Liu, J.; Chen, X.; Zhang, J.; Wang, H.; Li, D. CFFI-ViT: Enhanced vision transformer for the accurate classification of fish feeding intensity. J. Mar. Sci. Eng. 2024, 12, 1132. [Google Scholar] [CrossRef]
  85. Zhao, S.; Ding, W.; Zhao, S.; Gu, J. Adaptive neural fuzzy inference system for feeding decision-making of grass carp (Ctenopharyngodon idellus) in outdoor intensive culturing ponds. Aquaculture 2019, 498, 28–36. [Google Scholar] [CrossRef]
  86. Wang, G.X.; Muhammad, A.; Liu, C. Automatic recognition of fish behavior with a fusion of RGB and optical flow data based on deep learning. Animals 2021, 11, 2774. [Google Scholar] [CrossRef] [PubMed]
  87. Ubina, F.C.; Estuar, M.R.J.E.; Ubina, C.D. Optical flow neural network for fish swimming behavior and activity analysis. Appl. Artif. Intell. 2021, 35, 1409–1424. [Google Scholar]
  88. Yang, X.; Zhang, S.; Liu, J.; Gao, Q.; Dong, S.; Zhou, C. Deep learning for smart fish farming: Applications, opportunities and challenges. Rev. Aquac. 2021, 13, 66–90. [Google Scholar] [CrossRef]
  89. Saminiano, B. Feeding behavior classification of Nile Tilapia (Oreochromis niloticus) using convolutional neural network. Int. J. Adv. Trends Comput. Sci. Eng. 2020, 9, 259–263. [Google Scholar] [CrossRef]
  90. Zhang, Y.; Wang, J.; Duan, Q. Application of convolutional neural networks (CNN) for fish feeding detection. J. Aquac. Res. Dev. 2020, 11, 543–550. [Google Scholar]
  91. Fernandes, R.; Turra, E.M.; de Alvarenga, R.; Passafaro, T.L.; Lopes, F.B.; Alves, G.F.; Singh, V.; Rosa, G.J. Deep learning-based analysis of fish feeding images using CNNs. Aquac. Rep. 2020, 18, 100426. [Google Scholar]
  92. Zhou, C.; Xu, D.; Sun, C.; Yang, X.; Chen, L. Delaunay triangulation and texture analysis for fish behavior recognition in aquaculture. Aquac. Res. 2018, 49, 1751–1762. [Google Scholar]
  93. Liu, Z.; Li, X.; Fan, L.; Lu, H.; Liu, L.; Liu, Y. Measuring feeding activity of fish in RAS using computer vision. Aquac. Eng. 2014, 60, 20–27. [Google Scholar] [CrossRef]
  94. Adegboye, M.A.; Aibinu, A.M.; Kolo, J.G. Incorporating intelligence in fish feeding system for dispensing feed based on fish feeding intensity. IEEE Access 2020, 8, 91948–91960. [Google Scholar] [CrossRef]
  95. Atoum, Y.; Srivastava, S.; Liu, X.M. Automatic feeding control for dense aquaculture fish tanks. IEEE Signal Process. Lett. 2015, 22, 1089–1093. [Google Scholar] [CrossRef]
  96. Chandran, P.J.I.; Khalil, H.A.; Hashir, P.K.; Veerasingam, S. Smart technologies in aquaculture: An integrated IoT, AI, and blockchain framework for sustainable growth. Aquac. Eng. 2025, 111, 102584. [Google Scholar] [CrossRef]
  97. Cao, J.; Wang, Y.; Chen, H.; Zhou, C. Enhanced CNN frameworks for identifying feeding behavior in aquaculture. Aquaculture 2023, 561, 738682. [Google Scholar]
  98. Vijayalakshmi, M.; Sasithradevi, A. AquaYOLO: Advanced YOLO-based fish detection for optimized aquaculture pond monitoring. Sci. Rep. 2025, 15, 6151. [Google Scholar] [CrossRef]
  99. Zhou, C.; Xu, D.; Chen, L.; Zhang, S.; Sun, C.; Yang, X.; Wang, Y. Evaluation of fish feeding intensity in aquaculture using a convolutional neural network and machine vision. Aquaculture 2019, 507, 457–466. [Google Scholar] [CrossRef]
  100. Cai, K.; Yang, Z.; Gao, T.; Liang, M.; Liu, P.; Zhou, S.; Pang, H.; Liu, Y. Efficient recognition of fish feeding behavior: A novel two-stage framework pioneering intelligent aquaculture strategies. Comput. Electron. Agric. 2024, 224, 109129. [Google Scholar] [CrossRef]
  101. Gu, X.Y.; Zhao, S.L.; Duan, Y.Q. MMFINet: A multimodal fusion network for accurate fish feeding intensity assessment in recirculating aquaculture systems. Comput. Electron. Agric. 2025, 232, 110138. [Google Scholar] [CrossRef]
  102. Fini, C.; Amato, S.G.; Scutaru, D.; Biancardi, S.; Antonucci, F.; Violino, S.; Ortenzi, L.; Nemmi, E.N.; Mei, A.; Pallottino, F.; et al. Application of generative artificial intelligence in the aquacultural sector. Aquac. Eng. 2025, 111, 102568. [Google Scholar] [CrossRef]
  103. Ma, P.; Yang, X.; Hu, W.; Fu, T.; Zhou, C. Fish feeding behavior recognition using time-domain and frequency-domain signals fusion from six-axis inertial sensors. Comput. Electron. Agric. 2024, 227, 109652. [Google Scholar] [CrossRef]
  104. Du, M.; Cui, X.; Xu, Z.; Bai, J.; Han, W.; Li, J.; Yang, X.; Liu, C.; Wang, D. Harnessing multimodal data fusion to advance accurate identification of fish feeding intensity. Biosyst. Eng. 2024, 246, 135–149. [Google Scholar] [CrossRef]
  105. Nayan, A.A.; Saha, J.; Mozumder, A.N.; Mahmud, K.R.; Al Azad, A.K.; Kibria, M.G. A machine learning approach for early detection of fish diseases by analyzing water quality. Trends Sci. 2021, 18, 351. [Google Scholar] [CrossRef]
  106. O’Donncha, F.; Stockwell, C.L.; Planellas, S.R.; Micallef, G.; Palmes, P.; Webb, C.; Filgueira, R.; Grant, J. Data driven insight into fish behaviour and their use for precision aquaculture. Front. Anim. Sci. 2021, 2, 695054. [Google Scholar] [CrossRef]
  107. Mandal, A.; Ghosh, A.R. Role of artificial intelligence (AI) in fish growth and health status monitoring: A review on sustainable aquaculture. Aquac. Int. 2024, 32, 2791–2820. [Google Scholar] [CrossRef]
  108. Costa, C.S.; Goncalves, W.N.; Zanoni, V.A.G.; Dos Santos De Arruda, M.; de Araujo Carvalho, M.; Nascimento, E.; Marcato, J.; Diemer, O.; Pistori, H. Counting tilapia larvae using images captured by smartphones. Smart Agric. Technol. 2023, 4, 10016. [Google Scholar] [CrossRef]
  109. Cui, M.; Liu, X.B.; Liu, H.H. Fish tracking, counting and behaviour analysis in digital aquaculture: A comprehensive survey. Rev. Aquac. 2025, 17, e13001. [Google Scholar] [CrossRef]
  110. Sadoul, B.; Alfonso, S.; Cousin, X.; Prunet, P.; Bégout, M.L.; Leguen, I. Global assessment of the response to chronic stress in European sea bass. Aquaculture 2021, 544, 737072. [Google Scholar] [CrossRef]
  111. Carbonara, P.; Alfonso, S.; Zupa, W.; Manfrin, A.; Fiocchi, E.; Pretto, T.; Spedicato, M.T.; Lembo, G. Behavioral and physiological responses to stocking density in sea bream (Sparus aurata): Do coping styles matter? Physiol. Behav. 2019, 212, 112698. [Google Scholar] [CrossRef]
  112. Li, D.L.; Wang, G.X.; Du, L. Recent advances in intelligent recognition methods for fish stress behavior. Aquac. Eng. 2022, 96, 102222. [Google Scholar] [CrossRef]
  113. Kolarevic, J.; Aas-Hansen, Ø.; Espmark, Å.; Baeverfjord, G.; Terjesen, B.F.; Damsgård, B. The use of acoustic acceleration transmitter tags for monitoring of Atlantic salmon swimming activity in recirculating aquaculture systems (RAS). Aquac. Eng. 2016, 72, 30–39. [Google Scholar] [CrossRef]
  114. Martinez-Alpiste, I.; De Tailly, J.B.; Alcaraz-Calero, J.M. Machine learning-based understanding of aquatic animal behaviour in high-turbidity waters. Expert Syst. Appl. 2024, 255, 124804. [Google Scholar] [CrossRef]
  115. Wang, X.; Li, P.; Chen, R.; Zhang, J.; Liu, Z. Appearance-motion autoencoder network (AMA-Net) for behavior recognition of Oplegnathus punctatus. Aquaculture 2023, 569, 739302. [Google Scholar]
  116. Huang, J.; Yu, X.; Chen, X.; An, D.; Zhou, Y.; Wei, Y. Recognizing fish behavior in aquaculture with graph convolutional network. Aquac. Eng. 2022, 98, 102246. [Google Scholar] [CrossRef]
  117. Kong, L.; Xu, W.; Zhao, J.; Sun, F. Active learning with VGG16 for behavior detection in Oplegnathus punctatus. Aquac. Eng. 2022, 96, 102175. [Google Scholar]
  118. Hu, C.; Yang, X.; Xu, D.; Zhou, C.; Chen, L. Automated monitoring of fish behavior in recirculating aquaculture systems using edge detection and segmentation methods. Aquac. Eng. 2015, 67, 13–24. [Google Scholar] [CrossRef]
  119. Sadoul, B.; Vijayan, M.M.; Schram, E.; Aluru, N.; Wendelaar Bonga, S.E. Physiological and behavioral responses to multiple stressors in farmed fish: Application of imaging and dispersion indices. Aquaculture 2014, 432, 362–370. [Google Scholar]
  120. Pinkiewicz, T.H.; Purser, G.J.; Williams, R.N. A computer vision system to analyse the swimming behaviour of farmed fish in commercial aquaculture facilities: A case study using cage-held Atlantic salmon. Aquac. Eng. 2011, 45, 20–27. [Google Scholar] [CrossRef]
  121. Han, F.F.; Zhu, J.C.; Liu, B. Fish shoals behavior detection based on convolutional neural network and spatiotemporal information. IEEE Access 2020, 8, 126907–126926. [Google Scholar] [CrossRef]
  122. Hu, J.; Zhao, D.D.; Zhang, Y.F. Real-time nondestructive fish behavior detecting in mixed polyculture system using deep learning and low-cost devices. Expert Syst. Appl. 2021, 178, 115051. [Google Scholar] [CrossRef]
  123. Iqbal, U.; Li, D.L.; Akhter, M. Intelligent diagnosis of fish behavior using deep learning method. Fishes 2022, 7, 201. [Google Scholar] [CrossRef]
  124. Rutz, C.; Bronstein, M.; Raskin, A. Using machine learning to decode animal communication. Science 2023, 381, 152–155. [Google Scholar] [CrossRef] [PubMed]
  125. Saad Saoud, L.; Sultan, A.; Elmezain, M. Beyond observation: Deep learning for animal behavior and ecological conservation. Ecol. Inform. 2024, 84, 102893. [Google Scholar] [CrossRef]
  126. Wang, J.H.; Lee, S.K.; Lai, Y.C.; Lin, C.C.; Wang, T.Y.; Lin, Y.R.; Hsu, T.H.; Huang, C.W.; Chiang, C.P. Anomalous behaviors detection for underwater fish using AI techniques. IEEE Access 2020, 8, 224372–224382. [Google Scholar] [CrossRef]
  127. Zhao, Y.X.; Qin, H.X.; Xu, L.A. Review of deep learning-based stereo vision techniques for phenotype feature and behavioral analysis of fish in aquaculture. Artif. Intell. Rev. 2025, 58, 7. [Google Scholar] [CrossRef]
  128. Zheng, T.; Wu, J.F.; Kong, H. A video object segmentation-based fish individual recognition method for underwater complex environments. Ecol. Inform. 2024, 82, 102689. [Google Scholar] [CrossRef]
  129. Long, L.; Johnson, Z.V.; Li, J.; Lancaster, T.J.; Aljapur, V.; Streelman, J.T.; McGrath, P.T. Automatic classification of cichlid behaviors using 3D convolutional residual networks. iScience 2020, 23, 101591. [Google Scholar] [CrossRef]
  130. Yang, H.; Zhou, C.; Shi, Y.; Wang, X.; Wang, J.; Ye, H. BlendMask-VoNetV2: Robust detection of overlapping fish behavior in aquaculture. Comput. Electron. Agric. 2023, 212, 108023. [Google Scholar]
  131. Huntingford, F.A.; Adams, C.; Braithwaite, V.A.; Kadri, S.; Pottinger, T.G.; Sandøe, P.; Turnbull, J.F. Current issues in fish welfare. J. Fish Biol. 2006, 68, 332–372. [Google Scholar] [CrossRef]
  132. Chakravorty, H. New approach for disease fish identification using augmented reality and image processing technique. IPASJ Int. J. Comput. Sci. 2021, 9, 3. [Google Scholar]
  133. Alfonso, S.; Sadoul, B.; Cousin, X.; Bégout, M.L. Spatial distribution and activity patterns as welfare indicators in response to water quality changes in European sea bass (Dicentrarchus labrax). Appl. Anim. Behav. Sci. 2020, 226, 104974. [Google Scholar] [CrossRef]
  134. Ashley, P.J. Fish welfare: Current issues in aquaculture. Appl. Anim. Behav. Sci. 2007, 104, 199–235. [Google Scholar] [CrossRef]
  135. Bohara, K.; Joshi, P.; Acharya, K.P.; Ramena, G. Emerging technologies revolutionising disease diagnosis and monitoring in aquatic animal health. Rev. Aquac. 2024, 16, 836–854. [Google Scholar] [CrossRef]
  136. Ubina, N.A.; Lan, H.Y.; Cheng, S.C.; Chang, C.C.; Lin, S.S.; Zhang, K.X.; Lu, H.Y.; Cheng, C.Y.; Hsieh, Y.Z. Digital twin-based intelligent fish farming with Artificial Intelligence Internet of Things (AIoT). Smart Agric. Technol. 2023, 5, 100285. [Google Scholar] [CrossRef]
  137. Lim, L.W.K. Implementation of artificial intelligence in aquaculture and fisheries: Deep learning, machine vision, big data, internet of things, robots and beyond. J. Comput. Commun. Eng. 2024, 3, 112–118. [Google Scholar] [CrossRef]
Figure 1. The main stages of an intelligent feeding system for fish.
Figure 1. The main stages of an intelligent feeding system for fish.
Applsci 15 12638 g001
Figure 2. Example of a smart aquaculture farm.
Figure 2. Example of a smart aquaculture farm.
Applsci 15 12638 g002
Table 1. Traditional vs. smart farming systems.
Table 1. Traditional vs. smart farming systems.
Problems of Traditional Farming SystemsOpportunities of Smart Farming Systems
Manual and poorly automated processesEach stage can generate valuable information on feeding, welfare, quality, and product yield
Many data remain isolated and are not objectively measuredPossibility to digitalize every stage of the production chain
Difficulty in identifying waste and inefficiencies in real timeEnables integration of all available data
Difficulty in tracking and standardising processesNew technologies (AI, computer vision, and IoT) make it possible to collect, interpret, and enhance large volumes of data
High variability among animals and batchesThe supply chain has access to thousands of hidden data points, previously uncollected or only partially used
Quality controls performed only on samplesEach animal becomes a data source that can be reused to improve genetic selection, production, and animal welfare
Difficulty in simultaneously analysing behaviour, product quality, and economic performanceBuild an intelligent, transparent supply chain focused on quality control and continuous improvement
Welfare indicators estimated rather than measured
Growing demand for transparency and ethical responsibility
Table 2. AI feeding monitoring methods in aquaculture production.
Table 2. AI feeding monitoring methods in aquaculture production.
Method/ModelData TypeApplicationReference
MMKDRimage-basedfeeding behaviour and intensity quantification[68]
YOLO feedimage-basedfeed intake[69]
Feedforward neural network (FFN)water surface
fluctuations
feeding behaviour
YOLOv5image-basedfeeding behaviour
Dicentrarchus labrax
[70]
MobileViT-SENetimage-basedfish density and feeding intensity in outdoor ponds[58]
YOLOv8 modeimage-basedfish swimming behaviour and activity degree[71]
Mul-SEResNet50multi-informationsound and activity degree
Oncorhynchus mykiss
[72]
DeformAtt-ViTimage-basedswimming behaviour and activity degree
largemouth bass
[73]
RCNNvideo-basedactivity degree
Ctenopharyngodon idella
[74]
FishFeed methodsvideo-basedfish density and spatial information[75]
CNNimage-basedfeeding behaviour classification[76]
BlendMask-VoNetV2video-basedfish swimming behaviour and activity degree[77]
Audio spectrum swin transformer networkmulti-informationsound and activity level
Oncorhynchus mykiss
[78]
STANmulti-informationswimming and spatial features
pompanos
[79]
MMTMmulti-informationsound and activity degree
Oncorhynchus aguabonita
[56]
LC-GhostNet lightweight networkmulti-informationsound and activity degree
Oplegnathus punctatus
[80]
MSIF- MobileNetV3image-basedswimming behaviour and activity degree
Oplegnathus punctatus
[6]
Resnet50-LSTMvideo-basedswimming behaviour and activity degree[81]
3D ResNet-GloRevideo-basedswimming behaviour and activity degree
Oncorhynchus mykiss
[82]
MobileNetV2- SENetimage-basedswimming behaviour and activity degree
Plectropomus leopardus
[83]
Multi-task networkmulti-informationgroup activity level
Oplegnathus punctatus
[84]
Long-term recurrent convolutional networkimage-basedfeeding behaviour
grass and crucian carps
[62]
YOLOv4image-basedwater feed detection[85]
CNNimage-basedwater feed detection[86]
CNNimage-basedwater feed detection[87]
Customised recurrent neural networkvideo-basedswimming behaviour and activity degree
American black bass
[67]
Optical flow neural networkimage-basedswimming behaviour and activity degree[88]
Dual attention network-EfficinetB2image-basedfeeding behaviour
[89]
Optical flow modeloptical flowfeeding behaviour recognition[66]
CNNimage-basedfeeding behaviour
Tilapia
[90]
CNNimage-basedfish recognition/feeding detection[91]
CNNimage-basedfeeding behaviour detection[92]
Dual-Stream Recurrent Networkvideo-basedspatial and motion information
Atlantic salmon
[64]
CNNimage-basedfeeding behaviour
Tilapia
[93]
RNNsensorsfish growth/environment modelling[42]
Computer vision feeding indeximage-basedfeeding activity assessment[94]
Table 3. AI fish behaviour monitoring methods.
Table 3. AI fish behaviour monitoring methods.
Method/ModelData TypeApplicationReferences
AquaYOLOimage-basedfish detection[99]
AMA-Netvideo-basedappearance and motion
Oplegnathus punctatus
[118]
Graph Convolution Networks (GCN)image-basedswimming/spatial features of Oncorhynchus mykiss[119]
VGG16 + Active Learningimage-basedswimming behaviour [120]
Multi-task networkmulti modalgroup activity level
Oplegnathus punctatus
[83]
CNNimage-basedfish recognition[9]
CNNimage-basedspatial information
Tilapia
[90]
LeNet-5image-basedswimming behaviour and activity
Tilapia
[98]
Recurrent Networkimage-basedfish behaviour recognition[34]
SVM + image textureImage-basedfish behaviour analysis[84]
Social force modelSimulationfish schooling dynamics[25]
Grayscale + edge detectionimage-basedfish location and quantity[35]
Image edge detection + threshold segmentationimage-basedbehaviour analysis[121]
Image processing methodsimage-basedgroup dispersion and activity index[122]
Adaptive threshold + edge detectionimage-basedswimming velocity and direction[123]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

D’Agaro, E. Fish Farming 5.0: Advanced Tools for a Smart Aquaculture Management. Appl. Sci. 2025, 15, 12638. https://doi.org/10.3390/app152312638

AMA Style

D’Agaro E. Fish Farming 5.0: Advanced Tools for a Smart Aquaculture Management. Applied Sciences. 2025; 15(23):12638. https://doi.org/10.3390/app152312638

Chicago/Turabian Style

D’Agaro, Edo. 2025. "Fish Farming 5.0: Advanced Tools for a Smart Aquaculture Management" Applied Sciences 15, no. 23: 12638. https://doi.org/10.3390/app152312638

APA Style

D’Agaro, E. (2025). Fish Farming 5.0: Advanced Tools for a Smart Aquaculture Management. Applied Sciences, 15(23), 12638. https://doi.org/10.3390/app152312638

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop