Artificial Intelligence as a Useful Tool in Behavioural Studies

A special issue of Animals (ISSN 2076-2615). This special issue belongs to the section "Animal System and Management".

Deadline for manuscript submissions: 31 August 2026 | Viewed by 17646

Special Issue Editors

Special Issue Information

Dear Colleagues,

Animal monitoring is essential for understanding behaviour and social dynamics. Ethologists utilise focal animal sampling to collect detailed information on individual animals, which helps measure activity budgets and shape conservation strategies. This approach is particularly important for studying captive animals, as their ability to display natural behaviours is vital for their welfare and reintroduction success. The rapid advancement of subdisciplines within artificial intelligence, such as machine learning (ML), presents a unique opportunity for fields related to behavioural sciences. The growing availability of user-friendly ML software allows behavioural scientists to enhance the value and reproducibility of their research findings while also reducing their workload. The automated recording and quantification of behavioural traits create numerous possibilities for comparative studies, enabling robust quantitative conclusions. Consequently, there is an urgent need to integrate artificial intelligence into behavioural sciences for both in situ and ex situ studies of wild and captive populations. Furthermore, the application of AI techniques will facilitate quantitative assessments of animal welfare, potentially attracting researchers in the animal production field.

The aim of this Special Issue is to gather papers, including original research articles and review papers, short communications, perspectives, and technical notes, that provide insights into the application of artificial intelligence techniques in behavioural studies. By incorporating machine learning tools, this Special Issue will facilitate the precise quantification of the duration, speed, and direction of the behavioural traits being examined.

This Special Issue welcomes manuscripts that link together the following themes:

  • Potentiality and perspectives of machine learning techniques for behavioural studies;
  • Use of machine learnings techniques in behavioural science (with potential automatisation using machine learning techniques);
  • Synergistic effects of the use of machine learnings techniques with more conventional behavioural studies techniques;

We look forward to receiving your original research articles and reviews.

Prof. Dr. Cino Pertoldi
Dr. Sussie Pagh
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Animals is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • machine learning
  • artificial intelligence
  • Yolo
  • DeepLabCut
  • SLEAP
  • Create ML
  • in situ conservation
  • ex situ conservation
  • behavioural studies
  • wildlife biology
  • animal production
  • ethograms
  • time budget

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

14 pages, 127365 KB  
Article
CGS-BR: Construction and Benchmarking of a Respiratory Behavior Dataset for the Chinese Giant Salamander
by Dingwei Mao, Yan Zhou, Maochun Wang, Chenyang Shi, Yuanqiong Chen and Qinghua Luo
Animals 2026, 16(8), 1272; https://doi.org/10.3390/ani16081272 - 21 Apr 2026
Viewed by 242
Abstract
The Chinese giant salamander (Andrias davidianus) is a nationally protected species in China, and its respiratory behavior serves as a key indicator of its physiological state, health status, and biological rhythm. However, research on intelligent monitoring of its respiratory behavior remains [...] Read more.
The Chinese giant salamander (Andrias davidianus) is a nationally protected species in China, and its respiratory behavior serves as a key indicator of its physiological state, health status, and biological rhythm. However, research on intelligent monitoring of its respiratory behavior remains limited due to several challenges, including the species’ nocturnal habits, resulting in low image contrast and poor quality in dark environments; extremely subtle breathing movements; and high-cost manual annotation, leading to a scarcity of high-quality annotated visual data. These factors severely constrain the application of deep learning techniques in this field. To support research on respiratory behavior monitoring in the Chinese giant salamander, this study constructs and releases the CGS-BR dataset, which is the first vision-based dataset dedicated specifically to respiratory behavior detection in this species. The dataset was collected under controlled simulated breeding conditions and consists of 1732 images extracted from 215 high-definition video clips. Following a standardized procedure, each complete respiratory cycle is manually annotated into four stages: head-up, diving, exhalation, and inhalation. To validate the effectiveness of this dataset, this study selects YOLOv8n as the baseline model, which balances detection accuracy, speed, and parameter count, enabling efficient giant salamander respiratory detection under limited resources. By comparing it with several representative models, we provide a reliable evaluation of the dataset’s applicability. CGS-BR aims to provide fundamental data support for research on respiratory monitoring in the Chinese giant salamander, laying the foundation for subsequent applications in conservation management, captive breeding, health monitoring, and early disease warning. Full article
(This article belongs to the Special Issue Artificial Intelligence as a Useful Tool in Behavioural Studies)
Show Figures

Figure 1

17 pages, 55937 KB  
Article
Applicability of Machine Learning in Behavioural Monitoring of the Red Panda (Ailurus fulgens) in Zoos
by Amalie M. Worup, Anne S. Sonne, Jeppe Kudahl, Johanne H. Jacobsen, Sussie Pagh, Thea L. Faddersbøll and Cino Pertoldi
Animals 2026, 16(8), 1165; https://doi.org/10.3390/ani16081165 - 10 Apr 2026
Viewed by 706
Abstract
Welfare assessment for the endangered red panda (Ailurus fulgens) in captivity requires systematic behaviour monitoring, yet traditional direct observation is often limited by observer subjectivity and time constraints. This study evaluates the feasibility of employing machine learning (ML) to automate behavioural [...] Read more.
Welfare assessment for the endangered red panda (Ailurus fulgens) in captivity requires systematic behaviour monitoring, yet traditional direct observation is often limited by observer subjectivity and time constraints. This study evaluates the feasibility of employing machine learning (ML) to automate behavioural monitoring of a red panda in a complex, mixed-species enclosure at Aalborg Zoo, Denmark. Using video data from cameras in the enclosure of the red panda, and the ML model LabGym for animal detection and behavioural categorisation, models were trained to analyse activity patterns of the red panda. The results demonstrate that, while the behaviour categorizer is a promising tool with high classification confidence, the overall system effectiveness is currently limited by the object detector’s performance in a naturalistic environment. Challenges such as environmental obstructions (e.g., rocks, foliage, and trees) and the animal’s camouflage contributed to a significant amount of unclassified time, which may affect the overall assessment of behavioural distribution. We conclude that, while ML holds potential for non-invasive behaviour monitoring, its application in complex zoo settings requires improved detection capabilities to be fully reliable. Future iterations of this system could be enhanced by complementing standard object detection with pose estimation frameworks. Implementing alternative labelling strategies or background subtraction methods could additionally mitigate the detection challenges posed by environmental obstruction. Full article
(This article belongs to the Special Issue Artificial Intelligence as a Useful Tool in Behavioural Studies)
Show Figures

Figure 1

20 pages, 55314 KB  
Article
MSFN-YOLOv11: A Novel Multi-Scale Feature Fusion Recognition Model Based on Improved YOLOv11 for Real-Time Monitoring of Birds in Wetland Ecosystems
by Linqi Wang, Lin Ye, Xinbao Chen and Nan Chu
Animals 2025, 15(23), 3472; https://doi.org/10.3390/ani15233472 - 2 Dec 2025
Cited by 4 | Viewed by 1056
Abstract
Intelligent bird species recognition is vital for biodiversity monitoring and ecological conservation. This study tackles the challenge of declining recognition accuracy caused by occlusions and imaging noise in complex natural environments. Focusing on ten representative bird species from the Dongting Lake Wetland, we [...] Read more.
Intelligent bird species recognition is vital for biodiversity monitoring and ecological conservation. This study tackles the challenge of declining recognition accuracy caused by occlusions and imaging noise in complex natural environments. Focusing on ten representative bird species from the Dongting Lake Wetland, we propose an improved YOLOv11n-based model named MSFN-YOLO11, which incorporates multi-scale feature fusion. After selecting YOLOv11n as the baseline through comparison with the most-stable version of YOLOv8n, we enhance its backbone by introducing an MSFN module. This module strengthens global and local feature extraction via parallel dilated convolution and a channel attention mechanism. Experiments are conducted on a self-built dataset containing 4540 images of ten species with 6824 samples. To simulate real-world conditions, 25% of samples are augmented using random occlusion, Gaussian noise (σ = 0.2, 0.3, 0.4), and Poisson noise. The improved model achieves a mAP@50 of 96.4% and mAP@50-95 of 83.2% on the test set. Although the mAP@50 shows a slight improvement of 0.3% compared to the original YOLOv11, it has contributed to an 18% reduction in training time. Furthermore, it also demonstrates practical efficacy in processing dynamic video, attaining an average 63.1% accuracy at 1920 × 1080@72fps on an NVIDIA_Tesla_V100_SXM2_32_GB. The proposed model provides robust technical support for real-time bird monitoring in wetlands and enhances conservation efforts for endangered species. Full article
(This article belongs to the Special Issue Artificial Intelligence as a Useful Tool in Behavioural Studies)
Show Figures

Figure 1

24 pages, 6719 KB  
Article
Structure-Aware Multi-Animal Pose Estimation for Space Model Organism Behavior Analysis
by Kang Liu, Shengyang Li, Yixuan Lv, Rong Yang and Xuzhi Li
Animals 2025, 15(21), 3139; https://doi.org/10.3390/ani15213139 - 29 Oct 2025
Viewed by 1441
Abstract
Multi-animal pose estimation is a critical technique for enabling fine-grained quantification of group animal behaviors, which holds significant scientific value for uncovering behavioral changes under space environmental factors such as microgravity and radiation. Currently, the China Space Station has conducted a series of [...] Read more.
Multi-animal pose estimation is a critical technique for enabling fine-grained quantification of group animal behaviors, which holds significant scientific value for uncovering behavioral changes under space environmental factors such as microgravity and radiation. Currently, the China Space Station has conducted a series of space biology experiments involving typical model organisms, including Caenorhabditis elegans (C. elegans), zebrafish, and Drosophila. However, substantial differences in species types, body scales, and posture dynamics among these animals pose serious challenges to the generalization and robustness of traditional pose estimation methods. To address this, we propose a novel, flexible, and general single-stage multi-animal pose estimation method. The method constructs species-specific pose group representations based on anatomical priors, incorporates a multi-scale feature-sampling module to integrate shallow and deep visual cues, and employs a structure-guided learning mechanism to enhance keypoint localization robustness under occlusion and overlap. We evaluate our method on the SpaceAnimal dataset—the first public benchmark for pose estimation and tracking of model organisms in space—containing multi-species samples from both spaceflight and ground-based experiments. Our method achieves AP scores of 72.8%, 62.1%, and 67.1% on C. elegans, zebrafish, and Drosophila, respectively, surpassing the state-of-the-art performance. These findings demonstrate the effectiveness and robustness of the proposed method across species and imaging conditions, offering strong technical support for on-orbit behavior modeling and large-scale quantitative analysis. Full article
(This article belongs to the Special Issue Artificial Intelligence as a Useful Tool in Behavioural Studies)
Show Figures

Figure 1

11 pages, 1777 KB  
Communication
Comparing Manual and Automated Spatial Tracking of Captive Spider Monkeys Using Heatmaps
by Silje Marquardsen Lund, Frej Gammelgård, Jonas Nielsen, Laura Liv Nørgaard Larsen, Ninette Christensen, Sisse Puck Hansen, Trine Kristensen, Henriette Høyer Ørneborg Rodkjær, Shanthiya Manoharan Sivagnanasundram, Bianca Østergaard Thomsen, Sussie Pagh, Thea Loumand Faddersbøll and Cino Pertoldi
Animals 2025, 15(20), 3056; https://doi.org/10.3390/ani15203056 - 21 Oct 2025
Cited by 2 | Viewed by 1758
Abstract
Animal welfare assessments increasingly aim to quantify enclosure use and activity to support naturalistic behavior and improve Quality of Life (QoL). Traditionally, this is achieved through manual observations, which are time-consuming, subject to observer bias, and limited in temporal resolution due to short [...] Read more.
Animal welfare assessments increasingly aim to quantify enclosure use and activity to support naturalistic behavior and improve Quality of Life (QoL). Traditionally, this is achieved through manual observations, which are time-consuming, subject to observer bias, and limited in temporal resolution due to short observation periods. Here, we compared manual tracking using ZooMonitor with automated pose estimation (SLEAP) in a mother–son pair of black-headed spider monkeys (Ateles fusciceps) at Aalborg Zoo. We collected manual observations on six non-consecutive days (median daily duration: 62 min, mean: 66 min, range: 52–90 min) and visualized this as spatial heatmaps. We applied pose estimation to the same video footage, tracking four body parts to generate corresponding heatmaps. Across most days, the methods showed strong agreement (overlap 83–99%, Pearson’s r = 0.93–1.00), with both highlighting core activity areas on the floor near the central climbing structures and by the door with feeding gutters. Both methods also produced comparable estimates of time spent being active, with no significant difference across days (p = 0.952). Our results demonstrate that computer vision technology can provide a reliable and scalable tool for monitoring enclosure use and activity, enhancing the efficiency and consistency of zoo-based welfare assessments while reducing reliance on labor-intensive manual observations. Full article
(This article belongs to the Special Issue Artificial Intelligence as a Useful Tool in Behavioural Studies)
Show Figures

Figure 1

20 pages, 2770 KB  
Article
Foundations of Livestock Behavioral Recognition: Ethogram Analysis of Behavioral Definitions and Its Practices in Multimodal Large Language Models
by Siling Zhou, Wenjie Li, Mengting Zhou, Ryan N. Dilger, Isabella C. F. S. Condotta, Zhonghong Wu, Xiangfang Tang, Yiqi Wu, Tao Wang and Jiangong Li
Animals 2025, 15(20), 3030; https://doi.org/10.3390/ani15203030 - 19 Oct 2025
Viewed by 1885
Abstract
Computer vision offers a promising approach to automating the observation of animal behavior, thereby contributing to improved animal welfare and precision livestock management. However, the absence of standardized behavioral definitions limits the accuracy and generalizability of artificial intelligence models used for behavior recognition. [...] Read more.
Computer vision offers a promising approach to automating the observation of animal behavior, thereby contributing to improved animal welfare and precision livestock management. However, the absence of standardized behavioral definitions limits the accuracy and generalizability of artificial intelligence models used for behavior recognition. This study applied natural language processing techniques to analyze 655 behavior definitions related to feeding, drinking, resting, and moving, as reported in the livestock research literature published between 2000 and 2023. Clustering and structural analyses revealed consistent semantic patterns across behavior categories. Feeding and drinking behaviors were concisely defined in 6–10 words, including the semantic elements of body parts, actions, and action objects. Resting and moving behaviors were described in 6–15 words. Resting behavior was defined by actions and action objects, while moving behaviors were characterized by action words only. By integrating these structured definitions into prompts, ChatGPT-4o achieved an average correspondence score of 4.53 out of 5 in an image-based piglet behavior annotation task. These findings highlight the value of standardized behavior definitions in supporting more accurate and generalizable behavior recognition models for precision livestock farming. Full article
(This article belongs to the Special Issue Artificial Intelligence as a Useful Tool in Behavioural Studies)
Show Figures

Figure 1

16 pages, 25315 KB  
Article
A Deep Learning Framework for Multi-Object Tracking in Space Animal Behavior Studies
by Zhuang Zhou, Shengyang Li, Yixuan Lv, Kang Liu, Yuxuan Cao and Shicheng Guo
Animals 2025, 15(16), 2448; https://doi.org/10.3390/ani15162448 - 20 Aug 2025
Viewed by 1723
Abstract
In space environments, microgravity, high radiation, and weak magnetic fields induce behavioral alterations in animals, resulting in erratic movement patterns that complicate tracking. These challenges impede accurate behavioral analysis, especially in multi-object scenarios. To address this issue, this study proposes a deep learning-based [...] Read more.
In space environments, microgravity, high radiation, and weak magnetic fields induce behavioral alterations in animals, resulting in erratic movement patterns that complicate tracking. These challenges impede accurate behavioral analysis, especially in multi-object scenarios. To address this issue, this study proposes a deep learning-based multi-object tracking (MOT) framework specifically designed for space animals. The proposed method decouples appearance and motion features through dual-stream inputs and employs modality-specific encoders (MSEs), which are fused via a heterogeneous graph network to model cross-modal spatio-temporal relationships. Additionally, an object re-detection module is integrated to maintain identity continuity during occlusions or rapid movements. This approach is validated using public datasets of space-observed Drosophila and zebrafish, with experimental results demonstrating superior performance compared with existing tracking methods. This work highlights the potential of artificial intelligence as a valuable tool in behavioral studies, enabling reliable animal tracking and analysis under extreme space conditions and supporting future research in space life sciences. Full article
(This article belongs to the Special Issue Artificial Intelligence as a Useful Tool in Behavioural Studies)
Show Figures

Figure 1

Review

Jump to: Research, Other

34 pages, 1465 KB  
Review
AI and Data Analytics in the Dairy Farms: A Scoping Review
by Osvaldo Palma, Lluis M. Plà-Aragonés, Alejandro Mac Cawley and Víctor M. Albornoz
Animals 2025, 15(9), 1291; https://doi.org/10.3390/ani15091291 - 30 Apr 2025
Cited by 11 | Viewed by 6260
Abstract
The strong growth of the world population will cause a major increase in demand for bovine milk, making it necessary to use various technologies to increase milk production efficiently. Some technologies that can contribute to solving part of this problem are those related [...] Read more.
The strong growth of the world population will cause a major increase in demand for bovine milk, making it necessary to use various technologies to increase milk production efficiently. Some technologies that can contribute to solving part of this problem are those related to data analytics tools, big data, and sensor development. It is timely to review modern technologies and data analytics methods for milk predictions in view of supporting decision-making in dairy farms. To this end, a scoping review was carried out, which resulted in 151 articles of interest. Among the most important results, we found that (i) the identified studies are relatively recent with an average publication time of 5.95 years; (ii) the scope of the selected studies is mostly concentrated on milk and prediction (29%), early detection of lameness (26%), and timely detection of mastitis (13%); (iii) the type of analysis is mostly predictive (87%), and prescriptive is barely present (3%); (iv) the types of input data used in the studies are preferably historical (70%), and real-time data (25%) are used less frequently; (v) we found that the method of artificial neural networks (47%) and the convolutional neural networks (24%) are the most used for the studies regarding bovine milk output predictions. In the selected studies, the artificial neural network methods have considerable accuracy, recall, precision, and F1 Scores on average but with high ranges and standard deviations. (vi) Simulation tools are scarcely used, being present in 4% of cases. In the treatment of variability, the models reviewed are mostly deterministic (77%), and the stochastic models (5%) are considered in a small number of cases. Based on our analysis, we conclude that future research on decision-making tools will benefit from the advantages of artificial neural networks in data analytics combined with optimization–simulation methods. Full article
(This article belongs to the Special Issue Artificial Intelligence as a Useful Tool in Behavioural Studies)
Show Figures

Figure 1

Other

Jump to: Research, Review

23 pages, 3606 KB  
Protocol
Optimizing Feeding Schedule and Live-Weight Prediction for Native Chicken Based on Machine Learning
by Chung-Liang Chang and Rui-Yi Xu
Animals 2026, 16(1), 75; https://doi.org/10.3390/ani16010075 - 26 Dec 2025
Cited by 1 | Viewed by 1032
Abstract
To meet market supply and demand, producers must accurately schedule processing dates to ensure optimal pricing. This study developed a practical feeding program system for local Taiwanese chicken breeds, including Guzao males, Huangjin females, and Red Junglefowl males. The system integrates daily predictions [...] Read more.
To meet market supply and demand, producers must accurately schedule processing dates to ensure optimal pricing. This study developed a practical feeding program system for local Taiwanese chicken breeds, including Guzao males, Huangjin females, and Red Junglefowl males. The system integrates daily predictions of cage-level body weight to guide each flock toward a target weight before the planned processing date. Four prediction models were evaluated, including random forest, XGBoost, Extra Trees, and an artificial neural network. The best-performing model was embedded into the system, and an Extra Trees model was used to estimate the total remaining ration and update daily feed allocations under standard feeding conditions. A validation experiment was conducted using a 54-day batch of Guzao males, during which cage-level data were collected. The feed conversion ratio of birds managed under the feeding program was compared with that of conventional feeding. The results provide preliminary support for the feasibility of a data-guided feeding program system with potential agricultural application value, although additional batches and cross-farm evaluations are needed to confirm generalizability and operational performance. Full article
(This article belongs to the Special Issue Artificial Intelligence as a Useful Tool in Behavioural Studies)
Show Figures

Figure 1

Back to TopTop