Mathematical Modeling and Computer Vision in Animal Activity or Behavior: 2nd Edition

A special issue of Animals (ISSN 2076-2615). This special issue belongs to the section "Animal System and Management".

Deadline for manuscript submissions: 31 December 2025 | Viewed by 5552

Special Issue Editors


E-Mail Website
Guest Editor
College of Electronic Engineering, South China Agricultural University, Guangzhou 510642, China
Interests: precision livestock farming; computer vision; behavior detection and analysis; animal tracking; animal welfare

E-Mail Website
Guest Editor
College of Electronic Engineering, South China Agricultural University, Guangzhou 510642, China
Interests: precision livestock farming; computer vision; behavior detection and analysis; animal tracking; animal welfare
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The Special Issue "Mathematical Modeling and Computer Vision in Animal Activity or Behavior: 2nd Edition" of the journal Animals focuses on the application of advanced computational techniques to the study of animal behavior and activity. This Special Issue highlights the use of mathematical modeling and computer vision technologies to understand, monitor, and predict various aspects of animal behavior in a range of species, ranging from domestic animals to wildlife.

The scope of the Special Issue encompasses the development and implementation of algorithms and models that analyze animals’ movements, behavior patterns, and interactions within their environment. It encourages submissions that apply techniques such as machine learning, image processing, and sensor-based systems to capture and interpret behavioral data in an automated and objective manner. These approaches allow for real-time monitoring and detailed analysis that can improve animal welfare, conservation, and management.

This Special Issue requests contributions that explore novel computational methods, interdisciplinary approaches, and case studies demonstrating practical applications in fields such as ethology, ecology, animal husbandry, and veterinary science. By advancing the technological frontiers of behavioral studies, this Special Issue will offer valuable insights into animal biology, improve welfare standards, and optimize productivity in agricultural and conservation settings.

Dr. Haiming Gan
Prof. Dr. Yueju Xue
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Animals is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • animal behavior
  • mathematical modeling
  • computer vision
  • machine learning
  • behavior prediction
  • image processing
  • automated monitoring
  • sensor-based systems
  • animal welfare
  • livestock farming

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Related Special Issue

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

29 pages, 21305 KiB  
Article
Collaborative Optimization of Model Pruning and Knowledge Distillation for Efficient and Lightweight Multi-Behavior Recognition in Piglets
by Yizhi Luo, Kai Lin, Zixuan Xiao, Yuankai Chen, Chen Yang and Deqin Xiao
Animals 2025, 15(11), 1563; https://doi.org/10.3390/ani15111563 - 27 May 2025
Viewed by 373
Abstract
In modern intensive pig farming, accurately monitoring piglet behavior is crucial for health management and improving production efficiency. However, the complexity of existing models demands high computational resources, limiting the application of piglet behavior recognition in farming environments. In this study, the piglet [...] Read more.
In modern intensive pig farming, accurately monitoring piglet behavior is crucial for health management and improving production efficiency. However, the complexity of existing models demands high computational resources, limiting the application of piglet behavior recognition in farming environments. In this study, the piglet multi-behavior-recognition approach is divided into three stages. In the first stage, the LAMP pruning algorithm is used to prune and optimize redundant channels, resulting in the lightweight YOLOv8-Prune. In the second stage, based on YOLOv8, the AIFI module and the Gather–Distribute mechanism are incorporated, resulting in YOLOv8-GDA. In the third stage, using YOLOv8-GDA as the teacher model and YOLOv8-Prune as the student model, knowledge distillation is employed to further enhance detection accuracy, thus obtaining the YOLOv8-Piglet model, which strikes a balance between the detection accuracy and speed. Compared to the baseline model, YOLOv8-Piglet significantly reduces model complexity while improving detection performance, with a 6.3% increase in precision, 11.2% increase in recall, and an mAP@0.5 of 91.8%. The model was deployed on the NVIDIA Jetson Orin NX edge computing platform for the evaluation. The average inference time was reduced from 353.9 ms to 163.2 ms, resulting in a 53.8% reduction in the processing time. This study achieves a balance between model compression and recognition accuracy through the collaborative optimization of pruning and knowledge extraction. Full article
Show Figures

Figure 1

18 pages, 4718 KiB  
Article
SDGTrack: A Multi-Target Tracking Method for Pigs in Multiple Farming Scenarios
by Tao Liu, Dengfei Jie, Junwei Zhuang, Dehui Zhang and Jincheng He
Animals 2025, 15(11), 1543; https://doi.org/10.3390/ani15111543 - 24 May 2025
Viewed by 306
Abstract
In pig farming, multi-object tracking (MOT) algorithms are effective tools for identifying individual pigs and monitoring their health, which enhances management efficiency and intelligence. However, due to the considerable variation in breeding environments across different pig farms, existing models often struggle to perform [...] Read more.
In pig farming, multi-object tracking (MOT) algorithms are effective tools for identifying individual pigs and monitoring their health, which enhances management efficiency and intelligence. However, due to the considerable variation in breeding environments across different pig farms, existing models often struggle to perform well in unfamiliar settings. To enhance the model’s generalization in diverse tracking scenarios, we have innovatively proposed the SDGTrack method. This method improves tracking performance across various farming environments by enhancing the model’s adaptability to different domains and integrating an optimized tracking strategy, significantly increasing the generalization of group pig tracking technology across different scenarios. To comprehensively evaluate the potential of the SDGTrack method, we constructed a multi-scenario dataset that includes both public and private data, spanning ten distinct pig farming environments. We only used a portion of the daytime scenes as the training set, while the remaining daytime and nighttime scenes were used as the validation set for evaluation. The experimental results demonstrate that SDGTrack achieved a MOTA score of 80.9%, an IDSW of 24, and an IDF1 score of 85.1% across various scenarios. Compared to the original CSTrack method, SDGTrack improved the MOTA and IDF1 scores by 16.7% and 33.3%, respectively, while significantly reducing the number of ID switches by 94.6%. These findings indicate that SDGTrack offers robust tracking capabilities in previously unseen farming environments, providing a strong technical foundation for monitoring pigs in different settings. Full article
Show Figures

Figure 1

16 pages, 1891 KiB  
Article
Detecting Equine Gaits Through Rider-Worn Accelerometers
by Jorn Schampheleer, Anniek Eerdekens, Wout Joseph, Luc Martens and Margot Deruyck
Animals 2025, 15(8), 1080; https://doi.org/10.3390/ani15081080 - 8 Apr 2025
Viewed by 479
Abstract
Automatic horse gait classification offers insights into training intensity, but direct
sensor attachment to horses raises concerns about discomfort, behavioral disruption, and
entanglement risks. To address this, our study leverages rider-centric accelerometers for
movement classification. The position of a sensor, sampling frequency, and [...] Read more.
Automatic horse gait classification offers insights into training intensity, but direct
sensor attachment to horses raises concerns about discomfort, behavioral disruption, and
entanglement risks. To address this, our study leverages rider-centric accelerometers for
movement classification. The position of a sensor, sampling frequency, and window size of
segmented signal data have a major impact on classification accuracy in activity recognition.
Yet, there are no studies that have evaluated the effect of all these factors simultaneously
using accelerometer data from four distinct rider locations (the knee, backbone, chest, and
arm) across five riders and seven horses performing three gaits. A total of eight models
were compared, and an LSTM-convolutional network (ConvLSTM2D) achieved the highest
accuracy, with an average accuracy of 89.72% considering four movements (halt, walk,
trot, and canter). The model performed best with an interval width of four seconds and
a sampling frequency of 25 Hz. Additionally, an F1-score of 86.18% was achieved and
validated using LOSOCV (Leave One Subject Out Cross-Validation). Full article
Show Figures

Figure 1

26 pages, 73296 KiB  
Article
Leveraging Thermal Infrared Imaging for Pig Ear Detection Research: The TIRPigEar Dataset and Performances of Deep Learning Models
by Weihong Ma, Xingmeng Wang, Simon X. Yang, Lepeng Song and Qifeng Li
Animals 2025, 15(1), 41; https://doi.org/10.3390/ani15010041 - 27 Dec 2024
Viewed by 1003
Abstract
The stable physiological structure and rich vascular network of pig ears contribute to distinct thermal characteristics, which can reflect temperature variations. While the temperature of the pig ear does not directly represent core body temperature due to the ear’s role in thermoregulation, thermal [...] Read more.
The stable physiological structure and rich vascular network of pig ears contribute to distinct thermal characteristics, which can reflect temperature variations. While the temperature of the pig ear does not directly represent core body temperature due to the ear’s role in thermoregulation, thermal infrared imaging offers a feasible approach to analyzing individual pig status. Based on this background, a dataset comprising 23,189 thermal infrared images of pig ears (TIRPigEar) was established. The TIRPigEar dataset was obtained through a pig house inspection robot equipped with an infrared thermal imaging device, with post-processing conducted via manual annotation. By labeling pig ears within these images, a total of 69,567 labeled files were generated, which can be directly used for training pig ear detection models and enabling the analysis of pig temperature information by integrating the corresponding thermal imaging data. To validate the dataset’s utility, it was evaluated across various object detection algorithms. Experimental results show that the dataset achieves the highest precision, recall, and mAP50 on the YOLOv9m model, reaching 97.35%, 98.1%, and 98.6%, respectively. Overall, the TIRPigEar dataset demonstrates optimal performance when applied to the YOLOv9m algorithm. Utilizing thermal infrared imaging technology to detect pig ear information provides a non-contact, rapid, and effective method. Establishing the TIRPigEar dataset is highly significant, as it allows for a valuable resource for AI and precision livestock farming researchers to validate and improve their algorithms. This dataset will support many researchers in advancing precision livestock farming by enabling an efficient way for pig ear temperature analysis. Full article
Show Figures

Figure 1

19 pages, 7047 KiB  
Article
A Real-Time Lightweight Behavior Recognition Model for Multiple Dairy Goats
by Xiaobo Wang, Yufan Hu, Meili Wang, Mei Li, Wenxiao Zhao and Rui Mao
Animals 2024, 14(24), 3667; https://doi.org/10.3390/ani14243667 - 19 Dec 2024
Cited by 1 | Viewed by 1141
Abstract
Livestock behavior serves as a crucial indicator of physiological health. Leveraging deep learning techniques to automatically recognize dairy goat behaviors, particularly abnormal ones, enables early detection of potential health and environmental issues. To address the challenges of recognizing small-target behaviors in complex environments, [...] Read more.
Livestock behavior serves as a crucial indicator of physiological health. Leveraging deep learning techniques to automatically recognize dairy goat behaviors, particularly abnormal ones, enables early detection of potential health and environmental issues. To address the challenges of recognizing small-target behaviors in complex environments, a multi-scale and lightweight behavior recognition model for dairy goats called GSCW-YOLO was proposed. The model integrates Gaussian Context Transformation (GCT) and the Content-Aware Reassembly of Features (CARAFE) upsampling operator, enhancing the YOLOv8n framework’s attention to behavioral features, reducing interferences from complex backgrounds, and improving the ability to distinguish subtle behavior differences. Additionally, GSCW-YOLO incorporates a small-target detection layer and optimizes the Wise-IoU loss function, increasing its effectiveness in detecting distant small-target behaviors and transient abnormal behaviors in surveillance videos. Data for this study were collected via video surveillance under varying lighting conditions and evaluated on a self-constructed dataset comprising 9213 images. Experimental results demonstrated that the GSCW-YOLO model achieved a precision of 93.5%, a recall of 94.1%, and a mean Average Precision (mAP) of 97.5%, representing improvements of 3, 3.1, and 2 percentage points, respectively, compared to the YOLOv8n model. Furthermore, GSCW-YOLO is highly efficient, with a model size of just 5.9 MB and a frame per second (FPS) of 175. It outperforms popular models such as CenterNet, EfficientDet, and other YOLO-series networks, providing significant technical support for the intelligent management and welfare-focused breeding of dairy goats, thus advancing the modernization of the dairy goat industry. Full article
Show Figures

Figure 1

16 pages, 6692 KiB  
Article
Behavior Tracking and Analyses of Group-Housed Pigs Based on Improved ByteTrack
by Shuqin Tu, Haoxuan Ou, Liang Mao, Jiaying Du, Yuefei Cao and Weidian Chen
Animals 2024, 14(22), 3299; https://doi.org/10.3390/ani14223299 - 16 Nov 2024
Cited by 3 | Viewed by 1467
Abstract
Daily behavioral analysis of group-housed pigs provides critical insights into early warning systems for pig health issues and animal welfare in smart pig farming. In this study, our main objective was to develop an automated method for monitoring and analyzing the behavior of [...] Read more.
Daily behavioral analysis of group-housed pigs provides critical insights into early warning systems for pig health issues and animal welfare in smart pig farming. In this study, our main objective was to develop an automated method for monitoring and analyzing the behavior of group-reared pigs to detect health problems and improve animal welfare promptly. We have developed the method named Pig-ByteTrack. Our approach addresses target detection, Multi-Object Tracking (MOT), and behavioral time computation for each pig. The YOLOX-X detection model is employed for pig detection and behavior recognition, followed by Pig-ByteTrack for tracking behavioral information. In 1 min videos, the Pig-ByteTrack algorithm achieved Higher Order Tracking Accuracy (HOTA) of 72.9%, Multi-Object Tracking Accuracy (MOTA) of 91.7%, identification F1 Score (IDF1) of 89.0%, and ID switches (IDs) of 41. Compared with ByteTrack and TransTrack, the Pig-ByteTrack achieved significant improvements in HOTA, IDF1, MOTA, and IDs. In 10 min videos, the Pig-ByteTrack achieved the results with 59.3% of HOTA, 89.6% of MOTA, 53.0% of IDF1, and 198 of IDs, respectively. Experiments on video datasets demonstrate the method’s efficacy in behavior recognition and tracking, offering technical support for health and welfare monitoring of pig herds. Full article
Show Figures

Figure 1

Back to TopTop