Precision Livestock Farming: New Techniques for Monitoring the Behaviour and Welfare of Farm Animal

A special issue of Animals (ISSN 2076-2615). This special issue belongs to the section "Animal Welfare".

Deadline for manuscript submissions: 31 October 2025 | Viewed by 6905

Special Issue Editor


E-Mail Website
Guest Editor
Department of Management, Development and Technology, School of Science and Engineering, São Paulo State University (UNESP), Av. Domingos da Costa Lopes, 780., Tupã 17602-496, SP, Brazil
Interests: poultry farming; heat stress; animal welfare assessment; image processing; computer vision; machine learning
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Welfare is an inherent quality of animals that is difficult to understand. If animals are well, they will be in good health, and will express natural behaviors and their full genetic potential in production. Over the years, industrial production systems have guaranteed adequate food and thermal environment, safety and health for animals; however, they have neglected important conditions for well-being, such as space to express natural behaviors.

Monitoring animal behavior is an arduous task for humans. By monitoring behavior, we have important information about the welfare of animals. The increased capacity and processing speed of modern computers has made it possible to analyze large volumes of varied and disorganized data, develop complex multivariable models, and implement autonomous monitoring systems that record and analyze signals from sensors installed on the farm. All this expanded capacity to study animals in the production environment has made it possible to expand our knowledge about the behavior and well-being of production animals.

I invite you to submit original research articles, reviews and case studies on new animal monitoring systems and new data analysis techniques that contribute to increasing our knowledge about farm animals and enable a more accurate assessment of welfare.

Prof. Dr. Danilo Florentino Pereira
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Animals is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • artificial intelligence
  • automation
  • big data
  • environmental control
  • image analysis
  • instrumentation
  • Internet of Things (IoT)
  • machine learning
  • remote monitoring
  • smart housing

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 2887 KiB  
Article
Equivalence Between Optical Flow, the Unrest Index, and Walking Distance to Estimate the Welfare of Broiler Chickens
by Danilo Florentino Pereira, Irenilza de Alencar Nääs and Saman Abdanan Mehdizadeh
Animals 2025, 15(9), 1311; https://doi.org/10.3390/ani15091311 - 1 May 2025
Viewed by 124
Abstract
Modern poultry production demands scalable and non-invasive methods to monitor animal welfare, particularly as broiler strains are increasingly bred for rapid growth, often at the expense of mobility and health. This study evaluates two advanced computer vision techniques—Optical Flow and the Unrest Index—to [...] Read more.
Modern poultry production demands scalable and non-invasive methods to monitor animal welfare, particularly as broiler strains are increasingly bred for rapid growth, often at the expense of mobility and health. This study evaluates two advanced computer vision techniques—Optical Flow and the Unrest Index—to assess movement patterns in broiler chickens. Three commercial broiler strains (Hybro®, Cobb®, and Ross®) were housed in controlled environments and continuously monitored using ceiling-mounted video systems. Chicken movements were detected and tracked using a YOLO model, with centroid data informing both the Unrest Index and distance walked metrics. Optical Flow velocity metrics (mean, variance, skewness, and kurtosis) were extracted using the Farnebäck algorithm. Pearson correlation analyses revealed strong associations between Optical Flow variables and traditional movement indicators, with average velocity showing the strongest correlation to walked distance and the Unrest Index. Among the evaluated strains, Cobb® demonstrated the strongest correlation between Optical Flow variance and the Unrest Index, indicating a distinct movement profile. The equipment’s movement and the camera’s slight instability had a minimal effect on the Optical Flow measurement. Still, its strong correlation with the Unrest Index and walking distance accredits it as an effective method for high-resolution behavioral monitoring. This study supports the integration of Optical Flow and Unrest Index technologies into precision livestock systems, offering a foundation for predictive welfare management at scale. Full article
Show Figures

Figure 1

24 pages, 4262 KiB  
Article
PigFRIS: A Three-Stage Pipeline for Fence Occlusion Segmentation, GAN-Based Pig Face Inpainting, and Efficient Pig Face Recognition
by Ruihan Ma, Seyeon Chung, Sangcheol Kim and Hyongsuk Kim
Animals 2025, 15(7), 978; https://doi.org/10.3390/ani15070978 - 28 Mar 2025
Viewed by 327
Abstract
Accurate animal face recognition is essential for effective health monitoring, behavior analysis, and productivity management in smart farming. However, environmental obstructions and animal behaviors complicate identification tasks. In pig farming, fences and frequent movements often occlude essential facial features, while high inter-class similarity [...] Read more.
Accurate animal face recognition is essential for effective health monitoring, behavior analysis, and productivity management in smart farming. However, environmental obstructions and animal behaviors complicate identification tasks. In pig farming, fences and frequent movements often occlude essential facial features, while high inter-class similarity makes distinguishing individuals even more challenging. To address these issues, we introduce the Pig Face Recognition and Inpainting System (PigFRIS). This integrated framework enhances recognition accuracy by removing occlusions and restoring missing facial features. PigFRIS employs state-of-the-art occlusion detection with the YOLOv11 segmentation model, a GAN-based inpainting reconstruction module using AOT-GAN, and a lightweight recognition module tailored for pig face classification. In doing so, our system detects occlusions, reconstructs obscured regions, and emphasizes key facial features, thereby improving overall performance. Experimental results validate the effectiveness of PigFRIS. For instance, YOLO11l achieves a recall of 94.92% and a AP50 of 96.28% for occlusion detection, AOTGAN records a FID of 51.48 and an SSIM of 91.50% for image restoration, and EfficientNet-B2 attains an accuracy of 91.62% with an F1 Score of 91.44% in classification. Additionally, heatmap analysis reveals that the system successfully focuses on relevant facial features rather than irrelevant occlusions, enhancing classification reliability. This work offers a novel and practical solution for animal face recognition in smart farming. It overcomes the limitations of existing methods and contributes to more effective livestock management and advancements in agricultural technology. Full article
Show Figures

Figure 1

26 pages, 7964 KiB  
Article
Pig Face Open Set Recognition and Registration Using a Decoupled Detection System and Dual-Loss Vision Transformer
by Ruihan Ma, Hassan Ali, Malik Muhammad Waqar, Sang Cheol Kim and Hyongsuk Kim
Animals 2025, 15(5), 691; https://doi.org/10.3390/ani15050691 - 27 Feb 2025
Viewed by 465
Abstract
Effective pig farming relies on precise and adaptable animal identification methods, particularly in dynamic environments where new pigs are regularly added to the herd. However, pig face recognition is challenging due to high individual similarity, lighting variations, and occlusions. These factors hinder accurate [...] Read more.
Effective pig farming relies on precise and adaptable animal identification methods, particularly in dynamic environments where new pigs are regularly added to the herd. However, pig face recognition is challenging due to high individual similarity, lighting variations, and occlusions. These factors hinder accurate identification and monitoring. To address these issues under Open-Set conditions, we propose a three-phase Pig Face Open-Set Recognition (PFOSR) system. In the Training Phase, we adopt a decoupled design, first training a YOLOv8-based pig face detection model on a small labeled dataset to automatically locate pig faces in raw images. We then refine a Vision Transformer (ViT) recognition model via a dual-loss strategy—combining Sub-center ArcFace and Center Loss—to enhance both inter-class separation and intra-class compactness. Next, in the Known Pig Registration Phase, we utilize the trained detection and recognition modules to extract representative embeddings from 56 identified pigs, storing these feature vectors in a Pig Face Feature Gallery. Finally, in the Unknown and Known Pig Recognition and Registration Phase, newly acquired pig images are processed through the same detection–recognition pipeline, and the resulting embeddings are compared against the gallery via cosine similarity. If the system classifies a pig as unknown, it dynamically assigns a new ID and updates the gallery without disrupting existing entries. Our system demonstrates strong Open-Set recognition, achieving an AUROC of 0.922, OSCR of 0.90, and F1-Open of 0.94. In the closed set, it attains a precision@1 of 0.97, NMI of 0.92, and mean average precision@R of 0.96. These results validate our approach as a scalable, efficient solution for managing dynamic farm environments with high recognition accuracy, even under challenging conditions. Full article
Show Figures

Figure 1

23 pages, 12095 KiB  
Article
Research on a High-Efficiency Goat Individual Recognition Method Based on Machine Vision
by Yi Xue, Weiwei Wang, Mei Fang, Zhiming Guo, Keke Ning and Kui Wang
Animals 2024, 14(23), 3509; https://doi.org/10.3390/ani14233509 - 4 Dec 2024
Cited by 1 | Viewed by 992
Abstract
Accurate identification of individual goat identity is necessary for precision farming. Previous studies have primarily focused on using front face images for goat identification, leaving the potential of other appearances and multi-source appearance fusion unexplored. In this study, we used a self-developed multi-view [...] Read more.
Accurate identification of individual goat identity is necessary for precision farming. Previous studies have primarily focused on using front face images for goat identification, leaving the potential of other appearances and multi-source appearance fusion unexplored. In this study, we used a self-developed multi-view appearance image acquisition platform to capture five different appearances (left face, right face, front face, back body, and side body) from 54 Wanlin white goats. The recognition ability of different goat appearance images and its multi-source appearance fusion for its identity recognition was then systematically examined based on the four basic network models, namely, MobileNetV3, MobileViT, ResNet18, and VGG16, and the best combination of goat appearance and network was screened. When only one kind of goat appearance image was used, the combination of side body image and MobileViT was the best, with an accuracy of 99.63%; under identity recognition based on multi-source image appearance fusion, all recognition models after outlook fusion of two viewpoints generally outperformed single viewpoint appearance identity recognition models in recognizing the identity of individual goats; when three or more kinds of goat appearance images were utilized for fusion, any of the four models were capable of identifying the identity of an individual goat with 100% accuracy. Based on these results, a goat individual identity recognition strategy was proposed that balances accuracy, computation, and time, providing new ideas for goat individual identity recognition in complex farming contexts. Full article
Show Figures

Figure 1

27 pages, 23565 KiB  
Article
CAMLLA-YOLOv8n: Cow Behavior Recognition Based on Improved YOLOv8n
by Qingxiang Jia, Jucheng Yang, Shujie Han, Zihan Du and Jianzheng Liu
Animals 2024, 14(20), 3033; https://doi.org/10.3390/ani14203033 - 19 Oct 2024
Cited by 2 | Viewed by 1668
Abstract
Cow behavior carries important health information. The timely and accurate detection of standing, grazing, lying, estrus, licking, fighting, and other behaviors is crucial for individual cow monitoring and understanding of their health status. In this study, a model called CAMLLA-YOLOv8n is proposed for [...] Read more.
Cow behavior carries important health information. The timely and accurate detection of standing, grazing, lying, estrus, licking, fighting, and other behaviors is crucial for individual cow monitoring and understanding of their health status. In this study, a model called CAMLLA-YOLOv8n is proposed for Holstein cow behavior recognition. We use a hybrid data augmentation method to provide the model with rich Holstein cow behavior features and improve the YOLOV8n model to optimize the Holstein cow behavior detection results under challenging conditions. Specifically, we integrate the Coordinate Attention mechanism into the C2f module to form the C2f-CA module, which strengthens the expression of inter-channel feature information, enabling the model to more accurately identify and understand the spatial relationship between different Holstein cows’ positions, thereby improving the sensitivity to key areas and the ability to filter background interference. Secondly, the MLLAttention mechanism is introduced in the P3, P4, and P5 layers of the Neck part of the model to better cope with the challenges of Holstein cow behavior recognition caused by large-scale changes. In addition, we also innovatively improve the SPPF module to form the SPPF-GPE module, which optimizes small target recognition by combining global average pooling and global maximum pooling processing and enhances the model’s ability to capture the key parts of Holstein cow behavior in the environment. Given the limitations of traditional IoU loss in cow behavior detection, we replace CIoU loss with Shape–IoU loss, focusing on the shape and scale features of the Bounding Box, thereby improving the matching degree between the Prediction Box and the Ground Truth Box. In order to verify the effectiveness of the proposed CAMLLA-YOLOv8n algorithm, we conducted experiments on a self-constructed dataset containing 23,073 Holstein cow behavior instances. The experimental results show that, compared with models such as YOLOv3-tiny, YOLOv5n, YOLOv5s, YOLOv7-tiny, YOLOv8n, and YOLOv8s, the improved CAMLLA-YOLOv8n model achieved increases in Precision of 8.79%, 7.16%, 6.06%, 2.86%, 2.18%, and 2.69%, respectively, when detecting the states of Holstein cows grazing, standing, lying, licking, estrus, fighting, and empty bedding. Finally, although the Params and FLOPs of the CAMLLA-YOLOv8n model increased slightly compared with the YOLOv8n model, it achieved significant improvements of 2.18%, 1.62%, 1.84%, and 1.77% in the four key performance indicators of Precision, Recall, mAP@0.5, and mAP@0.5:0.95, respectively. This model, named CAMLLA-YOLOv8n, effectively meets the need for the accurate and rapid identification of Holstein cow behavior in actual agricultural environments. This research is significant for improving the economic benefits of farms and promoting the transformation of animal husbandry towards digitalization and intelligence. Full article
Show Figures

Figure 1

14 pages, 3115 KiB  
Article
Estimating the Energy Expenditure of Grazing Farm Animals Based on Dynamic Body Acceleration
by Pedro Gonçalves, João Magalhães and Daniel Corujo
Animals 2024, 14(15), 2140; https://doi.org/10.3390/ani14152140 - 23 Jul 2024
Viewed by 1244
Abstract
Indirect methods of measuring the energy expenditure of grazing animals using heartbeat variation or accelerometers are very convenient due to their low cost and low intrusiveness, allowing animals to maintain their usual routine. In the case of accelerometers, it is possible to use [...] Read more.
Indirect methods of measuring the energy expenditure of grazing animals using heartbeat variation or accelerometers are very convenient due to their low cost and low intrusiveness, allowing animals to maintain their usual routine. In the case of accelerometers, it is possible to use them to measure activity, as well as to classify animal behavior, allowing their usage in other scenarios. Despite the obvious convenience of use, it is important to evaluate the measurement error and understand the validity of the measurement through a simplistic method. In this paper, data from accelerometers were used to classify behavior and measure animal activity, and an algorithm was developed to calculate the energy expended by sheep. The results of the energy expenditure calculations were subsequently compared with the values reported in the literature, and it was verified that the values obtained were within the reference ranges. Although it cannot be used as a real metering of energy expended, the method is promising, as it can be integrated with other complementary sources of information, such as the evolution of the animal’s weight and ingestion time, thus providing assistance in animals’ dietary management. Full article
Show Figures

Figure 1

14 pages, 2015 KiB  
Article
Machine Vision Analysis of Ujumqin Sheep’s Walking Posture and Body Size
by Qing Qin, Chongyan Zhang, Mingxi Lan, Dan Zhao, Jingwen Zhang, Danni Wu, Xingyu Zhou, Tian Qin, Xuedan Gong, Zhixin Wang, Ruiqiang Zhao and Zhihong Liu
Animals 2024, 14(14), 2080; https://doi.org/10.3390/ani14142080 - 16 Jul 2024
Viewed by 1085
Abstract
The ability to recognize the body sizes of sheep is significantly influenced by posture, especially without artificial fixation, leading to more noticeable changes. This study presents a recognition model using the Mask R-CNN convolutional neural network to identify the sides and backs of [...] Read more.
The ability to recognize the body sizes of sheep is significantly influenced by posture, especially without artificial fixation, leading to more noticeable changes. This study presents a recognition model using the Mask R-CNN convolutional neural network to identify the sides and backs of sheep. The proposed approach includes an algorithm for extracting key frames through mask calculation and specific algorithms for head-down, head-up, and jumping postures of Ujumqin sheep. The study reported an accuracy of 94.70% in posture classification. We measured the body size parameters of Ujumqin sheep of different sexes and in different walking states, including observations of head-down and head-up. The errors for the head-down position of rams, in terms of body slanting length, withers height, hip height, and chest depth, were recorded as 0.08 ± 0.06, 0.09 ± 0.07, 0.07 ± 0.05, and 0.12 ± 0.09, respectively. For rams in the head-up position, the corresponding errors were 0.06 ± 0.05, 0.06 ± 0.05, 0.07 ± 0.05, and 0.13 ± 0.07, respectively. The errors for the head-down position of ewes, in terms of body slanting length, withers height, hip height, and chest depth, were recorded as 0.06 ± 0.05, 0.09 ± 0.08, 0.07 ± 0.06, and 0.13 ± 0.10, respectively. For ewes in the head-up position, the corresponding errors were 0.06 ± 0.05, 0.08 ± 0.06, 0.06 ± 0.04, and 0.16 ± 0.12, respectively. The study observed that sheep walking through a passage exhibited a more curved knee posture compared to normal measurements, often with a lowered head. This research presents a cost-effective data collection scheme for studying multiple postures in animal husbandry. Full article
Show Figures

Figure 1

Back to TopTop