This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Open AccessArticle
Attention-Guided Edge-Optimized Network for Real-Time Detection and Counting of Pre-Weaning Piglets in Farrowing Crates
by
Ning Kong
Ning Kong 1,*
,
Tongshuai Liu
Tongshuai Liu 2,3,
Guoming Li
Guoming Li 4,5
,
Lei Xi
Lei Xi 2,3,
Shuo Wang
Shuo Wang 1 and
Yuepeng Shi
Yuepeng Shi 6
1
School of Energy and Intelligence Engineering, Henan University of Animal Husbandry and Economy, Zhengzhou 450046, China
2
College of Animal Science & Technology, Henan University of Animal Husbandry and Economy, Zhengzhou 450046, China
3
Henan Engineering Research Center on Animal Healthy Environment and Intelligent Equipment, Zhengzhou 450046, China
4
Department of Poultry Science, The University of Georgia, Athens, GA 30602, USA
5
Institute for Artificial Intelligence, The University of Georgia, Athens, GA 30602, USA
6
Science and Technology Division, Henan University of Animal Husbandry and Economy, Zhengzhou 450046, China
*
Author to whom correspondence should be addressed.
Animals 2025, 15(17), 2553; https://doi.org/10.3390/ani15172553 (registering DOI)
Submission received: 10 July 2025
/
Revised: 22 August 2025
/
Accepted: 27 August 2025
/
Published: 30 August 2025
(This article belongs to the Section
Pigs)
Simple Summary
To improve the survival and management of pre-weaning piglets, it is necessary to achieve accurate and real-time detection and counting in farrowing crates. However, frequent occlusion of the piglets, social behaviors, and cluttered backgrounds make this task difficult, especially when using lightweight models in resource-limited environments. In this study, we propose an improved piglet detection model based on YOLOv8n. The model replaces the original backbone module with a Multi-Scale Spatial Pyramid Attention (MSPA) module; introduces an improved Gather-and-Distribute (GD) mechanism in the neck; and optimizes the detection head and the sample assignment strategy. The experimental results show that compared with the baseline YOLOv8n, our model reduces the parameters, floating point operations, and model size by 58.45%, 46.91%, and 56.45%, respectively, while increasing the detection precision by 2.6% and reducing the counting error by 4.41%. In addition, the model was successfully deployed on a Raspberry Pi 4B, achieving an average inference speed of less than 87 ms per image. These results demonstrate that the proposed method achieves both high accuracy and a lightweight performance, providing a practical solution for intelligent pig farming.
Abstract
Accurate, real-time, and cost-effective detection and counting of pre-weaning piglets are critical for improving piglet survival rates. However, achieving this remains technically challenging due to high computational demands, frequent occlusion, social behaviors, and cluttered backgrounds in commercial farming environments. To address these challenges, this study proposes a lightweight and attention-enhanced piglet detection and counting network based on an improved YOLOv8n architecture. The design includes three key innovations: (i) the standard C2f modules in the backbone were replaced with an efficient novel Multi-Scale Spatial Pyramid Attention (MSPA) module to enhance the multi-scale feature representation while a maintaining low computational cost; (ii) an improved Gather-and-Distribute (GD) mechanism was incorporated into the neck to facilitate feature fusion and accelerate inference; and (iii) the detection head and the sample assignment strategy were optimized to align the classification and localization tasks better, thereby improving the overall performance. Experiments on the custom dataset demonstrated the model’s superiority over state-of-the-art counterparts, achieving 88.5% precision and a 93.8% . Furthermore, ablation studies showed that the model reduced the parameters, floating point operations (FLOPs), and model size by 58.45%, 46.91% and 56.45% compared to those of the baseline YOLOv8n, respectively, while achieving a 2.6% improvement in the detection precision and a 4.41% reduction in the counting MAE. The trained model was deployed on a Raspberry Pi 4B with to verify the effectiveness of the lightweight design, reaching an average inference speed of <87 ms per image. These findings confirm that the proposed method offers a practical, scalable solution for intelligent pig farming, combining a high accuracy, efficiency, and real-time performance in resource-limited environments.
Share and Cite
MDPI and ACS Style
Kong, N.; Liu, T.; Li, G.; Xi, L.; Wang, S.; Shi, Y.
Attention-Guided Edge-Optimized Network for Real-Time Detection and Counting of Pre-Weaning Piglets in Farrowing Crates. Animals 2025, 15, 2553.
https://doi.org/10.3390/ani15172553
AMA Style
Kong N, Liu T, Li G, Xi L, Wang S, Shi Y.
Attention-Guided Edge-Optimized Network for Real-Time Detection and Counting of Pre-Weaning Piglets in Farrowing Crates. Animals. 2025; 15(17):2553.
https://doi.org/10.3390/ani15172553
Chicago/Turabian Style
Kong, Ning, Tongshuai Liu, Guoming Li, Lei Xi, Shuo Wang, and Yuepeng Shi.
2025. "Attention-Guided Edge-Optimized Network for Real-Time Detection and Counting of Pre-Weaning Piglets in Farrowing Crates" Animals 15, no. 17: 2553.
https://doi.org/10.3390/ani15172553
APA Style
Kong, N., Liu, T., Li, G., Xi, L., Wang, S., & Shi, Y.
(2025). Attention-Guided Edge-Optimized Network for Real-Time Detection and Counting of Pre-Weaning Piglets in Farrowing Crates. Animals, 15(17), 2553.
https://doi.org/10.3390/ani15172553
Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details
here.
Article Metrics
Article Access Statistics
For more information on the journal statistics, click
here.
Multiple requests from the same IP address are counted as one view.