Broiler Behavior Detection and Tracking Method Based on Lightweight Transformer
Abstract
:1. Introduction
- We propose replacing the original ResNet and HG series backbone network models with a lightweight backbone network, FasterNet. By implementing PConv, FasterNet minimizes redundant computations and memory accesses, thereby enhancing the efficiency of spatial feature extraction. This approach reduces the requirement of the model on the computing power of the device and improves the detection speed.
- To enhance information interaction within the model, we propose a novel cross-scale feature fusion network that includes a feature enhancement module (DDConv) and a feature fusion module (MSDCC). The feature enhancement module expands the convolution kernel, thereby increasing the receptive field of the feature prediction layer. Concurrently, the feature fusion module improves the model’s capacity to capture broader contextual information through residual connections, resulting in a richer gradient flow that compensates for the inherent limitations of feature extraction. This enhancement enables the model to more accurately comprehend the input data and generate more informative and discriminative representations, ultimately improving classification accuracy and generalization ability.
- The proposed combination of FCBD-DETR and Bytetrack [30] effectively facilitates the recognition, tracking, and counting of chicken behaviors. The FCBD-DETR model demonstrates outstanding performance across various downstream tasks. The implementation of these technologies can enhance the management efficiency of chicken farms, optimize resource allocation, and promote the sustainable development of smart farming practices.
2. Materials and Methods
2.1. Data Collection
2.2. Methods
2.3. Data Definition and Annotation
2.4. RT-DETR
- (1)
- Compared to other lightweight detection models, the Transformer requires the storage of large-scale key-value pairs due to its fully connected attention mechanism. This requirement can lead to memory limitations and slow detection speeds, particularly given the restricted computing resources available at actual breeding sites.
- (2)
- The single-neck feature structure of RT-DETR often results in the loss of contextual semantic information. Chickens possess similar appearance characteristics and tend to congregate, which poses significant challenges for distinguishing individual behaviors. This is especially problematic during detection and tracking, where mutual occlusion between individuals can easily result in target loss.
- (3)
- The varying shooting equipment utilized by different breeding institutions and researchers leads to significant discrepancies in the acquired images. Additionally, slight variations in shooting angles and heights can cause substantial changes in the image background. Consequently, the model must demonstrate strong generalization capabilities to effectively manage variable scenes and target features.
2.5. Modified FDDC-DETR
2.5.1. FasterNet
2.5.2. DDConv and MSDCC Module
2.6. Evaluation Metrics and Experimental Environment
2.6.1. Evaluation Metrics
2.6.2. Experimental Environment
3. Results and Discussion
3.1. Backbone Improvements
3.2. DDConv and MSDCC Ablation Experiments
3.3. Comparison Experiments of Different Models
3.4. Fine-Tuning Experiment
4. Chicken Behavior Tracking Extension Application
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- OECD; FAO. OECD-FAO Agricultural Outlook 2023–2032; OECD Publishing: Paris, France, 2023; Available online: https://www.fao.org/3/cc6361en/cc6361en.pdf (accessed on 15 October 2024).
- Welfare Quality®. Welfare Quality® Assessment Protocol for Poultry (Broilers, Laying Hens); Welfare Quality® Consortium: Lelystad, The Netherlands, 2009. [Google Scholar]
- Appleby, M.C.; Mench, J.A.; Hughes, B.O. Poultry Behaviour and Welfare; CABI: Wallingford, UK, 2004. [Google Scholar]
- Boissy, A.; Manteuffel, G.; Jensen, M.B.; Moe, R.O.; Spruijt, B.; Keeling, L.J.; Winckler, C.; Forkman, B.; Dimitrov, I.; Langbein, J.; et al. Assessment of positive emotions in animals to improve their welfare. Physiol. Behav. 2007, 92, 375–397. [Google Scholar] [CrossRef]
- Bradshaw, R.; Kirkden, R.D.; Broom, D.M. A review of the aetiology and pathology of leg weakness in broilers in relation to welfare. Avian Poult. Biol. Rev. 2002, 13, 45–103. [Google Scholar] [CrossRef]
- Tablante, N.L. Common Poultry Diseases and Their Prevention; University of Maryland Extension: College Park, MD, USA, 2013; pp. 1–53. [Google Scholar]
- Weeks, C.A.; Danbury, T.D.; Davies, H.C.; Hunt, P.; Kestin, S.C. The behaviour of broiler chickens and its modification by lameness. Appl. Anim. Behav. Sci. 2000, 67, 111–125. [Google Scholar] [CrossRef] [PubMed]
- Morris, T. Poultry Production Systems: Behaviour, Management and Welfare, by M. C. Appleby, B.O. Hughes & H. A. Elson. xvi + 238 pp. Wallingford: CAB International (1992). £40.00 or $76.00 (hardback). ISBN 0 85198 797 4. J. Agric. Sci. 1993, 120, 420–421. [Google Scholar] [CrossRef]
- Kashiha, M.; Pluk, A.; Bahr, C.; Vranken, E.; Berckmans, D. Development of an early warning system for a broiler house using computer vision. Biosyst. Eng. 2013, 116, 36–45. [Google Scholar] [CrossRef]
- Huang, J.; Wang, W.; Zhang, T. Method for detecting avian influenza disease of chickens based on sound analysis. Biosyst. Eng. 2019, 180, 16–24. [Google Scholar] [CrossRef]
- Zhuang, X.; Zhang, T. Detection of sick broilers by digital image processing and deep learning. Biosyst. Eng. 2019, 179, 106–116. [Google Scholar] [CrossRef]
- Zhuang, X.; Bi, M.; Guo, J.; Wu, S.; Zhang, T. Development of an early warning algorithm to detect sick broilers. Comput. Electron. Agric. 2018, 144, 102–113. [Google Scholar] [CrossRef]
- Rushton, J.; Viscarra, R.; Bleich, E.G.; McLeod, A. Impact of avian influenza outbreaks in the poultry sectors of five South East Asian countries (Cambodia, Indonesia, Lao PDR, Thailand, Viet Nam) outbreak costs, responses and potential long term control. Worlds Poult. Sci. J. 2005, 61, 491–514. [Google Scholar] [CrossRef]
- Blatchford, R.A.; Archer, G.S.; Mench, J.A. Contrast in light intensity, rather than day length, influences the behavior and health of broiler chickens. Poult. Sci. 2012, 91, 1768–1774. [Google Scholar] [CrossRef]
- Franco, B.R.; Shynkaruk, T.; Crowe, T.; Fancher, B.; French, N.; Gillingham, S.; Schwean-Lardner, K. Light color and the commercial broiler: Effect on behavior, fear, and stress. Poult. Sci. 2022, 101, 102052. [Google Scholar] [CrossRef] [PubMed]
- Girard MT, E.; Zuidhof, M.J.; Bench, C.J. Feeding, foraging, and feather pecking behaviours in precision-fed and skip-a-day-fed broiler breeder pullets. Appl. Anim. Behav. Sci. 2017, 188, 42–49. [Google Scholar] [CrossRef]
- Li, L.; Zhao, Y.; Oliveira, J.; Verhoijsen, W.; Liu, K.; Xin, H. A UHF RFID system for studying individual feeding and nesting behaviors of group-housed laying hens. Trans. ASABE 2017, 60, 1337–1347. [Google Scholar] [CrossRef]
- Aydin, A. Using 3D vision camera system to automatically assess the level of inactivity in broiler chickens. Comput. Electron. Agric. 2017, 135, 4–10. [Google Scholar] [CrossRef]
- Li, G.; Zhao, Y.; Hailey, R.; Zhang, N.; Liang, Y.; Purswell, J.L. Radio-frequency Identification (RFID) System for Monitoring Specific Behaviors of Group Housed Broilers. In Proceedings of the 10th International Livestock Environment Symposium (ILES X), Omaha, NE, USA, 25–27 September 2018. [Google Scholar] [CrossRef]
- Van Der Stuyft, E.; Schofield, C.P.; Randall, J.; Wambacq, P.; Goedseels, V. Development and application of computer vision systems for use in livestock production. Comput. Electron. Agric. 1991, 6, 243–265. [Google Scholar] [CrossRef]
- Xiao, D.; Zeng, R.; Zhou, M.; Huang, Y.; Wang, W. Monitoring key behaviors of herd Magang geese based on DH-YoloX. Trans. Chin. Soc. Agric. Eng. 2023, 39. Available online: https://link.cnki.net/urlid/11.2047.S.20230303.1347.004 (accessed on 15 October 2024).
- Wang, J. Research on Key Techniques for Behavior Recognition of Caged Breeder Chickens. Ph.D. Dissertation, Agricultural University of Hebei, Baoding, China, 2020. [Google Scholar]
- Qi, H.; Li, C.; Huang, G. Dead Chicken detection Algorithm based on lightweight YOLOv4. J. Chin. Agric. Mech. 2024, 45, 195–201. [Google Scholar] [CrossRef]
- Gu, Y.; Wang, S.; Yan, Y.; Tang, S.; Zhao, S. Identification and analysis of emergency behavior of cage-reared laying ducks based on YoloV5. Agriculture 2022, 12, 485. [Google Scholar] [CrossRef]
- Guo, Y.; Aggrey, S.E.; Wang, P.; Oladeinde, A.; Chai, L. Monitoring behaviors of broiler chickens at different ages with deep learning. Animals 2022, 12, 3390. [Google Scholar] [CrossRef]
- Lin, A.; Chen, B.; Xu, J.; Zhang, Z.; Lu, G.; Zhang, D. DS-TransUNet: Dual Swin Transformer U-Net for Medical Image Segmentation. IEEE Trans. Instrum. Meas. 2022, 71, 4005615. [Google Scholar] [CrossRef]
- Liu, L.; Ouyang, W.; Wang, X.; Fieguth, P.; Chen, J.; Liu, X.; Pietikäinen, M. Deep learning for generic object detection: A survey. Int. J. Comput. Vis. 2020, 128, 261–318. [Google Scholar] [CrossRef]
- Wang, W.; Dai, J.; Chen, Z.; Huang, Z.; Li, Z.; Zhu, X.; Hu, X.; Lu, T.; Lu, L.; Li, H.; et al. Internimage: Exploring large-scale vision foundation models with deformable convolutions. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 14408–14419. [Google Scholar]
- Carion, N.; Massa, F.; Synnaeve, G.; Usunier, N.; Kirillov, A.; Zagoruyko, S. End-to-end object detection with transformers. In Computer Vision—ECCV 2020: Proceedings of the 16th European Conference, Glasgow, UK, 23–28 August 2020; Springer International Publishing: Cham, Switzerland, 2020; pp. 213–229. [Google Scholar]
- Zhang, Y.; Sun, P.; Jiang, Y.; Yu, D.; Weng, F.; Yuan, Z.; Luo, P.; Liu, W.; Wang, X. Bytetrack: Multi-object tracking by associating every detection box. In Computer Vision—ECCV 2022: Proceedings of the 17th European Conference, Tel Aviv, Israel, 23–27 October 2022; Springer Nature: Cham, Switzerland, 2022; pp. 1–21. [Google Scholar]
- Chen, J.; Kao, S.H.; He, H.; Zhuo, W.; Wen, S.; Lee, C.H.; Chan SH, G. Run, don’t walk: Chasing higher FLOPS for faster neural networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 12021–12031. [Google Scholar]
- Zhang, X.; Zhou, X.; Lin, M.; Sun, J. Shufflenet: An extremely efficient convolutional neural network for mobile. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 18–23 June 2018. [Google Scholar]
- Li, H.; Li, J.; Wei, H.; Liu, Z.; Zhan, Z.; Ren, Q. Slim-neck by GSConv: A better design paradigm of detector architectures for autonomous vehicles. arXiv 2022, arXiv:2206.02424. [Google Scholar]
- Ding, X.; Zhang, X.; Ma, N.; Han, J.; Ding, G.; Sun, J. RepVGG: Making VGG-style convnets great again. In Proceedings of the IEEE/CVF Conference on Computer VISION and pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 13733–13742. [Google Scholar]
- Padilla, R.; Netto, S.L.; Da Silva, E.A. A survey on performance metrics for object-detection algorithms. In Proceedings of the 2020 International Conference on Systems, Signals and Image Processing (IWSSIP), Niteroi, Brazil, 1–3 July 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 237–242. [Google Scholar]
- Ding, X.; Zhang, X.; Han, J.; Ding, G. Scaling up your kernels to 31×31: Revisiting large kernel design in CNNs. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 11963–11975. [Google Scholar]
- Chollet, F. Xception: Deep learning with depthwise separable convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 1251–1258. [Google Scholar]
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Luo, W.; Li, Y.; Urtasun, R.; Zemel, R. Understanding the effective receptive field in deep convolutional neural networks. In Advances in Neural Information Processing Systems 29, Proceedings of the Annual Conference on Neural Information Processing Systems 2016, Barcelona, Spain, 5–10 December 2016; Neural Information Processing Systems Foundation, Inc. (NeurIPS): La Jolla, CA, USA, 2016. [Google Scholar]
- Bommasani, R.; Hudson, D.A.; Adeli, E.; Altman, R.; Arora, S.; von Arx, S.; Bernstein, M.S.; Bohg, J.; Bosselut, A.; Brunskill, E.; et al. On the opportunities and risks of foundation models. arXiv 2021, arXiv:2108.07258. [Google Scholar]
- Wu, D.; Cui, D.; Zhou, M.; Ying, Y. Information perception in modern poultry farming: A review. Comput. Electron. Agric. 2022, 199, 107131. [Google Scholar] [CrossRef]
- Zou, Z.; Chen, K.; Shi, Z.; Guo, Y.; Ye, J. Object detection in 20 years: A survey. Proc. IEEE 2023, 111, 257–276. [Google Scholar] [CrossRef]
Data Set | Stocking Density | Shooting Height | Shooting Angle |
---|---|---|---|
A | 0.04 m2/individual | 1.5 m | The coop is tilted down 45° directly in front of it |
B | 0.04 m2/individual | 1 m | The sides of the coop are angled 45° downward |
Behavior Name | Standard of Judgment | Free Sketch |
---|---|---|
Lying | Belly on the ground | (a) |
Feed | Head into the trough | (b) |
Stand | Legs visible | (c) |
Drink | Head in the sink | (d) |
Dataset | Lying Instance | Feed Instance | Stand Instance | Drink Instance |
---|---|---|---|---|
Validation set | 12,604 | 5334 | 5839 | 2307 |
Test set A | 12,627 | 5464 | 5709 | 2208 |
Test set B | 2586 | 641 | 1177 | 122 |
Parameters | Settings | Parameters | Settings |
---|---|---|---|
Optimizer | SGD | lrf | 1.0 |
Epoch | 300 | weight_decay | 0.0001 |
Batch-size | 4 | momentum | 0.9 |
Workers | 4 | Warmup_epochs | 2000 |
imgs | 640 | Warmup_momentum | 0.8 |
lr0 | 0.0001 | Close_mosaic | 10 |
Backbone | Test Set A | Test Set B | Params (M) | FLOPs (G) | ||
---|---|---|---|---|---|---|
mAP50 | mAP50-95 | mAP50 | mAP50-95 | |||
ResNet18 | 98.6 | 84.9 | 31.2 | 20.1 | 20 | 60 |
ResNet34 | 99.4 | 88.6 | 31.0 | 18.4 | 31 | 92 |
ResNet50 | 99.2 | 89.0 | 37.7 | 23.5 | 36 | 136 |
HGNetv2 | 99.3 | 88.4 | 32.2 | 19.1 | 32 | 110 |
FasterNet | 98.3 | 84.3 | 27.6 | 17.4 | 10.8 | 28.5 |
Model | t = 20% | t = 30% | t = 50% | t = 99% |
---|---|---|---|---|
RT-DETR-ResNet18 | 0.3% | 0.5% | 1.3% | 77.9% |
RT-DETR-Fasternet | 1% | 2.3% | 10.4% | 95.9% |
RT-DETR-Fasternet-DDConv | 2.3% | 4.4% | 10.4% | 88.7% |
Model | Tes Set A | Test Set B | Params (M) | FLOPs (G) | ||
---|---|---|---|---|---|---|
mAP50 | mAP50-95 | mAP50 | mAP50-95 | |||
RT-DETR-ResNet18 | 98.6 | 84.9 | 31.2 | 20.1 | 20 | 60 |
RT-DETR-Fasternet | 98.3 | 84.3 | 27.6 | 17.4 | 10.8 | 28.5 |
RT-DETR-Fasternet +DDConv | 99.0 | 86.7 | 25.6 | 15.7 | 9.9 | 26.7 |
Our(RT-DETR-Fasternet +DDConv + MSDCC) | 99.4 | 88.4 | 31.6 | 19.1 | 8.9 | 18.7 |
Models | mAP50(%) | GFLOPs | Params (M) | FPS | |
---|---|---|---|---|---|
Test Set A | Test Set B | ||||
SSD | 98.2 | 26.5 | 137.2 | 24.01 | 23.5 |
Faster-RCNN | 98.6 | 30.5 | 200.9 | 136.75 | 11.8 |
Yolov7 | 99.3 | 30.4 | 103.2 | 36.93 | 45.9 |
Yolov8m | 99.1 | 27.4 | 78.7 | 25.8 | 57.6 |
RT-DETR-R18 | 98.6 | 31.2 | 31.2 | 20.1 | 49.5 |
Our | 99.4 | 31.6 | 18.3 | 8.9 | 68.5 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Qi, H.; Chen, Z.; Liang, G.; Chen, R.; Jiang, J.; Luo, X. Broiler Behavior Detection and Tracking Method Based on Lightweight Transformer. Appl. Sci. 2025, 15, 3333. https://doi.org/10.3390/app15063333
Qi H, Chen Z, Liang G, Chen R, Jiang J, Luo X. Broiler Behavior Detection and Tracking Method Based on Lightweight Transformer. Applied Sciences. 2025; 15(6):3333. https://doi.org/10.3390/app15063333
Chicago/Turabian StyleQi, Haixia, Zihong Chen, Guangsheng Liang, Riyao Chen, Jinzhuo Jiang, and Xiwen Luo. 2025. "Broiler Behavior Detection and Tracking Method Based on Lightweight Transformer" Applied Sciences 15, no. 6: 3333. https://doi.org/10.3390/app15063333
APA StyleQi, H., Chen, Z., Liang, G., Chen, R., Jiang, J., & Luo, X. (2025). Broiler Behavior Detection and Tracking Method Based on Lightweight Transformer. Applied Sciences, 15(6), 3333. https://doi.org/10.3390/app15063333