Tracking Poultry Drinking Behavior and Floor Eggs in Cage-Free Houses with Innovative Depth Anything Model
Abstract
1. Introduction
2. Materials and Methods
2.1. Experimental Setup
2.2. Data Collection and Preparation
2.3. Deep Learning Model for Automated Detection of Perch, Chickens, and Eggs in Cage-Free Systems
2.4. Drinking Behavior Detection
Algorithm 1. ROI Depth Analysis for Drinking Behavior Detection |
import cv2 import numpy as np # Load the image image_path = ‘/mnt/data/image.png’ # Update with the path image = cv2.imread(image_path, cv2.IMREAD_GRAYSCALE) # Define the region of interest (ROI) x_start, x_end = 1300, 1400 roi = image[:, x_start:x_end] min_depth, max_depth = 16, 20 average_depth = np.mean(roi) # Compare the pixel values in the ROI with the depth range within_depth_range = (roi >= min_depth) & (roi <= max_depth) proportion_within_range = np.sum(within_depth_range) / roi.size print(f”Average depth in ROI: {average_depth}”) print(f”Proportion of pixels within the depth range: {proportion_within_range:.2f}”) |
2.5. Optimized Route Distance Calculation
Algorithm 2. Optimized Egg Collection Routes Using Depth and Clustering Techniques |
def calculate_optimized_route(cluster_routes): full_route = [] total_distance = 0 route_details = [] # Collect eggs within clusters in a pre-determined order (cluster_routes) for cluster_num in cluster_routes.keys(): eggs_in_cluster = cluster_routes[cluster_num] full_route.extend(eggs_in_cluster) # Calculate total distance for the optimized route for i in range(len(full_route) − 1): dist = distance_matrix_eggs[full_route[i], full_route[i + 1]] total_distance += dist route_details.append((full_route[i], full_route[i + 1], dist)) return total_distance, full_route, route_details |
2.6. Model Evaluation and Statistical Data Analysis
3. Results and Discussion
3.1. Results of Monocular Depth Estimation for Perch Frame Evaluation
3.2. Evaluation of Applying the DAM to Detect Drinking Behavior
3.3. Evaluation of Applying DAM to Determine the Short Distance for Egg Collector Robotics
3.4. Exploring the Application of DAM in Commercial Cage-Free Houses
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Senoo, E.E.K.; Anggraini, L.; Kumi, J.A.; Luna, B.K.; Akansah, E.; Sulyman, H.A.; Mendonça, I.; Aritsugi, M. IoT Solutions with Artificial Intelligence Technologies for Precision Agriculture: Definitions, Applications, Challenges, and Opportunities. Electronics 2024, 13, 1894. [Google Scholar] [CrossRef]
- Li, N.; Ren, Z.; Li, D.; Zeng, L. Review: Automated Techniques for Monitoring the Behaviour and Welfare of Broilers and Laying Hens: Towards the Goal of Precision Livestock Farming. Animal 2020, 14, 617–625. [Google Scholar] [CrossRef]
- Sa, I.; Ge, Z.; Dayoub, F.; Upcroft, B.; Perez, T.; McCool, C. DeepFruits: A Fruit Detection System Using Deep Neural Networks. Sensors 2016, 16, 1222. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep Learning in Agriculture: A Survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
- Nasirahmadi, A.; Edwards, S.A.; Sturm, B. Implementation of Machine Vision for Detecting Behaviour of Cattle and Pigs. Livest. Sci. 2017, 202, 25–38. [Google Scholar] [CrossRef]
- Ranftl, R.; Bochkovskiy, A.; Koltun, V. Vision Transformers for Dense Prediction. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 12179–12188. [Google Scholar]
- Yang, L.; Kang, B.; Huang, Z.; Xu, X.; Feng, J.; Zhao, H. Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 16–22 June 2024. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. In Advances in Neural Information Processing Systems; Curran Associates, Inc.: Red Hook, NY, USA, 2012; Volume 25. [Google Scholar]
- Choi, D.; Lee, W.S.; Schueller, J.K.; Ehsani, R.; Roka, F.; Diamond, J. A Performance Comparison of RGB, NIR, and Depth Images in Immature Citrus Detection Using Deep Learning Algorithms for Yield Prediction. In Proceedings of the 2017 ASABE Annual International Meeting, Spokane, WA, USA, 16–19 July 2017; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2017; p. 1. [Google Scholar]
- Bekhor, S.; Ben-Akiva, M.E.; Ramming, M.S. Evaluation of Choice Set Generation Algorithms for Route Choice Models. Ann. Oper. Res. 2006, 144, 235–247. [Google Scholar] [CrossRef]
- Sharma, V.; Tripathi, A.K.; Mittal, H. Technological Revolutions in Smart Farming: Current Trends, Challenges & Future Directions. Comput. Electron. Agric. 2022, 201, 107217. [Google Scholar] [CrossRef]
- Bist, R.B.; Yang, X.; Subedi, S.; Chai, L. Mislaying Behavior Detection in Cage-Free Hens with Deep Learning Technologies. Poult. Sci. 2023, 102, 102729. [Google Scholar] [CrossRef] [PubMed]
- Guo, Q.; Shi, Z.; Huang, Y.-W.; Alexander, E.; Qiu, C.-W.; Capasso, F.; Zickler, T. Compact Single-Shot Metalens Depth Sensors Inspired by Eyes of Jumping Spiders. Proc. Natl. Acad. Sci. USA 2019, 116, 22959–22965. [Google Scholar] [CrossRef]
- Yang, X.; Chai, L.; Bist, R.B.; Subedi, S.; Wu, Z. A Deep Learning Model for Detecting Cage-Free Hens on the Litter Floor. Animals 2022, 12, 1983. [Google Scholar] [CrossRef]
- Vroegindeweij, B.A.; van Willigenburg, G.L.; Groot Koerkamp, P.W.G.; van Henten, E.J. Path Planning for the Autonomous Collection of Eggs on Floors. Biosyst. Eng. 2014, 121, 186–199. [Google Scholar] [CrossRef]
- Martinez Angulo, A.; Henry, S.; Tanveer, M.H.; Voicu, R.; Koduru, C. The Voice-To-Text Implementation with ChatGPT in Unitree Go1 Programming. 2024. Available online: https://digitalcommons.kennesaw.edu/undergradsymposiumksu/spring2024/spring2024/227/ (accessed on 8 June 2025).
- Kirillov, A.; Mintun, E.; Ravi, N.; Mao, H.; Rolland, C.; Gustafson, L.; Xiao, T.; Whitehead, S.; Berg, A.C.; Lo, W.-Y.; et al. Segment Anything. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 1–6 October 2023. [Google Scholar]
- Yu, F.; Chen, H.; Wang, X.; Xian, W.; Chen, Y.; Liu, F.; Madhavan, V.; Darrell, T. BDD100K: A Diverse Driving Dataset for Heterogeneous Multitask Learning. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020; pp. 2633–2642. [Google Scholar]
- Yun, S.; Han, D.; Oh, S.J.; Chun, S.; Choe, J.; Yoo, Y. Cutmix: Regularization Strategy to Train Strong Classifiers with Localizable Features. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea, 27 October–2 November 2019; pp. 6023–6032. [Google Scholar]
- Oquab, M.; Darcet, T.; Moutakanni, T.; Vo, H.; Szafraniec, M.; Khalidov, V.; Fernandez, P.; Haziza, D.; Massa, F.; El-Nouby, A.; et al. DINOv2: Learning Robust Visual Features without Supervision. arXiv 2024, arXiv:2304.07193. [Google Scholar]
- Pham, H.; Dai, Z.; Xie, Q.; Le, Q.V. Meta Pseudo Labels. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 11557–11568. [Google Scholar]
- Silberman, N.; Hoiem, D.; Kohli, P.; Fergus, R. Indoor Segmentation and Support Inference from RGBD Images. In Computer Vision—ECCV 2012, Proceedings of the 12th European Conference on Computer Vision, Florence, Italy, 7–13 October 2012; Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2012; Volume 7576, pp. 746–760. ISBN 978-3-642-33714-7. [Google Scholar]
- Eigen, D.; Puhrsch, C.; Fergus, R. Depth Map Prediction from a Single Image Using a Multi-Scale Deep Network. In Advances in Neural Information Processing Systems; Curran Associates, Inc.: Red Hook, NY, USA, 2014; Volume 27. [Google Scholar]
- Lin, C.-Y.; Hsieh, K.-W.; Tsai, Y.-C.; Kuo, Y.-F. Automatic Monitoring of Chicken Movement and Drinking Time Using Convolutional Neural Networks. Trans. ASABE 2020, 63, 2029–2038. [Google Scholar] [CrossRef]
- Truswell, A.; Lee, Z.Z.; Stegger, M.; Blinco, J.; Abraham, R.; Jordan, D.; Milotic, M.; Hewson, K.; Pang, S.; Abraham, S. Augmented Surveillance of Antimicrobial Resistance with High-Throughput Robotics Detects Transnational Flow of Fluoroquinolone-Resistant Escherichia Coli Strain into Poultry. J. Antimicrob. Chemother. 2023, 78, 2878–2885. [Google Scholar] [CrossRef] [PubMed]
- Geiger, A.; Roser, M.; Urtasun, R. Efficient Large-Scale Stereo Matching. In Computer Vision—ACCV 2010, Proceedings of the 10th Asian Conference on Computer Vision, Queenstown, New Zealand, 8–12 November 2010; Kimmel, R., Klette, R., Sugimoto, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 25–38. [Google Scholar]
- Bist, R.B.; Subedi, S.; Chai, L.; Regmi, P.; Ritz, C.W.; Kim, W.K.; Yang, X. Effects of Perching on Poultry Welfare and Production: A Review. Poultry 2023, 2, 134–157. [Google Scholar] [CrossRef]
- Gunnarsson, S.; Yngvesson, J.; Keeling, L.J.; Forkman, B. Rearing without Early Access to Perches Impairs the Spatial Skills of Laying Hens. Appl. Anim. Behav. Sci. 2000, 67, 217–228. [Google Scholar] [CrossRef]
- Abas, A.M.F.M.; Azmi, N.A.; Amir, N.S.; Abidin, Z.Z.; Shafie, A.A. Chicken Farm Monitoring System. In Proceedings of the 6th International Conference on Computer and Communication Engineering (ICCCE 2016), Kuala Lumpur, Malaysia, 26–27 July 2016; IEEE: New York, NY, USA, 2016; pp. 132–137. [Google Scholar]
- Chen, Z.; Hou, Y.; Yang, C. Research on Identification of Sick Chicken Based on Multi Region Deep Features Fusion. In Proceedings of the 2021 6th International Conference on Computational Intelligence and Applications (ICCIA), Xiamen, China, 11–13 June 2021; pp. 174–179. [Google Scholar]
- Nasiri, A.; Amirivojdan, A.; Zhao, Y.; Gan, H. An Automated Video Action Recognition-Based System for Drinking Time Estimation of Individual Broilers. Smart Agric. Technol. 2024, 7, 100409. [Google Scholar] [CrossRef]
- Li, G.; Ji, B.; Li, B.; Shi, Z.; Zhao, Y.; Dou, Y.; Brocato, J. Assessment of Layer Pullet Drinking Behaviors under Selectable Light Colors Using Convolutional Neural Network. Comput. Electron. Agric. 2020, 172, 105333. [Google Scholar] [CrossRef]
- Bist, R.B.; Subedi, S.; Yang, X.; Chai, L. A Novel YOLOv6 Object Detector for Monitoring Piling Behavior of Cage-Free Laying Hens. AgriEngineering 2023, 5, 905–923. [Google Scholar] [CrossRef]
- Bidese Puhl, R. Precision Agriculture Systems for the Southeast US Using Computer Vision and Deep Learning. Ph.D. Thesis, Auburn University, Auburn, AL, USA, 2023. [Google Scholar]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Region-Based Convolutional Networks for Accurate Object Detection and Segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 38, 142–158. [Google Scholar] [CrossRef]
- Zhang, H.; Zhang, L.; Jiang, Y. Overfitting and Underfitting Analysis for Deep Learning Based End-to-End Communication Systems. In Proceedings of the 2019 11th International Conference on Wireless Communications and Signal Processing (WCSP), Xi’an, China, 23–25 October 2019; pp. 1–6. [Google Scholar]
- Wolc, A.; Settar, P.; Fulton, J.E.; Arango, J.; Rowland, K.; Lubritz, D.; Dekkers, J.C.M. Heritability of Perching Behavior and Its Genetic Relationship with Incidence of Floor Eggs in Rhode Island Red Chickens. Genet. Sel. Evol. 2021, 53, 38. [Google Scholar] [CrossRef] [PubMed]
- Chang, C.-L.; Xie, B.-X.; Wang, C.-H. Visual Guidance and Egg Collection Scheme for a Smart Poultry Robot for Free-Range Farms. Sensors 2020, 20, 6624. [Google Scholar] [CrossRef] [PubMed]
- Ahmed, M.; Seraj, R.; Islam, S.M.S. The K-Means Algorithm: A Comprehensive Survey and Performance Evaluation. Electronics 2020, 9, 1295. [Google Scholar] [CrossRef]
- Ren, G.; Lin, T.; Ying, Y.; Chowdhary, G.; Ting, K.C. Agricultural Robotics Research Applicable to Poultry Production: A Review. Comput. Electron. Agric. 2020, 169, 105216. [Google Scholar] [CrossRef]
Perch Level | Predicted Disparity | Adjusted Depth (m) | Actual Depth (m) |
---|---|---|---|
1 | 0.00456 | 1.18 | 1.00 |
2 | 0.00575 | 1.49 | 1.60 |
3 | 0.00748 | 1.93 | 2.00 |
4 | 0.00984 | 2.31 | 2.40 |
5 | 0.01048 | 2.71 | 2.70 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, X.; Lu, G.; Zhang, J.; Paneru, B.; Dhungana, A.; Dahal, S.; Bist, R.B.; Chai, L. Tracking Poultry Drinking Behavior and Floor Eggs in Cage-Free Houses with Innovative Depth Anything Model. Appl. Sci. 2025, 15, 6625. https://doi.org/10.3390/app15126625
Yang X, Lu G, Zhang J, Paneru B, Dhungana A, Dahal S, Bist RB, Chai L. Tracking Poultry Drinking Behavior and Floor Eggs in Cage-Free Houses with Innovative Depth Anything Model. Applied Sciences. 2025; 15(12):6625. https://doi.org/10.3390/app15126625
Chicago/Turabian StyleYang, Xiao, Guoyu Lu, Jinchang Zhang, Bidur Paneru, Anjan Dhungana, Samin Dahal, Ramesh Bahadur Bist, and Lilong Chai. 2025. "Tracking Poultry Drinking Behavior and Floor Eggs in Cage-Free Houses with Innovative Depth Anything Model" Applied Sciences 15, no. 12: 6625. https://doi.org/10.3390/app15126625
APA StyleYang, X., Lu, G., Zhang, J., Paneru, B., Dhungana, A., Dahal, S., Bist, R. B., & Chai, L. (2025). Tracking Poultry Drinking Behavior and Floor Eggs in Cage-Free Houses with Innovative Depth Anything Model. Applied Sciences, 15(12), 6625. https://doi.org/10.3390/app15126625