Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (73)

Search Parameters:
Keywords = intelligent livestock farming

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 14158 KiB  
Article
Enhanced YOLOv8 for Robust Pig Detection and Counting in Complex Agricultural Environments
by Jian Li, Wenkai Ma, Yanan Wei and Tan Wang
Animals 2025, 15(14), 2149; https://doi.org/10.3390/ani15142149 - 21 Jul 2025
Viewed by 308
Abstract
Accurate pig counting is crucial for precision livestock farming, enabling optimized feeding management and health monitoring. Detection-based counting methods face significant challenges due to mutual occlusion, varying illumination conditions, diverse pen configurations, and substantial variations in pig densities. Previous approaches often struggle with [...] Read more.
Accurate pig counting is crucial for precision livestock farming, enabling optimized feeding management and health monitoring. Detection-based counting methods face significant challenges due to mutual occlusion, varying illumination conditions, diverse pen configurations, and substantial variations in pig densities. Previous approaches often struggle with complex agricultural environments where lighting conditions, pig postures, and crowding levels create challenging detection scenarios. To address these limitations, we propose EAPC-YOLO (enhanced adaptive pig counting YOLO), a robust architecture integrating density-aware processing with advanced detection optimizations. The method consists of (1) an enhanced YOLOv8 network incorporating multiple architectural improvements for better feature extraction and object localization. These improvements include DCNv4 deformable convolutions for irregular pig postures, BiFPN bidirectional feature fusion for multi-scale information integration, EfficientViT linear attention for computational efficiency, and PIoU v2 loss for improved overlap handling. (2) A density-aware post-processing module with intelligent NMS strategies that adapt to different crowding scenarios. Experimental results on a comprehensive dataset spanning diverse agricultural scenarios (nighttime, controlled indoor, and natural daylight environments with density variations from 4 to 30 pigs) demonstrate our method achieves 94.2% mAP@0.5 for detection performance and 96.8% counting accuracy, representing 12.3% and 15.7% improvements compared to the strongest baseline, YOLOv11n. This work enables robust, accurate pig counting across challenging agricultural environments, supporting precision livestock management. Full article
Show Figures

Figure 1

42 pages, 3505 KiB  
Review
Computer Vision Meets Generative Models in Agriculture: Technological Advances, Challenges and Opportunities
by Xirun Min, Yuwen Ye, Shuming Xiong and Xiao Chen
Appl. Sci. 2025, 15(14), 7663; https://doi.org/10.3390/app15147663 - 8 Jul 2025
Viewed by 974
Abstract
The integration of computer vision (CV) and generative artificial intelligence (GenAI) into smart agriculture has revolutionised traditional farming practices by enabling real-time monitoring, automation, and data-driven decision-making. This review systematically examines the applications of CV in key agricultural domains, such as crop health [...] Read more.
The integration of computer vision (CV) and generative artificial intelligence (GenAI) into smart agriculture has revolutionised traditional farming practices by enabling real-time monitoring, automation, and data-driven decision-making. This review systematically examines the applications of CV in key agricultural domains, such as crop health monitoring, precision farming, harvesting automation, and livestock management, while highlighting the transformative role of GenAI in addressing data scarcity and enhancing model robustness. Advanced techniques, including convolutional neural networks (CNNs), YOLO variants, and transformer-based architectures, are analysed for their effectiveness in tasks like pest detection, fruit maturity classification, and field management. The survey reveals that generative models, such as generative adversarial networks (GANs) and diffusion models, significantly improve dataset diversity and model generalisation, particularly in low-resource scenarios. However, challenges persist, including environmental variability, edge deployment limitations, and the need for interpretable systems. Emerging trends, such as vision–language models and federated learning, offer promising avenues for future research. The study concludes that the synergy of CV and GenAI holds immense potential for advancing smart agriculture, though scalable, adaptive, and trustworthy solutions remain critical for widespread adoption. This comprehensive analysis provides valuable insights for researchers and practitioners aiming to harness AI-driven innovations in agricultural ecosystems. Full article
(This article belongs to the Section Electrical, Electronics and Communications Engineering)
Show Figures

Figure 1

28 pages, 1634 KiB  
Review
AI-Powered Vocalization Analysis in Poultry: Systematic Review of Health, Behavior, and Welfare Monitoring
by Venkatraman Manikandan and Suresh Neethirajan
Sensors 2025, 25(13), 4058; https://doi.org/10.3390/s25134058 - 29 Jun 2025
Viewed by 1006
Abstract
Artificial intelligence and bioacoustics represent a paradigm shift in non-invasive poultry welfare monitoring through advanced vocalization analysis. This comprehensive systematic review critically examines the transformative evolution from traditional acoustic feature extraction—including Mel-Frequency Cepstral Coefficients (MFCCs), spectral entropy, and spectrograms—to cutting-edge deep learning architectures [...] Read more.
Artificial intelligence and bioacoustics represent a paradigm shift in non-invasive poultry welfare monitoring through advanced vocalization analysis. This comprehensive systematic review critically examines the transformative evolution from traditional acoustic feature extraction—including Mel-Frequency Cepstral Coefficients (MFCCs), spectral entropy, and spectrograms—to cutting-edge deep learning architectures encompassing Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, attention mechanisms, and groundbreaking self-supervised models such as wav2vec2 and Whisper. The investigation reveals compelling evidence for edge computing deployment via TinyML frameworks, addressing critical scalability challenges in commercial poultry environments characterized by acoustic complexity and computational constraints. Advanced applications spanning emotion recognition, disease detection, and behavioral phenotyping demonstrate unprecedented potential for real-time welfare assessment. Through rigorous bibliometric co-occurrence mapping and thematic clustering analysis, this review exposes persistent methodological bottlenecks: dataset standardization deficits, evaluation protocol inconsistencies, and algorithmic interpretability limitations. Critical knowledge gaps emerge in cross-species domain generalization and contextual acoustic adaptation, demanding urgent research prioritization. The findings underscore explainable AI integration as essential for establishing stakeholder trust and regulatory compliance in automated welfare monitoring systems. This synthesis positions acoustic AI as a cornerstone technology enabling ethical, transparent, and scientifically robust precision livestock farming, bridging computational innovation with biological relevance for sustainable poultry production systems. Future research directions emphasize multi-modal sensor integration, standardized evaluation frameworks, and domain-adaptive models capable of generalizing across diverse poultry breeds, housing conditions, and environmental contexts while maintaining interpretability for practical farm deployment. Full article
(This article belongs to the Special Issue Feature Papers in Smart Agriculture 2025)
Show Figures

Figure 1

24 pages, 348 KiB  
Review
Knowledge Gaps in the Nutrient Requirements of Beef Cattle
by Michael L. Galyean, Karen A. Beauchemin, Joel S. Caton, N. Andy Cole, Joan H. Eisemann, Terry E. Engle, Galen E. Erickson, Clint R. Krehbiel, Ronald P. Lemenager and Luis O. Tedeschi
Ruminants 2025, 5(3), 29; https://doi.org/10.3390/ruminants5030029 - 29 Jun 2025
Viewed by 679
Abstract
The 8th revised edition of the Nutrient Requirements of Beef Cattle was released in 2016, with the recommendations provided in the publication being used extensively in both research and production settings. In the context of research needs identified in that publication, our objective [...] Read more.
The 8th revised edition of the Nutrient Requirements of Beef Cattle was released in 2016, with the recommendations provided in the publication being used extensively in both research and production settings. In the context of research needs identified in that publication, our objective was to review research on beef cattle nutrient requirements published since 2016 and identify knowledge gaps that should be addressed. Relative to energy requirements, the effects of environmental temperature and grazing activity, along with stress and disease, on maintenance requirements are inadequately characterized or defined. In addition, relationships between retained energy and protein should be more fully elucidated, and additional guidance on body weight at a target compositional endpoint is needed. Areas of continuing concern include accurately and precisely predicting microbial protein supply, predicting N recycling, and the metabolizable protein requirements for maintenance. Mineral and vitamin requirements are often challenging because of a lack of consistency in models used to determine requirements and potential effects of unique production settings on requirements. Based on recent research with feedlot cattle, zinc and chromium requirements should be examined more closely. Because predictions of dry matter intake are critical to supplying nutrients, additional development of prediction equations is needed, especially for beef cows and grazing beef cattle in general. Given considerable research in prediction of greenhouse gases, reevaluation of 2016 recommendations is warranted, along with a need for the updating of equations to predict excretions of N and P. Composition of feeds, particularly byproducts from ethanol production or other industrial streams, represents a knowledge gap, with obtaining reliable energy values of these feeds being a notable challenge. Nutritional models provide the means to integrate nutrient requirement recommendations into practice, and moving towards mechanistic models that take advantage of artificial intelligence and precision livestock farming technologies will be critical to developing future modeling systems. Full article
18 pages, 2046 KiB  
Review
Ethics, Animal Welfare, and Artificial Intelligence in Livestock: A Bibliometric Review
by Taize Calvacante Santana, Cristiane Guiselini, Héliton Pandorfi, Ricardo Brauer Vigoderis, José Antônio Delfino Barbosa Filho, Rodrigo Gabriel Ferreira Soares, Maria de Fátima Araújo, Nicoly Farias Gomes, Leandro Dias de Lima and Paulo César da Silva Santos
AgriEngineering 2025, 7(7), 202; https://doi.org/10.3390/agriengineering7070202 - 24 Jun 2025
Viewed by 960
Abstract
This study presents a bibliometric review aimed at mapping and analyzing the scientific literature related to the ethical implications of artificial intelligence (AI) in livestock farming, which is a rapidly emerging yet still underexplored field in international research. Based on the Scopus database, [...] Read more.
This study presents a bibliometric review aimed at mapping and analyzing the scientific literature related to the ethical implications of artificial intelligence (AI) in livestock farming, which is a rapidly emerging yet still underexplored field in international research. Based on the Scopus database, 151 documents published between 2015 and 2025 were identified and analyzed using the VOSviewer version 1.6.20 and Biblioshiny for Bibliometrix (RStudio version 2023.12.1) tools. The results show a significant increase in publications from 2021 onwards, reflecting the growing maturity of discussions around the integration of digital technologies in the agricultural sector. Keyword co-occurrence and bibliographic coupling analyses revealed the formation of four main thematic clusters, covering technical applications in precision livestock farming as well as reflections on governance, animal welfare, and algorithmic justice. The most influential authors, high-impact journals, and leading countries in the field were also identified. As a key contribution, this study highlights the lack of robust ethical guidelines and proposes future research directions for the development of regulatory frameworks, codes of conduct, and interdisciplinary approaches. The findings underscore the importance of aligning technological innovation with ethical responsibility and social inclusion in the transition to digital livestock farming. Full article
Show Figures

Graphical abstract

48 pages, 9168 KiB  
Review
Socializing AI: Integrating Social Network Analysis and Deep Learning for Precision Dairy Cow Monitoring—A Critical Review
by Sibi Chakravathy Parivendan, Kashfia Sailunaz and Suresh Neethirajan
Animals 2025, 15(13), 1835; https://doi.org/10.3390/ani15131835 - 20 Jun 2025
Viewed by 1018
Abstract
This review critically analyzes recent advancements in dairy cow behavior recognition, highlighting novel methodological contributions through the integration of advanced artificial intelligence (AI) techniques such as transformer models and multi-view tracking with social network analysis (SNA). Such integration offers transformative opportunities for improving [...] Read more.
This review critically analyzes recent advancements in dairy cow behavior recognition, highlighting novel methodological contributions through the integration of advanced artificial intelligence (AI) techniques such as transformer models and multi-view tracking with social network analysis (SNA). Such integration offers transformative opportunities for improving dairy cattle welfare, but current applications remain limited. We describe the transition from manual, observer-based assessments to automated, scalable methods using convolutional neural networks (CNNs), spatio-temporal models, and attention mechanisms. Although object detection models, including You Only Look Once (YOLO), EfficientDet, and sequence models, such as Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Long Short-Term Memory (convLSTM), have improved detection and classification, significant challenges remain, including occlusions, annotation bottlenecks, dataset diversity, and limited generalizability. Existing interaction inference methods rely heavily on distance-based approximations (i.e., assuming that proximity implies social interaction), lacking the semantic depth essential for comprehensive SNA. To address this, we propose innovative methodological intersections such as pose-aware SNA frameworks and multi-camera fusion techniques. Moreover, we explicitly discuss ethical challenges and data governance issues, emphasizing data transparency and animal welfare concerns within precision livestock contexts. We clarify how these methodological innovations directly impact practical farming by enhancing monitoring precision, herd management, and welfare outcomes. Ultimately, this synthesis advocates for strategic, empathetic, and ethically responsible precision dairy farming practices, significantly advancing both dairy cow welfare and operational effectiveness. Full article
(This article belongs to the Section Animal Welfare)
Show Figures

Figure 1

12 pages, 1228 KiB  
Article
Multi-Stage Data Processing for Enhancing Korean Cattle (Hanwoo) Weight Estimations by Automated Weighing Systems
by Dong-Hyeon Kim, Jae-Woo Song, Hyunjin Cho, Mingyung Lee, Dae-Hyun Lee, Seongwon Seo and Wang-Hee Lee
Animals 2025, 15(12), 1785; https://doi.org/10.3390/ani15121785 - 17 Jun 2025
Viewed by 303
Abstract
Weight is the most basic and important indicator in cattle management, and automation of its measurement serves as a fundamental step toward modern smart livestock farming. Automated weighing systems (AWS) capable of continuously measuring cattle weight, even during movement, have been explored as [...] Read more.
Weight is the most basic and important indicator in cattle management, and automation of its measurement serves as a fundamental step toward modern smart livestock farming. Automated weighing systems (AWS) capable of continuously measuring cattle weight, even during movement, have been explored as key monitoring components in smart livestock farming. However, owing to the high measurement variability caused by environmental factors, the accuracy of AWSs has been questioned. These factors include real-time fluctuations due to animal activities (e.g., feeding and locomotion), as well as measurement errors caused by residual feed or excreta within the AWS. Therefore, this study aimed to develop an algorithm to enhance the reliability of steer weight measurements using an AWS, ensuring close alignment with actual cattle body weight. Accordingly, daily weight data from 36 Hanwoo steers were processed using a three-stage approach consisting of outlier detection and removal, weight estimation, and post-processing for weight adjustment. The best-performing algorithm that combined Tukey’s fences for outlier detection, mean-based estimation, and post-processing based on daily weight gain recommended by the National Institute of Animal Science achieved a root mean square error of 12.35 kg, along with an error margin of less than 10% for individual steers. Overall, the study concluded that the AWS measured steer weight with high reliability through the developed algorithm, thereby contributing to data-driven intelligent precision feeding. Full article
(This article belongs to the Section Cattle)
Show Figures

Figure 1

24 pages, 412 KiB  
Review
Application of Convolutional Neural Networks in Animal Husbandry: A Review
by Rotimi-Williams Bello, Roseline Oluwaseun Ogundokun, Pius A. Owolawi, Etienne A. van Wyk and Chunling Tu
Mathematics 2025, 13(12), 1906; https://doi.org/10.3390/math13121906 - 6 Jun 2025
Viewed by 756
Abstract
Convolutional neural networks (CNNs) and their application in animal husbandry have in-depth mathematical expressions, which usually revolve around how well they map input data such as images or video frames of animals to meaningful outputs like health status, behavior class, and identification. Likewise, [...] Read more.
Convolutional neural networks (CNNs) and their application in animal husbandry have in-depth mathematical expressions, which usually revolve around how well they map input data such as images or video frames of animals to meaningful outputs like health status, behavior class, and identification. Likewise, computer vision and deep learning models are driven by CNNs to act intelligently in improving productivity and animal management for sustainable animal husbandry. In animal husbandry, CNNs play a vital role in the management and monitoring of livestock’s health and productivity due to their high-performance accuracy in analyzing images and videos. Monitoring animals’ health is important for their welfare, food abundance, safety, and economic productivity. This paper aims to comprehensively review recent advancements and applications of relevant models that are based on CNNs for livestock health monitoring, covering the detection of their various diseases and classification of their behavior, for overall management gain. We selected relevant articles with various experimental results addressing animal detection, localization, tracking, and behavioral monitoring, validating the high-performance accuracy and efficiency of CNNs. Prominent anchor-based object detection models such as R-CNN (series), YOLO (series) and SSD (series), and anchor-free object detection models such as key-point based and anchor-point based are often used, demonstrating great versatility and robustness across various tasks. From the analysis, it is evident that more significant research contributions to animal husbandry have been made by CNNs. Limited labeled data, variation in data, low-quality or noisy images, complex backgrounds, computational demand, species-specific models, high implementation cost, scalability, modeling complex behaviors, and compatibility with current farm management systems are good examples of several notable challenges when applying CNNs in animal husbandry. By continued research efforts, these challenges can be addressed for the actualization of sustainable animal husbandry. Full article
(This article belongs to the Section E: Applied Mathematics)
Show Figures

Figure 1

19 pages, 6390 KiB  
Article
AI-Based Smart Monitoring Framework for Livestock Farms
by Moonsun Shin, Seonmin Hwang and Byungcheol Kim
Appl. Sci. 2025, 15(10), 5638; https://doi.org/10.3390/app15105638 - 18 May 2025
Viewed by 1310
Abstract
Smart farms refer to spaces and technologies that utilize networks and automation to monitor and manage the environment and livestock without the constraints of time and space. As various devices installed on farms are connected to a network and automated, farm conditions can [...] Read more.
Smart farms refer to spaces and technologies that utilize networks and automation to monitor and manage the environment and livestock without the constraints of time and space. As various devices installed on farms are connected to a network and automated, farm conditions can be observed remotely anytime and anywhere via smartphones or computers. These smart farms have evolved into smart livestock farming, which involves collecting, analyzing, and sharing data across the entire process from livestock production and growth to post-shipment distribution and consumption. This data-driven approach aids decision-making and creates new value. However, in the process of evolving smart farm technology into smart livestock farming, challenges remain in the essential requirements of data collection and intelligence. Many livestock farms face difficulties in applying intelligent technologies. In this paper, we propose an intelligent monitoring system framework for smart livestock farms using artificial intelligence technology and implement deep learning-based intelligent monitoring. To detect cattle lesions and inactive individuals within the barn, we apply the RT-DETR method instead of the traditional YOLO model. YOLOv5 and YOLOv8 are representative models in the YOLO series, both of which utilize Non-Maximum Suppression (NMS). NMS is a postprocessing technique used to eliminate redundant bounding boxes by calculating the Intersection over Union (IoU) between all predicted boxes. However, this process can be computationally intensive and may negatively impact both speed and accuracy in object detection tasks. In contrast, RT-DETR (Real-Time Detection Transformer) is a Transformer-based real-time object detection model that does not require NMS and achieves higher accuracy compared to the YOLO models. Given environments where large-scale datasets can be obtained via CCTV, Transformer-based detection methods like RT-DETR are expected to outperform traditional YOLO approaches in terms of detection performance. This approach reduces computational costs and optimizes query initialization, making it more suitable for the real-time detection of cattle maintenance behaviors and related abnormal behavior detection. Comparative analysis with the existing YOLO technique verifies RT-DETR and confirms that RT-DETR shows higher performance than YOLOv8. This research contributes to resolving the low accuracy and high redundancy of traditional YOLO models in behavior recognition, increasing the efficiency of livestock management, and improving productivity by applying deep learning to the smart monitoring of livestock farms. Full article
(This article belongs to the Special Issue Future Information & Communication Engineering 2024)
Show Figures

Figure 1

7 pages, 1525 KiB  
Communication
Prediction of Weight and Body Condition Score of Dairy Goats Using Random Forest Algorithm and Digital Imaging Data
by Mateus Alves Gonçalves, Maria Samires Martins Castro, Eula Regina Carrara, Camila Raineri, Luciana Navajas Rennó and Erica Beatriz Schultz
Animals 2025, 15(10), 1449; https://doi.org/10.3390/ani15101449 - 16 May 2025
Viewed by 552
Abstract
The aim of study was to evaluate the use of digital images to predict body weight (BW) and classify the body condition score (BCS) of dairy goats. A total of 154 female Saanen and Alpine goats were used to obtain eight body measurements [...] Read more.
The aim of study was to evaluate the use of digital images to predict body weight (BW) and classify the body condition score (BCS) of dairy goats. A total of 154 female Saanen and Alpine goats were used to obtain eight body measurements features from digital images: withers height (WH), rump height (RH), body length (BL), chest depth (D), paw height (PH), chest width (CW), rump width (RW), rump length (RL). All animals were weighed using manual scales, and their BCS was evaluated on a scale of 1 to 5. For classification purposes, the BCS was grouped into three categories: low (1–2), moderate (2–3), and high (>3). Pearson’s correlation analysis and the Random Forest algorithm were performed. It was possible to predict BW using image features with an R2 of 0.87, with D (22.14%), CW (18.93%) and BL (15.47%) being the most important variables. For the BCS, the classification accuracy was 0.4054 with the CW (20.38%) the most important variable followed by RH and RL with 15.78% and 12.63%, respectively. It was concluded that digital image features can be used to obtain precise estimates of body weight, but it is necessary to increase data variability to improve the BCS classification of dairy goats. Full article
Show Figures

Figure 1

13 pages, 2855 KiB  
Article
Research on Video Behavior Detection and Analysis Model for Sow Estrus Cycle Based on Deep Learning
by Kaidong Lei, Bugao Li, Shan Zhong, Hua Yang, Hao Wang, Xiangfang Tang and Benhai Xiong
Agriculture 2025, 15(9), 975; https://doi.org/10.3390/agriculture15090975 - 30 Apr 2025
Viewed by 598
Abstract
Against the backdrop of precision livestock farming, sow behavior analysis holds significant theoretical and practical value. Traditional production methods face challenges such as low production efficiency, high labor intensity, and increased disease prevention risks. With the rapid advancement of optoelectronic technology and deep [...] Read more.
Against the backdrop of precision livestock farming, sow behavior analysis holds significant theoretical and practical value. Traditional production methods face challenges such as low production efficiency, high labor intensity, and increased disease prevention risks. With the rapid advancement of optoelectronic technology and deep learning, more technologies are being integrated into smart agriculture. Intelligent large-scale pig farming has become an effective means to improve sow quality and productivity, with behavior recognition technology playing a crucial role in intelligent pig farming. Specifically, monitoring sow behavior enables an effective assessment of health conditions and welfare levels, ensuring efficient and healthy sow production. This study constructs a 3D-CNN model based on video data from the sow estrus cycle, achieving analysis of SOB, SOC, SOS, and SOW behaviors. In typical behavior classification, the model attains accuracy, recall, and F1-score values of (1.00, 0.90, 0.95; 0.96, 0.98, 0.97; 1.00, 0.96, 0.98; 0.86, 1.00, 0.93), respectively. Additionally, under conditions of multi-pig interference and non-specifically labeled data, the accuracy, recall, and F1-scores for the semantic recognition of SOB, SOC, SOS, and SOW behaviors based on the 3D-CNN model are (1.00, 0.90, 0.95; 0.89, 0.89, 0.89; 0.91, 1.00, 0.95; 1.00, 1.00, 1.00), respectively. These findings provide key technical support for establishing the classification and semantic recognition of typical sow behaviors during the estrus cycle, while also offering a practical solution for rapid video-based behavior detection and welfare monitoring in precision livestock farming. Full article
(This article belongs to the Special Issue Computer Vision Analysis Applied to Farm Animals)
Show Figures

Figure 1

27 pages, 1868 KiB  
Article
MACA-Net: Mamba-Driven Adaptive Cross-Layer Attention Network for Multi-Behavior Recognition in Group-Housed Pigs
by Zhixiong Zeng, Zaoming Wu, Runtao Xie, Kai Lin, Shenwen Tan, Xinyuan He and Yizhi Luo
Agriculture 2025, 15(9), 968; https://doi.org/10.3390/agriculture15090968 - 29 Apr 2025
Viewed by 734
Abstract
The accurate recognition of pig behaviors in intensive farming is crucial for health monitoring and growth assessment. To address multi-scale recognition challenges caused by perspective distortion (non-frontal camera angles), this study proposes MACA-Net, a YOLOv8n-based model capable of detecting four key behaviors: eating, [...] Read more.
The accurate recognition of pig behaviors in intensive farming is crucial for health monitoring and growth assessment. To address multi-scale recognition challenges caused by perspective distortion (non-frontal camera angles), this study proposes MACA-Net, a YOLOv8n-based model capable of detecting four key behaviors: eating, lying on the belly, lying on the side, and standing. The model incorporates a Mamba Global–Local Extractor (MGLE) Module, which leverages Mamba to capture global dependencies while preserving local details through convolutional operations and channel shuffle, overcoming Mamba’s limitation in retaining fine-grained visual information. Additionally, an Adaptive Multi-Path Attention (AMPA) mechanism integrates spatial-channel attention to enhance feature focus, ensuring robust performance in complex environments and low-light conditions. To further improve detection, a Cross-Layer Feature Pyramid Transformer (CFPT) neck employs non-upsampled feature fusion, mitigating semantic gap issues where small target features are overshadowed by large target features during feature transmission. Experimental results demonstrate that MACA-Net achieves a precision of 83.1% and mAP of 85.1%, surpassing YOLOv8n by 8.9% and 4.4%, respectively. Furthermore, MACA-Net significantly reduces parameters by 48.4% and FLOPs by 39.5%. When evaluated in comparison to leading detectors such as RT-DETR, Faster R-CNN, and YOLOv11n, MACA-Net demonstrates a consistent level of both computational efficiency and accuracy. These findings provide a robust validation of the efficacy of MACA-Net for intelligent livestock management and welfare-driven breeding, offering a practical and efficient solution for modern pig farming. Full article
(This article belongs to the Special Issue Modeling of Livestock Breeding Environment and Animal Behavior)
Show Figures

Figure 1

15 pages, 3328 KiB  
Article
AGRARIAN: A Hybrid AI-Driven Architecture for Smart Agriculture
by Michael C. Batistatos, Tomaso de Cola, Michail Alexandros Kourtis, Vassiliki Apostolopoulou, George K. Xilouris and Nikos C. Sagias
Agriculture 2025, 15(8), 904; https://doi.org/10.3390/agriculture15080904 - 21 Apr 2025
Cited by 1 | Viewed by 1484
Abstract
Modern agriculture is increasingly challenged by the need for scalable, sustainable, and connectivity-resilient digital solutions. While existing smart farming platforms offer valuable insights, they often rely heavily on centralized cloud infrastructure, which can be impractical in rural or remote settings. To address this [...] Read more.
Modern agriculture is increasingly challenged by the need for scalable, sustainable, and connectivity-resilient digital solutions. While existing smart farming platforms offer valuable insights, they often rely heavily on centralized cloud infrastructure, which can be impractical in rural or remote settings. To address this gap, this paper presents AGRARIAN, a hybrid AI-driven architecture that combines IoT sensor networks, UAV-based monitoring, satellite connectivity, and edge-cloud computing to deliver real-time, adaptive agricultural intelligence. AGRARIAN supports a modular and interoperable architecture structured across four layers—Sensor, Network, Data Processing, and Application—enabling flexible deployment in diverse use cases such as precision irrigation, livestock monitoring, and pest forecasting. A key innovation lies in its localized edge processing and federated AI models, which reduce reliance on continuous cloud access while maintaining analytical performance. Pilot scenarios demonstrate the system’s ability to provide timely, context-aware decision support, enhancing both operational efficiency and digital inclusion for farmers. AGRARIAN offers a robust and scalable pathway for advancing autonomous, sustainable, and connected farming systems. Full article
(This article belongs to the Special Issue Computational, AI and IT Solutions Helping Agriculture)
Show Figures

Figure 1

28 pages, 9610 KiB  
Review
A Review of You Only Look Once Algorithms in Animal Phenotyping Applications
by Guangbo Li, Rui Jian, Xie Jun and Guolong Shi
Animals 2025, 15(8), 1126; https://doi.org/10.3390/ani15081126 - 13 Apr 2025
Cited by 1 | Viewed by 976
Abstract
Animal phenotyping recognition is a pivotal component of precision livestock management, holding significant importance for intelligent farming practices and animal welfare assurance. In recent years, with the rapid advancement of deep learning technologies, the YOLO algorithm—as the pioneering single-stage detection framework—has revolutionized the [...] Read more.
Animal phenotyping recognition is a pivotal component of precision livestock management, holding significant importance for intelligent farming practices and animal welfare assurance. In recent years, with the rapid advancement of deep learning technologies, the YOLO algorithm—as the pioneering single-stage detection framework—has revolutionized the field of object detection through its efficient and rapid approach and has been widely applied across various agricultural domains. This review focuses on animal phenotyping as the research target structured around four key aspects: (1) the evolution of YOLO algorithms, (2) datasets and preprocessing methodologies, (3) application domains of YOLO algorithms, and (4) future directions. This paper aims to offer readers fresh perspectives and insights into animal phenotyping research. Full article
(This article belongs to the Section Animal System and Management)
Show Figures

Figure 1

24 pages, 32213 KiB  
Article
ACMSPT: Automated Counting and Monitoring System for Poultry Tracking
by Edmanuel Cruz, Miguel Hidalgo-Rodriguez, Adiz Mariel Acosta-Reyes, José Carlos Rangel, Keyla Boniche and Franchesca Gonzalez-Olivardia
AgriEngineering 2025, 7(3), 86; https://doi.org/10.3390/agriengineering7030086 - 19 Mar 2025
Viewed by 2047
Abstract
The poultry industry faces significant challenges in efficiently monitoring large populations, especially under resource constraints and limited connectivity. This paper introduces the Automated Counting and Monitoring System for Poultry Tracking (ACMSPT), an innovative solution that integrates edge computing, Artificial Intelligence (AI), and the [...] Read more.
The poultry industry faces significant challenges in efficiently monitoring large populations, especially under resource constraints and limited connectivity. This paper introduces the Automated Counting and Monitoring System for Poultry Tracking (ACMSPT), an innovative solution that integrates edge computing, Artificial Intelligence (AI), and the Internet of Things (IoT). The study begins by collecting a custom dataset of 1300 high-resolution images from real broiler farm environments, encompassing diverse lighting conditions, occlusions, and growth stages. Each image was manually annotated and used to train the YOLOv10 object detection model with carefully selected hyperparameters. The trained model was then deployed on an Orange Pi 5B single-board computer equipped with a Neural Processing Unit (NPU), enabling on-site inference and real-time poultry tracking. System performance was evaluated in both small- and commercial-scale sheds, achieving a precision of 93.1% and recall of 93.0%, with an average inference time under 200 milliseconds. The results demonstrate that ACMSPT can autonomously detect anomalies in poultry movement, facilitating timely interventions while reducing manual labor. Moreover, its cost-effective, low-connectivity design supports broader adoption in remote or resource-limited environments. Future work will focus on improving adaptability to extreme conditions and extending this approach to other livestock management contexts. Full article
(This article belongs to the Special Issue Precision Farming Technologies for Monitoring Livestock and Poultry)
Show Figures

Graphical abstract

Back to TopTop