Next Article in Journal
Intercropping of Soybean and Common Millet—ARational Way of Forage Biomass Quality Enhancement
Previous Article in Journal
Training Strategy Optimization of a Tea Canopy Dataset for Variety Identification During the Harvest Period
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

How AI Improves Sustainable Chicken Farming: A Literature Review of Welfare, Economic, and Environmental Dimensions

by
Zhenlong Wu
,
Sam Willems
,
Dong Liu
and
Tomas Norton
*
M3 BIORES-Measure, Model and Manage Bioresponses, Division Animal and Human Health Engineering, Department of Biosystems, Catholic University of Leuven, Kasteelpark Arenberg 30, 3001 Leuven, Belgium
*
Author to whom correspondence should be addressed.
Agriculture 2025, 15(19), 2028; https://doi.org/10.3390/agriculture15192028 (registering DOI)
Submission received: 9 September 2025 / Revised: 24 September 2025 / Accepted: 26 September 2025 / Published: 27 September 2025
(This article belongs to the Section Farm Animal Production)

Abstract

Artificial Intelligence (AI) is widely recognized as a force that will fundamentally transform traditional chicken farming models. It can reduce labor costs while ensuring welfare and at the same time increase output and quality. However, the breadth of AI’s contribution to chicken farming has not been systematically quantified on a large scale; few people know how far current AI has actually progressed or how it will improve chicken farming to enhance the sector’s sustainability. Therefore, taking “AI + sustainable chicken farming” as the theme, this study retrieved 254 research papers for a comprehensive descriptive analysis from the Web of Science (May 2003 to March 2025) and analyzed AI’s contribution to the sustainable in recent years. Results show that: In the welfare dimension, AI primarily targets disease surveillance, behavior monitoring, stress detection, and health scoring, enabling earlier, less-invasive interventions and more stable, longer productive lifespans. In economic dimension, tools such as automated counting, vision-based weighing, and precision feeding improve labor productivity and feed use while enhancing product quality. In the environmental dimension, AI supports odor prediction, ventilation monitoring, and control strategies that lower emissions and energy use, reducing farms’ environmental footprint. However, large-scale adoption remains constrained by the lack of open and interoperable model and data standards, the compute and reliability burden of continuous multi-sensor monitoring, the gap between AI-based detection and fully automated control, and economic hurdles such as high upfront costs, unclear long-term returns, and limited farmer acceptance, particularly in resource-constrained settings. Environmental applications are also underrepresented because research has been overly vision-centric while audio and IoT sensing receive less attention. Looking ahead, AI development should prioritize solutions that are low cost, robust, animal friendly, and transparent in their benefits so that return on investment is visible in practice, supported by open benchmarks and standards, edge-first deployment, and staged cost–benefit pilots. Technically, integrating video, audio, and environmental sensors into a perception–cognition–action loop and updating policies through online learning can enable full-process adaptive management that improves welfare, enhances resource efficiency, reduces emissions, and increases adoption across diverse production contexts.

1. Introduction

Chicken meat and eggs are among the most accessible and globally valued foods, resulting in consistently high demand [1,2]. Recent outbreaks of chicken diseases have driven prices sharply upward, underscoring the urgent need to improve the sustainability of chicken farms [3,4], including stronger welfare and biosecurity, stable productivity, and a smaller environmental footprint. In this context, AI tools represent a particularly promising solution.
Sustainability in chicken production is inherently multidimensional, spanning three domains: (1) environmental care, (2) economic viability, and (3) social responsibility [5,6,7]. The aim is to balance these interdependent areas. Environmentally, efforts focus on shrinking the production footprint by reducing resource use, managing waste and emissions, and protecting surrounding ecosystems. Economically, long-term viability is essential, with profitability, cost-efficient resource use, market competitiveness, and resilience to fluctuations encouraging continued adoption of good practices. Socially, equity is central, emphasizing fair labor conditions, animal welfare, and constructive engagement with local communities. In practice, sustainable poultry systems safeguard workers, ensure humane treatment of birds, and foster positive community relations [8].
The application of AI in chicken farming marks a significant step toward more efficient, data-driven, and sustainable livestock management [9,10]. With advancements in machine learning, deep learning, and computer vision, traditional chicken farming methods have transformed, aligning more closely with the principles of Precision Livestock Farming (PLF) [11,12]. As shown in Figure 1, current AI technologies can, through image data, audio data, Internet of Things (IoT) and sensor data, as well as production data, automate monitoring processes, optimize production, and ensure the health and welfare of chickens, thereby reducing reliance on manual labor and improving overall operational efficiency while lowering inputs, waste, and emissions intensity [13,14]. As the scale of chicken farming continues to grow to meet global demand, the need for AI-based solutions becomes increasingly essential to deliver higher output with better welfare outcomes and a reduced environmental footprint [15,16,17].
However, despite the growing body of AI research in chicken farming, most studies remain fragmented, focusing on narrow aspects of production or welfare without a chicken-specific sustainability framework that enables cross-study comparison. For example, one study might focus on detecting a single behavior (e.g., feeding), another on environmental control, and another on disease detection, but there is no unifying framework or comparison between them. They also remain largely independent, with different approaches, datasets, and evaluation methods used in parallel rather than in a coordinated or cumulative way. While several other review papers have addressed related themes, most either provide deep dives into a single technology area or offer broad poultry-wide overviews, and thus lack a large-scale, chicken-specific synthesis framed by sustainability. To fill this gap, a systematic classification is needed to clarify:
  • What role AI currently plays in sustainable chicken farming
  • What challenges are encountered in its practical deployment
  • What future directions are emerging
To address these questions, this study retrieved publications from the Web of Science (May 2003 to March 2025) using a reproducible search-terms protocol, yielding 254 research papers. This review organized the papers around the welfare, economic, and environmental dimensions of sustainability and linked findings to their outcomes, focusing on how AI converts data into actionable insights. It identifies mainstream research directions and consolidates the literature into a map, enabling researchers and farmers to quickly locate methods aligned with their goals. Its broad scope provides a concise synthesis across all application areas to serve as a practical reference and to support the development of new insights and tailored solutions.
This paper is structured as follows: Section 2 describes our literature screening process and data-processing methods, Section 3 presents the classified analysis of data: it first provides an overall analysis and then classifies the data according to the three pillars of sustainability (welfare, economic, and environmental impacts) to explore the role of AI in sustainable chicken farming. Section 4 discusses the current technological challenges and forecasts future directions. Finally, Section 5 concludes the study.

2. Methodology

2.1. Search Strategy

This study employs literature analysis to assess the development of scientific research in AI applications within chicken farming, as illustrated in Figure 2. Literature analysis offers a comprehensive overview of scientific and technological advancements by examining the body of literature available in databases, thereby enabling an analysis of how AI enhances the sustainability of chicken farming in terms of economics, welfare, and environmental impact [18,19].

2.2. Selection of Search Terms

The Web of Science Core Collection was selected as the primary database for scientific literature retrieval, as it is widely used in literature studies and is one of the most comprehensive scientific article repositories globally. Moreover, it provides quality metrics such as citation counts and the h-index, which are frequently utilized in literature research.
Search terms were organized into three groups covering the farming context, chicken-related terms, and AI technologies; the complete list is provided in Table 1 [20]. To maximize relevance to chicken-specific studies while controlling noise, broad taxa such as bird, fowl, and poultry were excluded. Terms related to laying hens are captured under hen, and records using layer or Gallus domesticus typically also index hen or chicken. Likewise, most AI studies co-index artificial intelligence or deep learning. Expanding to broader synonyms would likely add limited recall but substantially increase irrelevant hits, so the current term set balances recall and precision.
In the Web of Science, the search was conducted using the “Topic” field, which searches within the title, abstract, authors, and keywords. Boolean operators and wildcard symbols were applied to improve accuracy. Parentheses were used to group terms, and the “OR” operator was used within groups to capture any of the terms, while the “AND” operator was used between groups to ensure that records contained terms from each group [21]. Wildcards (e.g., asterisks) were used to capture both singular and plural forms of the terms. An example syntax used for the search is:
TS=((farm OR breed OR house) AND (broiler OR chicken OR chick OR cock OR hen) AND (artificial intelligence OR machine learning OR deep learning OR neural network OR natural language processing OR transformer OR generative adversarial network)).
The search was limited to documents published between May 2003 and January 2025. The year 2003 was chosen because it marks the publication of the first study containing this set of keywords. This temporal scope allows for a comprehensive analysis of the development and trends in AI applications in chicken farming over the past two decades.

2.3. Inclusion and Exclusion Criteria

The data selection and organization process involved reviewing and refining the information obtained. After the initial database search, a total of 366 articles were identified. These articles were then filtered to exclude conference papers, non-English articles, review papers, and duplicate works, resulting in 316 articles (with 50 articles removed).
Articles deemed irrelevant were those that, although containing some keywords related to the research topic, primarily focused on other animals, different aspects of animal production, or biochemical analyses unrelated to AI applications in chicken farming. Therefore, a further screening process was carried out to exclude articles not directly relevant to the research focus, reducing the total to 254 articles (with 62 additional articles removed).

2.4. Data Extraction from Included Papers

After identifying 254 research papers, titles, abstracts, conclusions, and keywords were extracted into a structured template. When needed, the methods and results were consulted to clarify targets and outcomes. Each paper was assigned one primary sustainability contribution (Table 2) and one primary technical modality (Table 3), determined by the study’s main objective and dominant endpoints stated in the conclusions or main results. All counts and visualizations use these primary labels (sustainability contribution and technical modality). Application cases and tasks were also recorded for downstream analysis.
Based on the extracted information, the corpus was analyzed in the following steps:
(1)
computed annual publication and citation counts.
(2)
aggregated author keywords across all papers and retained those occurring more than 10 times, then grouped these frequent keywords into Method, Object, and Purpose categories.
(3)
applied the decision rules in Table 2 and Table 3 to produce tallies by sustainability contribution and technical modality for all papers.
(4)
summarized the typical AI processing workflow for sustainable chicken farming.
(5)
used the contribution tallies to address what role AI currently plays in sustainable chicken farming.
(6)
synthesized the above analyses to discuss the challenges of practical deployment and the emerging directions.

3. Results

This section analyzes and classifies the selected 254 articles, and carries out the overall analysis of keywords, data type analysis, algorithm type analysis and application analysis, respectively.

3.1. Overall Description of Literature

The annual publication volume and citation growth trends of articles on AI in chicken farming from May 2003 to March 2025 are depicted in Figure 3.
The data in Figure 3 illustrates that although PLF introduced relevant concepts as early as 2003, practical AI implementations in chicken farming remained limited due to technological constraints. Early research predominantly involved theoretical frameworks and lacked substantive applied outcomes. However, a significant shift began in 2017, marked by rapid technological advancements and increased attention to AI-driven methods. This period saw a considerable increase in both publications and citations. Specifically, in 2024, the number of articles published reached 56, with citations soaring to 905, indicating growing academic interest and industry relevance. Preliminary data from March 2025 suggests that this upward trend continues, reflecting sustained and expanding interest in integrating AI within chicken farming practices. This trajectory also mirrors system-wide advances: maturing sensing (humidity, gas and weighing sensors), progress in computer vision (from SVM/CNN classifiers to real-time detectors such as YOLOv1-v12), affordable edge compute (Raspberry Pi and Jetson Nano enabling on-site inference), and the emergence of large language models (e.g., ChatGPT and local runtimes such as ollama), which together broaden practical application options in chicken farming.
Table 4 presents keywords that frequently appeared in the reviewed literature, categorized by method, object, and research purpose. Among methodological keywords, deep learning and machine learning dominate, reflecting a strong preference for data-driven approaches. Computer vision and Machine vision technologies are significant subcategories, and their presence among the methodological terms underscores their role in practical applications that depend on image-based and visual data analysis. Regarding research subjects, the literature predominantly specifies Laying Hens explicitly, whereas Broilers and other poultry types are grouped under broader categories like Chickens or Poultry. As for research purposes, improving animal welfare is the primary motivation driving most studies. Subsequent priorities include enhancing poultry performance (Growth, Production), alongside predictive analytics and behavioral classification tasks critical for welfare assessments.
Figure 4 presents the breakdown of the 254 papers by primary contribution. Research focusing on improving chicken welfare through AI accounts for the largest share, with 133 papers (52%), mainly concentrating on image-based approaches. Next are studies aimed at enhancing economic efficiency, totaling 107 papers (42%), which largely combine image analysis with production-performance analysis. Finally, only 14 papers (6%) target the optimization of environmental impacts, possibly because environmental issues are more often tackled with IoT sensor solutions unrelated to AI.
AI research workflows in this field generally proceed through five stages as Figure 5: data acquisition, data preprocessing, feature extraction, model training and recognition evaluation. Each stage is analyzed in terms of statistical approaches, methodological choices, outcomes, identified challenges and future prospects, forming the backbone of this review. In practice, model selection typically follows the modality and recent state-of-the-art: researchers test newly released popular architectures (e.g., YOLOv11, Transformer variants) against established baselines on their own datasets, and then adopt the better-performing option as the starting point to achieve their specific research aims.

3.2. AI-Driven Enhancements to Chicken Welfare

As shown in Figure 4, improving chicken welfare has become one of the most important goals in recent AI-related studies on chicken farming. We reviewed 133 papers that are directly relevant to welfare promotion and identified four major objectives: disease monitoring and prevention, behavior monitoring, stress monitoring, and health scoring. These objectives are addressed in 102 papers published between 2021 and March 2025; the discussion below focuses on that period to illustrate how AI helps achieve each welfare goal for long-term sustainability. The remaining 31 papers are listed but not analyzed in detail due to overlapping with the four categories and the non-homogeneity of trends given rapid post-2020 methodological and hardware advances. Table 5 presents the main objective, modality, and cases for these papers.
Regarding disease monitoring, the aim of AI techniques is to identify abnormal postures, unusual droppings, skin or organ lesions, and other early signs of illness as quickly and accurately as possible so that preventive or intervention measures can be taken without delay. Most studies use image data, audio data, and IoT sensor data (such as RFID), together with production data, to provide real-time monitoring and early warning of diseases such as Newcastle disease, parasitic infections, and aflatoxin poisoning [68,69,70].
For behavior monitoring, AI mainly relies on image analysis and sensor data to automatically track feeding, drinking, dust-bathing, and activity levels, thereby optimizing the environment and improving living comfort [71,72]. AI systems can also detect problematic behaviors such as feather pecking and piling in real time, allowing timely intervention and preventing injury or stress [73]. Moreover, IMU and RFID sensors combined with machine-learning algorithms help identify aggressive and other abnormal behaviors, further safeguarding flock safety and quality of life [74].
Stress monitoring is achieved mainly through computer vision, audio analysis, and production-data analytics. By combining visual techniques with ambient temperature and humidity data, AI can automatically recognize signs of heat stress such as open-beak panting or retching, thereby improving environmental control efficiency [75]. Audio analysis uses deep-learning models to recognize the birds’ emotional states for real-time stress monitoring [76]. Production-data analytics can also assess heat-stress levels and guide environmental adjustments to raise welfare [77].
For health scoring, AI primarily uses image recognition and sound analysis to evaluate health status. Image-based methods include assessing feather quality and monitoring eye and comb temperatures to support management decisions [78]. Sound analysis evaluates vocal traits after vaccination to judge vaccine efficacy and overall health, thus optimizing immunization strategies and improving flock welfare [79].
In summary, AI research on chicken welfare now concentrates on disease surveillance and prevention. A promising next step is instant dead-bird detection followed by contact-free removal using robots; although robot prototypes exist, the diversity of housing conditions and chicken breeds still limit large-scale commercial adoption. Automated monitoring of abnormal behavior, stress and overall health scores is just as important. Rapidly identifying and addressing discomfort not only improves welfare but also preserves meat and egg quality, offering clear benefits for both producers and animal-welfare advocates. In this way, welfare-oriented AI improves sustainable chicken production, benefiting producers, animals and the planet alike.

3.3. Economic Impacts of AI Applications in Chicken Farming

In recent years, AI not only improves profit margins in chicken production but has also helped make those profits more sustainable by trimming feed, labor and energy inputs. A total of 107 papers address these topics (as shown in Figure 4); following the approach in Section 3.2, this review analyses 68 papers published between 2021 and March 2025 to highlight the economic and resource-efficiency gains achieved during the past five years. The remaining 39 papers are listed but not analyzed in detail due to overlapping with the four categories and the non-homogeneity of trends given rapid post-2020 methodological and hardware advances. Table 6 presents the main objective, modality, and cases for these papers.
Regarding production optimization, AI systems seek to replace manual checks with continuous, high-resolution metrics that guide feeding, labor and logistics in real time. Most studies fuse computer-vision streams with audio or IoT data to estimate body weight, count birds and eggs, and flag floor-egg or carcass events, thereby cutting labor while lifting output [126,127,128]. Integrated analysis of feed-intake acoustics and sensor logs refines ration allocation and improves feed-conversion efficiency [129]. Robot patrols that remove dead birds or floor-eggs streamline daily operations [30], and machine-learning inspection of transport losses spots anomalies early, protecting economic returns [95].
Research on performance scoring shows that machine-learning models predict laying performance and egg output more accurately, supporting better production planning and higher profits [130]. Neural networks that measure body dimensions and weights supply precise data for evaluating growth, pushing management decisions toward a more scientific basis [98]. AI also integrates environmental data from inside the house to fine-tune conditions and optimize growth and productivity.
For product quality, multi-modal pipelines couple spectral imaging and genomics to grade meat and eggs and to flag Salmonella risk, boosting food safety and market value while lowering recall costs [131,132]. High-resolution spectra combined with SNP profiles also strengthens traceability, enhancing brand trust [133].
Genetic-improvement studies show that combining bioinformatics with machine learning for genomic prediction reduces hereditary-disease risk and increases economic returns on large farms [134]. Adaptive feeding control that uses deep learning with genetic algorithms markedly improves feed-conversion ratios while cutting costs [135]. AI-based analysis of chick vocalizations enables early sex determination, and neural-network inspection of candled eggs identifies fertility and embryo sex, reducing labor and improving hatchery efficiency [136,137].
Overall, AI tools improve economic efficiency across the entire chicken-production chain while advancing sustainability (genomic trait selection, early sexing, real-time weight and count tracking, and quality checks). By converting high-frequency production data into actionable decisions, these AI applications trim feed use, labor hours and recall risk while raising product uniformity and market value, which raise both economic and sustainability progress.

3.4. AI-Driven Environmental Optimization in Chicken Farming

In recent years, research applying AI to optimize the chicken-house environment has been limited, yet the existing work already supports the sector’s sustainability goals by cutting emissions, improving air quality and reducing energy waste. From 2012 to 2024, 14 related studies were published (as shown in Figure 4), concentrating mainly on odor-concentration prediction and correlation, ventilation performance monitoring, environmental control strategies and environmental impact assessment. Because the corpus is small, this part of study forego a pooled synthesis and instead summarize each study narratively (Table 7).
Regarding odor control, AI aims to track ammonia, hydrogen sulfide and other volatiles fast enough for ventilation to respond before bird health is affected; most studies fuse gas-sensors or electronic-nose data with machine-learning models to predict peaks and trigger fans [138,139]. For ventilation health, time-series models analyze fan and controller signals to detect faults early and schedule maintenance, keeping airflow stable and energy use in check [142,143]. Concerning climate strategy, feature-extraction plus decision-tree or random-forest classifiers label temperature–humidity–air-speed states and simulate egg-output under alternative set-points, guiding fine-tuned environmental control [144,145]. Finally, AI-enhanced life-cycle assessments combine Monte Carlo sampling, genetic algorithms and neural predictors to quantify manure value, groundwater risk and per-kilogram emissions, giving a full-chain view of sustainability gains [147].
Overall, the number of AI studies targeting environmental optimization in chicken houses remains small, partly because many environmental tasks are still handled by conventional expert-system approaches that rely on straightforward statistical analysis rather than AI. Rapid advances in language models and edge computing, however, open new possibilities: large language models can autonomously read and interpret sensor streams, trigger control actions, and store the resulting data for continual retraining. Such closed-loop, self-improving systems could make AI-driven environmental management a practical reality in commercial farms.

4. Discussion

This study conducted a literature analysis of AI applications in sustainable chicken farming, uncovering global research trends, key technologies, and contributors in the field. Using the Web of Science database and filtering relevant papers through keyword selection, the analysis identified 254 research papers, providing a comprehensive view of AI’s role and development in sustainable chicken farming. This discussion synthesizes the findings around three guiding questions:

4.1. What Role AI Currently Plays in Sustainable Chicken Farming

The analysis shows that interest in applying AI to chicken farming has surged since 2017, with the number of publications increased from 5 to 56 in 2024 (an increase of 51 papers, ~83%), while citations rose from 28 to 905 (an increase of 877, ~94%). Most studies leverage deep learning models for predictive tasks on farm data. The overarching goal of these AI applications is to automate farm management decisions, thereby reducing manual labor requirements and minimizing human error while improving decision precision and animal welfare.
Current AI applications yield benefits across animal welfare, economic efficiency, and environmental management. For animal welfare, researchers are using AI for tasks such as disease surveillance, early mortality detection, and stress monitoring, enabling faster interventions that improve flock health and product quality. On the economic front, data-driven tools (such as automated weight tracking, early sex identification, and real-time quality checks) help producers minimize feed waste, save labor, and enhance yield and product uniformity, thereby boosting profitability. In terms of environmental management, although relatively few studies currently use AI for climate control or waste handling, emerging sensor networks and intelligent control algorithms could soon maintain optimal housing conditions automatically, reducing resource consumption and environmental impact. Together, these outcomes support progress toward more sustainable production.

4.2. What Challenges Are Encountered in Practical Deployment

Although each dimension offers valuable insights, several cross-cutting barriers still prevent AI tools from achieving large-scale commercial deployment and delivering tangible sustainability gains in chicken farming:

4.2.1. Lack of Standardized Model-Optimization Pipelines

Many studies treat incremental network tweaks (such as swapping model backbones or inserting attention blocks) as novel contributions, yet there is no agreed recipe for selecting or tuning these enhancements. Differences in house layout, sensor specifications, and mounting positions mean that the same architecture can perform very differently from one farm to the next, forcing researchers to repeat large numbers of experiments. Establishing industry standards for house dimensions, camera resolution, and mounting angles would harmonize data inputs, promote model sharing, and raise generalizability. Until a widely accepted public dataset is available, chasing marginal accuracy gains is of limited value. Sensor hardware adds another layer of variability; precision, sampling rate, and failure modes differ across projects, making cross-study benchmarking difficult. What is needed is an open, large-scale dataset (covering environmental, behavioral, and health modalities) that can serve as a common test bed for algorithm validation. Once that foundation is in place, accuracy gains translate directly into wider deployment, fewer redundant on-farm trials and faster diffusion of best practice. Cutting research waste and resources to improve sustainability.
Furthermore, this review does not aggregate model-level accuracy or economic effect sizes across studies. Reported metrics are highly heterogeneous in tasks, datasets, label definitions, thresholds, and evaluation protocols, and many papers claim accuracies above 90% under lab-specific conditions while real-world performance remains uncertain. To avoid drawing misleading comparisons, we refrain from pooling “accuracy gains” and instead underscore the need for a widely accepted, public benchmark spanning image, audio, IoT, and production records with fixed splits, cross-farm validation, and standardized reporting of inference latency, energy use, maintenance cost, and downstream impacts on feed efficiency and air quality. Establishing such a benchmark would make results comparable, clarify external validity, and convert methodological improvements into deployable gains.

4.2.2. Long-Term, Real-Time Use of Multi-Modal Sensor Networks

Most existing AI algorithms assume only a few video or audio streams, yet commercial facilities may host dozens. Limited edge compute and bandwidth force systems to down-sample, while gas sensors drift and wearables exhaust their batteries. Future work must deliver lightweight inference pipelines, zero-drift gas probes and energy-harvesting wearables. Achieving multi-year, maintenance-free sensing will give farms an uninterrupted view of welfare and emissions, allowing ventilation, feeding and lighting to adjust exactly when needed, which in turn cuts energy demand, feed waste and medication use.
Most published studies are still experimental, either conducted in pilot barns or as short on-farm trials without longitudinal evidence of sustained use. As the technology stack continues to mature, what is needed are multi-month to multi-season deployments in commercial houses with continuous logging of uptime, failure rates, recalibration intervals, battery life, bandwidth and energy budgets, and with clear records of control actions taken. Such longitudinal evidence will demonstrate real-world maturity, quantify seasonality and maintenance burden, and show whether welfare, emissions, feed use, and medication actually improve over time. Studies that deliver durable, long-term operation in production settings will be more value for adoption.

4.2.3. Disconnect Between Detection and Control

Most studies halt at status recognition or risk scoring, so a human still has to convert alerts into concrete actions such as tweaking ventilation, revising feed schedules, or dispatching a robot. A recent proof-of-concept by Leite et al. [150]. showed that a retrieval-augmented language model can translate sensor summaries directly into executable control instructions, demonstrating that deeper automation is within reach. Looking ahead, edge-based expert systems could combine causal reasoning with control logic and issue real-time commands to programmable-logic controllers or robot operating systems. With online learning to close the perception–cognition–action loop, chicken farms could become largely self-optimizing, continuously fine-tuning indoor climate, feeding, and sanitation while further reducing labor input, production losses, and environmental impacts.

4.2.4. Economic Considerations for AI Adoption

In practice, AI remains a capital-intensive technology with delayed payoffs. Most studies argue that long-term gains will accrue through reduced labor, automated feed control, and improved climate management, yet the upfront costs of cameras, sensors, networking, and computers are substantial. Retrofitting legacy houses often requires structural adjustments, reliable power and networking, and ongoing maintenance and basic technical staffing, which further raise operating costs. Many of the promised returns are still insufficiently evidenced in longitudinal, commercial settings, since the current wave of AI chicken farm research only began to scale after 2016. These factors dampen farmers’ willingness to make large initial investments. Data ownership and access also remain unsettled: researchers and vendors seek broad data rights to improve models, while farmers treat operational data as proprietary assets, which constrains data sharing and slows technical refinement. Geographically, the literature is concentrated in a few high-income countries. Whether these solutions translate to resource-constrained regions is unclear, especially where poultry production is largely cage-based, budgets are tight, and skilled technical support is scarce. The current evidence base therefore risks a techno-optimistic bias that underestimates cost, capacity, and context barriers outside early-adopting markets.

4.3. What Future Directions Are Emerging

Future progress will hinge on turning sensing and prediction into reliable, low-burden control that improves outcomes in real flocks and real houses. Research should move from short pilots to longitudinal deployments that integrate video, audio, IoT sensors, and production records within a single perception–cognition–action loop. Continuous data streams can trigger targeted interventions, log post-intervention outcomes, and update policies through online learning, creating systems that improve with use while remaining interpretable and animal friendly (Figure 6).
For welfare, the emphasis will shift toward earlier, less invasive interventions aligned with the 3R (Replacement, Reduction and Refinement) principles. Models that detect subtle precursors of disease and stress, or early deviations in activity and social behavior, can reduce frequent manual handling.
For the economy of adoption, the agenda should prioritize visible return on investment through cost-aware designs. Key directions include low-cost edge deployment, modular retrofits that work in legacy cage systems, and staged cost–benefit pilots that report labor saved, feed conserved, energy reduced, and downtime avoided. Clear data-governance agreements and lightweight service models will help address ownership concerns and reduce the staffing burden. Extending evidence beyond a few high-income regions to resource-constrained settings is essential to avoid optimism bias and to ensure solutions are robust to local constraints.
Technically and environmentally, the field should broaden beyond vision to incorporate audio analytics, gas sensing, microclimate telemetry, and power-aware scheduling. Closed-loop ventilation and heating informed by ammonia and airflow sensing, coupled with predictive maintenance of fans and inlets, can improve indoor comfort while reducing emissions and energy use. Community benchmarks with fixed splits and cross-farm protocols, open standards for sensors and mounting, and multi-season evaluations will make results comparable and accelerate translation from promising prototypes to commercially durable, self-optimizing systems.

4.4. Study Limitations

The corpus is heavily skewed toward vision-based datasets and models, while audio sensing and broader multimodal fusion are comparatively underexplored. This imbalance is most visible in environmental applications, where only 14 studies were identified, despite the central importance of emissions, air quality, and energy use to sector-level sustainability. Greater emphasis on audio, gas sensing, microclimate telemetry, and integrated multimodal pipelines is needed to capture phenomena that cameras alone cannot represent and to support actionable environmental control.
This review relied on the Web of Science Core Collection as the primary source because it indexes most mainstream journals in this field and shows substantial overlap with Scopus and PubMed for AI-in-chicken-farming applications. Nonetheless, using a single database can lead to omissions. A scoped query of IEEE Xplore retrieved only 11 eligible items, largely conference papers whose technical directions were already represented in the included journal literature; given this limited yield and the journal focus of the present review, IEEE Xplore was not used as a core source. Other databases such as CAB Abstracts and Google Scholar were not employed due to retrieval complexity and weaker filtering for de-duplication and relevance at scale. Finally, conference proceedings were excluded from formal synthesis because of space-limited reporting and heterogeneous review standards, even though some high-quality contributions exist. Future work should broaden coverage across multiple databases, incorporate conference literature where appropriate, implement cross-database de-duplication, and conduct sensitivity analyses to assess how database choice affects conclusions.
Although every effort was made to follow a transparent search protocol, topic selection and classification inevitably contain some subjectivity. All metadata and inclusion criteria are provided in the Data Availability Statement so that other researchers can replicate the workflow, refine the categories, or extend the corpus for further study.

5. Conclusions

This literature review shows how AI improves sustainable chicken farming across welfare, economic, and environmental aspects: earlier and less invasive welfare surveillance, data-driven management that raises efficiency and product quality, and environmental monitoring that supports targeted ventilation and lower emissions.
The analysis also highlights current gaps that limit large-scale adoption: the absence of shared standards and public benchmarks, limited longitudinal evidence for multi-modal sensing at farm scale, the gap between detection and closed-loop control, high upfront costs with uncertain long-term economic returns and unresolved data-ownership issues, and a vision-centric research bias that underrepresents environmental applications.
Looking ahead, priorities are low-cost and robust solutions that make benefits visible, unified perception–cognition–action loops with online learning, broader use of audio/IoT modalities, and cross-farm, multi-season evaluations under open standards to translate promising prototypes into durable commercial practice.

Author Contributions

Conceptualization, Z.W., S.W.; methodology, Z.W., S.W.; validation, Z.W.; formal analysis, Z.W.; investigation, Z.W.; resources, T.N.; data curation, Z.W.; writing—original draft preparation, Z.W.; writing—review and editing, T.N., S.W. and D.L.; visualization, Z.W.; supervision, T.N.; project administration, T.N.; funding acquisition, T.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable for studies not involving humans or animals.

Data Availability Statement

The data including 254 papers retrieved from Web of Science and the corresponding statistical analysis tables are available at the following link: https://osf.io/jgnbq/?view_only=7e2865edba224e97ab200c2337bf2b81 (accessed on 25 September 2025).

Acknowledgments

Thanks to the Guangzhou Elite Project for its joint funding support for the doctoral training program (JY2024017). During the preparation of this work the authors used ChatGPT® (version 5) in order to improve readability and language. After using this tool, the authors reviewed and edited the content as needed and took full responsibility for the content of the publication.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AIArtificial intelligence
IoTInternet of Things
PLFPrecision Livestock Farming
CNNConvolutional Neural Network
SVMSupport Vector Machine
GANGenerative Adversarial Network
LSTMLong Short-Term Memory
RFIDRadio-Frequency Identification
IMUInertial Measurement Unit
NH3Ammonia

References

  1. Abbas, A.O.; Nassar, F.S.; Al Ali, A.M. Challenges of Ensuring Sustainable Poultry Meat Production and Economic Resilience under Climate Change for Achieving Sustainable Food Security. Res. World Agric. Econ. 2025, 6, 159–171. [Google Scholar] [CrossRef]
  2. Usturoi, M.G.; Rațu, R.N.; Crivei, I.C.; Veleșcu, I.D.; Usturoi, A.; Stoica, F.; Radu Rusu, R.M. Unlocking the Power of Eggs: Nutritional Insights, Bioactive Compounds, and the Advantages of Omega-3 and Omega-6 Enriched Varieties. Agriculture 2025, 15, 242. [Google Scholar] [CrossRef]
  3. Reynolds, S. Breaking the Expected: Rethinking the Impact of Bird Losses on Egg Prices. 2025. Available online: https://opus.govst.edu/research_day/2025/thurs/9/ (accessed on 25 September 2025).
  4. Kagaya, S.; Widmar, N.O.; Kilders, V. The price of attention: An analysis of the intersection of media coverage and public sentiments about eggs and egg prices. Poult. Sci. 2025, 104, 104482. [Google Scholar] [CrossRef]
  5. Brundtland, G.H. Our Common Future World Commission on Environment and Developement. Available online: https://sustainabledevelopment.un.org/content/documents/5987our-common-future.pdf (accessed on 25 September 2025).
  6. FAO. Production|Gateway to Poultry Production and Products|Food and Agriculture Organization of the United Nations. Available online: https://www.fao.org/poultry-production-products/production/en/ (accessed on 25 September 2025).
  7. Vaarst, M.; Steenfeldt, S.; Horsted, K. Sustainable development perspectives of poultry production. World’s Poult. Sci. J. 2015, 71, 609–620. [Google Scholar] [CrossRef]
  8. Bist, R.B.; Bist, K.; Poudel, S.; Subedi, D.; Yang, X.; Paneru, B.; Mani, S.; Wang, D.; Chai, L. Sustainable poultry farming practices: A critical review of current strategies and future prospects. Poult. Sci. 2024, 103, 104295. [Google Scholar] [CrossRef]
  9. Berckmans, D. General introduction to precision livestock farming. Anim. Front. 2017, 7, 6–11. [Google Scholar] [CrossRef]
  10. Werner, A.; Jarfe, A. (Eds.) Programme Book of the Joint Conference of ECPA-ECPLF; Wageningen Academic: Wageningen, The Netherlands, 2003; p. 846. Available online: https://www.cabidigitallibrary.org/doi/full/10.5555/20033115161 (accessed on 25 September 2025).
  11. Norton, T.; Berckmans, D. Engineering advances in precision livestock farming. Biosyst. Eng. 2018, 173, 1–3. [Google Scholar] [CrossRef]
  12. Roy, A.; Rana, T. Precision Livestock Farming and Its Advantage to the Environment. Epidemiol. Environ. Hyg. Vet. Public Health 2025, 6, 343–347. [Google Scholar] [CrossRef]
  13. Bretas, I.L.; Dubeux, J.C., Jr.; Cruz, P.J.; Oduor, K.T.; Queiroz, L.D.; Valente, D.S.; Chizzotti, F.H. Precision livestock farming applied to grazingland monitoring and management—A review. Agron. J. 2024, 116, 1164–1186. [Google Scholar] [CrossRef]
  14. Hsieh, C.W.; Hsu, S.Y.; Chao, C.H.; Lee, M.C.; Chen, C.Y.; Wu, C.H. Implementing an intelligent chicken aviary using deep learning techniques. Multimed. Tools Appl. 2025, 84, 1–30. [Google Scholar] [CrossRef]
  15. Van Hertem, T.; Rooijakkers, L.; Berckmans, D.; Fernández, A.P.; Norton, T.; Vranken, E. Appropriate data visualisation is key to Precision Livestock Farming acceptance. Comput. Electron. Agric. 2017, 138, 1–10. [Google Scholar] [CrossRef]
  16. Kaswan, S.; Chandratre, G.A.; Upadhyay, D.; Sharma, A.; Sreekala, S.M.; Badgujar, P.C.; Ruchay, A. Applications of sensors in livestock management. In Engineering Applications in Livestock Production; Academic Press: Cambridge, MA, USA, 2024; pp. 63–92. [Google Scholar] [CrossRef]
  17. Brassó, L.D.; Komlósi, I.; Várszegi, Z. Modern technologies for improving broiler production and welfare: A Review. Animals 2025, 15, 493. [Google Scholar] [CrossRef] [PubMed]
  18. Rowe, E.; Dawkins, M.S.; Gebhardt-Henrich, S.G. A systematic review of precision livestock farming in the poultry sector: Is technology focussed on improving bird welfare? Animals 2019, 9, 614. [Google Scholar] [CrossRef] [PubMed]
  19. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. bmj 2021, 372, 71. [Google Scholar] [CrossRef] [PubMed]
  20. Coulibaly, S.; Kamsu-Foguem, B.; Kamissoko, D.; Traore, D. Deep learning for precision agriculture: A bibliometric analysis. Intell. Syst. Appl. 2022, 16, 200102. [Google Scholar] [CrossRef]
  21. Zhu, J.; Liu, W. A tale of two databases: The use of Web of Science and Scopus in academic papers. Scientometrics 2020, 123, 321–335. [Google Scholar] [CrossRef]
  22. Ali, W.; Din, I.U.; Almogren, A.; Rodrigues, J.J. Poultry Health Monitoring With Advanced Imaging: Toward Next-Generation Agricultural Applications in Consumer Electronics. IEEE Trans. Consum. Electron. 2024, 70, 7147–7154. [Google Scholar] [CrossRef]
  23. Bist, R.B.; Yang, X.; Subedi, S.; Bist, K.; Paneru, B.; Li, G.; Chai, L. An automatic method for scoring poultry footpad dermatitis with deep learning and thermal imaging. Comput. Electron. Agric. 2024, 226, 109481. [Google Scholar] [CrossRef]
  24. Dayan, J.; Goldman, N.; Halevy, O.; Uni, Z. Research Note: Prospects for early detection of breast muscle myopathies by automated image analysis. Poult. Sci. 2024, 103, 103680. [Google Scholar] [CrossRef]
  25. Elmessery, W.M.; Gutiérrez, J.; El-Wahhab, G.G.A.; Elkhaiat, I.A.; El-Soaly, I.S.; Alhag, S.K.; Al-Shuraym, L.A.; Akela, M.A.; Moghanm, F.S.; Abdelshafie, M.F. YOLO-based model for automatic detection of broiler pathological phenomena through visual and thermal images in intensive poultry houses. Agriculture 2023, 13, 1527. [Google Scholar] [CrossRef]
  26. Tong, Q.; Zhang, E.; Wu, S.; Xu, K.; Sun, C. A real-time detector of chicken healthy status based on modified YOLO. Signal Image Video Process. 2023, 17, 4199–4207. [Google Scholar] [CrossRef]
  27. Qin, W.; Yang, X.; Liu, C.; Zheng, W. A deep learning method based on YOLOv5 and SuperPoint-SuperGlue for digestive disease warning and cage location backtracking in stacked cage laying hen systems. Comput. Electron. Agric. 2024, 222, 108999. [Google Scholar] [CrossRef]
  28. Pakuła, A.; Paśko, S.; Marć, P.; Kursa, O.; Jaroszewicz, L.R. AI Classification of Eggs’ Origin from Mycoplasma synoviae-Infected or Non-Infected Poultry via Analysis of the Spectral Response. Appl. Sci. 2023, 13, 12360. [Google Scholar] [CrossRef]
  29. Nakrosis, A.; Paulauskaite-Taraseviciene, A.; Raudonis, V.; Narusis, I.; Gruzauskas, V.; Gruzauskas, R.; Lagzdinyte-Budnike, I. Towards early poultry health prediction through non-invasive and computer vision-based dropping classification. Animals 2023, 13, 3041. [Google Scholar] [CrossRef] [PubMed]
  30. Yang, X.; Zhang, J.; Paneru, B.; Lin, J.; Bist, R.B.; Lu, G.; Chai, L. Precision Monitoring of Dead Chickens and Floor Eggs with a Robotic Machine Vision Method. AgriEngineering 2025, 7, 35. [Google Scholar] [CrossRef]
  31. Khanal, R.; Wu, W.; Lee, J. Automated Dead Chicken Detection in Poultry Farms Using Knowledge Distillation and Vision Transformers. Appl. Sci. 2024, 15, 136. [Google Scholar] [CrossRef]
  32. Ma, W.; Wang, X.; Yang, S.X.; Xue, X.; Li, M.; Wang, R.; Yu, L.; Song, L.; Li, Q. Autonomous inspection robot for dead laying hens in caged layer house. Comput. Electron. Agric. 2024, 227, 109595. [Google Scholar] [CrossRef]
  33. Luo, S.; Ma, Y.; Jiang, F.; Wang, H.; Tong, Q.; Wang, L. Dead laying hens detection using TIR-NIR-depth images and deep learning on a commercial farm. Animals 2023, 13, 1861. [Google Scholar] [CrossRef]
  34. Fodor, I.; van der Sluis, M.; Jacobs, M.; de Klerk, B.; Bouwman, A.C.; Ellen, E.D. Automated pose estimation reveals walking characteristics associated with lameness in broilers. Poult. Sci. 2023, 102, 102787. [Google Scholar] [CrossRef]
  35. Nasiri, A.; Yoder, J.; Zhao, Y.; Hawkins, S.; Prado, M.; Gan, H. Pose estimation-based lameness recognition in broiler using CNN-LSTM network. Comput. Electron. Agric. 2022, 197, 106931. [Google Scholar] [CrossRef]
  36. Sun, Z.; Zhang, M.; Liu, J.; Wang, J.; Wu, Q.; Wang, G. Research on white feather broiler health monitoring method based on sound detection and transfer learning. Comput. Electron. Agric. 2023, 214, 108319. [Google Scholar] [CrossRef]
  37. Cuan, K.; Zhang, T.; Li, Z.; Huang, J.; Ding, Y.; Fang, C. Automatic Newcastle disease detection using sound technology and deep learning method. Comput. Electron. Agric. 2022, 194, 106740. [Google Scholar] [CrossRef]
  38. Welch, M.; Sibanda, T.Z.; De Souza Vilela, J.; Kolakshyapati, M.; Schneider, D.; Ruhnke, I. An initial study on the use of machine learning and radio frequency identification data for predicting health outcomes in free-range laying hens. Animals 2023, 13, 1202. [Google Scholar] [CrossRef] [PubMed]
  39. Mei, W.; Yang, X.; Zhao, Y.; Wang, X.; Dai, X.; Wang, K. Identification of aflatoxin-poisoned broilers based on accelerometer and machine learning. Biosyst. Eng. 2023, 227, 107–116. [Google Scholar] [CrossRef]
  40. Ahmed, G.; Malick, R.A.S.; Akhunzada, A.; Zahid, S.; Sagri, M.R.; Gani, A. An approach towards IoT-based predictive service for early detection of diseases in poultry chickens. Sustainability 2021, 13, 13396. [Google Scholar] [CrossRef]
  41. Ram Das, A.; Pillai, N.; Nanduri, B.; Rothrock, M.J., Jr.; Ramkumar, M. Exploring pathogen presence prediction in pastured poultry farms through transformer-based models and attention mechanism explainability. Microorganisms 2024, 12, 1274. [Google Scholar] [CrossRef]
  42. Jeon, K.M.; Jung, J.; Lee, C.M.; Yoo, D.S. Identification of Pre-Emptive Biosecurity Zone Areas for Highly Pathogenic Avian Influenza Based on Machine Learning-Driven Risk Analysis. Animals 2023, 13, 3728. [Google Scholar] [CrossRef]
  43. Liu, Y.; Zhuang, Y.; Yu, L.; Li, Q.; Zhao, C.; Meng, R.; Zhu, J.; Guo, X. A Machine learning framework based on extreme gradient boosting to predict the occurrence and development of infectious diseases in laying hen farms, taking H9N2 as an example. Animals 2023, 13, 1494. [Google Scholar] [CrossRef]
  44. Pillai, N.; Ayoola, M.B.; Nanduri, B.; Rothrock, M.J., Jr.; Ramkumar, M. An ensemble learning approach to identify pastured poultry farm practice variables and soil constituents that promote Salmonella prevalence. Heliyon 2022, 8. [Google Scholar] [CrossRef]
  45. Merenda, V.R.; Bodempudi, V.U.; Pairis-Garcia, M.D.; Li, G. Development and validation of machine-learning models for monitoring individual behaviors in group-housed broiler chickens. Poult. Sci. 2024, 103, 104374. [Google Scholar] [CrossRef]
  46. Paneru, B.; Bist, R.; Yang, X.; Chai, L. Tracking dustbathing behavior of cage-free laying hens with machine vision technologies. Poult. Sci. 2024, 103, 104289. [Google Scholar] [CrossRef]
  47. Yang, X.; Dai, H.; Wu, Z.; Bist, R.B.; Subedi, S.; Sun, J.; Lu, G.; Li, C.; Liu, T.; Chai, L. An innovative segment anything model for precision poultry monitoring. Comput. Electron. Agric. 2024, 222, 109045. [Google Scholar] [CrossRef]
  48. Teterja, D.; Garcia-Rodriguez, J.; Azorin-Lopez, J.; Sebastian-Gonzalez, E.; Nedić, D.; Leković, D.; Knežević, P.; Drajić, D.; Vukobratović, D. A Video Mosaicing-Based Sensing Method for Chicken Behavior Recognition on Edge Computing Devices. Sensors 2024, 24, 3409. [Google Scholar] [CrossRef] [PubMed]
  49. Jensen, D.B.; Toscano, M.; van der Heide, E.; Grønvig, M.; Hakansson, F. Comparison of strategies for automatic video-based detection of piling behaviour in laying hens. Smart Agric. Technol. 2025, 10, 100745. [Google Scholar] [CrossRef]
  50. Bist, R.B.; Subedi, S.; Yang, X.; Chai, L. A novel YOLOv6 object detector for monitoring piling behavior of cage-free laying hens. AgriEngineering 2023, 5, 905–923. [Google Scholar] [CrossRef]
  51. Subedi, S.; Bist, R.; Yang, X.; Chai, L. Tracking pecking behaviors and damages of cage-free laying hens with machine vision technologies. Comput. Electron. Agric. 2023, 204, 107545. [Google Scholar] [CrossRef]
  52. Fujinami, K.; Takuno, R.; Sato, I.; Shimmura, T. Evaluating behavior recognition pipeline of laying hens using wearable inertial sensors. Sensors 2023, 23, 5077. [Google Scholar] [CrossRef]
  53. Shahbazi, M.; Mohammadi, K.; Derakhshani, S.M.; Groot Koerkamp, P.W. Deep learning for laying hen activity recognition using wearable sensors. Agriculture 2023, 13, 738. [Google Scholar] [CrossRef]
  54. Oso, O.M.; Mejia-Abaunza, N.; Bodempudi, V.U.C.; Chen, X.; Chen, C.; Aggrey, S.E.; Li, G. Automatic analysis of high, medium, and low activities of broilers with heat stress operations via image processing and machine learning. Poult. Sci. 2025, 104, 104954. [Google Scholar] [CrossRef]
  55. Yeh, Y.H.; Chen, B.L.; Hsieh, K.Y.; Huang, M.H.; Kuo, Y.F. Designing an Autonomous Robot for Monitoring Open-Mouth Behavior of Chickens in Commercial Chicken Farms. J. ASABE 2025, 68, 25–36. [Google Scholar] [CrossRef]
  56. Yan, Y.; Sheng, Z.; Gu, Y.; Heng, Y.; Zhou, H.; Wang, S. Research note: A method for recognizing and evaluating typical behaviors of laying hens in a thermal environment. Poult. Sci. 2024, 103, 104122. [Google Scholar] [CrossRef] [PubMed]
  57. Bai, Y.; Zhang, J.; Chen, Y.; Yao, H.; Xin, C.; Wang, S.; Yu, J.; Chen, C.; Xiao, M.; Zou, X. Research into Heat Stress Behavior Recognition and Evaluation Index for Yellow-Feathered Broilers, Based on Improved Cascade Region-Based Convolutional Neural Network. Agriculture 2023, 13, 1114. [Google Scholar] [CrossRef]
  58. de Carvalho Soster, P.; Grzywalski, T.; Hou, Y.; Thomas, P.; Dedeurwaerder, A.; De Gussem, M.; Tuyttens, F.; Devos, P.; Botteldooren, D.; Antonissen, G. Automated detection of broiler vocalizations a machine learning approach for broiler chicken vocalization monitoring. Poult. Sci. 2025, 104, 104962. [Google Scholar] [CrossRef] [PubMed]
  59. Srinivasagan, R.; El Sayed, M.S.; Al-Rasheed, M.I.; Alzahrani, A.S. Edge intelligence for poultry welfare: Utilizing tiny machine learning neural network processors for vocalization analysis. PLoS ONE 2025, 20, e0316920. [Google Scholar] [CrossRef]
  60. Lev-Ron, T.; Yitzhaky, Y.; Halachmi, I.; Druyan, S. Classifying vocal responses of broilers to environmental stressors via artificial neural network. Animal 2025, 19, 101378. [Google Scholar] [CrossRef]
  61. Lv, M.; Sun, Z.; Zhang, M.; Geng, R.; Gao, M.; Wang, G. Sound recognition method for white feather broilers based on spectrogram features and the fusion classification model. Measurement 2023, 222, 113696. [Google Scholar] [CrossRef]
  62. Massari, J.M.; Moura, D.J.D.; Nääs, I.D.A.; Pereira, D.F.; Oliveira, S.R.D.M.; Branco, T.; Barros, J.D.S.G. Sequential Behavior of Broiler Chickens in Enriched Environments under Varying Thermal Conditions Using the Generalized Sequential Pattern Algorithm: A Proof of Concept. Animals 2024, 14, 2010. [Google Scholar] [CrossRef]
  63. Solis, I.L.; de Oliveira-Boreli, F.P.; de Sousa, R.V.; Martello, L.S.; Pereira, D.F. Using Thermal Signature to Evaluate Heat Stress Levels in Laying Hens with a Machine-Learning-Based Classifier. Animals 2024, 14, 1996. [Google Scholar] [CrossRef]
  64. De-Sousa, K.T.; Deniz, M.; Santos, M.P.D.; Klein, D.R.; Vale, M.M.D. Decision support system to classify the vulnerability of broiler production system to heat stress based on fuzzy logic. Int. J. Biometeorol. 2023, 67, 475–484. [Google Scholar] [CrossRef]
  65. Zhang, X.; Xu, T.; Zhang, Y.; Gao, Y.; Pan, J.; Rao, X. FCS-Net: Feather condition scoring of broilers based on dense feature fusion of RGB and thermal infrared images. Biosyst. Eng. 2024, 247, 132–142. [Google Scholar] [CrossRef]
  66. Lamping, C.; Derks, M.; Koerkamp, P.G.; Kootstra, G. ChickenNet-an end-to-end approach for plumage condition assessment of laying hens in commercial farms using computer vision. Comput. Electron. Agric. 2022, 194, 106695. [Google Scholar] [CrossRef]
  67. Lins, A.C.D.S.; Lourençoni, D.; Yanagi, T.; Miranda, I.B.; Santos, I.E.D.A. Neuro-fuzzy modeling of eyeball and crest temperatures in egg-laying hens. Eng. Agrícola 2021, 41, 34–38. [Google Scholar] [CrossRef]
  68. Ansarimovahed, A.; Banakar, A.; Li, G.; Javidan, S.M. Separating Chickens’ Heads and Legs in Thermal Images via Object Detection and Machine Learning Models to Predict Avian Influenza and Newcastle Disease. Animals 2025, 15, 1114. [Google Scholar] [CrossRef] [PubMed]
  69. Shetty, S.; Shetty, M. Poultry Disease Detection: A Comparative Analysis of CNN, SVM, and YOLO v3 Algorithms for Accurate Diagnosis. Gener. Artif. Intell. Concepts Appl. 2025, 155–171. [Google Scholar] [CrossRef]
  70. Pal, S.; Ghosh, A.; Sarkar, S.K. Development of Efficient Algorithm for Detection and Tracking of Infected Chicken at an Early Stage of Bird Flu with a Suitable Surveillance System Using RFID Technology. Power Devices Internet Things Intell. Syst. Des. 2025, 303–325. [Google Scholar] [CrossRef]
  71. Khan, I.; Peralta, D.; Fontaine, J.; de Carvalho, P.S.; Martinez-Caja, A.M.; Antonissen, G.; Tuyttens, F.; De Poorter, E. Monitoring Welfare of Individual Broiler Chickens Using Ultra-Wideband and Inertial Measurement Unit Wearables. Sensors 2025, 25, 811. [Google Scholar] [CrossRef]
  72. Hui, X.; Zhang, D.; Jin, W.; Ma, Y.; Li, G. Fine-tuning faster region-based convolution neural networks for detecting poultry feeding behaviors. Int. J. Agric. Biol. Eng. 2025, 18, 64–73. [Google Scholar] [CrossRef]
  73. Smith, B.; Long, Y.; Morris, D. An Automated LED Intervention System for Poultry Piling. In 2025 ASABE Annual International Meeting; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2025; p. 1. [Google Scholar] [CrossRef]
  74. Xue, H.; Ma, J.; Yang, Y.; Qu, H.; Wang, L.; Li, L. Aggressive behavior recognition and welfare monitoring in yellow-feathered broilers using FCTR and wearable identity tags. Comput. Electron. Agric. 2025, 235, 110284. [Google Scholar] [CrossRef]
  75. Karatsiolis, S.; Panagi, P.; Vassiliades, V.; Kamilaris, A.; Nicolaou, N.; Stavrakis, E. Towards understanding animal welfare by observing collective flock behaviors via AI-powered Analytics. In Proceedings of the 2024 19th Conference on Computer Science and Intelligence Systems (FedCSIS), Belgrade, Serbia, 8–11 September 2024; IEEE: New York, NY, USA, 2024; pp. 643–648. [Google Scholar] [CrossRef]
  76. Neethirajan, S. Decoding Vocal Indicators of Stress in Laying Hens: A CNN-MFCC Deep Learning Framework. Smart Agric. Technol. 2025, 11, 101056. [Google Scholar] [CrossRef]
  77. van Veen, L.A.; van den Brand, H.; van den Oever, A.C.; Kemp, B.; Youssef, A. An adaptive expert-in-the-loop algorithm for flock-specific anomaly detection in laying hen production. Comput. Electron. Agric. 2025, 229, 109755. [Google Scholar] [CrossRef]
  78. Niu, J.; Li, T.; Qi, K.; Liu, Y.; Deng, H.; Hu, Y.; Xu, D.; Wu, L.; Amevor, F.K.; Wang, Y.; et al. Research Note: Application of Convolutional Neural Networks for Feather Classification in Chickens. Poult. Sci. 2025, 104, 105254. [Google Scholar] [CrossRef]
  79. Zhou, H.; Zhu, Q.; Norton, T. Cough sound recognition in poultry using portable microphones for precision medication guidance. Comput. Electron. Agric. 2025, 237, 110541. [Google Scholar] [CrossRef]
  80. Zheng, H.; Ma, C.; Liu, D.; Huang, J.; Chen, R.; Fang, C.; Yang, J.; Berckmans, D.; Norton, T.; Zhang, T. Weight prediction method for individual live chickens based on single-view point cloud information. Comput. Electron. Agric. 2025, 234, 110232. [Google Scholar] [CrossRef]
  81. Campbell, M.; Miller, P.; Díaz-Chito, K.; Irvine, S.; Baxter, M.; Del Rincón, J.M.; Hong, X.; McLaughlin, N.; Arumugam, T.; O’COnnell, N. Automated precision weighing: Leveraging 2D video feature analysis and machine learning for live body weight estimation of broiler chickens. Smart Agric. Technol. 2025, 10, 100793. [Google Scholar] [CrossRef]
  82. Oh, Y.; Lyu, P.; Ko, S.; Min, J.; Song, J. Enhancing Broiler Weight Estimation through Gaussian Kernel Density Estimation Modeling. Agriculture 2024, 14, 809. [Google Scholar] [CrossRef]
  83. Li, X.; Wu, J.; Zhao, Z.; Zhuang, Y.; Sun, S.; Xie, H.; Gao, Y.; Xiao, D. An improved method for broiler weight estimation integrating multi-feature with gradient boosting decision tree. Animals 2023, 13, 3721. [Google Scholar] [CrossRef]
  84. Wu, Z.; Yang, J.; Zhang, H.; Fang, C. Enhanced Methodology and Experimental Research for Caged Chicken Counting Based on YOLOv8. Animals 2025, 15, 853. [Google Scholar] [CrossRef]
  85. Guo, Y.; Wu, Z.; Su, Z.; Zhao, J.; Li, X. PCCNet: A Point Supervised Dense Chickens Flock Counting Network. Smart Agric. Technol. 2025, 10, 100795. [Google Scholar] [CrossRef]
  86. Li, X.; Cai, M.; Tan, X.; Yin, C.; Chen, W.; Liu, Z.; Wen, J.; Han, Y. An efficient transformer network for detecting multi-scale chicken in complex free-range farming environments via improved RT-DETR. Comput. Electron. Agric. 2024, 224, 109160. [Google Scholar] [CrossRef]
  87. Liu, Y.; Zhou, H.; Ni, Z.; Jiang, Z.; Wang, X. An Accurate and Lightweight Algorithm for Caged Chickens Detection based on Deep Learning. Pak. J. Agric. Sci. 2024, 61, 403–415. [Google Scholar] [CrossRef]
  88. Subedi, S.; Bist, R.; Yang, X.; Chai, L. Tracking floor eggs with machine vision in cage-free hen houses. Poult. Sci. 2023, 102, 102637. [Google Scholar] [CrossRef]
  89. Ren, Y.; Huang, Y.; Wang, Y.; Zhang, S.; Qu, H.; Ma, J.; Wang, L.; Li, L. A high-performance day-age classification and detection model for chick based on attention encoder and convolutional neural network. Animals 2022, 12, 2425. [Google Scholar] [CrossRef] [PubMed]
  90. Amirivojdan, A.; Nasiri, A.; Zhou, S.; Zhao, Y.; Gan, H. ChickenSense: A Low-Cost Deep Learning-Based Solution for Poultry Feed Consumption Monitoring Using Sound Technology. AgriEngineering 2024, 6, 2115–2129. [Google Scholar] [CrossRef]
  91. Nasiri, A.; Amirivojdan, A.; Zhao, Y.; Gan, H. Estimating the feeding time of individual broilers via convolutional neural network and image processing. Animals 2023, 13, 2428. [Google Scholar] [CrossRef] [PubMed]
  92. Xin, C.; Li, H.; Li, Y.; Wang, M.; Lin, W.; Wang, S.; Zhang, W.; Xiao, M.; Zou, X. Research on an identification and grasping device for dead yellow-feather broilers in flat houses based on deep learning. Agriculture 2024, 14, 1614. [Google Scholar] [CrossRef]
  93. Zhang, Y.; Lai, Z.; Wang, H.; Jiang, F.; Wang, L. Autonomous navigation using machine vision and self-designed fiducial marker in a commercial chicken farming house. Comput. Electron. Agric. 2024, 224, 109179. [Google Scholar] [CrossRef]
  94. Li, G.; Chesser, G.D.; Purswell, J.L.; Magee, C.; Gates, R.S.; Xiong, Y. Design and development of a broiler mortality removal robot. Appl. Eng. Agric. 2022, 38, 853–863. [Google Scholar] [CrossRef]
  95. Pirompud, P.; Sivapirunthep, P.; Punyapornwithaya, V.; Chaosap, C. Application of machine learning algorithms to predict dead on arrival of broiler chickens raised without antibiotic program. Poult. Sci. 2024, 103, 103504. [Google Scholar] [CrossRef]
  96. You, J.; Lou, E.; Afrouziyeh, M.; Zukiwsky, N.M.; Zuidhof, M.J. A supervised machine learning method to detect anomalous real-time broiler breeder body weight data recorded by a precision feeding system. Comput. Electron. Agric. 2021, 185, 106171. [Google Scholar] [CrossRef]
  97. Johansen, S.V.; Jensen, M.R.; Chu, B.; Bendtsen, J.D.; Mogensen, J.; Rogers, E. Broiler FCR optimization using norm optimal terminal iterative learning control. IEEE Trans. Control Syst. Technol. 2019, 29, 580–592. [Google Scholar] [CrossRef]
  98. Ma, C.; Zhang, T.; Zheng, H.; Yang, J.; Chen, R.; Fang, C. Measurement method for live chicken shank length based on improved ResNet and fused multi-source information. Comput. Electron. Agric. 2024, 221, 108965. [Google Scholar] [CrossRef]
  99. Zhu, R.; Li, J.; Yang, J.; Sun, R.; Yu, K. In vivo prediction of breast muscle weight in broiler chickens using X-ray images based on deep learning and machine learning. Animals 2024, 14, 628. [Google Scholar] [CrossRef] [PubMed]
  100. Li, Z.; Zheng, J.; An, B.; Ma, X.; Ying, F.; Kong, F.; Wen, J.; Zhao, G. Several models combined with ultrasound techniques to predict breast muscle weight in broilers. Poult. Sci. 2023, 102, 102911. [Google Scholar] [CrossRef] [PubMed]
  101. Castro, S.L.D.; Silva, I.J.D.; Nazareno, A.C.; Mota, M.D.O. Computer vision for morphometric evaluation of broiler chicken bones. Eng. Agrícola 2022, 42, e20210150. [Google Scholar] [CrossRef]
  102. Ji, H.; Xu, Y.; Teng, G. Predicting egg production rate and egg weight of broiler breeders based on machine learning and Shapley additive explanations. Poult. Sci. 2025, 104, 104458. [Google Scholar] [CrossRef]
  103. Bumanis, N.; Kviesis, A.; Paura, L.; Arhipova, I.; Adjutovs, M. Hen egg production forecasting: Capabilities of machine learning models in scenarios with limited data sets. Appl. Sci. 2023, 13, 7607. [Google Scholar] [CrossRef]
  104. Qin, X.; Lai, C.; Pan, Z.; Pan, M.; Xiang, Y.; Wang, Y. Recognition of abnormal-laying hens based on fast continuous wavelet and deep learning using hyperspectral images. Sensors 2023, 23, 3645. [Google Scholar] [CrossRef]
  105. Oliveira, E.B.; Almeida, L.G.B.D.; Rocha, D.T.D.; Furian, T.Q.; Borges, K.A.; Moraes, H.L.D.S.; Nascimento, V.P.D.; Salle, C.T.P. Artificial neural networks to predict egg-production traits in commercial laying breeder hens. Braz. J. Poult. Sci. 2022, 24, eRBCA-2021. [Google Scholar] [CrossRef]
  106. Quintana-Ospina, G.A.; Alfaro-Wisaquillo, M.C.; Oviedo-Rondon, E.O.; Ruiz-Ramirez, J.R.; Bernal-Arango, L.C.; Martinez-Bernal, G.D. Effect of environmental and farm-associated factors on live performance parameters of broilers raised under commercial tropical conditions. Animals 2023, 13, 3312. [Google Scholar] [CrossRef]
  107. Akinsola, O.M.; Sonaiya, E.B.; Bamidele, O.; Hassan, W.A.; Yakubu, A.; Ajayi, F.O.; Ogundu, U.; Alabi, O.O.; Adebambo, O.A. Comparison of five mathematical models that describe growth in tropically adapted dual-purpose breeds of chicken. J. Appl. Anim. Res. 2021, 49, 158–166. [Google Scholar] [CrossRef]
  108. Peng, J.; Xiao, R.; Wu, C.; Zheng, Z.; Deng, Y.; Chen, K.; Xiang, Y.; Xu, C.; Zou, L.; Liao, M.; et al. Characterization of the prevalence of Salmonella in different retail chicken supply modes using genome-wide and machine-learning analyses. Food Res. Int. 2024, 191, 114654. [Google Scholar] [CrossRef]
  109. Spyrelli, E.D.; Ozcan, O.; Mohareb, F.; Panagou, E.Z.; Nychas, G.J.E. Spoilage assessment of chicken breast fillets by means of fourier transform infrared spectroscopy and multispectral image analysis. Curr. Res. Food Sci. 2021, 4, 121–131. [Google Scholar] [CrossRef]
  110. Sekulska-Nalewajko, J.; Gocławski, J.; Korzeniewska, E.; Kiełbasa, P.; Dróżdż, T. The verification of hen egg types by the classification of ultra-weak photon emission data. Expert Syst. Appl. 2024, 238, 122130. [Google Scholar] [CrossRef]
  111. Chen, Z.; He, P.; He, Y.; Wu, F.; Rao, X.; Pan, J.; Lin, H. Eggshell biometrics for individual egg identification based on convolutional neural networks. Poult. Sci. 2023, 102, 102540. [Google Scholar] [CrossRef] [PubMed]
  112. Bischof, G.; Januschewski, E.; Juadjur, A. Authentication of laying hen housing systems based on egg yolk using 1H NMR spectroscopy and machine learning. Foods 2024, 13, 1098. [Google Scholar] [CrossRef] [PubMed]
  113. Cheng, T.; Li, P.; Ma, J.; Tian, X.; Zhong, N. Identification of four chicken breeds by hyperspectral imaging combined with chemometrics. Processes 2022, 10, 1484. [Google Scholar] [CrossRef]
  114. Seo, D.; Cho, S.; Manjula, P.; Choi, N.; Kim, Y.-K.; Koh, Y.J.; Lee, S.H.; Kim, H.-Y.; Lee, J.H. Identification of target chicken populations by machine learning models using the minimum number of SNPs. Animals 2021, 11, 241. [Google Scholar] [CrossRef]
  115. Horkaew, P.; Kupittayanant, S.; Kupittayanant, P. Noninvasive in ovo sexing in Korat chicken by pattern recognition of its embryologic vasculature. J. Appl. Poult. Res. 2024, 33, 100424. [Google Scholar] [CrossRef]
  116. Ghaderi, M.; Mireei, S.A.; Masoumi, A.; Sedghi, M.; Nazeri, M. Fertility detection of unincubated chicken eggs by hyperspectral transmission imaging in the Vis-SWNIR region. Sci. Rep. 2024, 14, 1289. [Google Scholar] [CrossRef]
  117. Groves, I.; Holmshaw, J.; Furley, D.; Manning, E.; Chinnaiya, K.; Towers, M.; Evans, B.D.; Placzek, M.; Fletcher, A.G. Accurate staging of chick embryonic tissues via deep learning of salient features. Development 2023, 150, dev202068. [Google Scholar] [CrossRef]
  118. Jia, N.; Li, B.; Zhao, Y.; Fan, S.; Zhu, J.; Wang, H.; Zhao, W. Exploratory study of sex identification for chicken embryos based on blood vessel images and deep learning. Agriculture 2023, 13, 1480. [Google Scholar] [CrossRef]
  119. Wang, J.; Cao, R.; Wang, Q.; Ma, M. Nondestructive prediction of fertilization status and growth indicators of hatching eggs based on respiration. Comput. Electron. Agric. 2023, 208, 107779. [Google Scholar] [CrossRef]
  120. Li, Z.; Zhang, T.; Cuan, K.; Fang, C.; Zhao, H.; Guan, C.; Yang, Q.; Qu, H. Sex detection of chicks based on audio technology and deep learning methods. Animals 2022, 12, 3106. [Google Scholar] [CrossRef] [PubMed]
  121. Cuan, K.; Li, Z.; Zhang, T.; Qu, H. Gender determination of domestic chicks based on vocalization signals. Comput. Electron. Agric. 2022, 199, 107172. [Google Scholar] [CrossRef]
  122. Li, X.; Chen, X.; Wang, Q.; Yang, N.; Sun, C. Integrating bioinformatics and machine learning for genomic prediction in chickens. Genes 2024, 15, 690. [Google Scholar] [CrossRef]
  123. Bouwman, A.C.; Hulsegge, I.; Hawken, R.J.; Henshall, J.M.; Veerkamp, R.F.; Schokker, D.; Kamphuis, C. Classifying aneuploidy in genotype intensity data using deep learning. J. Anim. Breed. Genet. 2023, 140, 304–315. [Google Scholar] [CrossRef]
  124. Serva, L.; Marchesini, G.; Cullere, M.; Ricci, R.; Dalle Zotte, A. Testing two NIRs instruments to predict chicken breast meat quality and exploiting machine learning approaches to discriminate among genotypes and presence of myopathies. Food Control 2023, 144, 109391. [Google Scholar] [CrossRef]
  125. Klotz, D.F.; Ribeiro, R.; Enembreck, F.; Denardin, G.W.; Barbosa, M.A.; Casanova, D.; Teixeira, M. Estimating and tuning adaptive action plans for the control of smart interconnected poultry condominiums. Expert Syst. Appl. 2022, 187, 115876. [Google Scholar] [CrossRef]
  126. Shams, M.Y.; Elmessery, W.M.; Oraiath, A.A.T.; Elbeltagi, A.; Salem, A.; Kumar, P.; El-Messery, T.M.; El-Hafeez, T.A.; Abdelshafie, M.F.; El-Wahhab, G.G.A.; et al. Automated on-site broiler live weight estimation through YOLO-based segmentation. Smart Agric. Technol. 2025, 10, 100828. [Google Scholar] [CrossRef]
  127. Pangestu, G. Real-Time Chicken Counting System using YOLO for FCR Optimization in Small and Medium Poultry Farms. J. Dev. Res. 2025, 9, 1–9. [Google Scholar] [CrossRef]
  128. Wu, Z.; Zhang, H.; Fang, C. Research on machine vision online monitoring system for egg production and quality in cage environment. Poult. Sci. 2025, 104, 104552. [Google Scholar] [CrossRef] [PubMed]
  129. Yang, X.; Zhu, L.; Jiang, W.; Yang, Y.; Gan, M.; Shen, L.; Zhu, L. Machine Learning-Based Prediction of Feed Conversion Ratio: A Feasibility Study of Using Short-Term FCR Data for Long-Term Feed Conversion Ratio (FCR) Prediction. Animals 2025, 15, 1773. [Google Scholar] [CrossRef] [PubMed]
  130. Adejola, Y.A.; Sibanda, T.Z.; Ruhnke, I.; Boshoff, J.; Pokhrel, S.; Welch, M. Analyzing the Risk of Short-Term Losses in Free-Range Egg Production Using Commercial Data. Agriculture 2025, 15, 743. [Google Scholar] [CrossRef]
  131. Hélène, O.K.; Kuradusenge, M.; Sibomana, L.; Mwaisekwa, I.I. TinyML and IoT-Enabled System for Automated Chicken Egg Quality Analysis and Monitoring. Smart Agric. Technol. 2025, 12, 101162. [Google Scholar] [CrossRef]
  132. Sari, O.F.; Bader-El-Den, M.; Leadley, C.; Esmeli, R.; Mohasseb, A.; Ince, V. AI-driven food safety risk prediction: A transformer-based approach with RASFF database. British Food J. 2025, 127, 3427–3445. [Google Scholar] [CrossRef]
  133. Kong, Y.; Wen, Z.; Cai, X.; Tan, L.; Liu, Z.; Wang, Q.; Li, Q.; Yang, N.; Wang, Y.; Zhao, Y. Genetic traceability, conservation effectiveness, and selection signatures analysis based on ancestral information: A case study of Beijing-You chicken. BMC Genom. 2025, 26, 402. [Google Scholar] [CrossRef]
  134. Zhi, Y.; Geng, W.; Li, S.; Chen, X.; Challioui, M.K.; Chen, B.; Wang, D.; Li, Z.; Tian, Y.; Li, H.; et al. Advanced molecular system for accurate identification of chicken genetic resources. Comput. Electron. Agric. 2025, 231, 109989. [Google Scholar] [CrossRef]
  135. Gao, Z.; Zheng, J.; Xu, G. Molecular Mechanisms and Regulatory Factors Governing Feed Utilization Efficiency in Laying Hens: Insights for Sustainable Poultry Production and Breeding Optimization. Int. J. Mol. Sci. 2025, 26, 6389. [Google Scholar] [CrossRef]
  136. Rodriguez, M.V.; Phan, T.; Fernandes, A.F.; Breen, V.; Arango, J.; Kidd, M.T.; Le, N. Facial Chick Sexing: An Automated Chick Sexing System From Chick Facial Image. Smart Agric. Technol. 2025, 12, 101044. [Google Scholar] [CrossRef]
  137. Saetiew, J.; Nongkhunsan, P.; Saenjae, J.; Yodsungnoen, R.; Molee, A.; Jungthawan, S.; Fongkaew, I.; Meemon, P. Automated chick gender determination using optical coherence tomography and deep learning. Poult. Sci. 2025, 104, 105033. [Google Scholar] [CrossRef]
  138. Barbosa, L.V.S.; Lima, N.D.d.S.; Barros, J.d.S.G.; de Moura, D.J.; Estellés, F.; Ramón-Moragues, A.; Calvet-Sanz, S.; García, A.V. Predicting Risk of Ammonia Exposure in Broiler Housing: Correlation with Incidence of Health Issues. Animals 2024, 14, 615. [Google Scholar] [CrossRef]
  139. Liu, Y.; Zhuang, Y.; Ji, B.; Zhang, G.; Rong, L.; Teng, G.; Wang, C. Prediction of laying hen house odor concentrations using machine learning models based on small sample data. Comput. Electron. Agric. 2022, 195, 106849. [Google Scholar] [CrossRef]
  140. Weng, X.; Kong, C.; Jin, H.; Chen, D.; Li, C.; Li, Y.; Ren, L.; Xiao, Y.; Chang, Z. Detection of volatile organic compounds (VOCs) in livestock houses based on electronic nose. Appl. Sci. 2021, 11, 2337. [Google Scholar] [CrossRef]
  141. Cosan, A.; Atilgan, A. Setting up of an expert system to determine ammonia gas in animal livestock. J. Food Agric. Environ. 2012, 10, 910–912. Available online: https://hero.epa.gov/hero/index.cfm/reference/details/reference_id/2044779 (accessed on 25 September 2025).
  142. Kim, S.J.; Lee, M.H. Design and implementation of a malfunction detection system for livestock ventilation devices in smart poultry farms. Agriculture 2022, 12, 2150. [Google Scholar] [CrossRef]
  143. Silva, W.; Moura, D.; Carvalho-Curi, T.; Seber, R.; Massari, J. Evatuation system of exhaust fans used on ventilation system in commercial broiler house. Eng. Agrícola 2017, 37, 887–899. [Google Scholar] [CrossRef]
  144. Gonzalez-Mora, A.F.; Rousseau, A.N.; Larios, A.D.; Godbout, S.; Fournel, S. Assessing environmental control strategies in cage-free aviary housing systems: Egg production analysis and Random Forest modeling. Comput. Electron. Agric. 2022, 196, 106854. [Google Scholar] [CrossRef]
  145. Martinez, A.A.G.; Nääs, I.D.A.; de Carvalho-Curi, T.M.R.; Abe, J.M.; da Silva Lima, N.D. A Heuristic and data mining model for predicting broiler house environment suitability. Animals 2021, 11, 2780. [Google Scholar] [CrossRef]
  146. Yelmen, B.; Şahin, H.; Cakir, M.T. The use of artificial neural networks in energy use modeling in broiler farms: A case study of Mersin province in the Mediterranean region. Appl. Ecol. Environ. Res. 2019, 17, 13169–13183. [Google Scholar] [CrossRef]
  147. López-Andrés, J.J.; Aguilar-Lasserre, A.A.; Morales-Mendoza, L.F.; Azzaro-Pantel, C.; Pérez-Gallardo, J.R.; Rico-Contreras, J.O. Environmental impact assessment of chicken meat production via an integrated methodology based on LCA, simulation and genetic algorithms. J. Clean. Prod. 2018, 174, 477–491. [Google Scholar] [CrossRef]
  148. Rico-Contreras, J.O.; Aguilar-Lasserre, A.A.; Méndez-Contreras, J.M.; Cid-Chama, G.; Alor-Hernández, G. Moisture content prediction in poultry litter to estimate bioenergy production using an artificial neural network. Rev. Mex. De Ing. Química 2014, 13, 933–955. Available online: https://www.scielo.org.mx/pdf/rmiq/v13n3/v13n3a24.pdf (accessed on 25 September 2025).
  149. Karadurmus, E.; Cesmeci, M.; Yuceer, M.; Berber, R. An artificial neural network model for the effects of chicken manure on ground water. Appl. Soft Comput. 2012, 12, 494–497. [Google Scholar] [CrossRef]
  150. Leite, M.V.; Abe, J.M.; Souza, M.L.H.; de Alencar Nääs, I. Enhancing Environmental Control in Broiler Production: Retrieval-Augmented Generation for Improved Decision-Making with Large Language Models. AgriEngineering 2025, 7, 12. [Google Scholar] [CrossRef]
Figure 1. A data-driven overview of AI applications in chicken farming. IoT: Internet of Things.
Figure 1. A data-driven overview of AI applications in chicken farming. IoT: Internet of Things.
Agriculture 15 02028 g001
Figure 2. PRISMA workflow of literature analysis.
Figure 2. PRISMA workflow of literature analysis.
Agriculture 15 02028 g002
Figure 3. Publication and citation trends for AI in chicken farming from May 2003 to March 2025. (Citations scored were calculated on 16 April 2025, which may have changed after that access date).
Figure 3. Publication and citation trends for AI in chicken farming from May 2003 to March 2025. (Citations scored were calculated on 16 April 2025, which may have changed after that access date).
Agriculture 15 02028 g003
Figure 4. Data classification of 254 papers about AI in chicken farming. Note: Individual papers may address multiple objectives but are counted here by their principal contribution.
Figure 4. Data classification of 254 papers about AI in chicken farming. Note: Individual papers may address multiple objectives but are counted here by their principal contribution.
Agriculture 15 02028 g004
Figure 5. AI data processing workflow in chicken farming. CNN: Convolutional Neural Network; SVM: Support Vector Machine. GAN: Generative Adversarial Network; LSTM: Long Short-Term Memory.
Figure 5. AI data processing workflow in chicken farming. CNN: Convolutional Neural Network; SVM: Support Vector Machine. GAN: Generative Adversarial Network; LSTM: Long Short-Term Memory.
Agriculture 15 02028 g005
Figure 6. Perception-Cognition-Action loop with online adaptation.
Figure 6. Perception-Cognition-Action loop with online adaptation.
Agriculture 15 02028 g006
Table 1. Search terms and Boolean grouping used for the Web of Science query.
Table 1. Search terms and Boolean grouping used for the Web of Science query.
Group (AND)Keyword (OR)
Farming environment“farm”, “breed”, “house”
Chicken type“broiler”, “chicken”, “chick”, “cock”, “hen”
AI technologies“artificial intelligence”, “machine learning”, “deep learning”, “neural network”, “natural language processing”, “transformer”, “generative adversarial network”
Table 2. Sustainability contribution categories and decision rules.
Table 2. Sustainability contribution categories and decision rules.
CategoryDefinitionDecision Rule
WelfareResearch that benefits the birds directly: improved health, reduced stress, better behavior, lower pain or handling timeAssign when the main objective and endpoints concern bird condition, health status, stress or behavior indices, mortality or lameness reduction, longer productive lifespan
EconomicResearch that improves production efficiency or profitabilityAssign when primary outcomes focus on productivity, cost, labor, feed efficiency, yield or uniformity, product quality grades, or return on investment
EnvironmentResearch that improves environmental management or reduces environmental burdenAssign when endpoints center on emissions, waste handling, microclimate quality, energy consumption, or environmental impact indicators (for example ammonia levels)
Table 3. Technical modalities and decision rules.
Table 3. Technical modalities and decision rules.
ModalityDecision RuleExample
ImageImages or video are the main data source used by the modelObject detection and tracking, segmentation, pose or activity recognition
AudioAcoustic signals are the main data sourceVocalization analysis, distress or cough detection, barn acoustics anomalies
IoT SensorNon-image, non-audio physical sensors and integrated systemsTemperature and humidity sensing, ammonia or CO2 monitoring, RFID or accelerometers, robotics, ventilation telemetry
ProductionStructured production or biological records are the main data sourceGrowth and weight records, egg counts and quality grades, feed intake, mortality, genetic or breeding records
Table 4. Keywords with more than 10 appearances in the field of AI in chicken farming from May 2003 to March 2025. (Keywords were calculated on 16 April 2025, which may have changed after that access date).
Table 4. Keywords with more than 10 appearances in the field of AI in chicken farming from May 2003 to March 2025. (Keywords were calculated on 16 April 2025, which may have changed after that access date).
CategoryKeywordYear of First AppearanceNumber
MethodDeep Learning202046
Machine Learning201943
System201931
Computer Vision201626
Artificial Intelligence201918
Machine Vision202113
Artificial Neural Network201013
ObjectChickens201221
Poultry201615
Laying Hens202016
PurposeAnimal Welfare201929
Performance201127
Behavior201627
Prediction200326
Classification201611
Growth200712
Note: “keywords” refer to the 5–10 author-provided terms, not arbitrary words in the text.
Table 5. Overview of recent AI objectives and research implementations for improving chicken welfare.
Table 5. Overview of recent AI objectives and research implementations for improving chicken welfare.
Welfare
Objective
ModalityAI Research CasesReferences
Disease Monitoring and PreventionImageDeep neural networks analyze external features such as feathers, head, and feet to enable real-time monitoring and early prevention of diseases including fowlpox, avian cholera, footpad dermatitis, avian influenza, and others[22,23,24,25,26]
ImageImage recognition of abnormal appearance in chicken droppings, eggs, and other products allows indirect inference of flock health, enabling indirect disease detection and welfare assurance[27,28,29]
ImageAn automated dead-bird detection and alarm system based on image recognition prevents disease spread and safeguards overall flock welfare[30,31,32,33]
ImageComputer vision analysis of chicken locomotion postures accurately and promptly identifies lameness and other disorders, improving monitoring of locomotor health[34,35]
AudioAcoustic analysis and machine learning monitor and classify abnormal sounds such as coughing to rapidly detect Newcastle disease, avian influenza, and similar illnesses, strengthening health surveillance[36,37]
IoT SensorMachine-learning analysis of RFID sensor data identifies parasitic infestations and aflatoxin poisoning, enabling early intervention and improved welfare[38,39,40]
ProductionAI tools analyze feces and carcass sample data to quickly identify potential pathogens in the production environment, optimizing disease-control strategies and enhancing overall welfare[41,42,43,44]
Behavior MonitoringImageComputer vision automatically monitors daily behaviors such as feeding, drinking, dust-bathing, and activity level, quantifies comfort, and supports environmental optimization to improve welfare[45,46,47,48]
ImageAI-based video analysis detects abnormal behaviors like feather pecking and piling in real time, enabling timely intervention and preventing injury or stress[49,50,51]
IoT SensorSensors such as IMU and RFID record daily behavioral data; machine-learning algorithms evaluate aggressive and abnormal acts to ensure flock safety and quality of life[52,53]
Stress
Monitoring
Image/IoT SensorCombining computer-vision analysis of mouth movements (open beak, gaping) with ambient temperature and humidity data automatically identifies heat stress, improving environmental control and welfare[54,55,56,57]
AudioDeep-learning models recognize and classify vocal expressions of different emotional states, providing real-time monitoring and management of stress and emotional welfare[58,59,60,61]
ProductionAI analyses behavioral data to automatically detect and categorize degrees of heat stress, facilitating environmental adjustments and welfare improvement[62,63,64]
Health
Scoring
ImageImage recognition assesses feather quality to evaluate individual growth and health status, supporting management decisions and enhancing flock welfare[65,66]
ImageThermal imaging monitors eye and comb temperatures to ensure environmental comfort and health[67]
AudioAI analyses vocal characteristics after vaccination to evaluate vaccine efficacy and health status, optimizing immunization strategies and improving overall welfare[68]
Note: For each research direction, no more than five of the most recent papers are listed; the review covers studies published from 2021 to March 2025.
Table 6. Overview of AI objectives and research aimed at improving economic performance.
Table 6. Overview of AI objectives and research aimed at improving economic performance.
Economic ObjectiveModalityAI Research CasesReferences
Farming Management OptimizationImageUse of 2D/3D computer vision to estimate body weight accurately, enabling automated individual and flock management and lowering labor costs while raising efficiency[80,81,82,83]
ImageNeural-network video analysis for automatic bird counting, reducing manual inventory and improving management efficiency[84,85,86,87]
ImageRapid floor-egg detection via image recognition, cutting losses and increasing egg yield and quality[88]
ImageImage recognition for fast and accurate estimation of chick age, supporting precision rearing and cost reduction[89]
Audio/IoT SensorAI analysis of feed intake patterns to optimize ration allocation and lower feed costs[90,91]
IoT SensorRobot patrols and automated dead-bird/egg collection to improve daily management while reducing labor and boosting efficiency[92,93,94]
ProductionMachine-learning analysis of dead-on-arrival rates during transport to cut economic losses and raise logistics efficiency[95]
ProductionReal-time AI analysis of body-weight data to detect anomalies and optimize feed-conversion ratio[96,97]
Growth and Performance ScoringImage/IoT SensorNeural networks accurately measure body dimensions and weights for precise performance evaluation and management optimization[98,99,100,101]
ProductionMachine-learning models predict and evaluate laying performance, improving production planning and profits[102,103,104,105]
ProductionAI integrates environmental factors within the farm to fine-tune conditions and optimize growth and productivity[106,107]
Product Quality MonitoringImage/IoT SensorSpectral imaging, whole-genome sequencing, and machine learning assess meat quality and Salmonella risk, enhancing food safety, uniformity, and market competitiveness[108,109]
Image/IoT SensorAI combining image and optical-sensor data automatically grades eggs, boosting market value[110,111]
Image/IoT Sensor/ProductionHigh-spectral data traced by AI to verify meat and egg provenance, strengthening branding and consumer trust[112,113,114]
Gender Identification and Genetic ImprovementImageNeural-network analysis of candled-egg images for early detection of fertilization and embryo sex, improving hatchery efficiency and profitability[115,116,117,118,119]
AudioAI analysis of chick vocalizations for early sex determination, lowering labor costs and increasing productivity[120,121]
ProductionIntegration of bioinformatics and machine learning for genomic prediction, reducing hereditary disease risk and enhancing large-scale economic returns[122,123,124]
ProductionDeep learning combined with genetic algorithms for adaptive feeding control, significantly improving feed-conversion ratio and reducing costs[125]
Note: For each research direction, no more than five of the most recent papers are listed; the review covers studies published from 2021 to March 2025.
Table 7. AI research objectives and implementations for improving the chicken framing environment.
Table 7. AI research objectives and implementations for improving the chicken framing environment.
Environmental
Objective
ModalityAI Research CasesReferences
Odor-concentration prediction and correlationProductionA multilayer neural network trained on health records and ammonia levels at different stocking densities predicts in-house ammonia exposure and links it to respiratory-disease incidence[138]
IoT SensorA gradient-boosted decision-tree model, fed with small samples from ammonia (NH3) and hydrogen-sulfide sensors, forecasts odor levels in layer houses and confirms ammonia as the dominant driver[139]
IoT SensorGas-chromatography-mass-spectrometry data combined with “electronic-nose” readings and a support-vector machine identify abnormal volatile compounds in real time[140]
IoT SensorAn AI expert system that relies on low-cost ammonia sensors provides continuous NH3 monitoring and automatically adjusts the ventilation rate[141]
Ventilation-performance monitoringIoT SensorA recurrent neural network analyses time-series data from fan sensors and controllers, detecting faults and abnormal airflow[142]
IoT SensorDesign a deep-learning-based platform that merges camera analysis to evaluate exhaust-fan performance in real time and issue maintenance alerts[143]
Environmental-control strategiesIoT SensorPrincipal-component and linear-discriminant analyses extract key features in cage-free houses; a random-forest model then simulates egg-production changes under alternative control strategies[144]
IoT SensorA decision-tree classifier labels conditions as suitable, marginal, or unsuitable based on temperature, humidity, air speed and ammonia concentration, guiding environmental adjustments[145]
ProductionA neural network fed with feed, energy-use and output records estimates total farm energy demand and suggests retrofit options for saving power[146]
Environmental-impact assessmentProductionWithin a life-cycle-assessment framework, Monte-Carlo simulation and a genetic algorithm optimize the entire broiler-production chain, lowering environmental impact per kilogram of meat[147]
ProductionA neural network coupled with risk simulation predicts manure moisture content, evaluating its value as a bio-energy feedstock[148]
ProductionA neural network refined by the Levenberg–Marquardt method models groundwater-contamination risk, forecasting how chicken manure affects fecal-coliform counts in nearby water sources[149]
Note: Because the literature pool is small, all relevant studies from 2012 to 2024 are included.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wu, Z.; Willems, S.; Liu, D.; Norton, T. How AI Improves Sustainable Chicken Farming: A Literature Review of Welfare, Economic, and Environmental Dimensions. Agriculture 2025, 15, 2028. https://doi.org/10.3390/agriculture15192028

AMA Style

Wu Z, Willems S, Liu D, Norton T. How AI Improves Sustainable Chicken Farming: A Literature Review of Welfare, Economic, and Environmental Dimensions. Agriculture. 2025; 15(19):2028. https://doi.org/10.3390/agriculture15192028

Chicago/Turabian Style

Wu, Zhenlong, Sam Willems, Dong Liu, and Tomas Norton. 2025. "How AI Improves Sustainable Chicken Farming: A Literature Review of Welfare, Economic, and Environmental Dimensions" Agriculture 15, no. 19: 2028. https://doi.org/10.3390/agriculture15192028

APA Style

Wu, Z., Willems, S., Liu, D., & Norton, T. (2025). How AI Improves Sustainable Chicken Farming: A Literature Review of Welfare, Economic, and Environmental Dimensions. Agriculture, 15(19), 2028. https://doi.org/10.3390/agriculture15192028

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop