Artificial Intelligence in Livestock Farming

A special issue of Agriculture (ISSN 2077-0472). This special issue belongs to the section "Farm Animal Production".

Deadline for manuscript submissions: closed (15 July 2023) | Viewed by 17450

Special Issue Editors


E-Mail Website
Guest Editor
Department of Animal Sciences, Livestock Systems, University of Göttingen, 37077 Göttingen, Germany
Interests: animal husbandry systems; precision livestock farming; automatic behavior monitoring; complex methods; deep learning; animal welfare and behaviour

E-Mail Website
Guest Editor
Statistics und Data Science Group, Faculty of Agriculture, South Westphalia University of Applied Sciences, Lübecker Ring 2, 59494 Soest, Germany
Interests: machine learning; computer vision; animal welfare; smart livestock farming; big-data; deep learning; random forest

Special Issue Information

Dear Colleagues,

In recent years, more and more sensor technologies are available for applications in livestock husbandry systems and are not even limited to research farms. From the research perspective, a huge task is to develop systems to work with these data, such as assistance for stockpersons in animal management and culling decisions. Moreover, monitoring animal welfare is of high importance, and new knowledge on assistance in animal welfare monitoring is needed.

Artificial intelligence (AI) as a method is not new, but going along with rapidly increasing availability in data storage and, more importantly, working memory, these methods and algorithms received an enormous drive. Further developments show the great potential for applications in livestock farming regarding various tasks and species. However, knowledge of the transfer of successful research results into continuous monitoring or commercial farming is limited.

This Special Issue focuses on new research results regarding the application of AI in livestock farming regardless of species, husbandry system, or ready-to-use level in commercial farming. The monitoring of animal-related characteristics such as behavior or welfare issues is welcome as well as monitoring housing environments such as climate or bedding material.

Prof. Dr. Imke Traulsen
Prof. Dr. Mehmet Gültas
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Agriculture is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • automatic behavior assessment
  • animal monitoring
  • individual personalities
  • precision livestock farming

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 26726 KiB  
Article
Visual Detection of Lost Ear Tags in Breeding Pigs in a Production Environment Using the Enhanced Cascade Mask R-CNN
by Fang Wang, Xueliang Fu, Weijun Duan, Buyu Wang and Honghui Li
Agriculture 2023, 13(10), 2011; https://doi.org/10.3390/agriculture13102011 - 17 Oct 2023
Cited by 6 | Viewed by 1837
Abstract
As the unique identifier of individual breeding pigs, the loss of ear tags can result in the loss of breeding pigs’ identity information, leading to data gaps and confusion in production and genetic breeding records, which can have catastrophic consequences for breeding efforts. [...] Read more.
As the unique identifier of individual breeding pigs, the loss of ear tags can result in the loss of breeding pigs’ identity information, leading to data gaps and confusion in production and genetic breeding records, which can have catastrophic consequences for breeding efforts. Detecting the loss of ear tags in breeding pigs can be challenging in production environments due to factors such as overlapping breeding pig clusters, imbalanced pig-to-tag ratios, and relatively small-sized ear tags. This study proposes an improved method for the detection of lost ear tags in breeding pigs based on Cascade Mask R-CNN. Firstly, the model utilizes ResNeXt combined with a feature pyramid network (FPN) as the feature extractor; secondly, the classification branch incorporates the online hard example mining (OHEM) technique to improve the utilization of ear tags and low-confidence samples; finally, the regression branch employs a decay factor of Soft-NMS to reduce the overlap of redundant bounding boxes. The experiment employs a sliding window detection method to evaluate the algorithm’s performance in detecting lost ear tags in breeding pigs in a production environment. The results show that the accuracy of the detection can reach 92.86%. This improvement effectively enhances the accuracy and real-time performance of lost ear tag detection, which is highly significant for the production and breeding of breeding pigs. Full article
(This article belongs to the Special Issue Artificial Intelligence in Livestock Farming)
Show Figures

Graphical abstract

20 pages, 6327 KiB  
Article
Describing Behavior Sequences of Fattening Pigs Using Process Mining on Video Data and Automated Pig Behavior Recognition
by Andreas Melfsen, Arvid Lepsien, Jan Bosselmann, Agnes Koschmider and Eberhard Hartung
Agriculture 2023, 13(8), 1639; https://doi.org/10.3390/agriculture13081639 - 21 Aug 2023
Cited by 9 | Viewed by 2798
Abstract
This study aimed to demonstrate the application of process mining on video data of pigs, facilitating the analysis of behavioral patterns. Video data were collected over a period of 5 days from a pig pen in a mechanically ventilated barn and used for [...] Read more.
This study aimed to demonstrate the application of process mining on video data of pigs, facilitating the analysis of behavioral patterns. Video data were collected over a period of 5 days from a pig pen in a mechanically ventilated barn and used for analysis. The approach in this study relies on a series of individual steps to allow process mining on this data set. These steps include object detection and tracking, spatiotemporal activity recognition in video data, and process model analysis. Each step gives insights into pig behavior at different time points and locations within the pen, offering increasing levels of detail to describe typical pig behavior up to process models reflecting different behavior sequences for clustered datasets. Our data-driven approach proves suitable for the comprehensive analysis of behavioral sequences in conventional pig farming. Full article
(This article belongs to the Special Issue Artificial Intelligence in Livestock Farming)
Show Figures

Figure 1

16 pages, 3410 KiB  
Article
Detection of Cattle Key Parts Based on the Improved Yolov5 Algorithm
by Dangguo Shao, Zihan He, Hongbo Fan and Kun Sun
Agriculture 2023, 13(6), 1110; https://doi.org/10.3390/agriculture13061110 - 23 May 2023
Cited by 11 | Viewed by 2918
Abstract
Accurate detection of key body parts of cattle is of great significance to Precision Livestock Farming (PLF), using artificial intelligence for video analysis. As the background image in cattle livestock farms is complex and the target features of the cattle are not obvious, [...] Read more.
Accurate detection of key body parts of cattle is of great significance to Precision Livestock Farming (PLF), using artificial intelligence for video analysis. As the background image in cattle livestock farms is complex and the target features of the cattle are not obvious, traditional object-detection algorithms cannot detect the key parts of the image with high precision. This paper proposes the Filter_Attention attention mechanism to detect the key parts of cattle. Since the image is unstable during training and initialization, particle noise is generated in the feature graph after convolution calculation. Therefore, this paper proposes an attentional mechanism based on bilateral filtering to reduce this interference. We also designed a Pooling_Module, based on the soft pooling algorithm, which facilitates information loss relative to the initial activation graph compared to maximum pooling. Our data set contained 1723 images of cattle, in which labels of the body, head, legs, and tail were manually entered. This dataset was divided into a training set, verification set, and test set at a ratio of 7:2:1 for training the model proposed in this paper. The detection effect of our proposed module is proven by the ablation experiment from mAP, the AP value, and the F1 value. This paper also compares other mainstream object detection algorithms. The experimental results show that our model obtained 90.74% mAP, and the F1 value and AP value of the four parts were improved. Full article
(This article belongs to the Special Issue Artificial Intelligence in Livestock Farming)
Show Figures

Figure 1

15 pages, 6167 KiB  
Article
Facial Region Analysis for Individual Identification of Cows and Feeding Time Estimation
by Yusei Kawagoe, Ikuo Kobayashi and Thi Thi Zin
Agriculture 2023, 13(5), 1016; https://doi.org/10.3390/agriculture13051016 - 6 May 2023
Cited by 11 | Viewed by 2539
Abstract
With the increasing number of cows per farmer in Japan, an automatic cow monitoring system is being introduced. One important aspect of such a system is the ability to identify individual cows and estimate their feeding time. In this study, we propose a [...] Read more.
With the increasing number of cows per farmer in Japan, an automatic cow monitoring system is being introduced. One important aspect of such a system is the ability to identify individual cows and estimate their feeding time. In this study, we propose a method for achieving this goal through facial region analysis. We used a YOLO detector to extract the cow head region from video images captured during feeding with the head region cropped as a face region image. The face region image was used for cow identification and transfer learning was employed for identification. In the context of cow identification, transfer learning can be used to train a pre-existing deep neural network to recognize individual cows based on their unique physical characteristics, such as their head shape, markings, or ear tags. To estimate the time of feeding, we divided the feeding area into vertical strips for each cow and established a horizontal line just above the feeding materials to determine whether a cow was feeding or not by using Hough transform techniques. We tested our method using real-life data from a large farm, and the experimental results showed promise in achieving our objectives. This approach has the potential to diagnose diseases and movement disorders in cows and could provide valuable insights for farmers. Full article
(This article belongs to the Special Issue Artificial Intelligence in Livestock Farming)
Show Figures

Figure 1

17 pages, 4526 KiB  
Article
An Approach towards a Practicable Assessment of Neonatal Piglet Body Core Temperature Using Automatic Object Detection Based on Thermal Images
by Steffen Küster, Lion Haverkamp, Martin Schlather and Imke Traulsen
Agriculture 2023, 13(4), 812; https://doi.org/10.3390/agriculture13040812 - 31 Mar 2023
Cited by 5 | Viewed by 2266
Abstract
Body core temperature (BCT) is an important characteristic for the vitality of pigs. Suboptimal BCT might indicate or lead to increased stress or diseases. Thermal imaging technologies offer the opportunity to determine BCT in a non-invasive, stress-free way, potentially reducing the manual effort. [...] Read more.
Body core temperature (BCT) is an important characteristic for the vitality of pigs. Suboptimal BCT might indicate or lead to increased stress or diseases. Thermal imaging technologies offer the opportunity to determine BCT in a non-invasive, stress-free way, potentially reducing the manual effort. The current approaches often use multiple close-up images of different parts of the body to estimate the rectal temperature, which is laborious under practical farming conditions. Additionally, images need to be manually annotated for the regions of interest inside the manufacturer’s software. Our approach only needs a single (top view) thermal image of a piglet to automatically estimate the BCT. We first trained a convolutional neural network for the detection of the relevant areas, followed by a background segmentation using the Otsu algorithm to generate precise mean, median, and max temperatures of each detected area. The best fit of our method had an R2 = 0.774. The standardized setup consists of a “FLIROnePro” attached to an Android tablet. To sum up, this approach could be an appropriate tool for animal monitoring under commercial and research farming conditions. Full article
(This article belongs to the Special Issue Artificial Intelligence in Livestock Farming)
Show Figures

Figure 1

18 pages, 5494 KiB  
Article
Estimating Body Weight in Captive Rabbits Based on Improved Mask RCNN
by Enze Duan, Hongyun Hao, Shida Zhao, Hongying Wang and Zongchun Bai
Agriculture 2023, 13(4), 791; https://doi.org/10.3390/agriculture13040791 - 30 Mar 2023
Cited by 5 | Viewed by 2220
Abstract
Automated body weight (BW) estimation is an important indicator to reflect the automation level of breeding, which can effectively reduce the damage to animals in the breeding process. In order to manage meat rabbits accurately, reduce the frequency of manual intervention, and improve [...] Read more.
Automated body weight (BW) estimation is an important indicator to reflect the automation level of breeding, which can effectively reduce the damage to animals in the breeding process. In order to manage meat rabbits accurately, reduce the frequency of manual intervention, and improve the intelligent of meat rabbit breeding, this study constructed a meat rabbit weight estimation system to replace manual weighing. The system consists of a meat rabbit image acquisition robot and a weight estimation model. The robot stops at each cage in turn and takes a top view of the rabbit through an RGB camera. The images from the robot are automatically processed in the weight estimation model, which consists of the meat rabbit segmentation network based on improved Mask RCNN and the BW fitting network. Attention mechanism, PointRend algorithm, and improved activation function are proposed to improve the performance of Mask RCNN. Six morphological parameters (relative projected area, contour perimeter, body length, body width, skeleton length, and curvature) are extracted from the obtained mask, and are sent into the BW fitting network based on SVR-SSA-BPNN. The experiment shows that the system achieves a 4.3% relative error and 172.7 g average absolute error in BW estimation for 441 rabbits, while the meat rabbit segmentation network achieves a 99.1% mean average precision (mAP) and a 98.7% mean pixel accuracy (MPA). The system provides technical support for automatic BW estimation of meat rabbits in commercial breeding, which is helpful to promote precision breeding. Full article
(This article belongs to the Special Issue Artificial Intelligence in Livestock Farming)
Show Figures

Figure 1

Back to TopTop