Next Article in Journal
A Controlled Clinical Study of Intensive Neurorehabilitation in Post-Surgical Dogs with Severe Acute Intervertebral Disc Extrusion
Next Article in Special Issue
An Improved Approach to Automated Measurement of Body Condition Score in Dairy Cows Using a Three-Dimensional Camera System
Previous Article in Journal
Initial Evidence That Gilthead Seabream (Sparus aurata L.) Is a Host for Lymphocystis Disease Virus Genotype I
Previous Article in Special Issue
Infrared Thermography in the Study of Animals’ Emotional Responses: A Critical Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Intelligent Perception-Based Cattle Lameness Detection and Behaviour Recognition: A Review

1
Australian Centre for Field Robotics (ACFR), Faculty of Engineering, The University of Sydney, Sydney, NSW 2006, Australia
2
Livestock Production and Welfare Group, School of Life and Environmental Sciences, Faculty of Science, The University of Sydney, Sydney, NSW 2006, Australia
3
College of Engineering, China Agricultural University, Beijing 100083, China
*
Author to whom correspondence should be addressed.
Animals 2021, 11(11), 3033; https://doi.org/10.3390/ani11113033
Submission received: 24 August 2021 / Revised: 14 October 2021 / Accepted: 20 October 2021 / Published: 22 October 2021
(This article belongs to the Special Issue Automation, Big Data, and New Technologies in Animal Research)

Abstract

:

Simple Summary

Cattle lameness detection as well as behaviour recognition are the two main objectives in the applications of precision livestock farming (PLF). Over the last five years, the development of smart sensors, big data, and artificial intelligence has offered more automatic tools. In this review, we discuss over 100 papers that used automated techniques to detect cattle lameness and to recognise animal behaviours. To assist researchers and policy-makers in promoting various livestock technologies for monitoring cattle welfare and productivity, we conducted a comprehensive investigation of intelligent perception for cattle lameness detection and behaviour analysis in the PLF domain. Based on the literature review, we anticipate that PLF will develop in an objective, autonomous, and real-time direction. Additionally, we suggest that further research should be dedicated to improving the data quality, modeling accuracy, and commercial availability.

Abstract

The growing world population has increased the demand for animal-sourced protein. However, animal farming productivity is faced with challenges from traditional farming practices, socioeconomic status, and climate change. In recent years, smart sensors, big data, and deep learning have been applied to animal welfare measurement and livestock farming applications, including behaviour recognition and health monitoring. In order to facilitate research in this area, this review summarises and analyses some main techniques used in smart livestock farming, focusing on those related to cattle lameness detection and behaviour recognition. In this study, more than 100 relevant papers on cattle lameness detection and behaviour recognition have been evaluated and discussed. Based on a review and a comparison of recent technologies and methods, we anticipate that intelligent perception for cattle behaviour and welfare monitoring will develop towards standardisation, a larger scale, and intelligence, combined with Internet of things (IoT) and deep learning technologies. In addition, the key challenges and opportunities of future research are also highlighted and discussed.

1. Introduction

Livestock production is the second largest supplier of food for human consumption, after vegetable/cereal agriculture. The livestock sector contributes up to 50% of the global agricultural gross domestic product and supports the livelihoods and food security of almost 1.3 billion people in developing countries [1]. The increasing demand of livestock products is the result of human population growth, urbanisation, and growing incomes. The United Nations Food and Agriculture Organisation predicts a 60% increase in demand for animal products (i.e., meat, milk, and eggs) by 2050 [2]. To meet the rising global demand for animal products, the livestock industry will need to improve its efficiency and operations to enhance productivity per animal. Along with this, there are also growing concerns about wastewater, air pollutants caused by animal manures, CO 2 emissions, and animal ethical issues. However, rising labour and maintenance costs alongside the increasing number of animals per farm have reduced the levels of individual animal care [3,4]. In addition, some animal farming problems such as high levels of lameness, inefficient use of resources, reduced species diversity, reproduction, and cows’ short lifespan also need to be considered.
In order to provide sufficient care for each individual animal to increase productivity and yield, automatic livestock monitoring and management technologies are needed. With the ability to increase efficiency by shifting the focus towards the individual animal, precision livestock farming (PLF) has attracted much attention from both governments and the industry in recent years [5]. Conceptually, PLF involves the integration and interpretation of relevant sensor information to enable the management of individual animals through continuous real-time monitoring of their health, behaviour, production/reproduction, and environmental impact [6,7]. Through technologies such as machine learning and Internet of Things (IoT, i.e., the interconnection between computing devices via the Internet), decision making in PLF can be better managed by fusing and analysing different sensor data streams, thereby reducing operational costs and improving animal health and welfare while increasing productivity, yield, and environmental sustainability. The intelligent perception and analysis of individual animal’s behaviours, welfare, and production is fundamental for improving sustainable production systems [4]. The concept of intelligent perception for animal monitoring was proposed by [8] and refers to perceptive animal bio-response from the animal–environment interaction using multi-senor data and the ability to apply adaptive learning to analyse animal welfare and health status [9]. In recent years, sensors such as cameras, microphones, 3D accelerometers, temperature sensors, glucose sensors, and technologies such as deep learning and the IoT make it increasingly feasible to model, monitor, and control animal bio-response and to provide accurate feedback to the farmer. Grounded on this basis, combined with the development of a Decision Support System (DSS) or expert systems, intelligent perception technologies can make large-scale animal husbandry more cost-effective, efficient, and sustainable [10,11].
Cattle lameness is a key factor for reduced performance on many farms. The timely detection of lameness is important for providing effective and inexpensive treatment and for preventing future ailments [12]. Meanwhile, cattle behaviour is an important indicator for animals’ health and welfare, which influence the quantity and quality of cattle products [13]. However, traditional lameness detection and behaviour recognition approaches are time-consuming and labour-intensive, resulting in major concerns on farms. Here, we focus on intelligent perception and analysis technologies relevant for the following two main tasks: (1) lameness detection; (2) cattle behaviour recognition and analysis. In this work, we summarise and analyse recent work in the above areas and discuss future research, and developmental opportunities and challenges. For an overview of existing technologies used for cattle identification, body condition score evaluation, and weight estimation, we refer the reader to [14].
In this study, we focus on intelligent perception techniques for precision beef cattle farming. We also partially review existing studies on dairy cattle. This is mainly because some technologies and methods could be used to resolve common concerns in both beef and dairy cattle. Therefore, we focus on beef cattle in this work and discuss some references corresponding to dairy cows. A general intelligent perception-based animal farming process is presented in Figure 1. In this framework, the animal welfare measurement with relevant smart sensors is the key part, and a DSS utilises the former information to manage farming and environment protocols [15,16,17].

2. Cattle Lameness Detection and Scoring

2.1. Cattle Lameness

Lameness is a prevalent health issue in cattle production, impacting both animal welfare and livestock productivity. Painful disorders in the locomotor system result in the animal modifying its gait and posture to minimise pain, which is observed as impaired motion, or non-standard gait or posture [18,19,20]. The main causes of lameness include Hoof lesions [12], limb lesions, or locomotor deficiencies [21]. Lameness in cattle restricts locomotion and movement and leads to reduced milk production, lower fertility, and higher culling rates [22]. As a consequence, it is the third most costly health issue after reproduction issues and mastitis in the dairy industry [23]. Lameness affects not only animal welfare but also yield and profit. In addition, due to its high prevalence on farms, lameness is regarded as a major health and economic concern in modern cattle farming. As such, the detection of lameness in an accurate and timely manner is of great significance [24,25]. However, the cause and prevalence of lameness vary between production systems (pastures and barns) and farm management but lameness is typically found in between 10% and 30% of the herd [26].

2.2. Manual Cattle Lameness Detection Approaches

Lameness can be detected manually by visually observing behavioural changes as lame animals reduce their speed, change their pace, arch their backs, and drop their heads during walking [27]. A manual Locomotion Scoring System (MLSS) is a systematic method of assessing lameness [28] with the locomotion of an animal scored on an ordinal scale by humans who watch for specific locomotion traits [28]. Sprecher et al. [22] and Winckler and Willen [29] scored cow lameness by considering the step consistency, step size, and load of a dairy cow’s gait. More recently, Thomsen et al. [30] used the threshold judgment method to make the lameness scoring system reliable. It should be remarked that the score from MLSS is subjective, as it is influenced by the examiner’s experience and perceptions [31,32]. Moreover, as the intensity and scale of cattle farming increase, farmers tend to have less time to conduct manual lameness assessments.

2.3. Automatic Cattle Lameness Detection Approaches

In recent years, electronic sensors and artificial intelligence techniques have been introduced into the livestock industry for lameness detection. With these emerging technologies, lameness detection can be performed in a more timely and accurate manner [32]. Advances in sensor technologies and other fields have been adapted to automatic lameness detection. For example, there is recent research dedicated to detecting lameness automatically with Automatic Locomotion Scoring Systems (ALSSs) [19,33]. In contrast to MLSSs, ALSSs could provide a more objective, consistent lameness assessment [20]. Locomotion was classified into lameness levels in recent studies using more advanced metrics such as body movement pattern [34], gait [35], and step frequency [36].
The most popular sensors used in automatic cattle lameness detection include force platforms [37], two-dimensional (2D) and three-dimensional (3D) cameras [38], and on-limb accelerometers [39]. Measurements from these sensor are the input to algorithms to compute lameness traits such as step overlap [33,40] and back curvature [38].
According to [19,27], automatic lameness detection approaches can be categorised into three classes: kinetic (measuring forces involved in locomotion), kinematic (measuring limb trajectories in space and time, and related specific posture variables), and indirect measurement techniques (measuring behavioural or production variables). An overview of the most popular lameness detection methods and the corresponding locomotion traits observed is given in Table 1.

2.3.1. Kinetic Approaches

Cattle lameness can be detected through an analysis of cattle motion and the causes of motion such as forces, or translational and rotational torque. This is known as the kinetic framework. In kinetic approaches, hoof forces or weight distribution when cattle are walking or standing, respectively, are often used to evaluate locomotion scores. For example, Liu et al. [37] and Dunthorn et al. [41] used a force-plate to measure the leg force and applied logistic regression to detect cattle lameness. It should be noted that in early work on kinetic approaches, only vertical ground reaction forces were considered for lameness detection. In recent years, ground reaction forces in three dimensions have been measured and utilised for lameness detection [38,41].
Deep learning approaches have also gained some recent interest. For example, Wu et al. [42] proposed a lameness detection approach for dairy cows based on YOLOv3. It is worthwhile mentioning that, in real practical experiments, the lameness detection accuracy of kinetic approaches is affected by the cattle hoof position on the weighing units during measurement [43] or by the walking speed of cows [44].
Table 1. Main studies on lameness detection.
Table 1. Main studies on lameness detection.
WorkSensorDataset Size
(Cattle Number)
TraitsModelAutomation LevelResults
kinetic
Liu et al. [37]force plate346vertical kineticlogistic regressionmediumsensitivity = 51.92%
Dunthorn et al. [41]3D force-plate85leg forcelogistic regressionmediumsensitivity = 90.0%
Nechanitzky et al. [45]weighing platform44weight and laying timelogistic regressionmediumsensitivity = 81.0%
Chapinal et al. [36]camera
weighing platform
57step frequency
laying time, weight
logistic regressionhighArea under the curve = 83.0%
Chapinal and Tucker [46]camera
weighing platform
257step number and gaitstatistic analysishighsensitivity ≥ 0.96
Zillner et al. [47]clock53walking speedanalysis of variancelowsensitivity = 71.43%
kinematic
Van Nuffel et al. [48]gaitwise system61gaitlinear discriminantmediumsensitivity = 88.0%
Pluk et al. [40]camera85step overlapregression modelmedium R 2 = 80.90%
Poursaberi et al. [49]camera156back curvatureimage analysishighaccuracy = 96.7%
Poursaberi et al. [50]camera1200posture and movementimage analysishighaccuracy = 92.0%
Viazzi et al. [34]camera90posture and movementimage analysishighaccuracy = 76.0%
Viazzi et al. [38]3D camera273back posturedecision treehighaccuracy = 90.0%
Van Hertem et al. [35]3D camera186gaitlogistic regression modelhighaccuracy = 60.2%
Van Hertem et al. [51]3D camera208back posturebinary GLMMhighaccuracy = 79.8%
Wu et al. [42]camera50step sizelong short-term memoryhighaccuracy = 98.57%
Zhao et al. [12]camera98leg swingdecision tree classifierhighsensitivity = 90.25%
Beer et al. [52]Camera63gaitlogistic regression modelmediumsensitivity = 90.2%
Jiang et al. [53]camera30walking characteristicsdouble normal distribution statisticalhighaccuracy = 93.75%
Jabbar et al. [54]3D camera22height variationsupport vector machinehighaccuracy = 95.7%
Kang et al. [55]camera100supporting phasedata analysishighaccuracy = 95.7%
Piette et al. [56]camera209back posturethreefold cross validationhighaccuracy = 82.0%
In direct
De Mol et al. [57]3D accelerometers100lying timedynamic linear modelhighsensitivity = 85.5%
Kamphuis et al. [58]pedometers, weigh scales
milk meters
292live weight, steps
milk yield
dynamic linear modelhighsensitivity = 80.0%
Miekley et al. [59]milk meter pedometers338milk yield
feeding patterns
principal component analysishighsensitivity = 87·8%
Kramer et al. [60]milk meter
neck transponders
125milk yield and activityfuzzy logic modelhighsensitivity ≥ 70.0%
Chapinal et al. [44]camera153gait score, walking speed
lying behaviour
discriminant analysishighsensitivity = 67.0%
Garcia et al. [61]automatic milking system88milk yield and activity variablesdiscriminant analysishighsensitivity ≥ 80.0%
Wood et al. [62]Infrared thermometry153foot temperaturelinear regressionhighcoefficient = 62.3%
Lin et al. [63]infrared thermometers990foot-surface temperatureslinear regressionhighsensitivity = 78.5%
Jabbar et al. [54]3D camera22shape index and curvednessSVMhighaccuracy = 95.7%
Taneja et al. [64]camera150step count, lying time, swapsK-Nearest Neighbourshighaccuracy = 87.0%
Jiang et al. [53]camera30pixel distribution characteristicsstatistical modelhighaccuracy = 93.75%
Note: SVM means support vector machine; GLMM means Generalised linear mixed model; Gaitwise is a pressure-sensitive measure system.

2.3.2. Kinematic Approaches

Unlike kinetic approaches that try to measure ground reaction forces, kinematic approaches focus on kinematic variables (i.e., how the cattle move spatially and temporally) [20]. In other words, the kinematic approaches only study the motion itself without considering the cause of the motion.
In kinematic approaches, different techniques can be used to obtain variables of locomotion such as step size, step length, height, and back curvature [40,48,65]. In general, kinematic gait variables can be computed based on hoof location pattern. For example, Pluk et al. [65] used a pressure sensitive mat-based Gaitwise system to measure the hoof location with the vertical reaction force and time.
Image/video processing and analysis have also been used for cattle lameness detection, where the recorded cattle videos are transformed into images sequences for kinematics extraction [65]. Here, the hooves, limb joints, and withers were tracked using attached reflective markers, and the kinematic gait parameters (e.g., stride duration, stance duration, and swing duration and hoof speed) were calculated to find ulcers. Moreover, back postures extracted from video frames were used for automatic lameness detection in [49], where a binary classification from a back arch metric resulted in a sensitivity of 100%, a specificity of 97.6%, and correct classification rates in the order of 96.5%. Furthermore, Beer et al. [52] evaluated the feasibility of the newly described parameters (e.g., “calculated walking speed” and “lying bout duration”) of cow gait for the early detection of lameness.
Other techniques use cattle limbs attached accelerometers to measure the acceleration of legs while cattle are walking [66]. Based on the collected measurements, a ratio of acceleration variance and a ratio of wavelet detail between the left and right limbs can be used for lameness detection.

2.3.3. Indirect Approaches

Some variables not directly related to lameness or motion have also been used for lameness detection. The methods in this category are usually called indirect approaches [58,67,68]. Generally speaking, these approaches use sensors to measure behavioural (e.g., lying, standing, and walking time) and production variables (e.g., milking order and milk yield) for final lameness detection [61]. For example, ALSSs in Buisman et al. [25] relied on the use of on-cow sensors such as accelerometers to detect alterations in behaviour, such as the duration of lying or standing bouts, and the total time lying down or standing per day. In fact, lying time is a commonly used behaviour metric in several works [69,70,71], while ALSSs is based on production information and mainly focuses on the live weight, or milk yield and collection time. The production data may be obtained by combining off-cow sensors such as milk meters or weight scales [58].
Miekley et al. [59] proposed a lameness detection method based on pedometer activity and feeding patterns. Kramer et al. [60] used the milk yield and feeding behaviour to predict lameness with the aid of fuzzy logic and achieved specificity in the range of 75% for 125 cows. Gardenier et al. [20] used Faster R-CNN to detect hooves and carpal/tarsal joints to obtain individual trajectories per limb. Jiang et al. [68] learned video representations using neural networks with single-stream long-term optical flow convolution and achieved 98.24% lameness behaviour recognition accuracy with a speed of 564 FPS (1.77 ms/image).
Garcia et al. [61] used the milking, feeding, and behavioural parameters to predict lameness with a partial least squares discriminant analysis model. In that work, a binary lameness classification reached 77% and 83% for parity in two cows. To improve lameness detection performance, in Ishihara et al. [72], a multivariate model combining several variables such as body posture, day-time activity, rear back angle, walking speed, milk yield, and milk flow rate was proposed.
Additionally, 2D cameras and 3D sensors have gained popularity in lameness detection [49]. In these approaches, after image acquisition (Seen in Figure 2), visual features (such as uneven gait and back arch) were extracted to build a lameness detection model. Viazzi et al. [38] compared the 2D and 3D camera systems for cow lameness detection and found that both 3D and 2D camera can achieve more than 90% accuracy. In general, compared with 2D camera, 3D cameras obtain more comprehensive information and are more suitable for long-term observation and data collection for lameness detection. However, the processing of 3D information is complex and time-consuming due to its larger amount of data.
In Song et al. [33], trackway information containing hoof locations in the real world and its corresponding times in a video were calculated for automatic detection of lameness. Van Hertem et al. [51] used a 3D video recording system to automatically quantify the back posture of cows and achieved 79.8% lameness classification accuracy using generalised linear mixed models (GLMM). Jabbar et al. [54] proposed a nonintrusive lameness detection method in dairy cows using 3D video and achieved an overall lameness classification accuracy with 95.7%. Zhao et al. [12] developed an automatic system for scoring the locomotion of cows, quantified the movement patterns of cows for classifying lameness using the features extracted from movement analysis, and achieved 90.18% accuracy.
Recently, Jiang et al. [53] proposed a double normal background statistical model for lameness detection using side-view images and achieved 93.75% detection accuracy. Piette et al. [56] proposed a lameness monitoring algorithm based on back posture values derived from a camera for individual cows and tuned the deviation thresholds and the quantity of the historical data being used. Taneja et al. [64] developed an end-to-end approach that leverages fog computing and K-Nearest Neighbours techniques to identify lame cattle and achieved 87% accuracy for an early lameness detection window of 3 days before visual signs.
Apart from traditional 2D and 3D cameras, thermal infrared cameras have also been used to check hoof temperatures for cattle lameness detection [73]. This is based on the fact that hoof lesions and infection can change the hoof surface temperature due to increased blood flow. Hence, when a cow’s hoof is damaged, the surface temperature increases [74].
For example, Lin et al. [63] proposed a lameness detection approach using infrared thermometers. They analysed the ambient-temperature-adjusted foot-surface temperatures and temperature differences between the hind feet of individual cows to optimise lameness detection. According to their results, the optimal threshold was 23.3 C with 78.5% sensitivity and 39.2% specificity. However, given the fact that different hoof positions result in varying temperatures, the selection of the threshold value still needs further study. Nevertheless, infrared thermography has great potential as an early diagnostic method for lameness and can compensate for 2D or 3D lameness detection because early lameness motor characteristics may not be as significant [75].

2.4. Limitations of Automated Lameness Detection Systems

The availability of low cost and validated automatic lameness detection systems makes monitoring animal lameness behaviour quite feasible. However, the majority of lameness detection systems are still in the research phase and have not yet been commercialised and implemented under field conditions [76].
The use of ALSSs can be influenced by many factors (e.g., experiences of users, the space limitation, and the investment budget). Investment cost, product efficiency, maintenance complexity, robustness, and equipment application ability are the main factors to consider when choosing the type of automatic lameness detection system [27]. A good lameness detection system should be combined with existing farm infrastructures (e.g., weighing platform) and should allow the manager to regularly (e.g., twice or three times a day) check the data. The fusion of different sensors’ measurements such as feeding, milking, and grooming data has the potential to improve the accuracy of lameness detection at an early stage [53].
In terms of automated lameness detection methods, a combination of multiple methods can potentially further improve the robustness and accuracy of detection because it is difficult to detect all lame cows accurately based on only a single feature [64]. Meanwhile, both temporal and spatial characteristics from data can be unified to improve performance of lameness detection [77]. High quality data should be given to lameness detection systems to help farmers make decisions and to provide some early warnings. In addition, parts of lameness detection also rely on the recognition of behaviours such as walking, lying, and feeding; therefore, a comprehensive behaviour recognition and analysis is also helpful in some cases.

2.5. Cattle Behaviours

Cattle behaviour mainly refers to the animals’ continuous interaction with the environment and the way they express themselves. Hence, it is a valuable indicator in assessing the health and welfare of animals [78]. The behaviour of domestic cattle has evolved over a long time, initially in response to their domestication [79]. According to recent research, the main cattle activity behaviours in PLF can be classified into grazing, exploring, grooming, mounting, ruminating, lying, walking, standing, and aggressive behaviour [80]. Measuring and assessing the behaviour of livestock is important as it can be used to indicate their pain feeling [81], lameness [67], and welfare status [82]. When animals are ill, their behaviour changes include a decrease in exploratory activity, reproductive activity, food and water intake, grooming, and other social behaviours. Hence, monitoring and analysing changes in behavioural activity could provide useful information for timely management decisions to optimise animal performance, genetic selection and breeding, welfare, and environmental outcomes [83]. In Table 2, the descriptions of some main cattle behaviours are given.
Especially, grazing is an important behaviour from an economic and welfare point of view in PLF [80]. Lying behaviour is a parameter frequently quantified by precision dairy monitoring technologies, since the time that a cattle spends lying down can indicate comfort, welfare, and health changes in an animal [84,85]. Mounting behaviour is the most widely used indicator of reproductive behaviour for estrus detection [86]. Aggressive behaviour can be observed during feeding times when animals compete for food, water, or other resources. There is also some association between aggressiveness and a high level of feeding in a half-open feedlot production system, as investigated in [87].
Recent progress towards cattle behaviour monitoring and analysis can be classified into three different categories: the first category only focuses on behaviour detection, the second category is long-term behaviour monitoring and detection, and the final category is automatic behavioural changes detection and quantification based on long-term behaviour monitoring [88]. Currently, most existing results in the literature focus on the second category—monitoring behaviour over time, with few reports about the third category—detection and quantification of behavioural changes.

2.6. Manual Approaches for Cattle Behaviour Monitoring and Recognition

The traditional human observation method for cattle behaviour recognition is time-consuming [89]. For example, Geers et al. [90] reported that the time required for mounting behaviour detection accounts for 30% of the labour involved in commercial farming. Sambraus et al. [91] mentioned that continuous monitoring of mounting behaviour results in 20% of oestrus being undetected. Moreover, recognising individual cattle in a large herd for key management decisions such as the identification of estrus is too labour-intensive [92].

2.7. Automatic Approaches for Cattle Behaviour Monitoring and Recognition

Recently, the increasing availability of sensors and machine learning technologies makes automated monitoring and recognition of animal behaviour practicable [93,94]. Sensors that can provide information about animal behaviour can be classified as contact and non-contact ones. On the one hand, contact sensors are usually fitted on (or sometimes in) the animal, for example, tags, collars, global Positioning System (GPS), accelerometer, pedometers, and magnetometer, etc. [95]. On the other hand, non-contact sensors such as camera and LiDAR are cheap, easy, non-stressful, and noninvasive methods. Moreover, non-contact sensors can be adapted to different animals, in both indoor and outdoor situations, using the animals’ natural features (e.g., shape, colour, and movement) for monitoring their behaviours [96].
It should be remarked that an automatic activity monitoring system needs to allow for recording in the animals’ normal environment without influencing the animals’ behaviour. Additionally, cattle can vary in size and shape (spatial), and over a period of time (temporal). Therefore, to collect behavioural phenotypic information, temporal or spatial features (e.g., velocity, acceleration, speed, shape, and contour) can be extracted from sensor data for behaviour recognition.
The concept of features should also include external factors such as temperature and air quality. In addition, the distribution time of feed and drinking also contains useful information that can explain the current conditions influencing cattle’s behaviours. The feature extraction processes need to be practical with respect to the demand on computational cost and efficiency. After the features are extracted, machine learning methods can be applied to identify the cattle behaviours. In Table 3, some main contact and non-contact sensor-based cattle behaviour recognition studies are presented.

2.7.1. Contact Sensor-Based Approaches

Contact sensor-based approaches mainly collect individual animal data through sensors fixed on cattle and recognise behaviours according to animal posture (standing or lying), behavioural activity (walking, resting, grazing, and ruminating), and geolocation [101]. For example, Yin et al. [118] used a wireless sensor monitoring system to capture cattle body temperature, respiratory rate, and movement acceleration parameters; then, they used a K-means clustering algorithm to distinguish cattle behaviours. Barriuso et al. [119] presented a multi-agent architecture based on virtual organisations to help farmers monitor the cattle remotely.
González et al. [80] analyzed grazing cattle data from collar-mounted motion and used GPS sensors to perform automatic and real-time behaviour monitoring with high spatial and temporal resolution. Werner et al. [120] validated the RumiWatchSystem as a research tool for measuring detailed grazing behaviour of cows. To improve cattle welfare monitoring and to reduce the labour requirements, Mattachini et al. [70] proposed an automated lying behaviour measurement method for monitoring lactating dairy cows. In the work of [121], pedometers were used to record the step numbers, and the relationship between cattle step numbers, behavioural estrous parameters, and ovulation time were studied. In that work, it was argued that the pedometer is a promising tool to detect estrus and to predict ovulation. Palmer et al. [122] combined visual observations, tail paint, and radiotelemetry (HeatWatch) for 23 cows’ estrus detection. The results in Gibbons et al. [123] highlighted the complexity of aggressive style of cows during feeding and illustrated that some measures of aggressive feeding behaviour were repeatable within cows. Šimić et al. [124] reported that an enriched environment reduced the occurrence of aggressive behaviour in beef cattle.
More recently, Rahman et al. [104] classified cattle behaviour based on a time series of accelerometer data from collar, halter, and ear tag sensors. Riaboff et al. [107] used 3D accelerometer data to predict the behaviours of dairy cows. Peng et al. [93] developed a recurrent neural network (RNN) with a long short-time memory (LSTM) model to detect and recognise calving-related behaviours using inertial measurement unit (IMU). Shen et al. [109] used a triaxial acceleration sensor as the device for collecting mandibular movement data of dairy cow and divided dairy cow behaviours into three categories: eating, ruminating, and other behaviours. In that work, by using K-nearest neighbour algorithm, the recognition accuracy of eating and ruminating reached 92.8% and 93.7%, respectively.
Although contact sensors might have a high precision, they can cause cattle stress. Moreover, the service life of these detection devices can be affected by factors such as scraping and moisture infiltration. In addition, it is impractical to use contact sensors for scoring group behaviours due to their cost and vulnerability.

2.7.2. Non-Contact Sensor-Based Approach

In recent years, a number of non-contact sensor-based approaches have been proposed for undertaking behaviour monitoring and recognition. As non-contact sensors can continuously operate without operator involvement, it is generally believed that they have the potential to assess animal behaviour more quantitatively under a predetermined process that does not change greatly [125,126]. For these reasons, vision/LiDAR-based animal behaviour recognition methods have attracted a lot of attention in the literature [69].
For example, Huang et al. [127] investigated cattle body dimension reconstruction with transfer learning from LiDAR measurements. Gao et al. [128] extracted cattle for moving behaviour tracking and recognition using a dynamic analysis. Gu et al. [112] used minimum bounding box and contour mapping to identify cattle behaviour, and hoof or back characteristics. Meunier et al. [129] integrated a number of image analysis techniques to help determine cows’ main activities (except drinking behaviour). In some other studies, 2D and 3D cameras have been utilised to quantify how much feed was consumed by an individual animal [88,130]. In general, 2D camera monitoring realises behaviour clarification based on shape and colour features, while 3D cameras are more accurate in distinguishing between behaviour using 3D motion detection during feeding and drinking times.
Porto et al. [111] modeled and verified feeding and standing behaviour detection in dairy cows by designing a method based on the Viola–Jones algorithm and a multi-camera video recording system. The sensitivity of this system to feeding and standing behaviours was about 0.87 and 0.86, respectively. Guo et al. [114] used region geometry (for example, inter-frame difference and background subtraction), optical flow characteristics and support vector machine to recognise cow mounting behaviour, achieving a recognition accuracy of 0.98 with 30 videos. Tracking the animal around its enclosure can also lead to the discovery of other important information such as the time taken at the feeder or drinker and can help optimise farm decisions, e.g., the number of feed stations or space requirements [125].
Additionally, sound recognition-based cattle behaviour recognition approaches have also attracted some attention in the cattle industry. Nunes et al. [131] trained a recurrent neural network (RNN) with a long short-term memory (LSTM) layer to detect and distinguish cattle behaviours via chews, bites, and noise. Jung et al. [132] proposed deep learning-based cattle vocal classification model and real-time livestock monitoring system with noise filtering. The proposed approach achieved 81.96% accuracy after the sound filtering. Meen et al. [133] reported a potential welfare monitoring system that observes the vocalisations and behaviours of Holstein Friesian cattle using audio and video recordings. Röttgen et al. [134] reported that the vocalisation rate is a suitable indicator used to confirm a cattle’s estrus status, and it was suggested that the status of the cattle can be monitored through voice analysis. Chelotti et al. [135] estimated grazing and rumination bouts using acoustic signals in grazing cattle and achieved 0.75 F1-scores for both activities. However, how to effectively acquire sound and to accurately determine this information in a livestock facility is still a challenge.
Apart from the above, the potential to identify welfare-compromised animals through motion characteristics or spatial characteristics can has also been explored. Fuentes et al. [77] extracted temporal-context features (3D-CNN) and motion information (optical flow) from videos, achieving 78.8% recognition for 15 different hierarchical behaviours. Yin et al. [115] proposed the EfficientNet-LSTM model to extract spatial feature for the recognition of cows’ motion behaviours, which achieved 97.87% behaviour recognition accuracy in the antagonism of environmental robustness. Wu et al. [13] proposed CNN-LSTM (a fusion of convolutional neural network and long short-term memory) for recognising the basic behaviours of a single cow. In the former work, the experimental results illustrated that the precision of the proposed algorithm for the recognition of five behaviours ranged from 0.958 to 0.995, that the recall ranged from 0.950 to 0.985, and that the specificity ranged from 0.974 to 0.991.

2.8. Cattle Behavioural Change Detection and Quantification

Although comprehensive knowledge of the characteristics of the behavioural activities of animals is fundamental, changes in behavioural activity are also important as they reflect exceptional and probably challenging situations caused by internal or external stimuli [88]. Methods of assessing behavioural changes have emerged in recent years with the development of smart sensors and data analysis techniques.
Actually, behavioural change is a good indicator that can reflect diseases or welfare situations [136]. González et al. [81] used data on the feed intake, feeding time, and number of daily feeder visits to describe and quantify changes in short-term feeding behaviour. Their research showed that the quantification of short-term feeding behaviour is helpful in the early identification of sick cows. Overton et al. [137] recorded dairy cow behavioural patterns using time-lapse video photography and examined factors affecting lying behaviour changes during summer conditions. Butt [138] investigated the influences of seasonality in drylands for space–time dynamics of cattle behaviours based on data from GPS collars.
In addition, MacKay et al. [139] illustrated the links between short-term temperament tests and the longer-term behaviour data in beef steers. Some subtle behavioural changes such as walking speed, frequency of standing episodes, or the amount of food intake may also be regarded as indicators of animal health compromises [88]. However, quantifying variable and complex animal behaviour is challenging, and some subtle changes are hard to detect at early stages. Therefore, leveraging long-term videos to measure and quantify the change in behaviours through automated tracking and analysing is of significant value in health and welfare monitoring.

2.9. Limitations of Existing Approaches

Most studies focus on the basic behaviours of a single animal such as walking, standing, lying, feeding, and drinking. There are few research on other advanced and group behaviours such as rumination, limping, reproduction, and aggression [79]. In large farms, group or interaction behaviours are also important for animal welfare and the corresponding management. Meanwhile, some tiny behaviours such as limping is part of basic walking behaviours, which is difficult to detect using a general network [13]. In terms of behavioural analysis, environmental conditions are prone to be ignored. Actually, environmental conditions such as temperature, humanity, and carbon dioxide density affect the cattle’s activity and motion behaviours [124].
On the other hand, the majority of the abovementioned behaviour recognition methods require high-definition videos, which may limit their practicability in complex environments such as the low image quality of farm cameras, night, and rainy days. Additionally, quantifying variable and complex animal behaviours based on video data is challenging. The instances of a given behaviour must be recorded and analysed to detect changes with statistical analysis. However, it is time-consuming and often accompanied by error. Additionally, some small behavioural changes are difficult to detect using visual data. All of these severely limit the quantification of animal behaviour changes [77]. It is believed that computer vision combining motion sensor systems could achieve a more economical and accurate behavioural analysis system.
Moreover, the majority of existing monitoring techniques used ground-level sensors such as smart ear tags, camera traps [140], and infrared thermal cameras. Hence, these approaches have limitations in relatively large geographic areas with complex terrains. Remotely sensed imagery can be used to identify dead or live animals with poor mobility by tracking the movement of the animals. This could be a potential alternative and could complement ground-based animal monitoring [141]. The use of quadcopter or satellite data in conjunction with machine learning algorithms is likely to become an emerging and promising direction that can revolutionise livestock management.

3. Challenges and Future Research Trends

Based on the above literature review and the livestock development requirements, some main challenges of lameness detection and behaviour recognition are summarised in the following:
(1) Lack of high-quality public data and data fusion methods: Machine learning methods rely on large-scale data to train a favorable model. However, considering the high values of data and the issues of ownership, security, and confidentiality, farms and other commercial entities seldom release their collected data into the public domain [142]. In addition, complex datasets generated from different sources, such as images and motion information, may fail to compensate for functions due to unknown interactions across multiple variables.
(2) Demand for smart management systems: Various sensor data and information could be used to support farm-level decision making, but few management systems could be used to deal with the complex and large-scale data in a broader geographic contexts. How to establish a production and cost management system and how to use this to balance economic and non-economic values from emerging technologies remain challenge to be explored [143].
(3) Lack of commercial availability: The data in laboratory research and production practice have been in a state of disconnect, and actual production is still lacking. Given the techniques used, the performance of the new proposed systems reviewed would be questionable if applied outside the laboratory environment. There is a lack of effective tools to widely use existing data, knowledge, and models [10]. Therefore, a practical system would need to be designed to satisfy applications in a commercial farm environment.
With the development and maturity of various smart sensors, big data, and artificial intelligence, precision livestock farming will develop in the direction of standardisation, large scales, and intelligence with the support of modern equipment. Based on the above review of the research in lameness detection and behaviour recognition, the future research opportunities are discussed in the following:
(1) Animal pose estimation and behaviour changes detection: Pose estimation could help to ensure that cattle in abnormal conditions can be identified on time, hence reducing the possibility of infection and improving the quality of dairy or meat products. In addition, perceived animal’s behaviour changes can provide a basis for automatic determination of its health status, accurate breeding, and other fields. However, research on pose estimation and behaviour change detection is still in its infancy. Advanced pose estimation, behavioural models, and detection theories and methods are important and are desired for future PLF development.
(2) Livestock growth model and intelligent decision support system: Based on big data, perception technology, automatic control technology, and livestock breeding technology, the whole life cycle of an animal can be monitored and analysed. By analysing and processing massive amounts of animal data and information, livestock growth models can be constructed to achieve fine control of livestock and to maximise the benefits of farmers. Meanwhile, decision support systems exploring the trade-off among conflict objectives and offering farmers feasible solutions are desirable. Such systems should consider various data sources, including economic (farm income, profit, and gross domestic product), social (public support subsidy and farm employment), animal welfare and health (body condition, weight, behaviour, reproduction, and growth), and environmental indicators (soil cover, nitrogen, pesticide, and energy). Based on these indicators, an intelligent growth model can be used to assess the trade-off among economic–social–environmental objectives.
(3) New strategies for environmental regulation based on livestock welfare and production performance: Environmental conditions have a significant influence on animal growth rate, behaviours, health status, and productivity. Creating a comfortable growth and production environment for livestock is not only related to the welfare and health of the livestock itself but also closely related to the quality of livestock products, food safety, and economic benefits of the farm. It is necessary to monitor the dynamic changes in the ecological environment parameters in real time. Based on animal behaviour changes, nutrition, growth, and health status, regulatory decisions for the fine control of environmental dynamics and fine feeding of livestock can be made.
(4) The development of more advanced livestock monitoring equipment: It is desirable to develop intelligent equipment and production process robots with embedded perception and intelligent control, from breeding stocks to commercial stock. Industrial applications of intelligent breeding equipment, especially robots, should be combined with breeding modes and livestock facilities in order to improve the overall process efficiency. At the same time, it will also be challenging and rewarding to study animal physiology, growth, and behaviour for better mutual adaptation of equipment and animals and to improve the welfare of animals.

4. Conclusions

The global livestock industry has been developing in the direction of standardisation, large scales, and intelligence. Intelligent perception for cattle monitoring is the key to the development of precision livestock farming. The low cost, high efficiency, safety, and sustainability of a large-scale livestock industry can be promoted through the acquisition, processing, analysis, and application of information on cattle welfare and behaviour. Cattle lameness and behaviour are two important indicators for the determination of diseases and health status, early and real-time detection of normal behaviours (e.g., feeding and drinking), and abnormal behaviours (e.g., aggression).
Hence, monitoring cattle lameness and behaviour can reduce the cost of animal production, reduce losses from diseases and mortality, and improve the efficiency of livestock management. In this paper, we conducted a comprehensive survey of intelligent perception for cattle lameness detection and behaviour analysis in the precision livestock farming domain. It is our anticipation that contactless, automated, real-time, and continuous detection will play an important role in PLF. Based on the literature review, we have also discussed the emerging future research trends. Our aim and hope is that this survey will assisst researchers in the field of precision livestock farming, especially in solving various livestock problems involving monitoring cattle welfare and productivity.

Author Contributions

Investigation, data curation, methodology (lead), formal analysis, and writing—original draft, Y.Q.; experiment coordination in the field, funding acquisition, and revision, C.C.; experiment coordination in the field, resources, and revision, S.L.; methodology (supporting), and writing—review and editing, H.K.; conceptualization, methodology (supporting), and writing—review and editing, D.S. and S.E.; resources and funding acquisition, S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Meat & Livestock Australia Donor Company (grant number P.PSH.0819).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The authors also express their gratitude to Khalid Rafique, Javier Martinez, Amanda Doughty, Ashraful Islam, and Mike Reynolds for their help in experiment organisation and data collection.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Herrero, M.; Henderson, B.; Havlík, P.; Thornton, P.K.; Conant, R.T.; Smith, P.; Wirsenius, S.; Hristov, A.N.; Gerber, P.; Gill, M.; et al. Greenhouse gas mitigation potentials in the livestock sector. Nat. Clim. Chang. 2016, 6, 452–461. [Google Scholar] [CrossRef] [Green Version]
  2. Henchion, M.; Hayes, M.; Mullen, A.M.; Fenelon, M.; Tiwari, B. Future protein supply and demand: Strategies and factors influencing a sustainable equilibrium. Foods 2017, 6, 53. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Thornton, P.K. Livestock production: Recent trends, future prospects. Philos. Trans. R. Soc. B Biol. Sci. 2010, 365, 2853–2867. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Fournel, S.; Rousseau, A.N.; Laberge, B. Rethinking environment control strategy of confined animal housing systems through precision livestock farming. Biosyst. Eng. 2017, 155, 96–123. [Google Scholar] [CrossRef]
  5. García, R.; Aguilar, J.; Toro, M.; Pinto, A.; Rodríguez, P. A systematic literature review on the use of machine learning in precision livestock farming. Comput. Electron. Agric. 2020, 179, 105826. [Google Scholar] [CrossRef]
  6. Neethirajan, S.; Reimert, I.; Kemp, B. Measuring Farm Animal Emotions—Sensor-Based Approaches. Sensors 2021, 21, 553. [Google Scholar] [CrossRef]
  7. Buller, H.; Blokhuis, H.; Lokhorst, K.; Silberberg, M.; Veissier, I. Animal welfare management in a digital world. Animals 2020, 10, 1779. [Google Scholar] [CrossRef]
  8. Kendrick, K.M. Intelligent perception. Appl. Anim. Behav. Sci. 1998, 57, 213–231. [Google Scholar] [CrossRef]
  9. King, A. The future of agriculture. Nature 2017, 544, S21–S23. [Google Scholar] [CrossRef] [Green Version]
  10. Bahlo, C.; Dahlhaus, P.; Thompson, H.; Trotter, M. The role of interoperable data standards in precision livestock farming in extensive livestock systems: A review. Comput. Electron. Agric. 2019, 156, 459–466. [Google Scholar] [CrossRef]
  11. Qiao, Y.; Truman, M.; Sukkarieh, S. Cattle segmentation and contour extraction based on Mask R-CNN for precision livestock farming. Comput. Electron. Agric. 2019, 165, 104958. [Google Scholar] [CrossRef]
  12. Zhao, K.; Bewley, J.; He, D.; Jin, X. Automatic lameness detection in dairy cattle based on leg swing analysis with an image processing technique. Comput. Electron. Agric. 2018, 148, 226–236. [Google Scholar] [CrossRef]
  13. Wu, D.; Wang, Y.; Han, M.; Song, L.; Shang, Y.; Zhang, X.; Song, H. Using a CNN-LSTM for basic behaviors detection of a single dairy cow in a complex environment. Comput. Electron. Agric. 2021, 182, 106016. [Google Scholar] [CrossRef]
  14. Qiao, Y.; Kong, H.; Clark, C.; Lomax, S.; Su, D.; Eiffert, S.; Sukkarieh, S. Intelligent perception for cattle monitoring: A review for cattle identification, body condition score evaluation, and weight estimation. Comput. Electron. Agric. 2021, 185, 106143. [Google Scholar] [CrossRef]
  15. Berckmans, D. Precision livestock farming technologies for welfare management in intensive livestock systems. Sci. Tech. Rev. Off. Int. Des Epizoot. 2014, 33, 189–196. [Google Scholar] [CrossRef] [PubMed]
  16. He, D.; Liu, D.; Zhao, K. Review of perceiving animal information and behavior in precision livestock farming. Trans. Chin. Soc. Agric. Mach. 2016, 47, 231–244. [Google Scholar]
  17. Qiao, Y.; Su, D.; Kong, H.; Sukkarieh, S.; Lomax, S.; Clark, C. Individual Cattle Identification Using a Deep Learning Based Framework. IFAC-PapersOnLine 2019, 52, 318–323. [Google Scholar] [CrossRef]
  18. Van Nuffel, A.; Zwertvaegher, I.; Van Weyenberg, S.; Pastell, M.; Thorup, V.; Bahr, C.; Sonck, B.; Saeys, W. Lameness detection in dairy cows: Part 2. Use of sensors to automatically register changes in locomotion or behavior. Animals 2015, 5, 861–885. [Google Scholar] [CrossRef] [Green Version]
  19. Schlageter-Tello, A.; Bokkers, E.A.; Koerkamp, P.W.G.; Van Hertem, T.; Viazzi, S.; Romanini, C.E.; Halachmi, I.; Bahr, C.; Berckmans, D.; Lokhorst, K. Manual and automatic locomotion scoring systems in dairy cows: A review. Prev. Vet. Med. 2014, 116, 12–25. [Google Scholar] [CrossRef]
  20. Gardenier, J.; Underwood, J.; Clark, C. Object Detection for Cattle Gait Tracking. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 2206–2213. [Google Scholar]
  21. O’Leary, N.; Byrne, D.; O’Connor, A.; Shalloo, L. Invited review: Cattle lameness detection with accelerometers. J. Dairy Sci. 2020, 103, 3895–3911. [Google Scholar] [CrossRef] [Green Version]
  22. Sprecher, D.; Hostetler, D.; Kaneene, J. A lameness scoring system that uses posture and gait to predict dairy cattle reproductive performance. Theriogenology 1997, 47, 1179–1187. [Google Scholar] [CrossRef]
  23. Enting, H.; Kooij, D.; Dijkhuizen, A.; Huirne, R.; Noordhuizen-Stassen, E. Economic losses due to clinical lameness in dairy cattle. Livest. Prod. Sci. 1997, 49, 259–267. [Google Scholar] [CrossRef]
  24. Kang, X.; Zhang, X.D.; Liu, G. A Review: Development of Computer Vision-Based Lameness Detection for Dairy Cows and Discussion of the Practical Applications. Sensors 2021, 21, 753. [Google Scholar]
  25. Buisman, L.L.; Alsaaod, M.; Bucher, E.; Kofler, J.; Steiner, A. Objective assessment of lameness in cattle after foot surgery. PLoS ONE 2018, 13, e0209783. [Google Scholar] [CrossRef]
  26. Bennett, R.; Barker, Z.; Main, D.; Whay, H.; Leach, K. Investigating the value dairy farmers place on a reduction of lameness in their herds using a willingness to pay approach. Vet. J. 2014, 199, 72–75. [Google Scholar] [CrossRef] [PubMed]
  27. Alsaaod, M.; Fadul, M.; Steiner, A. Automatic lameness detection in cattle. Vet. J. 2019, 246, 35–44. [Google Scholar] [CrossRef] [PubMed]
  28. Schlageter-Tello, A.; Bokkers, E.; Koerkamp, P.; Van Hertem, T.; Viazzi, S.; Romanini, C.; Halachmi, I.; Bahr, C.; Berckmans, D.; Lokhorst, K. Comparison of locomotion scoring for dairy cows by experienced and inexperienced raters using live or video observation methods. Anim. Welf. 2015, 24, 69–79. [Google Scholar] [CrossRef]
  29. Winckler, C.; Willen, S. The reliability and repeatability of a lameness scoring system for use as an indicator of welfare in dairy cattle. Acta Agric. Scand. Sect. A-Anim. Sci. 2001, 51, 103–107. [Google Scholar] [CrossRef]
  30. Thomsen, P.; Munksgaard, L.; Tøgersen, F. Evaluation of a lameness scoring system for dairy cows. J. Dairy Sci. 2008, 91, 119–126. [Google Scholar] [CrossRef] [PubMed]
  31. Renn, N.; Onyango, J.; McCormick, W. Digital infrared thermal imaging and manual lameness scoring as a means for lameness detection in cattle. Vet. Clin. Sci. 2014, 2, 16–23. [Google Scholar]
  32. Gardenier, J.; Underwood, J.; Weary, D.; Clark, C. Pairwise comparison locomotion scoring for dairy cattle. J. Dairy Sci. 2021, 104, 6185–6193. [Google Scholar] [CrossRef]
  33. Song, X.; Leroy, T.; Vranken, E.; Maertens, W.; Sonck, B.; Berckmans, D. Automatic detection of lameness in dairy cattle—Vision-based trackway analysis in cow’s locomotion. Comput. Electron. Agric. 2008, 64, 39–44. [Google Scholar] [CrossRef]
  34. Viazzi, S.; Bahr, C.; Schlageter-Tello, A.; Van Hertem, T.; Romanini, C.; Pluk, A.; Halachmi, I.; Lokhorst, C.; Berckmans, D. Analysis of individual classification of lameness using automatic measurement of back posture in dairy cattle. J. Dairy Sci. 2013, 96, 257–266. [Google Scholar] [CrossRef] [Green Version]
  35. Van Hertem, T.; Viazzi, S.; Steensels, M.; Maltz, E.; Antler, A.; Alchanatis, V.; Schlageter-Tello, A.A.; Lokhorst, K.; Romanini, E.C.; Bahr, C.; et al. Automatic lameness detection based on consecutive 3D-video recordings. Biosyst. Eng. 2014, 119, 108–116. [Google Scholar] [CrossRef]
  36. Chapinal, N.; De Passillé, A.; Rushen, J.; Wagner, S. Automated methods for detecting lameness and measuring analgesia in dairy cattle. J. Dairy Sci. 2010, 93, 2007–2013. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Liu, J.; Dyer, R.M.; Neerchal, N.K.; Tasch, U.; Rajkondawar, P.G. Diversity in the magnitude of hind limb unloading occurs with similar forms of lameness in dairy cows. J. Dairy Res. 2011, 78, 168–177. [Google Scholar] [CrossRef] [PubMed]
  38. Viazzi, S.; Bahr, C.; Van Hertem, T.; Schlageter-Tello, A.; Romanini, C.; Halachmi, I.; Lokhorst, C.; Berckmans, D. Comparison of a three-dimensional and two-dimensional camera system for automated measurement of back posture in dairy cows. Comput. Electron. Agric. 2014, 100, 139–147. [Google Scholar] [CrossRef]
  39. Thorup, V.M.; Munksgaard, L.; Robert, P.E.; Erhard, H.; Thomsen, P.; Friggens, N. Lameness detection via leg-mounted accelerometers on dairy cows on four commercial farms. Animal 2015, 9, 1704–1712. [Google Scholar] [CrossRef] [Green Version]
  40. Pluk, A.; Bahr, C.; Leroy, T.; Poursaberi, A.; Song, X.; Vranken, E.; Maertens, W.; Van Nuffel, A.; Berckmans, D. Evaluation of step overlap as an automatic measure in dairy cow locomotion. Trans. ASABE 2010, 53, 1305–1312. [Google Scholar] [CrossRef]
  41. Dunthorn, J.; Dyer, R.M.; Neerchal, N.K.; McHenry, J.S.; Rajkondawar, P.G.; Steingraber, G.; Tasch, U. Predictive models of lameness in dairy cows achieve high sensitivity and specificity with force measurements in three dimensions. J. Dairy Res. 2015, 82, 391–399. [Google Scholar] [CrossRef]
  42. Wu, D.; Wu, Q.; Yin, X.; Jiang, B.; Wang, H.; He, D.; Song, H. Lameness detection of dairy cows based on the YOLOv3 deep learning algorithm and a relative step size characteristic vector. Biosyst. Eng. 2020, 189, 150–163. [Google Scholar] [CrossRef]
  43. Pastell, M.; Hänninen, L.; De Passillé, A.; Rushen, J. Measures of weight distribution of dairy cows to detect lameness and the presence of hoof lesions. J. Dairy Sci. 2010, 93, 954–960. [Google Scholar] [CrossRef] [Green Version]
  44. Chapinal, N.; De Passille, A.; Weary, D.; Von Keyserlingk, M.; Rushen, J. Using gait score, walking speed, and lying behavior to detect hoof lesions in dairy cows. J. Dairy Sci. 2009, 92, 4365–4374. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Nechanitzky, K.; Starke, A.; Vidondo, B.; Müller, H.; Reckardt, M.; Friedli, K.; Steiner, A. Analysis of behavioral changes in dairy cows associated with claw horn lesions. J. Dairy Sci. 2016, 99, 2904–2914. [Google Scholar] [CrossRef] [Green Version]
  46. Chapinal, N.; Tucker, C. Validation of an automated method to count steps while cows stand on a weighing platform and its application as a measure to detect lameness. J. Dairy Sci. 2012, 95, 6523–6528. [Google Scholar] [CrossRef] [PubMed]
  47. Zillner, J.C.; Tücking, N.; Plattes, S.; Heggemann, T.; Büscher, W. Using walking speed for lameness detection in lactating dairy cows. Livest. Sci. 2018, 218, 119–123. [Google Scholar] [CrossRef]
  48. Van Nuffel, A.; Saeys, W.; Sonck, B.; Vangeyte, J.; Mertens, K.C.; De Ketelaere, B.; Van Weyenberg, S. Variables of gait inconsistency outperform basic gait variables in detecting mildly lame cows. Livest. Sci. 2015, 177, 125–131. [Google Scholar] [CrossRef]
  49. Poursaberi, A.; Bahr, C.; Pluk, A.; Van Nuffel, A.; Berckmans, D. Real-time automatic lameness detection based on back posture extraction in dairy cattle: Shape analysis of cow with image processing techniques. Comput. Electron. Agric. 2010, 74, 110–119. [Google Scholar] [CrossRef]
  50. Poursaberi, A.; Bahr, C.; Pluk, A.; Berckmans, D.; Veermäe, I.; Kokin, E.; Pokalainen, V. Online lameness detection in dairy cattle using Body Movement Pattern (BMP). In Proceedings of the 2011 11th International Conference on Intelligent Systems Design and Applications, Cordoba, Spain, 22–24 November 2011; pp. 732–736. [Google Scholar]
  51. Van Hertem, T.; Bahr, C.; Tello, A.S.; Viazzi, S.; Steensels, M.; Romanini, C.; Lokhorst, C.; Maltz, E.; Halachmi, I.; Berckmans, D. Lameness detection in dairy cattle: Single predictor v. multivariate analysis of image-based posture processing and behaviour and performance sensing. Animal 2016, 10, 1525–1532. [Google Scholar] [CrossRef] [Green Version]
  52. Beer, G.; Alsaaod, M.; Starke, A.; Schuepbach-Regula, G.; Müller, H.; Kohler, P.; Steiner, A. Use of extended characteristics of locomotion and feeding behavior for automated identification of lame dairy cows. PLoS ONE 2016, 11, e0155796. [Google Scholar] [CrossRef] [Green Version]
  53. Jiang, B.; Song, H.; He, D. Lameness detection of dairy cows based on a double normal background statistical model. Comput. Electron. Agric. 2019, 158, 140–149. [Google Scholar] [CrossRef]
  54. Jabbar, K.A.; Hansen, M.F.; Smith, M.L.; Smith, L.N. Early and non-intrusive lameness detection in dairy cows using 3-dimensional video. Biosyst. Eng. 2017, 153, 63–69. [Google Scholar] [CrossRef]
  55. Kang, X.; Zhang, X.; Liu, G. Accurate detection of lameness in dairy cattle with computer vision: A new and individualized detection strategy based on the analysis of the supporting phase. J. Dairy Sci. 2020, 103, 10628–10638. [Google Scholar] [CrossRef] [PubMed]
  56. Piette, D.; Norton, T.; Exadaktylos, V.; Berckmans, D. Individualised automated lameness detection in dairy cows and the impact of historical window length on algorithm performance. Animal 2020, 14, 409–417. [Google Scholar] [CrossRef] [PubMed]
  57. De Mol, R.; André, G.; Bleumer, E.; Van der Werf, J.; De Haas, Y.; Van Reenen, C. Applicability of day-to-day variation in behavior for the automated detection of lameness in dairy cows. J. Dairy Sci. 2013, 96, 3703–3712. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  58. Kamphuis, C.; Frank, E.; Burke, J.; Verkerk, G.; Jago, J. Applying additive logistic regression to data derived from sensors monitoring behavioral and physiological characteristics of dairy cows to detect lameness. J. Dairy Sci. 2013, 96, 7043–7053. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  59. Miekley, B.; Traulsen, I.; Krieter, J. Principal component analysis for the early detection of mastitis and lameness in dairy cows. J. Dairy Res. 2013, 80, 335–343. [Google Scholar] [CrossRef]
  60. Kramer, E.; Cavero, D.; Stamer, E.; Krieter, J. Mastitis and lameness detection in dairy cows by application of fuzzy logic. Livest. Sci. 2009, 125, 92–96. [Google Scholar] [CrossRef]
  61. Garcia, E.; Klaas, I.; Amigo, J.; Bro, R.; Enevoldsen, C. Lameness detection challenges in automated milking systems addressed with partial least squares discriminant analysis. J. Dairy Sci. 2014, 97, 7476–7486. [Google Scholar] [CrossRef] [PubMed]
  62. Wood, S.; Lin, Y.; Knowles, T.; Main, D.J. Infrared thermometry for lesion monitoring in cattle lameness. Vet. Rec. 2015, 176, 308. [Google Scholar] [CrossRef] [Green Version]
  63. Lin, Y.C.; Mullan, S.; Main, D.C. Optimising lameness detection in dairy cattle by using handheld infrared thermometers. Vet. Med. Sci. 2018, 4, 218–226. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  64. Taneja, M.; Byabazaire, J.; Jalodia, N.; Davy, A.; Olariu, C.; Malone, P. Machine learning based fog computing assisted data-driven approach for early lameness detection in dairy cattle. Comput. Electron. Agric. 2020, 171, 105286. [Google Scholar] [CrossRef]
  65. Pluk, A.; Bahr, C.; Poursaberi, A.; Maertens, W.; Van Nuffel, A.; Berckmans, D. Automatic measurement of touch and release angles of the fetlock joint for lameness detection in dairy cattle using vision techniques. J. Dairy Sci. 2012, 95, 1738–1748. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  66. Pastell, M.; Tiusanen, J.; Hakojärvi, M.; Hänninen, L. A wireless accelerometer system with wavelet analysis for assessing lameness in cattle. Biosyst. Eng. 2009, 104, 545–551. [Google Scholar] [CrossRef]
  67. Norring, M.; Häggman, J.; Simojoki, H.; Tamminen, P.; Winckler, C.; Pastell, M. Lameness impairs feeding behavior of dairy cows. J. Dairy Sci. 2014, 97, 4317–4321. [Google Scholar] [CrossRef] [PubMed]
  68. Jiang, B.; Yin, X.; Song, H. Single-stream long-term optical flow convolution network for action recognition of lameness dairy cow. Comput. Electron. Agric. 2020, 175, 105536. [Google Scholar] [CrossRef]
  69. Bonk, S.; Burfeind, O.; Suthar, V.; Heuwieser, W. Evaluation of data loggers for measuring lying behavior in dairy calves. J. Dairy Sci. 2013, 96, 3265–3271. [Google Scholar] [CrossRef] [Green Version]
  70. Mattachini, G.; Antler, A.; Riva, E.; Arbel, A.; Provolo, G. Automated measurement of lying behavior for monitoring the comfort and welfare of lactating dairy cows. Livest. Sci. 2013, 158, 145–150. [Google Scholar] [CrossRef]
  71. Jónsson, R.; Blanke, M.; Poulsen, N.K.; Caponetti, F.; Højsgaard, S. Oestrus detection in dairy cows from activity and lying data using on-line individual models. Comput. Electron. Agric. 2011, 76, 6–15. [Google Scholar] [CrossRef]
  72. Ishihara, A.; Bertone, A.L.; Rajala-Schultz, P.J. Association between subjective lameness grade and kinetic gait parameters in horses with experimentally induced forelimb lameness. Am. J. Vet. Res. 2005, 66, 1805–1815. [Google Scholar] [CrossRef]
  73. Rodríguez, A.; Olivares, F.; Descouvieres, P.; Werner, M.; Tadich, N.; Bustamante, H. Thermographic assessment of hoof temperature in dairy cows with different mobility scores. Livest. Sci. 2016, 184, 92–96. [Google Scholar] [CrossRef]
  74. Alsaaod, M.; Römer, C.; Kleinmanns, J.; Hendriksen, K.; Rose-Meierhöfer, S.; Plümer, L.; Büscher, W. Electronic detection of lameness in dairy cows through measuring pedometric activity and lying behavior. Appl. Anim. Behav. Sci. 2012, 142, 134–141. [Google Scholar] [CrossRef]
  75. Harris-Bridge, G.; Young, L.; Handel, I.; Farish, M.; Mason, C.; Mitchell, M.A.; Haskell, M.J. The use of infrared thermography for detecting digital dermatitis in dairy cattle: What is the best measure of temperature and foot location to use? Vet. J. 2018, 237, 26–33. [Google Scholar] [CrossRef] [PubMed]
  76. Van De Gucht, T.; Saeys, W.; Van Meensel, J.; Van Nuffel, A.; Vangeyte, J.; Lauwers, L. Farm-specific economic value of automatic lameness detection systems in dairy cattle: From concepts to operational simulations. J. Dairy Sci. 2018, 101, 637–648. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  77. Fuentes, A.; Yoon, S.; Park, J.; Park, D.S. Deep learning-based hierarchical cattle behavior recognition with spatio-temporal information. Comput. Electron. Agric. 2020, 177, 105627. [Google Scholar] [CrossRef]
  78. Venter, Z.S.; Hawkins, H.J.; Cramer, M.D. Cattle don’t care: Animal behaviour is similar regardless of grazing management in grasslands. Agric. Ecosyst. Environ. 2019, 272, 175–187. [Google Scholar] [CrossRef]
  79. Moran, J.; Doyle, R. Cow Talk: Understanding Dairy Cow Behaviour to Improve Their Welfare on Asian Farms; CSIRO Publishing: Clayton, Victoria, Australia, 2015. [Google Scholar]
  80. González, L.; Bishop-Hurley, G.; Handcock, R.N.; Crossman, C. Behavioral classification of data from collars containing motion sensors in grazing cattle. Comput. Electron. Agric. 2015, 110, 91–102. [Google Scholar] [CrossRef]
  81. González, L.; Tolkamp, B.; Coffey, M.; Ferret, A.; Kyriazakis, I. Changes in feeding behavior as possible indicators for the automatic monitoring of health disorders in dairy cows. J. Dairy Sci. 2008, 91, 1017–1028. [Google Scholar] [CrossRef] [Green Version]
  82. Dutta, R.; Smith, D.; Rawnsley, R.; Bishop-Hurley, G.; Hills, J.; Timms, G.; Henry, D. Dynamic cattle behavioural classification using supervised ensemble classifiers. Comput. Electron. Agric. 2015, 111, 18–28. [Google Scholar] [CrossRef]
  83. Müller, R.; Schrader, L. A new method to measure behavioural activity levels in dairy cows. Appl. Anim. Behav. Sci. 2003, 83, 247–258. [Google Scholar] [CrossRef]
  84. McGowan, J.; Burke, C.; Jago, J. Validation of a technology for objectively measuring behaviour in dairy cows and its application for oestrous detection. In Proceedings-New Zealand Society of Animal Production; New Zealand Society of Animal Production: Palmerston North, New Zealand, 2007; Volume 67, p. 136. [Google Scholar]
  85. Ledgerwood, D.; Winckler, C.; Tucker, C. Evaluation of data loggers, sampling intervals, and editing techniques for measuring the lying behavior of dairy cattle. J. Dairy Sci. 2010, 93, 5129–5139. [Google Scholar] [CrossRef]
  86. Rydhmer, L.; Zamaratskaia, G.; Andersson, H.; Algers, B.; Guillemet, R.; Lundström, K. Aggressive and sexual behaviour of growing and finishing pigs reared in groups, without castration. Acta Agric. Scand Sect. A 2006, 56, 109–119. [Google Scholar] [CrossRef]
  87. Bozkurt, Y.; Ozkaya, S.; Ap Dewi, I. Association between aggressive behaviour and high-energy feeding level in beef cattle. Czech J. Anim. Sci. 2006, 51, 151. [Google Scholar] [CrossRef] [Green Version]
  88. Matthews, S.G.; Miller, A.L.; Clapp, J.; Plötz, T.; Kyriazakis, I. Early detection of health and welfare compromises through automated detection of behavioural changes in pigs. Vet. J. 2016, 217, 43–51. [Google Scholar] [CrossRef] [Green Version]
  89. Oczak, M.; Ismayilova, G.; Costa, A.; Viazzi, S.; Sonoda, L.T.; Fels, M.; Bahr, C.; Hartung, J.; Guarino, M.; Berckmans, D.; et al. Analysis of aggressive behaviours of pigs by automatic video recordings. Comput. Electron. Agric. 2013, 99, 209–217. [Google Scholar] [CrossRef]
  90. Geers, R.; Puers, B.; Goedseels, V.; Wouters, P. Electronic Identification, Monitoring and Tracking of Animals; CAB International: Wallingford, UK, 1997. [Google Scholar]
  91. Sambraus, H.H.; Brummer, H.; Putten, G.V.; Schäfer, M.; Wennrich, G. Nutztier Ethologie: Das Verhalten landwirtschaftlicher Nutztiere; eine Angewandte Verhaltenskunde fuer die Praxis; Verlag Paul Parey: Hamburg, Germany, 1978. [Google Scholar]
  92. Pereira, D.F.; Miyamoto, B.C.; Maia, G.D.; Sales, G.T.; Magalhães, M.M.; Gates, R.S. Machine vision to identify broiler breeder behavior. Comput. Electron. Agric. 2013, 99, 194–199. [Google Scholar] [CrossRef]
  93. Peng, Y.; Kondo, N.; Fujiura, T.; Suzuki, T.; Ouma, S.; Wulandari; Yoshioka, H.; Itoyama, E. Dam behavior patterns in Japanese black beef cattle prior to calving: Automated detection using LSTM-RNN. Comput. Electron. Agric. 2020, 169, 105178. [Google Scholar] [CrossRef]
  94. Liu, D.; Oczak, M.; Maschat, K.; Baumgartner, J.; Pletzer, B.; He, D.; Norton, T. A computer vision-based method for spatial-temporal action recognition of tail-biting behaviour in group-housed pigs. Biosyst. Eng. 2020, 195, 27–41. [Google Scholar] [CrossRef]
  95. Handcock, R.N.; Swain, D.L.; Bishop-Hurley, G.J.; Patison, K.P.; Wark, T.; Valencia, P.; Corke, P.; O’Neill, C.J. Monitoring animal behaviour and environmental interactions using wireless sensor networks, GPS collars and satellite remote sensing. Sensors 2009, 9, 3586–3603. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  96. Tsai, D.M.; Huang, C.Y. A motion and image analysis method for automatic detection of estrus and mating behavior in cattle. Comput. Electron. Agric. 2014, 104, 25–31. [Google Scholar] [CrossRef]
  97. Martiskainen, P.; Järvinen, M.; Skön, J.P.; Tiirikainen, J.; Kolehmainen, M.; Mononen, J. Cow behaviour pattern recognition using a three-dimensional accelerometer and support vector machines. Appl. Anim. Behav. Sci. 2009, 119, 32–38. [Google Scholar] [CrossRef]
  98. Tani, Y.; Yokota, Y.; Yayota, M.; Ohtani, S. Automatic recognition and classification of cattle chewing activity by an acoustic monitoring method with a single-axis acceleration sensor. Comput. Electron. Agric. 2013, 92, 54–65. [Google Scholar] [CrossRef]
  99. Smith, D.; Rahman, A.; Bishop-Hurley, G.J.; Hills, J.; Shahriar, S.; Henry, D.; Rawnsley, R. Behavior classification of cows fitted with motion collars: Decomposing multi-class classification into a set of binary problems. Comput. Electron. Agric. 2016, 131, 40–50. [Google Scholar] [CrossRef]
  100. Williams, M.; Mac Parthaláin, N.; Brewer, P.; James, W.; Rose, M. A novel behavioral model of the pasture-based dairy cow from GPS data using data mining and machine learning techniques. J. Dairy Sci. 2016, 99, 2063–2075. [Google Scholar] [CrossRef] [Green Version]
  101. Williams, M.; James, W.; Rose, M. Fixed-time data segmentation and behavior classification of pasture-based cattle: Enhancing performance using a hidden Markov model. Comput. Electron. Agric. 2017, 142, 585–596. [Google Scholar] [CrossRef]
  102. Andriamandroso, A.L.H.; Lebeau, F.; Beckers, Y.; Froidmont, E.; Dufrasne, I.; Heinesch, B.; Dumortier, P.; Blanchy, G.; Blaise, Y.; Bindelle, J. Development of an open-source algorithm based on inertial measurement units (IMU) of a smartphone to detect cattle grass intake and ruminating behaviors. Comput. Electron. Agric. 2017, 139, 126–137. [Google Scholar] [CrossRef]
  103. Wang, J.; Zhang, H.; Zhao, K.; Liu, G. Cow movement behavior classification based on optimal binary decision-tree classification model. Trans. Chin. Soc. Agric. Eng. 2018, 34, 202–210. [Google Scholar]
  104. Rahman, A.; Smith, D.; Little, B.; Ingham, A.; Greenwood, P.; Bishop-Hurley, G. Cattle behaviour classification from collar, halter, and ear tag sensors. Inf. Process. Agric. 2018, 5, 124–133. [Google Scholar] [CrossRef]
  105. Achour, B.; Belkadi, M.; Aoudjit, R.; Laghrouche, M. Unsupervised automated monitoring of dairy cows’ behavior based on Inertial Measurement Unit attached to their back. Comput. Electron. Agric. 2019, 167, 105068. [Google Scholar] [CrossRef]
  106. Peng, Y.; Kondo, N.; Fujiura, T.; Suzuki, T.; Yoshioka, H.; Itoyama, E. Classification of multiple cattle behavior patterns using a recurrent neural network with long short-term memory and inertial measurement units. Comput. Electron. Agric. 2019, 157, 247–253. [Google Scholar] [CrossRef]
  107. Riaboff, L.; Aubin, S.; Bedere, N.; Couvreur, S.; Madouasse, A.; Goumand, E.; Chauvin, A.; Plantier, G. Evaluation of pre-processing methods for the prediction of cattle behaviour from accelerometer data. Comput. Electron. Agric. 2019, 165, 104961. [Google Scholar] [CrossRef]
  108. Williams, L.R.; Moore, S.T.; Bishop-Hurley, G.J.; Swain, D.L. A sensor-based solution to monitor grazing cattle drinking behaviour and water intake. Comput. Electron. Agric. 2020, 168, 105141. [Google Scholar] [CrossRef]
  109. Shen, W.; Cheng, F.; Zhang, Y.; Wei, X.; Fu, Q.; Zhang, Y. Automatic recognition of ingestive-related behaviors of dairy cows based on triaxial acceleration. Inf. Process. Agric. 2020, 7, 427–443. [Google Scholar] [CrossRef]
  110. Tran, D.N.; Nguyen, T.N.; Khanh, P.C.P.; Trana, D.T. An IoT-based Design Using Accelerometers in Animal Behavior Recognition Systems. IEEE Sens. J. 2021. [Google Scholar] [CrossRef]
  111. Porto, S.M.; Arcidiacono, C.; Anguzza, U.; Cascone, G. The automatic detection of dairy cow feeding and standing behaviours in free-stall barns by a computer vision-based system. Biosyst. Eng. 2015, 133, 46–55. [Google Scholar] [CrossRef]
  112. Gu, J.; Wang, Z.; Gao, R.; Wu, H. Cow behavior recognition based on image analysis and activities. Int. J. Agric. Biol. Eng. 2017, 10, 165–174. [Google Scholar]
  113. Ahn, S.J.; Ko, D.M.; Choi, K.S. Cow behavior recognition using motion history image feature. In International Conference Image Analysis and Recognition; Springer: Berlin/Heidelberg, Germany, 2017; pp. 626–633. [Google Scholar]
  114. Guo, Y.; Zhang, Z.; He, D.; Niu, J.; Tan, Y. Detection of cow mounting behavior using region geometry and optical flow characteristics. Comput. Electron. Agric. 2019, 163, 104828. [Google Scholar] [CrossRef]
  115. Yin, X.; Wu, D.; Shang, Y.; Jiang, B.; Song, H. Using an EfficientNet-LSTM for the recognition of single Cow’s motion behaviours in a complicated environment. Comput. Electron. Agric. 2020, 177, 105707. [Google Scholar] [CrossRef]
  116. Achour, B.; Belkadi, M.; Filali, I.; Laghrouche, M.; Lahdir, M. Image analysis for individual identification and feeding behaviour monitoring of dairy cows based on convolutional neural networks (cnn). Biosyst. Eng. 2020, 198, 31–49. [Google Scholar] [CrossRef]
  117. Guo, Y.; Qiao, Y.; Sukkarieh, S.; Chai, L.; He, D. BiGRU-Attention Based Cow Behavior Classification Using Video Data For Precision Livestock Farming. Trans. ASABE 2021. [Google Scholar] [CrossRef]
  118. Yin, L.; Liu, C.; Hong, T.; Zhou, H.; Kae Hsiang, K. Design of system for monitoring dairy cattle’s behavioral features based on wireless sensor networks. Trans. Chin. Soc. Agric. Eng. 2010, 26, 203–208. [Google Scholar]
  119. Barriuso, A.L.; Villarrubia González, G.; De Paz, J.F.; Lozano, Á.; Bajo, J. Combination of Multi-Agent Systems and Wireless Sensor Networks for the Monitoring of Cattle. Sensors 2018, 18, 108. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  120. Werner, J.; Leso, L.; Umstatter, C.; Niederhauser, J.; Kennedy, E.; Geoghegan, A.; Shalloo, L.; Schick, M.; O’Brien, B. Evaluation of the RumiWatchSystem for measuring grazing behaviour of cows. J. Neurosci. Methods 2018, 300, 138–146. [Google Scholar] [CrossRef] [PubMed]
  121. Roelofs, J.B.; van Eerdenburg, F.J.; Soede, N.M.; Kemp, B. Pedometer readings for estrous detection and as predictor for time of ovulation in dairy cattle. Theriogenology 2005, 64, 1690–1703. [Google Scholar] [CrossRef]
  122. Palmer, M.A.; Olmos, G.; Boyle, L.A.; Mee, J.F. Estrus detection and estrus characteristics in housed and pastured Holstein–Friesian cows. Theriogenology 2010, 74, 255–264. [Google Scholar] [CrossRef]
  123. Gibbons, J.M.; Lawrence, A.B.; Haskell, M.J. Consistency of aggressive feeding behaviour in dairy cows. Appl. Anim. Behav. Sci. 2009, 121, 1–7. [Google Scholar] [CrossRef]
  124. Šimić, R.; Matković, K.; Ostović, M.; Pavičić, Ž.; Mihaljević, Ž. Influence of an enriched environment on aggressive behaviour in beef cattle. Vet. Stanica 2018, 49, 239–245. [Google Scholar]
  125. Tscharke, M.; Banhazi, T.M. A brief review of the application of machine vision in livestock behaviour analysis. Agrárinformatika/J. Agric. Inform. 2016, 7, 23–42. [Google Scholar]
  126. Xue, T.; Qiao, Y.; Kong, H.; Su, D.; Pan, S.; Rafique, K.; Sukkarieh, S. One-shot Learning-based Animal Video Segmentation. IEEE Trans. Ind. Inform. 2021. [Google Scholar] [CrossRef]
  127. Huang, L.; Guo, H.; Rao, Q.; Hou, Z.; Li, S.; Qiu, S.; Fan, X.; Wang, H. Body Dimension Measurements of Qinchuan Cattle with Transfer Learning from LiDAR Sensing. Sensors 2019, 19, 5046. [Google Scholar] [CrossRef] [Green Version]
  128. Gao, R.; Gu, J.; Liang, J. Cow Behavioral Recognition Using Dynamic Analysis. In Proceedings of the 2017 International Conference on Smart Grid and Electrical Automation (ICSGEA), Changsha, China, 27–28 May 2017; pp. 335–338. [Google Scholar]
  129. Meunier, B.; Pradel, P.; Sloth, K.H.; Cirié, C.; Delval, E.; Mialon, M.M.; Veissier, I. Image analysis to refine measurements of dairy cow behaviour from a real-time location system. Biosyst. Eng. 2018, 173, 32–44. [Google Scholar] [CrossRef]
  130. Kashiha, M.; Pluk, A.; Bahr, C.; Vranken, E.; Berckmans, D. Development of an early warning system for a broiler house using computer vision. Biosyst. Eng. 2013, 116, 36–45. [Google Scholar] [CrossRef]
  131. Nunes, L.; Ampatzidis, Y.; Costa, L.; Wallau, M. Horse foraging behavior detection using sound recognition techniques and artificial intelligence. Comput. Electron. Agric. 2021, 183, 106080. [Google Scholar] [CrossRef]
  132. Jung, D.H.; Kim, N.Y.; Moon, S.H.; Jhin, C.; Kim, H.J.; Yang, J.S.; Kim, H.S.; Lee, T.S.; Lee, J.Y.; Park, S.H. Deep Learning-Based Cattle Vocal Classification Model and Real-Time Livestock Monitoring System with Noise Filtering. Animals 2021, 11, 357. [Google Scholar] [CrossRef] [PubMed]
  133. Meen, G.; Schellekens, M.; Slegers, M.; Leenders, N.; van Erp-van der Kooij, E.; Noldus, L.P. Sound analysis in dairy cattle vocalisation as a potential welfare monitor. Comput. Electron. Agric. 2015, 118, 111–115. [Google Scholar] [CrossRef]
  134. Röttgen, V.; Becker, F.; Tuchscherer, A.; Wrenzycki, C.; Düpjan, S.; Schön, P.C.; Puppe, B. Vocalization as an indicator of estrus climax in Holstein heifers during natural estrus and superovulation. J. Dairy Sci. 2018, 101, 2383–2394. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  135. Chelotti, J.O.; Vanrell, S.R.; Rau, L.S.M.; Galli, J.R.; Planisich, A.M.; Utsumi, S.A.; Milone, D.H.; Giovanini, L.L.; Rufiner, H.L. An online method for estimating grazing and rumination bouts using acoustic signals in grazing cattle. Comput. Electron. Agric. 2020, 173, 105443. [Google Scholar] [CrossRef]
  136. Nasirahmadi, A.; Edwards, S.A.; Sturm, B. Implementation of machine vision for detecting behaviour of cattle and pigs. Livest. Sci. 2017, 202, 25–38. [Google Scholar] [CrossRef] [Green Version]
  137. Overton, M.; Sischo, W.; Temple, G.; Moore, D. Using time-lapse video photography to assess dairy cattle lying behavior in a free-stall barn. J. Dairy Sci. 2002, 85, 2407–2413. [Google Scholar] [CrossRef]
  138. Butt, B. Seasonal space-time dynamics of cattle behavior and mobility among Maasai pastoralists in semi-arid Kenya. J. Arid. Environ. 2010, 74, 403–413. [Google Scholar] [CrossRef]
  139. MacKay, J.; Turner, S.; Hyslop, J.; Deag, J.; Haskell, M. Short-term temperament tests in beef cattle relate to long-term measures of behavior recorded in the home pen. J. Anim. Sci. 2013, 91, 4917–4924. [Google Scholar] [CrossRef] [PubMed]
  140. Norouzzadeh, M.S.; Nguyen, A.; Kosmala, M.; Swanson, A.; Palmer, M.S.; Packer, C.; Clune, J. Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning. Proc. Natl. Acad. Sci. USA 2018, 115, E5716–E5725. [Google Scholar] [CrossRef] [Green Version]
  141. Xu, B.; Wang, W.; Falzon, G.; Kwan, P.; Guo, L.; Chen, G.; Tait, A.; Schneider, D. Automated cattle counting using Mask R-CNN in quadcopter vision system. Comput. Electron. Agric. 2020, 171, 105300. [Google Scholar] [CrossRef]
  142. Jones, J.W.; Antle, J.M.; Basso, B.; Boote, K.J.; Conant, R.T.; Foster, I.; Godfray, H.C.J.; Herrero, M.; Howitt, R.E.; Janssen, S.; et al. Toward a new generation of agricultural system data, models, and knowledge products: State of agricultural systems science. Agric. Syst. 2017, 155, 269–288. [Google Scholar] [CrossRef]
  143. Rojo-Gimeno, C.; van der Voort, M.; Niemi, J.K.; Lauwers, L.; Kristensen, A.R.; Wauters, E. Assessment of the value of information of precision livestock farming: A conceptual framework. NJAS-Wagening. J. Life Sci. 2019, 90, 100311. [Google Scholar] [CrossRef]
Figure 1. The framework of intelligent perception-based animal farming.
Figure 1. The framework of intelligent perception-based animal farming.
Animals 11 03033 g001
Figure 2. Examples of acquired images from an Intel RealSense D435 camera.
Figure 2. Examples of acquired images from an Intel RealSense D435 camera.
Animals 11 03033 g002
Table 2. Main cattle behaviour descriptions.
Table 2. Main cattle behaviour descriptions.
BehaviourDescription
GrazingHead is placed in or over feed or pasture,
while cattle searches, masticates, or sorts the feed (silage) or pasture
ExploringHead is in close proximity to
or in contact with
the ground, using the nose to detect smells or food
GroomingTurns head towards abdomen with a stretched neck,
using their tongue to groom the body
MountingAnimal climbs on any part of the body or head of another animal
RuminatingThe cow regurgitates feed,
or swallows masticated feed and regurgitates it
LyingThe cow lies
in any position except flat on its side
WalkingThe position of the body and four legs changes,
with the head and neck not moving
StandingThe cow stands on all four legs with its head erect
and without swinging its head from side to side
AggressiveCauses actual or potential harm
(e.g., threat) to other animals
Table 3. Main studies on cattle behaviour recognition.
Table 3. Main studies on cattle behaviour recognition.
WorkSensorBehaviour
Type
FeatureModelAutomation
Level
Average
Accuracy
Contact sensor based approach
Martiskainen et al. [97]3D accelerometerstanding, lying, ruminating, feeding, normal,
lame walking, lying down, and standing up
Statistical featuresSVMlow94.50%
Tani et al. [98]single-axis acceleratorchewingsound spectrogrampattern matchinglowover 90.0%
González et al. [80]GPS and 3D accelerometersforaging, ruminating, traveling, resting, and othersStatistical featuresStatistical analysismedium90.5%
Smith et al. [99]motion collarsgrazing, walking, ruminating, resting, and othershead position and motion intensityBinary time series classifiersmedium82.25%
Williams et al. [100]GPSgrazing, resting, and walkingstatistical featuresmachine learningmedium85.0%
Williams et al. [101]GPS datagrazing, resting, and walkingbehaviour-labelled GPS datahidden Markov modelmedium94.0%
Andriamandroso et al. [102]IMUgrass intake and ruminatingstatistical featurestwo-step discrimination treelow92.0%
Wang et al. [103]3D accelerometerstanding, lying, normal walking,
active walking, standing up, and lying down
statistical featuresbinary decision-treemedium76.47%
Rahman et al. [104]3D accelerometergrazing, standing, or ruminatingstatistical featuresStratified Cross Validationmedium91.2%
Achour et al. [105]IMUlying, standing, lying down, standing up,
walking, and stationary behaviours
statistical featuresFinite Mixture Modelsmedium99.0%
Peng et al. [106]IMUfeeding, lying, ruminating licking salt,
moving, social licking, and head butting
motion dataLSTM-RNN modelmedium88.65%
Riaboff et al. [107]3D accelerometergrazing, walking lying, and standingstatistical featuresdecision treemedium95.0%
Williams et al. [108]3D accelerometerdrinkingstatistical featuresaccelerometer algorithmmedium95.0%
Peng et al. [93]IMUruminating (lying), ruminating (standing), lying normal,
standing normal, feeding, lying final, and standing final
deep learning featuresLSTM-RNNhigh77.56%
Shen et al. [109]3D accelerometereating, ruminating, and other behaviourstime/frequency-domain featuresK-nearest neighbourhigh93.25%
Tran et al. [110]3D accelerometerwalking, feeding, lying, and standingstatistical featuresRandom Forest algorithmhigh94.75%
Non-contact sensor-based approach
Tsai and Huang [96]cameraestrus and mating behaviourchanges of moving object lengthsmotion analysismedium99.67%
Dutta et al. [82]cameragrazing, ruminating, resting,
walking, and other
sensor data and behaviour observationsbagging ensemble classificationmedium96%
Porto et al. [111]camerafeeding and standingimage detectorsViola–Jones algorithmmedium86.5%
Gu et al. [112]cameraestrus and hoof disease behavioursminimum bounding box areaDynamic Analysismedium83.40%
Ahn et al. [113]cameramounting, walking, running,
tail wagging, and foot stamping
motion history image featureSVMmedium82.83%
Guo et al. [114]cameramounting behaviourgeometric and optical flow characteristicsSVMmedium90.9%
Yin et al. [115]cameralying, standing,
walking, drinking, and feeding
visual featuresEfficientNet-LSTMhigh97.87%
Achour et al. [116]camerastanding and feedingvisual featuresCNNhigh92.00%
Fuentes et al. [77]camera15 types: standing, lying, lying, and others3D-CNN featuresdeep learninghigh78.80%
Wu et al. [13]cameradrinking, ruminating, walking, standing, and lyingvisual featuresCNN-LSTMhigh97.60%
Guo et al. [117]cameraexploring, feeding, grooming, standing, and walkingvisual featuresBiGUR-attentionhighover 82%
Note: SVM means support vector machine; LSTM means Long Short Term Memory network; RNN means Recurrent Neural Network; BiGRU is short for Bidirectional Gated Recurrent Unit.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Qiao, Y.; Kong, H.; Clark, C.; Lomax, S.; Su, D.; Eiffert, S.; Sukkarieh, S. Intelligent Perception-Based Cattle Lameness Detection and Behaviour Recognition: A Review. Animals 2021, 11, 3033. https://doi.org/10.3390/ani11113033

AMA Style

Qiao Y, Kong H, Clark C, Lomax S, Su D, Eiffert S, Sukkarieh S. Intelligent Perception-Based Cattle Lameness Detection and Behaviour Recognition: A Review. Animals. 2021; 11(11):3033. https://doi.org/10.3390/ani11113033

Chicago/Turabian Style

Qiao, Yongliang, He Kong, Cameron Clark, Sabrina Lomax, Daobilige Su, Stuart Eiffert, and Salah Sukkarieh. 2021. "Intelligent Perception-Based Cattle Lameness Detection and Behaviour Recognition: A Review" Animals 11, no. 11: 3033. https://doi.org/10.3390/ani11113033

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop