Next Article in Journal
Legume-Based Rotations Reduce Cereal Yield Loss and Water Use to Enhance System Yield Resilience in Response to Climate Change
Previous Article in Journal
Genetic Diversity of Greek Rye (Secale cereale L.) Germplasm Revealed by ISSR, SCoT and Exon-Based Molecular Markers
Previous Article in Special Issue
Evaluation of Single-Shot Object Detection Models for Identifying Fanning Behavior in Honeybees at the Hive Entrance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Research Progress of Robotic Technologies and Applications in Smart Pig Farms

1
College of Smart Agriculture (Artificial Intelligence), Nanjing Agricultural University, Nanjing 210031, China
2
Key Laboratory of Livestock Farming Equipment, Ministry of Agriculture and Rural Affairs, Nanjing Agricultural University, Nanjing 210031, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Agriculture 2026, 16(3), 334; https://doi.org/10.3390/agriculture16030334
Submission received: 25 November 2025 / Revised: 25 December 2025 / Accepted: 26 January 2026 / Published: 29 January 2026

Abstract

With the rapid development of artificial intelligence (AI), the Internet of Things (IoT), and robotics technology, the intelligent transformation of pig farms has become an inevitable trend in the livestock industry. In today’s large-scale pig farms, the traditional breeding methods are undergoing significant transformation due to the application of intelligent robotics technology. The robotic system is capable of performing autonomous inspection, precise feeding and environmental cleaning, which can effectively alleviate labor shortages on farms. It also shows great advantages in strengthening biosecurity, optimizing management processes and ensuring animal welfare. This paper systematically constructs the key technical framework of pig farm robots, including the basic support layer, the perception and execution layer, the intelligent processing layer, and the integrated application layer. On this basis, further analysis is conducted on the current application of robots in intelligent pig farms, covering the functional characteristics and technical implementations of inspection robots, cleaning robots, and feeding robots.

1. Introduction

With the continuous growth of the global population, the demand for pork consumption has risen significantly. According to forecasts by the Organisation for Economic Co-operation and Development (OECD), global pork consumption is expected to reach 130 million tons by 2034 [1]. The enormous market demand for pork and pork products has driven the livestock industry toward large-scale and intensive production. In the face of such demands, the traditional pig farming model has encountered multiple challenges. For instance, labor costs have risen, people’s willingness to engage in pig farming has declined, the pressure of animal disease prevention and control has increased, and environmental protection regulations have become stricter. Under such circumstances, intelligent pig farm robots are emerging as a promising solution to these challenges. They are becoming key solutions driving the transformation of pig farming toward automation and intelligence and have demonstrated high application value in inspection, cleaning, and feeding operations.
In the overall development process of the livestock industry, robotic technology has been proven effective in addressing critical problems. Livestock farming automation aims to minimize human intervention in the farming process to the greatest extent. It is required to integrate computer vision, localization and navigation, sensing, and actuation to build such autonomous robotic systems. Recent systematic studies have highlighted the progress made by robots and their key technologies in the automation of livestock farming, intending to alleviate the shortage of labor, enhance production efficiency, and reduce the impact on the environment [2,3,4,5]. Researchers have developed a wide variety of robotic solutions for different stages of livestock farming, including automated barn inspection [6], livestock phenotypic analysis [7], and precision feeding [8]. Furthermore, these robotic systems come in different forms depending on the specific tasks they are to perform. Some robots use tracks for better ground movement [9], while others use wheels [10]. They can work independently or cooperate as part of a multi-robot team [11,12]. Figure 1 shows the actual application scenario of the pigsty robot.
Pig farming is one of the key areas in the livestock industry where robotic automation can make a significant impact. With the increasing application of intelligent technologies in pig farming production, robots are fundamentally changing the management methods of modern pig farms. They are increasingly being used in essential tasks such as farm inspection, cleaning, feeding, and breeding. These robotic systems help address important challenges such as labor shortages, disease control, and the need for precise management [16]. Building on the current “perception-execution-interaction” model, this paper further proposes a four-layer technology system: “basic support—perception processing—control execution—collaborative interaction”. This system effectively addresses the actual needs of pig farms and provides a more systematic and targeted technological structure. The key advancements in this framework include multimodal sensor fusion and AI-based visual algorithms, which enhance the reliability of identifying each pig and analyzing their behavior. Meanwhile, the combination of edge computing and Internet of Things technologies has reduced response times to the millisecond level [17]. The effectiveness of these technologies has been confirmed in several key applications: inspection robots use thermal imaging and AI vision to enable round-the-clock health monitoring; cleaning robots accurately identify target areas using semantic segmentation; and feeding robots apply digital twin technology to deliver customized nutrition.
Large-scale pig farms are now facing practical challenges such as unprofessional construction, inadequate supporting facilities, and increasing demands for refined farming, which urgently require solutions enabled by intelligent robotic technologies [18,19]. In traditional farming, manual inspections are inefficient and may spread diseases. Environmental control mostly relies on personal experience, and the breeding methods lack precision. These limitations are the main driving forces behind the development of intelligent pig farming robots.
To help address these issues, this paper summarizes current research on robots in smart pig farms and also provides an overview of the progress of key robot technologies. The review also analyzes how robots are being used in tasks such as environmental monitoring, cleaning and disinfection, and precise feeding. Furthermore, it explores how advances in robotics can help address key issues in the daily operations on farms. From the perspectives of perception and recognition, decision-making and control, and multi-robot collaboration, this study examines how robotic technology can improve farming efficiency, lower labor costs, and enhance biosecurity. The aim is to provide technical insights and development guidance for the construction of future intelligent pig farms.

2. Research Methodology

To ensure that the literature screening process is transparent and rigorous, this review focuses on research related to intelligent pig farming robots and the digitalization of pig farm production. The online databases used for retrieval include Web of Science, IEEE Explore, ScienceDirect, Scopus, Wiley, MDPI, and ResearchGate. In addition, Google Scholar searches were used to minimize omissions. The primary time span is from 2015 to 2025. Keywords encompass “pig”, “swine”, “intelligent”, and “monitoring/feeding/cleaning robots” with Boolean operators (AND/OR) used to combine search terms.
Search terms primarily cover three categories:
  • Species: pig;
  • Technologies: precision livestock farming, mechanical systems, sensors, closed-loop control, AI and computer vision (CV), simultaneous localization and mapping (SLAM), and the IoT;
  • Applications: inspection robots, feeding robots, cleaning robots, automated semen delivery robots, bionic robots, and automated pig herding robots.
Inclusion criteria for the literature were peer-reviewed journal articles focusing on pigs, applicable to smart livestock farming scenarios, and involving relevant technologies or applications. Exclusion criteria were studies on non-porcine species, lack of practical validation, or absence of individual-level applications. In total, 101 references were cited. Approximately 19 of these references were cited in the introduction, about 52 discussed related technologies, and approximately 30 discussed current specific research examples. The thematic distribution of selected literature and case studies is illustrated in Figure 2.
This review outlines the key technical architecture and current research status of intelligent pig farm robots. It then analyzes the core bottlenecks in applying robotics technology to swine farming and explores innovative pathways for deep integration with AI and IoT. The aim is to provide a basis for decision-making in constructing digital platforms for intelligent pig farms.

3. Key Technology Analysis

As a core piece of equipment driving the intelligent transformation of the pig farming industry, pig farm robots have a key technology system that can be divided into four layers: the basic support layer, the perception-execution layer, the intelligent processing layer, and the collaborative interaction layer. The core of the basic support layer is a high-precision mechanical system and a real-time operating system, providing the robot with a stable hardware foundation and a fast-responding software environment. The perception and processing layer, based on multimodal sensor fusion and artificial intelligence computer vision technologies, achieves comprehensive perception of the environment and the pig’s condition. The control and execution layer, through drive control technology and path planning algorithms, transforms decision commands into precise execution actions. The integrated application layer, through the integration of IoT, edge computing, and human–computer interaction technologies, builds data channels and interaction interfaces between the robot and the management system, environmental elements, and operators. The integrated application layer relies on system integration to combine all underlying technologies and provide a comprehensive intelligent pig farming robotic solution.
Compared to traditional robotic frameworks, our framework offers more efficient and applicable solutions in intelligent sensing, data processing, and multi-robot collaboration to adapt to the dynamic changes and complexities of pig farm environments. Most significantly, it establishes a solid foundation for future performance optimization and functional expansion of pig farming robotics. The key technological architecture of intelligent pig farm robots is illustrated in Figure 3.
The system’s data flow and interaction architecture follow a clear layered design, achieving a complete closed loop from hardware-driven to collaborative decision-making. The basic support layer, acting as the hardware foundation, interacts with sensors via standardized protocols such as I2C, SPI, and CAN, and is encapsulated with customized APIs to provide standardized hardware access interfaces for upper layers. The perception processing layer collects raw sensor data and, using real-time communication protocols such as ROS messages and MQTT, directly sends calibrated, compressed, and formatted (e.g., Protobuf) data to the control execution layer. Upon receiving commands, the control execution layer drives actuators to complete actions via hardware interfaces such as PWM and CAN and feeds back the status to the perception processing layer in real time, forming a local control closed loop. The collaboration and interaction layer serves as the system’s collaborative hub, relying on edge computing devices to aggregate and process data from the perception and execution layers, performing preliminary calculations such as image processing and path planning to reduce latency; simultaneously, it seamlessly interacts with the cloud or external systems via protocols such as REST API and WebSocket, enabling data uplink and command downlink. Overall, data originates from the perception layer, is processed and fed back by the control and execution layer, and is finally aggregated and coordinated by the collaboration and interaction layer. Standardized interface protocols (such as MQTT, gRPC, and WebSocket) between the layers ensure the real-time performance, reliability, and scalability of data transmission.

3.1. Basic Support Layer

3.1.1. High-Precision Mechanical System

High-precision mechanical systems refer to mechatronic systems that maintain micron-level positioning accuracy and sub-arcsecond attitude stability under complex working conditions through rigid-flexible hybrid structure design, multi-axis coordinated motion control, and real-time error compensation techniques [20]. This type of system achieves exceptionally high positioning accuracy in livestock farming environments through the synergistic integration of precision mechanical design and high-resolution sensing systems, thereby enhancing data collection accuracy and advancing automation levels in animal husbandry. Table 1 outlines the core technologies of high-precision mechanical systems.
In smart pig farms, high-precision mechanical systems ensure the stable operation of automated equipment. Inspection robots rely on stable mechanical platforms to capture high-quality detection data. Mobile cleaning robots demand precise path-tracking capabilities. Precision feeding robots require millimeter-level control accuracy for feed dispensing. With the continuous development of smart manufacturing and precision agriculture fields, high-precision mechanical systems also continue to evolve towards higher accuracy, stronger adaptability, and greater levels of intelligence.

3.1.2. RTOS for Robots

The Real-Time Operating System for Robots (RTOS for Robots) has the ability of deterministic task scheduling and microsecond-level interrupt response. Deterministic task scheduling ensures that critical tasks such as motion control and sensor data acquisition can be completed within a preset time range. This way, system failures caused by task delays can be prevented. Meanwhile, microsecond-level interrupt response enables the system to react promptly to external events, meeting the hard real-time requirements of mechanical control and environmental perception [31]. These characteristics together ensure that the robot can perform tasks accurately and quickly.
In terms of technical implementation, to ensure the robot’s intelligent control platform can accurately and promptly complete tasks and coordinate operations such as robotic arm movement and sensor responses, modern robot real-time operating systems adopt a layered architecture design, as shown in Figure 4. Furthermore, this system can run collaboratively with general-purpose operating systems such as Linux. The real-time kernel handles high-response tasks, while Linux handles non-real-time tasks such as data analysis and user interaction, balancing real-time performance with complex functional requirements [32].
In smart pig farms, real-time robotic operating systems are used in many operational processes. This system can collect and process data from sensors in real time, such as temperature, humidity, and harmful gas concentrations, and automatically control the on/off operation of equipment such as ventilation and manure removal. Simultaneously, it can direct feeding robots to complete feeding according to a set route, ensuring accurate and timely feed delivery. Furthermore, the system can uniformly schedule various devices, supporting the simultaneous and coordinated execution of multiple tasks within the pigsty. Relying on these applications, RTOS for robots can coordinate various operations at the same time, ensuring automated execution and immediate response in breeding processes. This can improve the production efficiency and management precision.

3.2. Sensing-Processing Layer

3.2.1. Multi-Modal Sensor Fusion

Multimodal sensor fusion technology obviously improves system perception capabilities by integrating data obtained from multiple different types of sensors, such as inertial measurement units (IMUs), vision-based systems, and wireless sensing technologies like RFID. The core of this approach lies in deploying deep learning (DL) models, including CNN-LSTM [33], to achieve automated cross-modal feature extraction and temporal relationship modeling. After that, higher recognition accuracy and stronger robustness can be obtained compared to single-sensor systems [34]. Current research shows that the combined use of sensors based on different physical principles can effectively overcome the environmental limitations of unimodal sensing [35,36].
The main advantage of multimodal fusion is its ability to integrate all the unique strengths of each mode. Inertial sensors such as accelerometers and gyroscopes are adept at capturing high-frequency temporal motion features. Visual sensors such as cameras can provide rich spatial information. RFID can be applied in appropriate scenarios, helping achieve precise positioning and recognition. By leveraging the hierarchical feature extraction capabilities of DL models, the fusion system can more comprehensively analyze those complex behavioral patterns. This can enhance the accuracy of recognition and also improve the robustness of the entire system.
A three-layer architecture is commonly adopted to achieve precise animal-behavior monitoring: a data acquisition layer, a feature fusion layer, and a decision-making output layer. In the data acquisition layer, inertial sensors continuously capture temporal features of movement, such as acceleration and angular velocity, visual sensors obtain spatial information on posture and scene, and RFID technology provides individual identification and precise positioning. In the feature fusion layer, CNNs are used to extract spatial features from visual data for posture recognition, LSTMs establish the temporal dependencies of motion data, and an adaptive weighting algorithm dynamically adjusts the weights of each sensor in response to environmental changes, for example, by reducing the contribution of visual data in occluded situations. In the decision-making output layer, classifiers such as SVMs or transformers output the recognition results of typical behaviors and continuously optimize classification accuracy with Gaussian mixture models and related methods [37].

3.2.2. AI and CV Technology

AI is a technological system that endows machines with the capability to emulate human intelligent decision-making through algorithms, with its core lying in the construction of computational models that possess learning, reasoning, and decision-making abilities. CV, as a crucial perceptual dimension of AI, is grounded in digital image processing, pattern recognition, and deep learning technologies to achieve the acquisition, analysis, and understanding of visual information. The synergistic effect of the two forms a complete intelligent closed-loop of “perception–cognition–decision-making”, wherein CV handles the collection of environmental information and feature extraction, while AI undertakes the higher-order cognitive and decision-making tasks.
In terms of technological advantages, the synergy between artificial intelligence and computer vision technology exhibits three key characteristics:
  • The non-invasive monitoring feature avoids the interference with organisms caused by traditional contact-based measurements;
  • The real-time processing capability enables data acquisition and analysis response at the millisecond level;
  • Data-driven decision-making establishes quantitative evaluation models through the accumulation of large-scale data.
Currently, deep learning (DL) algorithms have become the driving force behind technological transformation in this field. Convolutional neural networks (CNNs) are used for spatial feature extraction, generative adversarial networks (GANs [38]) for data augmentation, and vision transformers (ViTs [39]) for global information modeling, enabling the system to maintain high robustness in highly dynamic and dense complex scenarios [40]. These underlying visual perception and analysis techniques—from basic feature extraction to complex decision-making models—constitute the “digital neural system” of the various intelligent robots in pig farms described in Section 4. Specifically, they provide the core algorithmic framework for multimodal perception in inspection robots (Section 4.1), autonomous navigation in cleaning robots (Section 4.2), individual precision management in feeding robots (Section 4.3), and complex bio-interactions in specialized production robots (Section 4.4). The specific analysis is shown in Table 2.

3.3. Control and Execution Layer

3.3.1. Environmental Perception and Motion Planning Technology

Environmental perception and motion planning technology refers to the technical system whereby intelligent robots acquire real-time environmental information through multiple sensors and dynamically plan optimal motion paths based on perceived data. In smart pig farms, this technology enables real-time monitoring of pig distribution, obstacles, and terrain, dynamically planning obstacle-avoidance paths and operational trajectories. It ensures robots can safely and efficiently execute tasks within complex pig farm environments. Environmental perception and motion planning are mainly achieved through two core technologies: Simultaneous Localization and Mapping (SLAM) [41] and path planning.
Table 2. Technical implementation and algorithmic performance of AI and computer vision in swine breeding.
Table 2. Technical implementation and algorithmic performance of AI and computer vision in swine breeding.
Core AlgorithmsTask/ScenarioTechnical Implementation HighlightsAlgorithmic Performance MetricsReferences
EfficientNetV2MStaticGlobal feature extraction and identity vector matching for biometric recognition.Rapid convergence and high inter-individual separability on high-resolution still images.Compte A et al. [42]
CNNStaticConvolutional processing of local facial features for biometric identification.Strong local-feature discrimination, but sensitive to scale and viewing-distance variations.Teng Guanghui
et al. [43]
Dual-view CNNDynamicSpatiotemporal feature fusion integrating data from multi-angle camera streams.Stable spatiotemporal identity cues in video, capturing non-linear motion patterns.Wu Shihai et al. [44]
AFCB + ReIDDynamicAnchor-free detection framework with global spatiotemporal association strategies.Efficient tracking-by-detection with reduced identity switches (IDSW) in dense crossingsMorann Mattina [45]
Multi-view FusionStatic and DynamicMulti-dimensional mathematical modeling to compensate for single-view information loss.Multi-view redundancy improves robustness to occlusion and non-uniform illumination.Jiangong Li et al. [46]
3D Point CloudDynamicVolumetric parameterization based on binocular vision or depth imaging.High-precision geometry estimation, constrained by point-cloud density and surface reflectance.Du Xiaodong
et al. [47]
SLAM technology is the core for achieving autonomous navigation and environmental perception of robots. This technology fuses data from multimodal sensors to build a high-precision map of the pig farm environment in real time and simultaneously determine the robot’s position, thereby supporting path planning, obstacle avoidance, and operational tasks. Visual SLAM based on deep learning (e.g., ORB-SLAM3 [48]) and LiDAR-based SLAM (e.g., LOAM [49] and LeGO-LOAM [50]) have significantly improved robustness in complex and dynamic environments. Furthermore, the integration of lightweight SLAM algorithms (e.g., Fast-LIO2 [51]) with edge computing (e.g., Edge-SLAM [52]) has also reduced the demand for computing resources, making it more suitable for agricultural scenarios. In smart pig farms, SLAM technology not only addresses challenges such as lighting changes and dynamic obstacles but also collaborates with IoT technologies to enable multi-robot coordination and data sharing, thereby providing a precise spatial perception foundation for intelligent livestock farming.
Path planning algorithms dynamically calculate the optimal movement path by combining map information of the pig farm environment (such as fences, passages, and pig distribution) and real-time sensor data, while avoiding dynamic obstacles (such as pigs or staff). Traditional global path planning algorithms (A* algorithm [53] and Dijkstra [54]) are based on static grid graphs to search for the shortest path and is more suitable for structured environments. However, in dynamic scenarios such as pig farms, due to the frequent movement of pigs, staff, and equipment, these algorithms need to be recalculated frequently, resulting in low efficiency. For this reason, incremental algorithms (such as the D* algorithm [55] and D* Lite [56]) are widely used. They significantly improve the speed of replanning by locally updating path segments affected by environmental changes. Driven by the growing scale and complexity of pig farms, deep learning-based reinforcement learning path planning (such as DQN [57] and PPO [58]) and SLAM technology that integrates multi-sensor data have gradually become research hotspots. These methods significantly improve the obstacle avoidance efficiency and path smoothness of robots in dynamic, unstructured pig farms through adaptive learning and real-time environment modeling.
Furthermore, multi-robot collaborative scheduling is an important technical direction for achieving efficient operations in large-scale and complex agricultural scenarios. Existing work has shown that task allocation strategies significantly affect the overall performance and robustness of multi-robot systems. For example, Miao et al. [59] proposed a multi-robot task allocation method based on deep reinforcement learning and multimodal multi-objective evolutionary algorithms. This method was compared with three popular evolutionary algorithms in three different multi-robot scheduling scenarios. The results showed that the method can generate multiple feasible solutions in uncertain environments and improve the overall task execution efficiency and robustness. Methods based on multi-agent frameworks have also demonstrated their performance improvement in multi-task scenarios. For example, this method significantly reduced the average task completion time and improved the resource utilization balance in various scheduling test scenarios, reflecting the potential of multi-robot scheduling algorithms in improving system efficiency and task response speed [60]. These experiments provide technical reference and performance support for multi-robot operations in agricultural automation scenarios such as smart pig farms.

3.3.2. Drive and Control Technology

The drive and control technology of intelligent robots achieves precise motion and compliant operation through servo, pneumatic, and hybrid drive systems, while closed-loop control enhances execution accuracy and adaptability. This technology system supports intelligent farming robots in completing complex tasks and is a core guarantee for automation and intelligence.
Table 3 outlines the three primary drive technologies employed in modern intelligent aquaculture robots. The combined application of these drive solutions effectively addresses the specific requirements of diverse operational tasks within aquaculture environments.
The control technologies employed in the execution layer of modern intelligent farming robots include closed-loop control, torque control, and distributed control. Among these, closed-loop control serves as the foundational technology. Nearly all intelligent robots rely on it, particularly for high-precision tasks in pig farms, where it is indispensable. Table 4 lists three common closed-loop control algorithms used in intelligent farming robots.
The drive and control technology for intelligent robots achieves precise motion and compliant operation through servo, pneumatic, and hybrid drive systems. When combined with closed-loop control, this approach improves both the precision and adaptability of task execution. This technical framework enables smart farming robots to handle complex operations, forming the core foundation for automated and intelligent agricultural production.

3.4. Coordination and Interaction Layer

3.4.1. IoT and Edge Computing

The Internet of Things (IoT) is a distributed system that interconnects physical devices, sensors, and networks through standardized communication protocols to achieve data acquisition, transmission, and intelligent control [64]. Edge computing technology, on the other hand, significantly reduces data transmission latency by pushing computing, storage, and analysis capabilities down to the network edge to build a distributed architecture [65]. The synergistic application of IoT and edge computing technologies is driving the development of intelligent pig farm robots toward seamless multi-terminal collaboration, millisecond-level response, and adaptive decision-making. The application of these technologies in the context of smart pig farming robots is shown in Figure 5.
The IoT technology uses a distributed networking architecture to build a high-precision sensing layer. This technology is based on a monitoring network using microcontrollers and low-power wireless communication protocols, which can realize the real-time acquisition of multi-parameter data [66]. Meanwhile, the IOT adopts a “1 + N” networking mode, supports reliable access for a large number of terminal devices, and optimizes data transmission efficiency through lightweight communication protocols [67,68]. In terms of device collaborative management, the modular design of the IoT allows flexible configuration of terminal nodes and communication protocols. Meanwhile, the tiered local caching mechanism can maintain basic data storage in the event of network anomalies, ensuring system robustness.
Edge computing significantly enhances the real-time performance of monitoring systems through localized processing. Its optimization strategy involves filtering, compressing, and extracting features directly from the data locally, thereby reducing network latency. This can reduce the latency of network transmission. In terms of computing performance, the improved algorithms deployed on edge devices can increase the frame rate of visual analysis and reduce uplink bandwidth consumption through data filtering strategies [69]. Furthermore, edge computing, through model compression and dynamic resource allocation, can improve recognition accuracy and enable real-time decision-making in closed scenarios, supporting efficient closed-loop control.

3.4.2. Human–Robot Interaction Technology

Human–computer interaction technology is key to enabling managers to efficiently monitor, command, and intervene in robot swarms. The core of this technology is to establish a two-way information channel, which should have the characteristics of low latency and high reliability and be able to adaptively adjust according to the context. These key technologies fall into two main categories: remote two-way control systems and natural multimodal interaction between humans and robots.
The first key technology involves remote bilateral teleoperation, where most human–robot interaction systems establish real-time data links through WebRTC [70] or proprietary protocols. The uplink transmits robot positioning, multi-sensor data, and system status, which are preprocessed at edge nodes to reduce bandwidth consumption. The downlink transmits path settings, emergency stop and other control commands. To address the challenge of network imbalance in pig farms, adaptive bitrate transmission and local network caching are used to ensure priority transmission of critical instructions and connection stability. The second key technology is multimodal natural interaction. In terms of voice, the system employs an offline speech recognition engine, optimizes the acoustic model for low-frequency noise in pig farm environments, and incorporates a domain-specific instruction set to achieve high-accuracy directional speech recognition. For vision, the system (Figure 6) uses AR technology to overlay analysis results onto real-world scenes, enhancing the visualization of key information [71]. The collaborative interaction mechanism between voice and vision allows operators to combine gestures with voice commands to issue complex tasks, significantly improving the intuitiveness of the interaction and operational efficiency.
The human–machine interaction technology of intelligent pig farming robots is developing from single command transmission to intelligent collaboration with context awareness and multimodal fusion. With the deployment of lightweight large models on edge devices, these systems will better comprehend managers’ ambiguous intentions and contextual cues, ultimately achieving truly efficient and natural human–robot co-operative operations.

3.5. Integrated Application Layer

Intelligent Pig Farm Robot System Integration

Intelligent pig farm robot system integration refers to the deep integration of various capabilities required by robots through standardized interfaces and collaborative control architecture, thereby forming the core technological foundation for unmanned operation of pig farms. The system can autonomously perform tasks such as inspection, cleaning, feeding, and health monitoring. Its core purpose is to integrate different technologies and achieve overall optimization through cross-level collaboration.
At the technical implementation level, system integration must address three key challenges:
  • Real-time coordinated control of multiple modules: Leveraging the robot operating system to dynamically schedule tasks for robotic arms, mobile platforms, and environmental sensors, ensuring rapid response and stability in complex scenarios;
  • Heterogeneous data fusion and decision optimization: By collecting pig growth data through multiple sensors, utilizing AI and CV technology for body condition assessment, and leveraging cloud-based historical records, a closed-loop decision-making model is established for actions such as feed rationing or health interventions;
  • Lightweight human–machine interaction design: Develop an intuitive and us-er-friendly visual interface enabling livestock personnel to effortlessly monitor pig conditions in real time, promptly receive anomaly alerts, and efficiently adjust task priorities.
This integration process distinctly demonstrates all the engineering capabilities of the robot in the livestock industry scenario and also provides systematic support for improving reliability and operational efficiency in subsequent larger-scale applications.

4. Research Progress

Intelligent pig farm robots refer to intelligent equipment that integrates technologies such as artificial intelligence, the Internet of Things, sensors, and automated control. These devices can perform functions such as environmental monitoring, cleaning and disinfection, and precise feeding in pig farms. Their core objective is to enhance production efficiency, reduce labor costs, ensure animal welfare, and minimize environmental pollution. To more clearly demonstrate the functions and technologies of intelligent pig farm robots in different application scenarios, this section first categorizes pig farm robot systems, including inspection robots, feeding robots, cleaning robots, and other specialized robots, such as automated sperm delivery robots and biomimetic robots. Each robot system employs its own unique technical solutions to meet the different needs of pig farms, achieving intelligent management and automated operation. The functions and technical comparisons of robots in smart pig farms are shown in Table 5.

4.1. Inspection Robots

As a core piece of equipment for intelligent pig farm management, inspection robots have made undergone notable technological development in technological development, moving from single-function environmental monitoring to multimodal intelligent perception and autonomous decision-making. Representative intelligent inspection robots for pig farms are summarized in Table 6. By systematically reviewing representative research findings, their key technological routes and breakthroughs can be summarized into the following three levels:
At the perception layer, the technology has evolved from single-sensor data collection to multimodal fusion. Early studies used a single ultrasonic sensor to automate the measurement of growth indicators such as backfat thickness in pigs, which initially verified the feasibility of precise sensing. With the development of technology, the introduction of an eight-in-one environmental sensor array has enabled the vertical spatial synchronous acquisition of parameters such as temperature, humidity, NH3, and CO2 [72], laying the foundation for precise monitoring of the pig house environment. Most studies are limited to short-term or controlled experiments, and the long-term performance of sensors onboard inspection robots in typical commercial pig farm environments remains limited, particularly under high humidity, heavy dust, corrosive gases, and frequent cleaning. Table 7 summarizes quantitative measurements of sensor degradation under these harsh conditions, including humidity drift, systematic underestimation by particulate matter sensors, and reduced lifespan of temperature and ammonia sensors. The findings indicate that sensor drift, systematic bias, and accelerated wear can restrict the reliability of inspection robots during long-term deployment. These results highlight the need for enhanced sensor protection and the integration of fault-tolerant and robust data fusion strategies to improve the sustained operational stability of inspection robots.
Meanwhile, the application of edge computing and lightweight AI algorithms has become a core breakthrough in decision-making and processing. Most research adopts a “cloud–edge–device” collaborative architecture, enabling the offloading of computationally intensive tasks to edge nodes. For example, YOLOv5/YOLOv8-based object detection models combined with Kalman filtering have achieved high-accuracy, low-latency pig counting and posture recognition on embedded platforms such as Jet-son Nano and STM32 [75,76]. These approaches primarily demonstrate the feasibility of achieving real-time responsiveness, reducing network dependency, and controlling hardware costs in constrained scenarios, rather than fully addressing these critical bottlenecks. This breakthrough effectively solved key bottlenecks.
In terms of pig farm robotic system architecture, modular design and functional integration have become the main development trends. Modern inspection robots are no longer capable of performing single tasks. Unlike before, through modular design, they integrate environmental monitoring, health diagnosis, behavior analysis, and the interface with feeding and disinfection systems into one unit [74]. This closed-loop “perception–decision–execution” mechanism reflects the evolution of inspection robots from mobile data acquisition terminals into intelligent platforms that combine monitoring, analysis, and adaptive control.
Based on the content mentioned above, the technological evolution of inspection robots clearly follows the track of “perceptual fusion, edge-intelligent decision-making, and functional integration”. Core breakthroughs lie in overcoming the limitations of single-function technology through multi-sensor fusion and lightweight AI algorithm deployment. It enables all-weather dynamic perception and autonomous decision-making of individual organisms and their surroundings in the complex farming environment. However, the overall maturity of inspection robot technologies remains constrained. Scalability to large commercial pig farms, long-term operational stability, and adaptability to variable environmental conditions have not yet been fully validated, indicating that while current robots demonstrate feasibility in controlled or small-scale deployments, further research is required to ensure reliable performance at commercial scale.

4.2. Cleaning Robots

Currently, the navigation technology of pig farm cleaning robots has completed the transformation from fixed-path guidance to autonomous environmental perception. As shown in Table 8, LiDAR SLAM technology has become the mainstream solution. Generally, it is combined with multiple sensors including ultrasonic and infrared sensors to form a composite obstacle avoidance system. This significantly enhancing robot mobility and operational robustness in complex fence and corridor environments. The cleaning actuator has been specially optimized for the task of feces removal and disinfection, integrating functional modules such as high-pressure spraying, variable rate spraying, and mechanical scraping.
Furthermore, pig farm cleaning robots still face several common bottlenecks: the damp, dusty, and ammonia-filled environment in manure channels can easily affect sensor accuracy; the cleaning efficiency and water consumption for stubborn dirt remain unresolved; and manufacturing costs and long-term system stability further restrict the large-scale application of robots. To tackle these challenges, the industry is focusing on upgrading key technologies. In addition, narrow passages such as corridors and gates impose higher demands on cleaning robot mobility. Failures in these constrained spaces often manifest as safety stops, collisions, or timeouts. Although the data in Table 9 are drawn from studies in other robotic navigation contexts, the quantitative evidence on success and failure rates in narrow passages can be transferred to pig farm cleaning robots, providing valuable insights for evaluating perception, path planning, and recovery strategies in similarly confined farm environments.
To tackle these challenges, the industry is focusing on three key upgrades. For navigation, LiDAR SLAM combined with multi-sensor fusion enhances perception robustness in harsh environments. The development of cleaning robots for pig farms will focus on intelligence and systematization. The path will be through multimodal perception fusion and AI decision-making, upgrading robots from single devices to multifunctional mobile platforms. Ultimately, integration with IoT systems will enable cluster scheduling and digital operation and maintenance, thereby supporting the construction of unmanned, intelligent pig farms.

4.3. Feeding Robots

Feeding robots are driving the transformation of traditional farming methods towards precision and automation. These intelligent robots, by integrating the Internet of Things, computer vision, and automated control technologies, have achieved a leapfrog development from group feeding to precise individual nutritional regulation. In modern pig farms, feeding robots primarily perform three core functions: precise feeding, feed intake monitoring, and nutritional management. Their realization in terms of technology relies on key technologies, including multi-sensor fusion, autonomous navigation and intelligent decision-making.
A smart feeding solution based on miniaturized TMR mixed feeding technology and a multi-modal environmental perception system (LiDAR and RFID) was proposed in [92], as shown in Figure 7, achieving adaptive coverage for farms of varying scales through differentiated feeding management and an intelligent platform architecture. In [93] an innovative approach using UWB high-precision positioning and a dynamic look-ahead distance PID control algorithm significantly improved the navigation accuracy of feed-pushing robots in complex barn environments compared to traditional methods, offering a new paradigm for high-precision autonomous navigation in livestock robotics. These technological innovations provide important references for the development of feeding robots in animal husbandry. However, while feeding robots are widely applied in ruminant operations such as dairy and goat farming, it is noteworthy that comparable technological breakthroughs have not yet been achieved in automated feeding systems for fattening pigs or precision feeding systems for sows in large-scale pig farms. In the field of mobile feeding robots for pig farms, leading international companies have made significant progress. For example, Lely’s Vector self-propelled feeding robot is equipped with feed height detection and dynamic formula adjustment functions, while Hetwin’s ARANOM robot in Langkampfen, Austria uses magnetic guidance and battery power for mixed feeding without tracks [94]. These advancements provide valuable reference for the development of feeding robots for pig farms.
In addition, a critical challenge remains in practical pig farm operations: ensuring the closed-loop reliability of “correct individual → correct dose” under group-housing conditions. Current studies report only limited direct indicators, such as delivery hit rate or mis-/miss-feeding rate, making it difficult to comprehensively assess robot performance in real-world scenarios. To address this gap, Table 10 summarizes existing evidence, combining both direct failure statistics and proxy indicators to provide a structured basis for evaluating this key bottleneck.

4.4. Other Production Management Robots in Swine Farming

4.4.1. Automated Semen Delivery Robots

Traditional manual insemination relies on skilled technicians, but this method has relatively low operational efficiency and is particularly susceptible to human factors. Automatic insemination robots improve insemination success rate and sow conception rate by precisely controlling the insemination process. For example, in reference [98], an electronic identification system recorded the frequency and duration of sow visits to boars. By combining a dynamic formula (BVI) and time segmentation data to optimize the threshold, automatic estrus detection of group-housed sows was realized, significantly reducing the frequency of manual inspection. Muyuan Foods Co., Ltd. (Nanyang, China) proposed an automatic insemination robot, which includes a gripping mechanism, an insemination mechanism and a drive mechanism. The gripping mechanism is fixed to the sow’s tail, and the drive mechanism adjusts the position of the insemination mechanism to align it with the sow’s ovary. The control system coordinates all components to complete automatic insemination [99]. Due to significant individual differences among sows, the adaptability of this robot still needs further optimization. By integrating artificial intelligence and flexible robot technology, automatic insemination robots are expected to achieve higher accuracy and wider application in the future.

4.4.2. Bionic Robots

The application of bionic robots in smart pig farms is gradually demonstrating its unique advantages. Traditional equipment used to raise livestock often lacks biocompatibility and may cause stress responses in animals. By mimicking animal morphology and behavioral characteristics, bionic robots interact with pig herds more naturally, reducing stress and enhancing management efficiency. In one study [100], a bionic boar device simulated sensory stimuli from real boars. It combined computer vision and deep learning models such as SAE to analyze the interaction between sows, ultimately achieving efficient and contactless automatic estrus detection. However, challenges remain for large-scale deployment due to limitations in motion stability under complex environments and cost factors. With advancements in biomimetic materials and swarm intelligence algorithms, future biomimetic robots are expected to achieve more realistic biological simulations and more efficient swarm management capabilities. The device is shown in Figure 8.

4.4.3. Automated Pig Herding Robots

Traditional pig slaughtering mainly relies on manual labor, which has problems such as low efficiency, heavy labor and uneven hygiene standards. Pig slaughtering robots use automation and intelligent technology to carry out precise operations, which can effectively improve production efficiency and meat quality. For example, the German company WESTFLEISCH (Münster, Germany) uses KUKA industrial robots to perform slaughtering tasks. Through a three-dimensional laser measurement system, the robots accurately locate the pig carcass and complete automatic splitting and cutting operations, which significantly improves slaughtering efficiency and product consistency [101]. At present, such robots still need to meet the challenges of the complex physiological structure of pigs and the high hygiene standards required in practical applications. For example, different breeds and sizes of pigs require customized parameter settings during the stunning and bleeding processes, which places higher demands on the robot’s intelligent recognition and decision-making capabilities. In the future, the development of high-tech is expected to enable slaughtering robots to perform more precise and personalized operations.

5. Discussion

5.1. Limitations in Perception and Recognition Capabilities

In smart pig farms, robots must accurately identify the characteristics and behavioral status of each pig, as well as various facilities within the farm. Current robot perception and recognition technologies still have limitations, primarily because pigs look very similar. During the rearing process, pigs’ body shape, coat color, and other characteristics change, making accurate identification difficult. Existing image recognition technologies will experience a significant decline in accuracy when parts of the pig’s body are occluded from each other or under different lighting conditions. Existing algorithms struggle to make timely and accurate judgments about pig behavior, such as disease or estrus identification. This limits the application of robots in precision farming management, preventing them from promptly detecting abnormal conditions in pigs and from taking effective intervention measures.
Future research should focus on optimizing multimodal sensor fusion and deep learning algorithms and could introduce adaptive lighting compensation algorithms and anti-occlusion recognition technology. This technology can significantly improve the robot’s recognition accuracy in harsh environments, providing reliable data support for precise feeding and health monitoring.

5.2. Multi-Robot Coordination Challenges

In large-scale intelligent pig farms, multiple robots often need to collaborate to complete tasks. However, multi-robot collaborative operations currently face many challenges. On the one hand, the robots’ communication systems are easily interfered with in the complex electromagnetic environment of pig farms, leading to poor information transmission and affecting the efficiency of collaborative operations. On the other hand, how to rationally allocate tasks and plan paths to avoid collisions and task conflicts between robots is also an urgent problem to be solved. Current algorithms are not yet mature enough in multi-robot collaborative path planning and task allocation, which easily leads to task duplication or omission, and cannot fully utilize the advantages of multiple robots.
To meet the practical needs of large-scale pig farms, an efficient communication and task allocation mechanism is required. Research can leverage edge computing technology to construct a low-latency, highly reliable multi-robot collaborative framework. For example, distributed reinforcement learning algorithms can be used to optimize path planning and prevent task conflicts. Simultaneously, a dynamic priority scheduling system can be designed to ensure that robots can respond quickly and work collaboratively in the event of unexpected situations.

5.3. Inadequate Adaptation to Complex Environments

Pig farms have complex environments, often characterized by high dust levels, unpleasant odors, and high humidity. These adverse conditions can affect the performance of robot hardware and sensors. For example, high humidity can cause electronic components to become damp or even short-circuit. Dust can easily obscure sensor lenses or block ventilation openings, interfering with normal operation. In addition, pig farm floors often have obstacles such as feces and stagnant water, which also places higher demands on the robot’s mobility. Current robots need to be improved in terms of maneuverability and anti-slip properties, as they are prone to slipping or getting stuck, making it impossible to complete tasks such as inspection and feeding.
Future research and development should focus on corrosion-resistant, moisture-proof, and dust-proof hardware. Magnesium-aluminum alloys and fiber-reinforced composite materials can be used to improve the structural strength of robots. New mobile chassis designs, such as omnidirectional wheels and track composite structures, should be developed to address the issues of slippery floors and obstacles in pigsties.

5.4. Cost-Benefit Balance

The cost of developing and manufacturing intelligent pig farming robots is relatively high precisely because of the implements of advanced technologies such as sensors, control systems, and mechanical structures. At the same time, the maintenance costs of these robots are also considerable, including the replacement of hardware and the upgrade of software. These factors make it impossible for many small and medium-sized pig farms to bear this cost. Furthermore, the current efficiency and service life of these robots still cannot fully meet the actual needs of pig farms, resulting in a relatively low return on investment. Some robots will experience a decline in accuracy and an increase in the probability of malfunctions after long-term use, further driving up maintenance costs. The above-mentioned factors make pig farms hesitant about whether to adopt robots, hindering the widespread application of intelligent pig farming robots.
Achieving widespread adoption requires a multi-dimensional strategy centered on structural optimization and financial innovation. Manufacturers can significantly reduce costs by adopting modular architectures and integrating domestically produced core components, thereby decreasing reliance on expensive imported hardware. This technological optimization should be coupled with the deployment of computationally efficient, lightweight AI models to lower the hardware requirements of edge computing devices, thus reducing the barriers to entry for data-driven smart farming. To alleviate the financial pressure on small-scale farming operations, the industry must shift towards flexible service-oriented business models, such as equipment leasing or “robotics as a service” models. These models shift the financial burden from capital expenditures to operating expenditures, ensuring the technology’s widespread adoption within the industry.

6. Conclusions

In summary, the future development of intelligent pig farm robots should focus on technological breakthroughs while also considering cost and practicality. By combining technological achievements from different fields, we can jointly promote the upgrading of animal husbandry towards intelligence and precision. With the deep integration of artificial intelligence, the Internet of Things, and robotics, the application of intelligent pig farming robots in pig farms will play an increasingly important role in improving breeding efficiency, ensuring animal welfare, and achieving sustainable development.

Author Contributions

Conceptualization, L.Z. and L.H.; methodology, L.Z. and L.H.; investigation, L.Z. and L.H.; writing—original draft preparation, L.Z. and L.H.; writing—review and editing, L.Z., L.H., Y.X., H.Q., A.B. and Z.C.; visualization, L.H.; project administration, L.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed during this study.

Acknowledgments

The researchers acknowledge Nanjing Agricultural University for its support in this research.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. OECD. Meat Consumption (Indicator). Available online: https://www.oecd.org/en/data/indicators/meat-consumption (accessed on 12 September 2025).
  2. Wang, T.; Chen, B.; Zhang, Z.; Li, H.; Zhang, M. Applications of machine vision in agricultural robot navigation: A review. Comput. Electron. Agric. 2022, 198, 107085. [Google Scholar] [CrossRef]
  3. Klerkx, L.; Jakku, E.; Labarthe, P. A review of social science on digital agriculture, smart farming and agriculture 4.0: New contributions and a future research agenda. NJAS-Wagening. J. Life Sci. 2019, 90, 100315. [Google Scholar] [CrossRef]
  4. Ariza-Sentís, M.; Vélez, S.; Martínez-Peña, R.; Baja, H.; Valente, J. Object detection and tracking in Precision Farming: A systematic review. Comput. Electron. Agric. 2024, 219, 108757. [Google Scholar] [CrossRef]
  5. Bao, J.; Xie, Q. Artificial intelligence in animal farming: A systematic literature review. J. Clean. Prod. 2022, 331, 129956. [Google Scholar] [CrossRef]
  6. Deqin, X.; Wu, G.; Huang, Y.; Feng, J.; Tan, Z.; Zhang, B.; Wang, H.; Yang, Q. Control Method and Equipment for Livestock and Poultry Health Inspection Robot for Multi-Index Collection. U.S. Patent 18,751,244, 22 June 2025. [Google Scholar]
  7. Guo, Y.; Yu, X.; Zhang, W.; Chen, C.; Yu, L. Structural design, modeling and simulation analysis of a cage broiler inspection robot. J. Agric. Eng. 2025, 56, 1806. [Google Scholar] [CrossRef]
  8. Detraversay, M.; Dreier, J.-J.; Karwacki, S.; Klaas, I. Method and Control Circuitry for Operating an Autonomous Feed Robot at a Feed Table in a Livestock Area. U.S. Patent 17,786,889, 11 December 2020. [Google Scholar]
  9. Nabokov, V.; Novopashin, L.; Denyozhko, L.; Sadov, A.; Ziablitckaia, N.; Volkova, S.; Speshilova, I. Applications of feed pusher robots on cattle farmings and its economic efficiency. Int. Trans. J. Eng. Manag. Appl. Sci. Technol. 2020, 11, 1–7. [Google Scholar] [CrossRef]
  10. Deng, H.; Zhang, T.; Li, K.; Yang, J. Visual navigation of caged chicken coop inspection robot based on road features. Animals 2024, 14, 2515. [Google Scholar] [CrossRef] [PubMed]
  11. Zhou, J.; Liu, L.; Jiang, T.; Tian, H.; Shen, M.; Liu, L. A novel behavior detection method for sows and piglets during lactation based on an inspection robot. Comput. Electron. Agric. 2024, 227, 109613. [Google Scholar] [CrossRef]
  12. Guo, H.; Miao, Z.; Ji, J.; Pan, Q. An effective collaboration evolutionary algorithm for multi-robot task allocation and scheduling in a smart farm. Knowl.-Based Syst. 2024, 289, 111474. [Google Scholar]
  13. Nanmu. Available online: https://www.nmjx.com.cn/ (accessed on 23 December 2025).
  14. Zhou, Y.; Hao, W.; Renli, Q.; Bin, H.; Xuemin, P.; Yaqiong, Z.; Zuohua, L. Research and application progress on intelligent cleaning robots in pigsties. Trans. Chin. Soc. Agric. Eng. 2025, 41, 1–11. Available online: https://link.cnki.net/urlid/11.2047.S.20250121.1857.015 (accessed on 25 January 2026).
  15. Envirologic. Available online: https://www.envirologic.se/en/washing-robot-for-pig-barns/ (accessed on 23 December 2025).
  16. Lokhorst, K.; Norton, T.; van Henten, E.; Edan, Y. (Eds.) Advances in the Use of Robotics in Livestock Production; Burleigh Dodds Science Publishing: Cambridge, UK, 2024. [Google Scholar] [CrossRef]
  17. Wang, S.; Jiang, H.; Qiao, Y.; Jiang, S.; Lin, H.; Sun, Q. The research progress of vision-based artificial intelligence in smart pig farming. Sensors 2022, 22, 6541. [Google Scholar] [CrossRef] [PubMed]
  18. Wang, L.; Li, D. Current status, challenges and prospects for pig production in Asia. Anim. Biosci. 2024, 37, 742. [Google Scholar] [CrossRef]
  19. Hasan, M.K.; Mun, H.-S.; Ampode, K.M.B.; Lagua, E.B.; Park, H.-R.; Kim, Y.-H.; Sharifuzzaman, M.; Yang, C.-J. Transformation toward precision large-scale operations for sustainable farming: A review based on China’s pig industry. J. Adv. Vet. Anim. Res. 2024, 11, 1076. [Google Scholar] [CrossRef]
  20. Tao, B.; Zhao, X.; Ding, H. Mobile-robotic machining for large complex components: A review study. Sci. China Technol. Sci. 2019, 62, 1388–1400. [Google Scholar] [CrossRef]
  21. Hu, Y.; Hu, K.; Wu, J.; Chen, X.; Huang, Z. A review of rigid-flexible coupled robots. J. Nanjing Univ. Inf. Sci. Technol. (Nat. Sci. Ed.) 2022, 14, 304–316. [Google Scholar] [CrossRef]
  22. Jing, Z.; Jun, Z.; Yi, X.; Aiguo, S. Design of a search and rescue robot with crab-inspired rigid-flexible coupling mechanisms. Chin. J. Sci. Instrum. 2023, 44, 11–20. [Google Scholar] [CrossRef]
  23. Xu, F.; Jiang, Q.; Jiang, F.; Shen, J.; Wang, X.; Jiang, G. Design and Testing of a Soft Robot with Variable Stiffness Based on Jamming Principles. J. Mech. Eng. 2021, 56, 67–77. [Google Scholar]
  24. Wang, Z.; Meng, Y.; Bing, L. Research on a New Type of Multifooted Robot. Intern. Combust. Engine Parts 2018, 8, 14–16. [Google Scholar] [CrossRef]
  25. Bhadani, S.; Dillikar, S.R.; Pradhan, O.N.; Cotrina de los Mozos, I.; Felicetti, L.; Upadhyay, S.; Tang, G. A ROS-Based Simulation and Control Framework for In-Orbit Multi-Arm Robot Assembly Operations. In Proceedings of the ASTRA 2023: 17th Symposium on Advanced Space Technologies in Robotics and Automation, Leiden, The Netherlands, 18–20 October 2023; Available online: https://dspace.lib.cranfield.ac.uk/handle/1826/20622 (accessed on 25 January 2026).
  26. Yuan, W.; Dong, S.; Adelson, E.H. Gelsight: High-resolution robot tactile sensors for estimating geometry and force. Sensors 2017, 17, 2762. [Google Scholar] [CrossRef] [PubMed]
  27. Cheney, M.; Isaacson, D.; Newell, J.C. Electrical impedance tomography. SIAM Rev. 1999, 41, 85–101. [Google Scholar] [CrossRef]
  28. Zhao, Z.; Li, W.; Li, Y.; Liu, T.; Li, B.; Wang, M.; Du, K.; Liu, H.; Zhu, Y.; Wang, Q. Embedding high-resolution touch across robotic hands enables adaptive human-like grasping. Nat. Mach. Intell. 2025, 7, 889–900. [Google Scholar] [CrossRef]
  29. Quigley, M.; Batra, S.; Gould, S.; Klingbeil, E.; Le, Q.; Wellman, A.; Ng, A.Y. High-accuracy 3D sensing for mobile manipulation: Improving object detection and door opening. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 2816–2822. [Google Scholar]
  30. Hatakeyama, K.; Okubo, Y.; Nakagome, T.; Makino, M.; Takashima, H.; Akutsu, T.; Sawamoto, T.; Nagase, M.; Noguchi, T.; Kawahito, S. A hybrid ToF image sensor for long-range 3D depth measurement under high ambient light conditions. IEEE J. Solid-State Circuits 2023, 58, 983–992. [Google Scholar] [CrossRef]
  31. Zhang, M.; Zhang, Q.; Meng, Z.; Shi, Z.; Guan, Y. A Survey of Research on Real-Time Dual-OS Architecture for Embedded Platform. Acta Electron. Sin. 2018, 46, 11. [Google Scholar] [CrossRef]
  32. Wei, H.; Shao, Z.; Huang, Z.; Chen, R.; Guan, Y.; Tan, J.; Shao, Z. RT-ROS: A real-time ROS architecture on multi-core processors. Future Gener. Comput. Syst. 2016, 56, 171–178. [Google Scholar] [CrossRef]
  33. Shi, X.; Chen, Z.; Wang, H.; Yeung, D.-Y.; Wong, W.-K.; Woo, W.-C. Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting. 2015. Available online: https://proceedings.neurips.cc/paper_files/paper/2015/file/07563a3fe3bbe7e3ba84431ad9d055af-Paper.pdf (accessed on 7 January 2026).
  34. Ordóñez, F.J.; Roggen, D. Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef]
  35. Chen, H.; Ying, L.; Yang, M.; Zhang, S.; Lin, J. Multimodal recognition of pig behavior using vision and sensors. Trans. Chin. Soc. Agric. Eng. 2025, 41, 194–203. [Google Scholar] [CrossRef]
  36. Yang, R.; Zhang, W.; Tiwari, N.; Yan, H.; Li, T.; Cheng, H. Multimodal sensors with decoupled sensing mechanisms. Adv. Sci. 2022, 9, 2202470. [Google Scholar] [CrossRef] [PubMed]
  37. He, J.; Yang, W.; Liu, T.; Zhuang, J. Research on pig behavior recognition based on the combination of wearable sensors. J. Chin. Agric. Mech. 2025, 46, 42–49. [Google Scholar] [CrossRef]
  38. Goodfellow, I.J.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative Adversarial Nets. 2014. Available online: https://proceedings.neurips.cc/paper_files/paper/2014/file/f033ed80deb0234979a61f95710dbe25-Paper.pdf (accessed on 7 January 2026).
  39. Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv 2020, arXiv:2010.11929. Available online: https://files.ryancopley.com/Papers/2010.11929v2.pdf (accessed on 7 January 2026).
  40. Dhanya, V.; Subeesh, A.; Kushwaha, N.; Vishwakarma, D.K.; Kumar, T.N.; Ritika, G.; Singh, A. Deep learning based computer vision approaches for smart agricultural applications. Artif. Intell. Agric. 2022, 6, 211–229. [Google Scholar] [CrossRef]
  41. Cheeseman, P.; Smith, R.; Self, M. A stochastic map for uncertain spatial relationships. In Proceedings of the 4th International Symposium on Robotic Research, Santa Cruz, CA, USA, 9–14 August 1987; pp. 467–474. Available online: https://www.academia.edu/24059684/A_Stochastic_Map_For_Uncertain_Spatial_Relationships (accessed on 7 January 2026).
  42. Compte, A.; Yan, Y.; Cortés, X.; Escalera, S.; Jacques-Junior, J.C. Housed pig identification and tracking for precision livestock farming. Expert Syst. Appl. 2025, 293, 128466. [Google Scholar] [CrossRef]
  43. Teng, G.; Ji, H.; Zhuang, Y.; Liu, M. Research progress of deep learning in the process of pig feeding. Trans. Chin. Soc. Agric. Eng. 2022, 38, 235–249. [Google Scholar] [CrossRef]
  44. Wu, S.; Bao, Y.; Chen, G.; Chen, Q. Contactless Identification System for Pig Behavior Based on Machine Vision. Comput. Syst. Appl. 2020, 29, 113–117. [Google Scholar] [CrossRef]
  45. Mattina, M.; Benzinou, A.; Nasreddine, K.; Richard, F. An efficient center-based method for real-time pig posture recognition and tracking. Appl. Intell. 2024, 54, 5183–5196. [Google Scholar] [CrossRef]
  46. Li, J.; Green-Miller, A.R.; Hu, X.; Lucic, A.; Mohan, M.M.; Dilger, R.N.; Condotta, I.C.; Aldridge, B.; Hart, J.M.; Ahuja, N. Barriers to computer vision applications in pig production facilities. Comput. Electron. Agric. 2022, 200, 107227. [Google Scholar] [CrossRef]
  47. Du, X.; Li, X.; Fan, S.; Yan, Z.; Ding, X.; Yang, J.; Zhang, L. A Review of the Methods of Pig Body Size Measurement and Body Weight Estimation. Chin. J. Anim. Sci. 2023, 59, 41–46. [Google Scholar] [CrossRef]
  48. Campos, C.; Elvira, R.; Rodríguez, J.J.G.; Montiel, J.M.; Tardós, J.D. Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
  49. Zhang, J.; Singh, S. LOAM: Lidar odometry and mapping in real-time. In Proceedings of the Robotics: Science and Systems, Berkeley, CA, USA, 12–16 July 2014; pp. 1–9. [Google Scholar]
  50. Shan, T.; Englot, B. Lego-loam: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
  51. Xu, W.; Cai, Y.; He, D.; Lin, J.; Zhang, F. Fast-lio2: Fast direct lidar-inertial odometry. IEEE Trans. Robot. 2022, 38, 2053–2073. [Google Scholar] [CrossRef]
  52. Ben Ali, A.J.; Kouroshli, M.; Semenova, S.; Hashemifar, Z.S.; Ko, S.Y.; Dantu, K. Edge-SLAM: Edge-assisted visual simultaneous localization and mapping. ACM Trans. Embed. Comput. Syst. 2022, 22, 1–31. [Google Scholar] [CrossRef]
  53. Hart, P.E.; Nilsson, N.J.; Raphael, B. A formal basis for the heuristic determination of minimum cost paths. IEEE Trans. Syst. Sci. Cybern. 1968, 4, 100–107. [Google Scholar] [CrossRef]
  54. Dijkstra, E.W. A note on two problems in connexion with graphs. In Edsger Wybe Dijkstra: His Life, Work, and Legacy; ACM Books: New York, NY, USA, 2022; pp. 287–290. [Google Scholar]
  55. Stentz, A. Optimal and efficient path planning for partially-known environments. In Proceedings of the 1994 IEEE International Conference on Robotics and Automation, San Diego, CA, USA, 8–13 May 1994; pp. 3310–3317. [Google Scholar]
  56. Koenig, S.; Likhachev, M. D* lite. In Proceedings of the Eighteenth National Conference on Artificial Intelligence, Edmonton, AL, Canada, 28 July–1 August 2002; pp. 476–483. Available online: https://dl.acm.org/doi/abs/10.5555/777092.777167 (accessed on 7 January 2026).
  57. Mnih, V.; Kavukcuoglu, K.; Silver, D.; Rusu, A.A.; Veness, J.; Bellemare, M.G.; Graves, A.; Riedmiller, M.; Fidjeland, A.K.; Ostrovski, G. Human-level control through deep reinforcement learning. Nature 2015, 518, 529–533. [Google Scholar] [CrossRef]
  58. Schulman, J.; Wolski, F.; Dhariwal, P.; Radford, A.; Klimov, O. Proximal policy optimization algorithms. arXiv 2017, arXiv:1707.06347. [Google Scholar] [CrossRef]
  59. Miao, Z.; Huang, W.; Zhang, Y.; Fan, Q. Multi-robot task allocation using multimodal multi-objective evolutionary algorithm based on deep reinforcement learning. J. Shanghai Jiaotong Univ. (Sci.) 2024, 29, 377–387. [Google Scholar] [CrossRef]
  60. Zhou, J.; Zheng, L.; Fan, W. Multirobot collaborative task dynamic scheduling based on multiagent reinforcement learning with heuristic graph convolution considering robot service performance. J. Manuf. Syst. 2024, 72, 122–141. [Google Scholar] [CrossRef]
  61. Visioli, A. Practical PID Control; Springer: Berlin/Heidelberg, Germany, 2006; Available online: https://link.springer.com/content/pdf/10.1007/1-84628-586-0_5.pdf (accessed on 7 January 2026).
  62. Slotine, J.-J.E.; Li, W. Applied Nonlinear Control; Prentice Hall: Englewood Cliffs, NJ, USA; Pearson: London, UK, 1991; Volume 199, Available online: https://www.academia.edu/6903196/Slotine_at_BULLET_Li_APPLIED_NONLINEAR_CONTROL (accessed on 7 January 2026).
  63. Passino, K.M.; Yurkovich, S.; Reinfrank, M. Fuzzy Control; Addison-Wesley: Reading, MA, USA, 1998; Volume 42, Available online: https://www.a-lab.ee/edu/system/files/eduard.petlenkov/courses/ISS0023/2016_Autumn/materials/FCbook.pdf (accessed on 7 January 2026).
  64. Li, S.; Xu, L.D.; Zhao, S. The internet of things: A survey. Inf. Syst. Front. 2015, 17, 243–259. [Google Scholar] [CrossRef]
  65. Khan, W.Z.; Ahmed, E.; Hakak, S.; Yaqoob, I.; Ahmed, A. Edge computing: A survey. Future Gener. Comput. Syst. 2019, 97, 219–235. [Google Scholar] [CrossRef]
  66. Shang, M.; Dong, G.; Mu, Y.; Wang, F.; Ruan, H. The Application of Internet of Things in Pig Breeding. In Proceedings of the International Conference on Computer and Computing Technologies in Agriculture, Beijing, China, 27–30 September 2015; pp. 548–556. [Google Scholar]
  67. Li, W.; Li, L. Pig Breeding Environment Information Collection System Based on Internet of Things. Agric. Eng. 2022, 12, 52–55. [Google Scholar] [CrossRef]
  68. Li, Y.; Fei, T. Design of pig intelligent feeding management system based on IoT. Mod. Electron. Tech. 2022, 45, 58–62. [Google Scholar] [CrossRef]
  69. Gao, S.; Yang, X.; Chen, T.; Feng, Y. Research on intelligent pig inventory based on improved YOLOv5 and edge computing. Foreign Electron. Meas. Technol. 2003, 42, 169–177. [Google Scholar] [CrossRef]
  70. Sredojev, B.; Samardzija, D.; Posarac, D. WebRTC technology overview and signaling solution design and implementation. In Proceedings of the 2015 38th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 25–29 May 2015; pp. 1006–1009. [Google Scholar]
  71. Kawasue, K.; Wai, P.P.; Win, K.D.; Lee, G.; Iki, Y. Pig weight prediction system using RGB-D sensor and AR glasses: Analysis method with free camera capture direction. Artif. Life Robot. 2023, 28, 89–95. [Google Scholar] [CrossRef]
  72. Li, Y.; Fu, C.; Yang, H.; Li, H.; Zhang, R.; Zhang, Y.; Wang, Z. Design of a closed piggery environmental monitoring and control system based on a track inspection robot. Agriculture 2023, 13, 1501. [Google Scholar] [CrossRef]
  73. Sun, H.; Palaoag, T.D.; Quan, Q. Design of pig farm environment regulation and video monitoring system based on livestock internet of things orbital inspection robot. In Proceedings of the 2024 7th International Conference on Communication Engineering and Technology (ICCET), Tokyo, Japan, 22–24 February 2024; pp. 25–30. [Google Scholar]
  74. Attard, G. Robots in livestock management. In Encyclopedia of Smart Agriculture Technologies; Springer: Berlin/Heidelberg, Germany, 2023; pp. 1–12. [Google Scholar]
  75. Frost, A.; French, A.; Tillett, R.; Pridmore, T.; Welch, S. A vision guided robot for tracking a live, loosely constrained pig. Comput. Electron. Agric. 2004, 44, 93–106. [Google Scholar] [CrossRef]
  76. Sun, H.; Palaoag, T.D.; Quan, Q. Design of Pig Inventory and Abnormality Monitoring System Based on Livestock Internet of Things Orbital Inspection Robot. In Proceedings of the 2024 5th International Conference on Machine Learning and Human-Computer Interaction (MLHMI), Kawasaki, Japan, 14–16 March 2024; pp. 17–21. [Google Scholar]
  77. Yanchang, L.; Haisheng, Z.; Zexu, L.; Zhixia, Z.; Xiangang, Z.; Guohou, L. Design of intelligent monitoring system for pig healthy breeding based on robot. J. Chin. Agric. Mech. 2021, 42, 187. [Google Scholar] [CrossRef]
  78. Guo, H.; Lemay, S.; Barber, E.; Zyla, L. Performance of five commercial electronic humidity sensors in a swine building. Can. Biosyst. Eng. 2004, 46, 5. Available online: https://library.csbe-scgab.ca/docs/journal/46/c0322.pdf (accessed on 7 January 2026).
  79. Rowland, H. Performance of a Low-Cost Particulate Matter Sensor in a Swine Barn. Master’s Thesis, The University of Iowa, Iowa City, IA, USA, 2024. Available online: https://www.proquest.com/openview/0dfcab5c2eed2e7b91cb2a2bf255677c/ (accessed on 7 January 2026).
  80. Impact of Finishing Facility Environment on Feed Efficiency; #PR-005316 2024. 2024. Available online: https://www.porkcheckoff.org/wp-content/uploads/2025/10/Leonard_FinalReport.pdf (accessed on 25 January 2026).
  81. Ebertz, P.; Krommweh, M.S.; Büscher, W. Feasibility study: Improving floor cleanliness by using a robot scraper in group-housed pregnant sows and their reactions on the new device. Animals 2019, 9, 185. [Google Scholar] [CrossRef]
  82. Xiaoneng, L.; Zhenping, W.; Zhuowei, M.; Zhao, L.; Duankang, Z.; Shuisheng, T. Design and Research of Washing Robot in Piggery. Mod. Agric. Equip. 2021, 42, 55–59. [Google Scholar]
  83. Lei, L. Design and Path Planning of Pig House Cleaning Robot. Value Eng. 2022, 41, 137–139. [Google Scholar] [CrossRef]
  84. Tuyu, L.; Yanyu, G.; Kunle, Z. Design and experiment of control system of piggery excrement cleaning robot. Heilongjiang Anim. Sci. Vet. Med. 2022, 12, 30–34. [Google Scholar] [CrossRef]
  85. Qi, L.; Zhihua, X.; Dongmei, S.; Teng, L.; Yiyun, L. Field Operation Performance Test for New Type of Crawler Type Piggery Underfloor Manure Remover. Agric. Eng. 2023, 13, 97–102. [Google Scholar] [CrossRef]
  86. Yanyu, G.; Tuyu, L.; Feng, B.; Kunle, Z.; Rikai, Z. Design and Experiment of Intelligent Robot System for Excrement Cleaning. J. Agric. Mech. Res. 2024, 46, 103–107+112. [Google Scholar] [CrossRef]
  87. Tuyu, L.; Yanyv, G.; Jiao, D.; Wei, G. Path Planning for a Pig House Manure Removal Robot Based on ROS. South Forum 2025, 56, 29–33. [Google Scholar] [CrossRef]
  88. Chen, J.; Zhu, J.; Jia, N.; Yue, J.; Zhou, Y.; Li, B.; Wang, H. Design and Experimentation of a Variable-Flow Cleaing and Disinfectant Robot for Pig Housing. J. Agric. Mech. Res. 2026, 48, 90–99+106. Available online: https://link.cnki.net/urlid/23.1233.S.20250721.1426.004 (accessed on 7 January 2026).
  89. Moreno, F.-A.; Monroy, J.; Ruiz-Sarmiento, J.-R.; Galindo, C.; Gonzalez-Jimenez, J. Automatic waypoint generation to improve robot navigation through narrow spaces. Sensors 2019, 20, 240. [Google Scholar] [CrossRef]
  90. Asselmeier, M.; Ahuja, D.; Zaro, A.; Abuaish, A.; Zhao, Y.; Vela, P.A. Dynamic Gap: Safe Gap-based Navigation in Dynamic Environments. In Proceedings of the 2025 IEEE International Conference on Robotics and Automation (ICRA), Atlanta, GA, USA, 17–23 May 2025; pp. 12870–12876. [Google Scholar]
  91. Khatiri, S.; Barrientos, F.E.V.; Wulf, M.; Tonella, P.; Panichella, S. Bridging Research and Practice in Simulation-based Testing of Industrial Robot Navigation Systems. arXiv 2025, arXiv:2510.09396. [Google Scholar] [CrossRef]
  92. Bae, J.; Park, S.; Jeon, K.; Choi, J.Y. Autonomous system of TMR (total mixed ration) feed feeding robot for smart cattle farm. Int. J. Precis. Eng. Manuf. 2023, 24, 423–433. [Google Scholar] [CrossRef]
  93. Chen, Z.; Wang, H.; Zhou, M.; Zhu, J.; Chen, J.; Li, B. Design and Experiment of an Autonomous Navigation System for a Cattle Barn Feed-Pushing Robot Based on UWB Positioning. Agriculture 2024, 14, 694. [Google Scholar] [CrossRef]
  94. Yang, L.; Xiong, B.; Wang, H.; Chen, R.; Zhao, Y. Research Progress and Outlook of Livestock Feeding Robot. Smart Agric. 2022, 4, 86–98. [Google Scholar] [CrossRef]
  95. Funk, T.H.; Rohrer, G.A.; Brown-Brandl, T.M.; Keel, B.N. Online feeding behavior monitoring of individual group-housed grow-finish pigs using a low-frequency RFID electronic feeding system. Transl. Anim. Sci. 2024, 8, txae051. [Google Scholar] [CrossRef] [PubMed]
  96. Garrido-Izard, M.; Correa, E.C.; Requejo, J.M.; Villarroel, M.; Diezma, B. Cleansing data from an electronic feeding station to improve estimation of feed efficiency. Biosyst. Eng. 2022, 224, 361–369. [Google Scholar] [CrossRef]
  97. Luo, Y.; Xia, J.; Lu, H.; Luo, H.; Lv, E.; Zeng, Z.; Li, B.; Meng, F.; Yang, A. Automatic recognition and quantification feeding behaviors of nursery pigs using improved YOLOV5 and feeding functional area proposals. Animals 2024, 14, 569. [Google Scholar] [CrossRef] [PubMed]
  98. Bressers, H.; Te Brake, J.; Noordhuizen, J. Automated oestrus detection in group-housed sows by recording visits to the boar. Livest. Prod. Sci. 1995, 41, 183–191. [Google Scholar] [CrossRef]
  99. Muyuan. Available online: https://www.muyuanfoods.com (accessed on 12 September 2025).
  100. Lei, K.; Zong, C.; Du, X.; Teng, G.; Feng, F. Oestrus analysis of sows based on bionic boars and machine vision technology. Animals 2021, 11, 1485. [Google Scholar] [CrossRef] [PubMed]
  101. Westfleisch. Available online: https://www.westfleisch.de (accessed on 12 September 2025).
Figure 1. Practical application scenario diagram. (a) Nanmu Equipment & Technology Co., Ltd., Yunfu, China [13]; (b) Manure Cleaning Robot, Beijing Jiawo Tianhe Intelligent Technology Co., Ltd., Beijing, China [14]. (c) EVO Cleaner, Envirologic AB, Uppsala, Sweden [15]. (d) Temperature-measuring and manure-cleaning robot, Beijing Jiawo Tianhe Intelligent Technology Co., Ltd., Beijing, China [14].
Figure 1. Practical application scenario diagram. (a) Nanmu Equipment & Technology Co., Ltd., Yunfu, China [13]; (b) Manure Cleaning Robot, Beijing Jiawo Tianhe Intelligent Technology Co., Ltd., Beijing, China [14]. (c) EVO Cleaner, Envirologic AB, Uppsala, Sweden [15]. (d) Temperature-measuring and manure-cleaning robot, Beijing Jiawo Tianhe Intelligent Technology Co., Ltd., Beijing, China [14].
Agriculture 16 00334 g001
Figure 2. The thematic distribution of the selected literature and cases.
Figure 2. The thematic distribution of the selected literature and cases.
Agriculture 16 00334 g002
Figure 3. Key technology architecture of smart pig farm robots.
Figure 3. Key technology architecture of smart pig farm robots.
Agriculture 16 00334 g003
Figure 4. Modern robot real-time operating system layered architecture design.
Figure 4. Modern robot real-time operating system layered architecture design.
Agriculture 16 00334 g004
Figure 5. Internet of things and edge computing technology.
Figure 5. Internet of things and edge computing technology.
Agriculture 16 00334 g005
Figure 6. A Pig Weight Prediction System Using Augmented Reality Glasses [66].
Figure 6. A Pig Weight Prediction System Using Augmented Reality Glasses [66].
Agriculture 16 00334 g006
Figure 7. A Cattle Barn Feed-Pushing Robot Based on UWB Positioning [92].
Figure 7. A Cattle Barn Feed-Pushing Robot Based on UWB Positioning [92].
Agriculture 16 00334 g007
Figure 8. A rendering of the test platform [86], © 2021, licensed under CC BY 4.0.
Figure 8. A rendering of the test platform [86], © 2021, licensed under CC BY 4.0.
Agriculture 16 00334 g008
Table 1. Core technologies of high-precision mechanical systems.
Table 1. Core technologies of high-precision mechanical systems.
TechnologyCommon TechnologiesExamplesFunction
Precision mechanical structure designAdoption of Rigid-Flex Hybrid Technology [21]Rigid-Flexible Coupling Mechanisms [22]
Variable-Stiffness Soft Robot [23]
Thermal deformation and vibration impact reduction
Structural optimizationSymmetric configuration [24]
Multi-arm coordinated motion [25]
High-resolution sensing systemsOpto-pressure conversion designGelSight technology [26]Surface detail detection through light propagation and pressure variation
Electrical impedance tomography [27]Biomimetic visuotactile dexterous hand F-TAC Hand [28]Synchronous recognition of multiple stimuli
High-precision 3D sensing technology [29]hToF technology [30]Enhanced 3D sensing precision, dynamic range, and interference resistance
Table 3. Three main types of drive technologies are adopted by intelligent breeding robots.
Table 3. Three main types of drive technologies are adopted by intelligent breeding robots.
Technical NamePrincipleFunction
Servo Drive SystemBased on the closed-loop control principle utilizing high-precision position feedbackTo achieve precise motion control
Pneumatic Drive SystemLeveraging fluid power transmission characteristics integrated with force sensingTo achieve a compliant operation
Hybrid Drive TechnologyIntegrating the precision of mechatronic systems with the structural advantages of new materialsTo form a multifunctional operational platform
Table 4. Three common closed-loop control algorithms in intelligent breeding robots.
Table 4. Three common closed-loop control algorithms in intelligent breeding robots.
Algorithmic ControlExemplarFunctionality
Proportional-Integral-Derivative (PID) ControlEngineering-oriented PID optimization methods [61]Automatically adjusts the system output through a feedback mechanism to achieve the desired value, ensuring precise motion and position control of the mechanical structure.
Adaptive ControlExtension of adaptive control to robotic nonlinear systems [62]
Fuzzy ControlApplication cases of fuzzy control [63]
Table 5. Functional and technological comparison of five robot categories in smart pig farms.
Table 5. Functional and technological comparison of five robot categories in smart pig farms.
Robot TypeCore FunctionalityKey TechnologyApplication AdvantagesApplication Limitations
Inspection RobotsMonitoring herd health and environmental parametersMultimodal Sensing, Visual Recognition, and Intelligent AlgorithmsReal-time
monitoring
Insufficient adaptability to complex environments
Cleaning RobotsAutomated manure cleaning and disinfectionPath planning, high-pressure nozzlesHygiene improvementLimited cleaning capability in hard-to-reach areas
Feeding RobotsPrecision feeding and feed managementLoad cells, RFID identification, robotic arm controlWaste reductionHigh initial cost
Automated Semen Delivery RobotsAutomated reproduction assistanceSemen storage, precise positioningReproductive efficiency enhancementHigh technical barriers
Bionic RobotsSimulating porcine behavior or structuresBionic manipulation, bionic design, sound simulation, tactile feedbackStress reductionLimited functionality
Automated Pig Herding RobotsCarcass positioning, automated splitting, and segmentationAutomated control, precise cutting, quality inspectionSlaughter efficiency and product consistency improvementRequires high-precision equipment and maintenance
Table 6. Representative intelligent inspection robots for pig farms.
Table 6. Representative intelligent inspection robots for pig farms.
Core FunctionsMain Sensor ConfigurationKey Technologies/AlgorithmsReferences
Environmental monitoring, health inspectionEight-in-one environmental sensorsReal-time synchronization of multi-source heterogeneous data[72]
Environmental regulation, video surveillanceEnvironmental sensors, visible-light camerasSeamless global perception[73]
Environmental monitoring, health diagnosis and careEnvironmental and health monitoring sensorsModular architecture, standardized data interfaces[74]
Visual tracking, site localizationVisual sensorsFour-point contour P2-site mapping, PID trajectory tracking[75]
Pig counting, anomaly detectionVisible-light camerasYOLOv5 object detection + Kalman filtering[76]
Posture recognition, behavior monitoringVisible-light camerasYOLOv8n + TensorRT acceleration + TSM[11]
Anomaly detection, environmental monitoring, integrated surveillanceImage and environmental sensorsFPGA controller + image processing + wireless communication[77]
Table 7. Quantitative evidence of sensor performance degradation in harsh pig-house environments (high humidity, heavy dust, and frequent cleaning).
Table 7. Quantitative evidence of sensor performance degradation in harsh pig-house environments (high humidity, heavy dust, and frequent cleaning).
Scene/EnvironmentMetricKey DataConclusionReferences
Pig house exposure, high humidity and pollutionRH Error0 days 4.8% → 365 days 8.9%High humidity leads to increased RH sensor drift[78]
Pig house high dust, coarse particlesPM Deviation−62%~−87%, reference value is 5–10 times sensor readingLow-cost PM sensors severely underestimate[79]
Commercial pig house, multi-cycle deploymentTempco/NH3 LifetimeApproximately 2 cycles require replacement, high-pressure water can cause failureEnvironment and cleaning accelerate sensor lifespan decay[80]
Table 8. Representative intelligent cleaning robots for pig farms.
Table 8. Representative intelligent cleaning robots for pig farms.
Main FunctionApplicable ScenarioObstacle Avoidance and Navigation MechanismCleaning MethodReferences
Manure RemovalSlatted floorsGyroscope + ultrasonic sensorsRubber scraper + optional water spray[81]
Manure RemovalPens + passagesMagnetic guidance + preset path planning + LiDAR obstacle avoidance5-DOF robotic arm + high-pressure water spray[82]
Manure RemovalPens + passagesLiDARFlexible scraper head + electric push-dump mechanism[83]
Manure RemovalSlatted floorsLiDAR SLAM + ultrasonic obstacle avoidanceRubber scraper + front/rear water spray nozzles[84]
Manure RemovalUnder slatted floorsLiDAR SLAM + IoT and APP remote monitoringPush-plate scraper[85]
Manure RemovalSlatted floorsLiDAR SLAM + ultrasonic sensors + infrared recharge alignmentRubber scraper + front/rear water spray nozzles[86]
Manure Removal-LiDAR SLAM + ultrasonic/infrared sensors + A*/D* algorithm path planningScraping blade[87]
Variable-rate Washing and DisinfectionPens + passagesMagnetic strip navigation + RFID station recognition + laser rangingVariable-rate spray boom + high-pressure water spray + disinfectant spray[88]
Table 9. Quantitative statistics of navigation success and failure in narrow passages (applicable to cleaning robot evaluation).
Table 9. Quantitative statistics of navigation success and failure in narrow passages (applicable to cleaning robot evaluation).
Scene/EnvironmentMetricControl/Baseline → Adverse ConditionsConclusionReferences
Narrow Corridor/DoorwayPassability RateGiraff-X: 0.26 → 0.90; TIAGO: 0.16 → 0.56The assistive strategy significantly improves passability in narrow areas and reduces failure rates.[89]
Tight Corridor + Dynamic ObstaclesSuccess Rate (%)/Failure ModesHospital-dynamic-gap: 88%; TEB: 80%; Main failure mode: timeoutFailures in narrow corridors mainly manifest as getting stuck or timing out.[90]
Narrow Passage Stress TestSuccess Rate/Safety-stop RateL-corridor: Success Rate 23.1%; Safety-stop Rate: 52.9%Failures can be categorized into safety stops and timeouts, interpretable as engineering types.[91]
Table 10. Quantitative indicators and key bottleneck conclusions for achieving correct in dividual to correct dose under group-housing conditions.
Table 10. Quantitative indicators and key bottleneck conclusions for achieving correct in dividual to correct dose under group-housing conditions.
Environment/ScaleEvidence TypeClosed-Loop ElementKey Quantitative EvidenceKey Bottleneck ConclusionReferences
Fattening pig group housing, 2826 heads, 12 batchesProxy, Event Identification/AlertingAlertingAlert hit rate: 55.7%, significant missed detection space exists, potential for many false alarms.Under group housing conditions, event identification suffers from high risks of missed detection and false positives, making it difficult to support stable closed-loop decision-making and reliable feeding.[95]
Commercial farm: EFS, 2748 heads, 2,709,600 visitsDirect, Event Recording/Data Chain MeasurementData ChainSources of data anomalies: Management 26.1%, Maintenance 4.4%, Weighing 10.6%; Short-term weight curve fluctuation 7.3%, Calibration error 3.3%, Missing weight data 1.9%.High proportion of errors in the recording-measurement data chain, easily leading to risks of over/underfeeding and dosage deviation.[96]
Nursery pig group housing, 2404 heads, 24 h videoProxy, Individual/Behavior RecognitionRecognitionConfusion between FB/NFB is common; misjudgment rate/missed detection rate not quantified.Occlusion in group housing and behavioral similarity lead to high risk of recognition misjudgment. Critical failure rate statistics are missing, making it difficult to stably identify the correct individual.[97]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhou, L.; Hao, L.; Xiong, Y.; Qin, H.; Bao, A.; Chen, Z. Research Progress of Robotic Technologies and Applications in Smart Pig Farms. Agriculture 2026, 16, 334. https://doi.org/10.3390/agriculture16030334

AMA Style

Zhou L, Hao L, Xiong Y, Qin H, Bao A, Chen Z. Research Progress of Robotic Technologies and Applications in Smart Pig Farms. Agriculture. 2026; 16(3):334. https://doi.org/10.3390/agriculture16030334

Chicago/Turabian Style

Zhou, Luyang, Linqiu Hao, Yingjun Xiong, Huanhuan Qin, Aoran Bao, and Zikang Chen. 2026. "Research Progress of Robotic Technologies and Applications in Smart Pig Farms" Agriculture 16, no. 3: 334. https://doi.org/10.3390/agriculture16030334

APA Style

Zhou, L., Hao, L., Xiong, Y., Qin, H., Bao, A., & Chen, Z. (2026). Research Progress of Robotic Technologies and Applications in Smart Pig Farms. Agriculture, 16(3), 334. https://doi.org/10.3390/agriculture16030334

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop