Key Technology Research and Applications of Agricultural Inspection Robots Based on Machine Vision and Artificial Intelligence

A special issue of Agriculture (ISSN 2077-0472). This special issue belongs to the section "Artificial Intelligence and Digital Agriculture".

Deadline for manuscript submissions: 25 May 2026 | Viewed by 1488

Special Issue Editors

College of Smart Agriculture (College of Artificial Intelligence), Nanjing Agricultural University, Nanjing 210031, China
Interests: microclimate analytics of poultry houses; intelligent agricultural equipment; smart farming; non-destructive detection of meat quality; agricultural robot
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
College of Smart Agriculture (College of Artificial Intelligence), Nanjing Agricultural University, Nanjing 210031, China
Interests: intelligent agricultural equipment; three-dimensional reconstruction
Special Issues, Collections and Topics in MDPI journals
1. School of Ecology and Applied Meteorology, Nanjing University of Information Science and Technology, Nanjing 210044, China
2. Jiangsu Key Laboratory of Agricultural Meteorology, Nanjing University of Information Science and Technology, Nanjing 210044, China
Interests: lake-atmosphere exchange

Special Issue Information

Dear Colleagues,

In the rapidly evolving realm of modern agriculture, machine vision (MV) and artificial intelligence (AI) have emerged as pivotal technologies, revolutionizing multiple agricultural sectors. For planting, agricultural inspection robots, equipped with high-resolution cameras and sophisticated environmental sensors, leverage MV to extract plant phenotypes, automate plant and organ detection, and gather real-time environmental data. Subsequently, AI algorithms analyze these data to build models, including convolutional neural networks for identifying diseases and pests and time series models for predicting plant growth. In livestock and poultry farming, MV and AI enable the continuous monitoring of animal behaviors, health status, and growth, thus facilitating early disease detection and optimized feeding strategies. In agricultural research laboratories, MV-based visual inspection systems, coupled with AI-driven gas detection sensors, ensure safety by detecting hazards, monitoring equipment operations, and alerting researchers to abnormal gas concentrations. The integration of MV and AI in agriculture promotes intelligent automation, reduces costs, enhances productivity, and reveals significant potential for the development of smart farming.

We are delighted to announce a new Special Issue focused on the cutting-edge research and applications of MV and AI in agriculture. Our research scope covers diverse topics, such as optimizing the visual systems of agricultural inspection robots for precise plant phenotyping and early pest detection in the planting industry; utilizing robotic MV and AI for livestock behavior analysis and health management; and deploying inspection robots integrated with MV and AI gas sensors to improve safety in agricultural research laboratories. We sincerely invite original research papers and comprehensive reviews to further explore and innovate in this vibrant field.

Dr. Xiuguo Zou
Dr. Yan Qian
Dr. Yuhua Li
Dr. Yong Wang
Dr. Wentian Zhang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Agriculture is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • machine vision
  • artificial intelligence
  • agricultural inspection robots
  • agricultural sensors
  • agricultural big data analytics
  • behavior recognition of livestock and poultry
  • plant phenotyping and pest detection
  • agricultural laboratory safety and gas sensing

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 7032 KB  
Article
Prediction Model for the Oscillation Trajectory of Trellised Tomatoes Based on ARIMA-EEMD-LSTM
by Yun Wu, Yongnian Zhang, Peilong Zhao, Xiaolei Zhang, Xiaochan Wang, Maohua Xiao and Yinlong Zhu
Agriculture 2025, 15(23), 2418; https://doi.org/10.3390/agriculture15232418 - 24 Nov 2025
Viewed by 159
Abstract
Second-order damping oscillation models are incapable of precisely predicting superimposed and multi-fruit collision-induced oscillations. In view of this problem, an ARIMA-EEMD-LSTM hybrid model for predicting the oscillation trajectories of trellised tomatoes was proposed in this study. First, the oscillation trajectories of trellised tomatoes [...] Read more.
Second-order damping oscillation models are incapable of precisely predicting superimposed and multi-fruit collision-induced oscillations. In view of this problem, an ARIMA-EEMD-LSTM hybrid model for predicting the oscillation trajectories of trellised tomatoes was proposed in this study. First, the oscillation trajectories of trellised tomatoes under different picking forces were captured with the aid of the Nokov motion capture system, and then the collected oscillation trajectory datasets were then divided into training and test subsets. Afterwards, the ensemble empirical mode decomposition (EEMD) method was employed to decompose oscillation signals into multiple intrinsic mode function (IMF) components, of which different components were predicted by different models. Specifically, high-frequency components were predicted by the long short-term memory (LSTM) model while low-frequency components were predicted by the autoregressive integrated moving average (ARIMA) model. The final oscillation trajectory prediction model for trellised tomatoes was constructed by integrating these components. Finally, the constructed model was experimentally validated and applied to an analysis of single-fruit oscillations and multi-fruit oscillations (including collision oscillations and superposition oscillations). The following experimental results were yielded: Under single-fruit oscillation conditions, the prediction accuracy reached an RMSE of 0.1008–0.2429 mm, an MAE of 0.0751–0.1840 mm, and an MAPE of 0.01–0.06%. Under multi-fruit oscillation conditions, the prediction accuracy yielded an RMSE of 0.1521–0.6740 mm, an MAE of 0.1084–0.5323 mm, and an MAPE of 0.01–0.27%. The research results serve as a reference for the dynamic harvesting prediction of tomato-picking robots and contribute to improvement of harvesting efficiency and success rates. Full article
Show Figures

Figure 1

29 pages, 5277 KB  
Article
DualHet-YOLO: A Dual-Backbone Heterogeneous YOLO Network for Inspection Robots to Recognize Yellow-Feathered Chicken Behavior in Floor-Raised House
by Yaobo Zhang, Linwei Chen, Hongfei Chen, Tao Liu, Jinlin Liu, Qiuhong Zhang, Mingduo Yan, Kaiyue Zhao, Shixiu Zhang and Xiuguo Zou
Agriculture 2025, 15(14), 1504; https://doi.org/10.3390/agriculture15141504 - 12 Jul 2025
Cited by 1 | Viewed by 746
Abstract
The behavior of floor-raised chickens is closely linked to their health status and environmental comfort. As a type of broiler chicken with special behaviors, understanding the daily actions of yellow-feathered chickens is crucial for accurately checking their health and improving breeding practices. Addressing [...] Read more.
The behavior of floor-raised chickens is closely linked to their health status and environmental comfort. As a type of broiler chicken with special behaviors, understanding the daily actions of yellow-feathered chickens is crucial for accurately checking their health and improving breeding practices. Addressing the challenges of high computational complexity and insufficient detection accuracy in existing floor-raised chicken behavior recognition models, a lightweight behavior recognition model was proposed for floor-raised yellow-feathered chickens, based on a Dual-Backbone Heterogeneous YOLO Network. Firstly, DualHet-YOLO enhances the feature extraction capability of floor-raised chicken images through a dual-path feature map extraction architecture and optimizes the localization and classification of multi-scale targets using a TriAxis Unified Detection Head. Secondly, a Proportional Scale IoU loss function is introduced that improves regression accuracy. Finally, a lightweight structure Eff-HetKConv was designed, significantly reducing model parameters and computational complexity. Experiments on a private floor-raised chicken behavior dataset show that, compared with the baseline YOLOv11 model, the DualHet-YOLO model increases the mAP for recognizing five behaviors—pecking, resting, walking, dead, and inactive—from 77.5% to 84.1%. Meanwhile, it reduces model parameters by 14.6% and computational complexity by 29.2%, achieving a synergistic optimization of accuracy and efficiency. This approach provides an effective solution for lightweight object detection in poultry behavior recognition. Full article
Show Figures

Figure 1

Back to TopTop