Next Article in Journal
Thymol Stimulates Lateral Root Formation via Regulating Endogenous Reactive Oxygen Species
Previous Article in Journal
Effects of Geraniol on Survival, Reproduction, Endophytes, and Transcriptome of Tea Green Leafhoppers (Empoasca onukii)
Previous Article in Special Issue
Multispectral Inversion of Starch Content in Rice Grains from Yingjiang County Based on Feature Band Selection Algorithms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

In-Field Detection and Monitoring Technology in Precision Agriculture

1
College of Mechanical and Electronic Engineering, Northwest A&F University, Xianyang 712100, China
2
College of Information & Electrical Engineering, China Agricultural University, Beijing 100083, China
3
Shenzhen Branch, Guangdong Laboratory of Lingnan Modern Agriculture, Genome Analysis Laboratory of the Ministry of Agriculture and Rural Affairs, Agricultural Genomics Institute at Shenzhen, Chinese Academy of Agricultural Sciences, Shenzhen 518120, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Agronomy 2025, 15(4), 783; https://doi.org/10.3390/agronomy15040783
Submission received: 12 March 2025 / Accepted: 21 March 2025 / Published: 23 March 2025
(This article belongs to the Special Issue In-Field Detection and Monitoring Technology in Precision Agriculture)

1. Introduction

Precision agriculture was proposed in the 1990s [1] to address global food security issues caused by population growth and climate change [2,3], and it is a relatively new agricultural management concept that is still used today [4]. It is based on the efficient acquisition and processing of field data and quantitative monitoring and management of the entire crop process, aiming to achieve higher productivity, maintain long-term economic viability, improve product quality, and significantly reduce negative effects on the environment [5,6]. Enhanced and intelligent field inspection and monitoring technologies play a crucial role in promoting the rapid advancement of precision agriculture [7]. With the development of equipment technology [4] and intelligent algorithms [8], precision agriculture requires researchers to provide more efficient field information acquisition equipment and more accurate information analysis models. Furthermore, in the process of crop growth, sensing technologies for monitoring agricultural environments, including soil properties and water quality parameters, are needed to obtain key environmental parameters with high spatial and temporal resolution at minimum cost, making crop management more refined, reducing the application of fertilizers and pesticides, and saving costs [9]. Image acquisition technology based on UAV, mobile phone, and other mobile terminals greatly facilitates the high-throughput and rapid acquisition of crop field images. In addition, thanks to new-generation intelligent learning models, image processing is becoming more robust and adaptable to the environment and spatiotemporal variation [10,11,12]. These technologies offer more precise and timely data to enhance agricultural management decisions and will help further advance precision agriculture.

2. Overview of the Special Issue

This Special Issue of Agronomy contains 12 research articles pertaining to deep learning, computer vision, and sensor technologies applied in precision agriculture, which have significantly improved the accuracy and efficiency of in-field detection systems. These original research papers can be grouped into four categories: environmental and physiological monitoring; the inversion of crop vegetation parameters; pest, disease, and weed detection; and seed detection and quality assessment.

2.1. Environmental and Physiological Monitoring

Guo et al. examined various factors influencing the deposition coverage of air-assisted electrostatic spraying technology applied to tomato foliage. Using the Box–Behnken surface response methodology, they highlighted spray distance as the primary factor determining the effectiveness of droplet coverage. The optimal parameters for air-assisted electrostatic spraying were a negative charging voltage, a spray distance of 2.75 m, and a descending spray pattern. This study provides valuable insights for improving the efficiency of pesticide application in greenhouse environments.
Pan et al. discuss the design, construction, and refinement of a flow-through chamber system for measuring net ecosystem carbon exchange (NEE) in maize plants. The study addresses the issue of turbulence in the chamber by modifying the chamber design to include deflectors, which reduce turbulence and improve measurement accuracy. The improved chamber took less time to attain an equilibrium state, enabled higher-frequency measurements, and minimized sensitivity to varying environmental conditions. This study validates the instrument’s precision using the alkali solution absorption method and demonstrates the system’s ability to capture NEE fluxes with high accuracy, as well as highlighting the importance of reducing turbulence in chamber measurements for accurate carbon cycle assessments.

2.2. Inversion of Crop Vegetation Parameters

Lai et al. introduced an approach to determine the crown planar area and count of papaya plants based on high-resolution images captured by UAVs. The method involves calculating vegetation indices, applying a low-pass filter, and using Otsu’s method for image segmentation. The study also introduced the Mean–Standard Deviation Threshold (MSDT) method for identifying the number of papaya plants. The method achieved an average accuracy of 93.71% for crown planar area extraction and 95.54% for plant number extraction. This research demonstrates the feasibility and applicability of the method for efficient orchard management, providing a basis for understanding papaya growth status and yield estimation. The methods developed in this study can also be applied to other fruit trees with similar crown morphology, providing a basis for yield estimation and orchard management. This method demonstrates excellent accuracy and minimal hardware demands, representing a substantial improvement within precision agriculture practices and offering a practical solution for monitoring and managing orchards.
Zhang et al. describes a method for detecting wheat lodging direction using an improved K-means algorithm and bag of visual words (BOVW) in combine harvesters. The improved K-means algorithm uses a cluster validity evaluation function, maximum and minimum distances for initial clustering centers, and a multichannel and multidimensional feature vector. of the authors constructed a dataset using adaptive image grid division and developed a BOVW based on the improved K-means algorithm. The proposed method achieved an average accuracy of 95.75% in wheat lodging direction detection, outperforming traditional methods. This research shows the proposed method’s potential for the accurate and efficient identification of wheat lodging orientation during combine harvesting operations.

2.3. Pest, Disease, and Weed Detection

Wang et al. introduced a grouping and partitioning method for sticky insect pests found in apple orchards by employing a Gaussian Mixture Model enhanced with Density and Curvature weighting (GMM-DC). The method involves image preprocessing, adhesion region analysis, and point cloud reconstruction to segment adhesive pests. The GMM-DC model achieved an average accurate segmentation rate of 95.75%, significantly outperforming traditional methods. The authors also integrated the segmentation method with an improved Mask R-CNN model, achieving a mean Average Precision (mAP) of 96.75% in pest recognition. This study demonstrates the proposed method’s effectiveness in improving pest identification accuracy in orchards.
Liu et al. present a study on the development of a GA-Mask R-CNN model for identifying stem-boring and leaf-rolling pests in rice crops. The model uses a generative adversarial network (GAN) to enhance the ability of the classification network to detect and recognize pest-specific features, improving its recognition accuracy. This study includes data collection, image preprocessing, and the construction of a multi-source dataset. The GA-Mask R-CNN model achieved an average precision (AP) of 92.71%, recall (R) of 89.28%, and a balanced F1 score of 90.96%, showing significant improvement over traditional models. This research demonstrates the model’s effectiveness in the remote intelligent monitoring of rice pests.
Wang et al. present an improved YOLOv7-tiny model for detecting small tomato pests in yellow sticky traps. The model incorporates a context information extraction (CIE) block based on a Transformer encoder, a Tiny-ELAN fusion network, and a P2 detection head. The proposed method enhances the detection accuracy for small pests by capturing the global context and improving feature fusion and achieves a mean average precision (mAP) of 90.4%, outperforming other advanced object detection models. This study provides a robust solution for pest detection in greenhouse environments.
Yu et al. introduce the YOLOv5s-ECCW model, a lightweight and efficient detection model for identifying sugarcane smut in natural environments. The model incorporates EfficientNetV2, CBAM, and the C3STR module to enhance feature extraction and reduce computational complexity. The YOLOv5s-ECCW model achieves a mean average precision (mAP) of 97.8% with only 4.9 G FLOPs and 3.25 M. The model outperforms existing models in terms of accuracy, efficiency, and model size, providing an effective solution for the real-time identification of sugarcane smut. This is conducive to promoting the timely intervention of sugarcane pests and diseases, thus reducing crop losses.
Chen et al. introduces IPMCNet, a lightweight deep learning model for the multiclassification of invasive plants. The model uses depth-wise separable convolutional kernels, channel splitting, and the elimination of fully connected layers to reduce the number of parameters. The study explores the impact of different loss functions and attention modules on model accuracy. IPMCNet achieved a Top-1 accuracy of 94.52% with the second-lowest number of parameters among all models tested. The model demonstrates strong generalization ability and high accuracy in identifying invasive plants, even with limited data. This research proposes a method for in-field detection using smartphones or drones, providing a cost-effective and efficient solution for invasive plant identification.

2.4. Seed Detection and Quality Assessment

Zhang et al. propose a multiscale feature fusion deep residual network (MSFF-ResNet) for the classification of needle-shaped Bidens L. seeds. The model combines the advantages of deep residual networks with feature extraction at multiple scales and an attention-based mechanism. The MSFF-ResNet model achieves an average accuracy of 93.81% and an F1-score of 94.44% on the test set and significantly improves the recognition accuracy of Bidens L. seeds, providing an innovative approach for accurately classifying seeds with needle-like shapes in agriculture.
Gao et al. introduced an effective approach for creating a large-scale wheat grain instance segmentation dataset constructed from high-definition imagery. The method integrates Grounding DINO and Segment Anything models into a labeling workflow, facilitating the rapid annotation of pixel-wise masks. This method adopts preprocessing via the cropping of overlapping images, combined with merging in postprocessing, to effectively align ultra-high-resolution images with the input requirements of the model, improving prediction accuracy. This method significantly reduces manual annotation time (by approximately 74.53%) and achieves similar annotation results when compared with the traditional method, and provides technical support for efficient dataset construction in customs quarantine and weed and insect screening.
Liang et al. develops an intelligent detection system to evaluate the visual quality of wheat grains using advanced neural network technology. The system integrates high-performance hardware and software components, including a fine-grained classification model based on multi-grained convolutional neural networks. systemin addition, it achieves a recognition accuracy rate of 99.45% for wheat grain categories and offers a fivefold increase in detection efficiency compared to manual recognition. The system includes a user-friendly interface for sorting and weighing imperfect grains, automatically assessing the quality of the target grains. Thus, it reduces human error, increases the speed and accuracy of food inspection, and improves agricultural productivity and food safety standards.

3. Conclusions

Precision agriculture relies heavily on advanced detection and monitoring technologies to optimize crop management, reduce resource waste, and enhance yield quality. Recently, significant progress has been made in in-field detection and monitoring technologies for precision agriculture. These technologies, ranging from deep learning models for pest and disease detection to advanced sensors and imaging techniques for crop quality and environmental monitoring, offer practical solutions for enhancing agricultural productivity and sustainability.
However, precision agriculture still faces multiple challenges: Although the lightweight model has excellent performance in field recognition, the generalization ability in small-sample scenarios needs to be strengthened; the deployment cost and durability of data acquisition equipment hinder the widespread adoption of smart technologies; and the compatibility and data sharing mechanisms of different technical modules need to be improved. In the future, emphasis should be placed on integrating field detection technology into precision agriculture systems and making them automated.

Author Contributions

S.Z., C.W., and X.Q. conceived this Special Issue and processed the submissions. All authors have read and agreed to the published version of the manuscript.

Funding

This article was funded by the National Key Research and Development Program of China (2022YFC2601500 and 2022YFC2601504), the National Natural Science Foundation of China (32272633), the Shenzhen Science and Technology Program (KCXFZ20240903093859009), and the Agricultural Science and Technology Innovation Program.

Data Availability Statement

The raw datasets were made available by the authors of each of the twelve published (List of Contributions 1-12) for the Special Issue “In-Field Detection and Monitoring Technology in Precision Agriculture” (https://www.mdpi.com/journal/agronomy/special_issues/JA390X0REI#published; revised 16 December 2024).

Acknowledgments

S.Z.: C.W., and X.Q., in their role as guest editors, thank all of the authors, reviewers, and topic editors for their interest and determination. Without them, this Special Issue would not have been possible. We also thank Agronomy’s editorial team, Editor-in-Chief, and section assistant managing editor for encouraging and enabling S.Z., C.W., and X.Q. to host this Special Issue.

Conflicts of Interest

The authors declare no conflicts of interest.

List of Contributions

  • Guo, J.; Dong, X.; Qiu, B. Analysis of the Factors Affecting the Deposition Coverage of Air-Assisted Electrostatic Spray on Tomato Leaves. Agronomy 2024, 14, 1108.
  • Pan, C.; Hu, J.; Cai, H.; Jiang, J.; Gu, K.; Zhu, C.; Mao, G. Development and Optimization of a Chamber System Applied to Maize Net Ecosystem Carbon Exchange Measurements. Agronomy 2023, 14, 68.
  • Lai, S.; Ming, H.; Huang, Q.; Qin, Z.; Duan, L.; Cheng, F.; Han, G. Remote Sensing Extraction of Crown Planar Area and Plant Number of Papayas Using UAV Images with Very High Spatial Resolution. Agronomy 2024, 14, 636.
  • Zhang, Q.; Chen, Q.; Xu, L.; Xu, X.; Liang, Z. Wheat Lodging Direction Detection for Combine Harvesters Based on Improved K-Means and Bag of Visual Words. Agronomy 2023, 13, 2227.
  • Wang, Y.; Liu, S.; Ren, Z.; Ma, B.; Mu, J.; Sun, L.; Zhang, H.; Wang, J. Clustering and Segmentation of Adhesive Pests in Apple Orchards Based on GMM-DC. Agronomy 2023, 13, 2806.
  • Liu, S.; Fu, S.; Hu, A.; Ma, P.; Hu, X.; Tian, X.; Zhang, H.; Liu, S. Research on Insect Pest Identification in Rice Canopy Based on GA-Mask R-CNN. Agronomy 2023, 13, 2155.
  • Wang, S.; Chen, D.; Xiang, J.; Zhang, C. A Deep-Learning-Based Detection Method for Small Target Tomato Pests in Insect Traps. Agronomy 2024, 14, 2887.
  • Yu, M.; Li, F.; Song, X.; Zhou, X.; Zhang, X.; Wang, Z.; Lei, J.; Huang, Q.; Zhu, G.; Huang, W.; Huang, H.; Chen, X.; Yang, Y.; Huang, D.; Li, Q.; Fang, H.; Yan, M. YOLOv5s-ECCW: A Lightweight Detection Model for Sugarcane Smut in Natural Environments. Agronomy 2024, 14, 2327.
  • Chen, Y.; Qiao, X.; Qin, F.; Huang, H.; Liu, B.; Li, Z.; Liu, C.; Wang, Q.; Wan, F.; Qian, W.; Huang, Y. IPMCNet: A Lightweight Algorithm for Invasive Plant Multiclassification. Agronomy 2024, 14, 333.
  • Zhang, Z.; Huang, Y.; Chen, Y.; Liu, Z.; Liu, B.; Liu, C.; Huang, C.; Qian, W.; Zhang, S.; Qiao, X. A Recognition Model Based on Multiscale Feature Fusion for Needle-Shaped Bidens L. Seeds. Agronomy 2024, 14, 2675.
  • Gao, Q.; Li, H.; Meng, T.; Xu, X.; Sun, T.; Yin, L.; Chai, X. A Rapid Construction Method for High-Throughput Wheat Grain Instance Segmentation Dataset Using High-Resolution Images. Agronomy 2024, 14, 1032.
  • Liang, J.; Chen, J.; Zhou, M.; Li, H.; Xu, Y.; Xu, F.; Yin, L.; Chai, X. An Intelligent Detection System for Wheat Appearance Quality. Agronomy 2024, 14, 1057.

References

  1. Lowenberg-DeBoer, J. Precision Farming and the New Information Technology: Implications for Farm Management, Policy, and Research: Discussion. Am. J. Agric. Econ. 1996, 78, 1281–1284. [Google Scholar] [CrossRef]
  2. Swart, R. Security Risks of Global Environmental Changes. Glob. Environ. Change 1996, 6, 187–192. [Google Scholar] [CrossRef]
  3. McLennon, E.; Dari, B.; Jha, G.; Sihi, D.; Kankarla, V. Regenerative Agriculture and Integrative Permaculture for Sustainable and Technology Driven Global Food Production and Security. Agron. J. 2021, 113, 4541–4559. [Google Scholar] [CrossRef]
  4. Lohan, S.K.; Prakash, C.; Lohan, N.; Kansal, S.; Karkee, M. State-of-the-Art in Real-Time Virtual Interfaces for Tractors and Farm Machines: A Systematic Review. Comput. Electron. Agric. 2025, 231, 109947. [Google Scholar]
  5. Zhang, N.; Wang, M.; Wang, N. Precision Agriculture—A Worldwide Overview. Comput. Electron. Agric. 2002, 36, 113–132. [Google Scholar] [CrossRef]
  6. Petrović, B.; Bumbálek, R.; Zoubek, T.; Kuneš, R.; Smutný, L.; Bartoš, P. Application of Precision Agriculture Technologies in Central Europe-Review. J. Agric. Food Res. 2024, 15, 101048. [Google Scholar] [CrossRef]
  7. Dusadeerungsikul, P.O.; Nof, S.Y. Precision Agriculture with AI-Based Responsive Monitoring Algorithm. Int. J. Prod. Econ. 2024, 271, 109204. [Google Scholar] [CrossRef]
  8. Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review. Agronomy 2021, 11, 646. [Google Scholar] [CrossRef]
  9. Munnaf, M.A.; Haesaert, G.; Van Meirvenne, M.; Mouazen, A.M. Site-Specific Seeding Using Multi-Sensor and Data Fusion Techniques: A Review. In Advances in Agronomy; Elsevier: Amsterdam, The Netherlands, 2020; Volume 161, pp. 241–323. [Google Scholar]
  10. Tulu, B.B.; Teshome, F.; Ampatzidis, Y.; Hailegnaw, N.S.; Bayabil, H.K. AgriSenAI: Automating UAV Thermal and Multispectral Image Processing for Precision Agriculture. SoftwareX 2025, 30, 102083. [Google Scholar] [CrossRef]
  11. Liu, T.; Zhao, Y.; Sun, Y.; Wang, J.; Yao, Z.; Chen, C.; Zhong, X.; Liu, S.; Sun, C.; Li, T. High-Throughput Identification of Fusarium Head Blight Resistance in Wheat Varieties Using Field Robot-Assisted Imaging and Deep Learning Techniques. J. Clean. Prod. 2024, 480, 144024. [Google Scholar] [CrossRef]
  12. Figorilli, S.; Moscovini, L.; Vasta, S.; Tocci, F.; Violino, S.; Abraham, D.; Pascal, S.; Benjamin, K.; Sandoval, R.; Spencer, R.; et al. Smart IoT Device for in Field Black Sigatoka Disease Recognition and Mapping. Smart Agric. Technol. 2025, 10, 100762. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, S.; Wang, C.; Qiao, X. In-Field Detection and Monitoring Technology in Precision Agriculture. Agronomy 2025, 15, 783. https://doi.org/10.3390/agronomy15040783

AMA Style

Zhang S, Wang C, Qiao X. In-Field Detection and Monitoring Technology in Precision Agriculture. Agronomy. 2025; 15(4):783. https://doi.org/10.3390/agronomy15040783

Chicago/Turabian Style

Zhang, Shuo, Cong Wang, and Xi Qiao. 2025. "In-Field Detection and Monitoring Technology in Precision Agriculture" Agronomy 15, no. 4: 783. https://doi.org/10.3390/agronomy15040783

APA Style

Zhang, S., Wang, C., & Qiao, X. (2025). In-Field Detection and Monitoring Technology in Precision Agriculture. Agronomy, 15(4), 783. https://doi.org/10.3390/agronomy15040783

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop