Topic Editors

College of Engineering, South China Agricultural University, Guangzhou 510642, China
College of Engineering, South China Agricultural University, Guangzhou 510642, China
Dr. Zhigang Zhang
College of Engineering, South China Agricultural University, Guangzhou 510642, China

Digital Agriculture, Smart Farming and Crop Monitoring

Abstract submission deadline
28 February 2026
Manuscript submission deadline
30 April 2026
Viewed by
604

Topic Information

Dear Colleagues,

We are pleased to announce a topic focusing on the rapidly evolving fields of Digital Agriculture, Smart Farming, and Crop Monitoring. This Topic aims to explore the latest advancements, challenges, and opportunities in leveraging digital technologies to transform agricultural practices, enhance productivity, and ensure sustainable farming systems.

Scope of the Topic:

This Topic invites original research articles, reviews, and case studies that address the following themes (but are not limited to):

  • Crop Monitoring and Management:
    1. Remote sensing and satellite imaging for crop health assessment;
    2. Early detection of pests, diseases, and abiotic stresses for crops;
    3. Real-time crop monitoring and yield prediction.
  • Smart Farming for Crop Production:
    1. Precision agriculture technologies for crop optimization;
    2. Smart irrigation and nutrient management systems;
    3. Decision support systems for crop management.
  • Digital Innovations in Crop Science:
    1. Big data analytics for crop modeling and prediction;
    2. Crop microphenotype by innovative imaging to computational analysis;
    3. IoT-based solutions for crop monitoring and management.

Prof. Dr. Qingting Liu
Prof. Dr. Tao Wu
Dr. Zhigang Zhang
Topic Editors

Keywords

  • precision farming
  • IoT in agriculture
  • AI in farming
  • drone technology
  • crop health monitoring

Participating Journals

Journal Name Impact Factor CiteScore Launched Year First Decision (median) APC
Agriculture
agriculture
3.3 4.9 2011 19.2 Days CHF 2600 Submit
AgriEngineering
agriengineering
3.0 4.7 2019 21.8 Days CHF 1600 Submit
Agronomy
agronomy
3.3 6.2 2011 17.6 Days CHF 2600 Submit
Applied Sciences
applsci
2.5 5.3 2011 18.4 Days CHF 2400 Submit
Automation
automation
- 2.9 2020 24.1 Days CHF 1000 Submit
Crops
crops
- - 2021 22.1 Days CHF 1000 Submit
Robotics
robotics
2.9 6.7 2012 21 Days CHF 1800 Submit
Sensors
sensors
3.4 7.3 2001 18.6 Days CHF 2600 Submit

Preprints.org is a multidisciplinary platform offering a preprint service designed to facilitate the early sharing of your research. It supports and empowers your research journey from the very beginning.

MDPI Topics is collaborating with Preprints.org and has established a direct connection between MDPI journals and the platform. Authors are encouraged to take advantage of this opportunity by posting their preprints at Preprints.org prior to publication:

  1. Share your research immediately: disseminate your ideas prior to publication and establish priority for your work.
  2. Safeguard your intellectual contribution: Protect your ideas with a time-stamped preprint that serves as proof of your research timeline.
  3. Boost visibility and impact: Increase the reach and influence of your research by making it accessible to a global audience.
  4. Gain early feedback: Receive valuable input and insights from peers before submitting to a journal.
  5. Ensure broad indexing: Web of Science (Preprint Citation Index), Google Scholar, Crossref, SHARE, PrePubMed, Scilit and Europe PMC.

Published Papers (3 papers)

Order results
Result details
Journals
Select all
Export citation of selected articles as:
20 pages, 29897 KiB  
Article
Accurate Parcel Extraction Combined with Multi-Resolution Remote Sensing Images Based on SAM
by Yong Dong, Hongyan Wang, Yuan Zhang, Xin Du, Qiangzi Li, Yueting Wang, Yunqi Shen, Sichen Zhang, Jing Xiao, Jingyuan Xu, Sifeng Yan, Shuguang Gong and Haoxuan Hu
Agriculture 2025, 15(9), 976; https://doi.org/10.3390/agriculture15090976 (registering DOI) - 30 Apr 2025
Viewed by 46
Abstract
Accurately extracting parcels from satellite images is crucial in precision agriculture. Traditional edge detection fails in complex scenes with difficult post-processing, and deep learning models are time-consuming in terms of sample preparation and less transferable. Based on this, we designed a method combining [...] Read more.
Accurately extracting parcels from satellite images is crucial in precision agriculture. Traditional edge detection fails in complex scenes with difficult post-processing, and deep learning models are time-consuming in terms of sample preparation and less transferable. Based on this, we designed a method combining multi-resolution remote sensing images based on the Segment Anything Model (SAM). Using cropland masking, overlap prediction and post-processing, we achieved 10 m-resolution parcel extraction with SAM, with performance in plain areas comparable to existing deep learning models (P: 0.89, R: 0.91, F1: 0.91, IoU: 0.87). Notably, in hilly regions with fragmented cultivated land, our approach even outperformed these models (P: 0.88, R: 0.76, F1: 0.81, IoU: 0.69). Subsequently, the 10 m parcels results were utilized to crop the high-resolution image. Based on the histogram features and internal edge features of the parcels, used to determine whether to segment downward or not, and at the same time, by setting the adaptive parameters of SAM, sub-meter parcel extraction was finally realized. Farmland boundaries extracted from high-resolution images can more accurately characterize the actual parcels, which is meaningful for farmland production and management. This study extended the application of deep learning large models in remote sensing, and provided a simple and fast method for accurate extraction of parcels boundaries. Full article
(This article belongs to the Topic Digital Agriculture, Smart Farming and Crop Monitoring)
Show Figures

Figure 1

23 pages, 4175 KiB  
Article
Detection of Leaf Miner Infestation in Chickpea Plants Using Hyperspectral Imaging in Morocco
by Mohamed Arame, Issam Meftah Kadmiri, Francois Bourzeix, Yahya Zennayi, Rachid Boulamtat and Abdelghani Chehbouni
Agronomy 2025, 15(5), 1106; https://doi.org/10.3390/agronomy15051106 - 30 Apr 2025
Viewed by 124
Abstract
This study addresses the problem of early detection of leaf miner infestations in chickpea crops, a significant agricultural challenge. It is motivated by the potential of hyperspectral imaging, once properly combined with machine learning, to enhance the accuracy of pest detection. Originality consists [...] Read more.
This study addresses the problem of early detection of leaf miner infestations in chickpea crops, a significant agricultural challenge. It is motivated by the potential of hyperspectral imaging, once properly combined with machine learning, to enhance the accuracy of pest detection. Originality consists of the application of these techniques to chickpea plants in controlled laboratory conditions using a natural infestation protocol, something not previously explored. The two major methodologies adopted in the approach are as follows: (1) spectral feature-based classification using hyperspectral data within the 400–1000 nm range, wherein a random forest classifier is trained to classify a plant as healthy or infested with eggs or larvae. Dimensionality reduction methods such as principal component analysis (PCA) and kernel principal component analysis (KPCA) were tried, and the best classification accuracies (over 80%) were achieved. (2) VI-based classification, leveraging indices associated with plant health, such as NDVI, EVI, and GNDVI. A support vector machine and random forest classifiers effectively classified healthy and infested plants based on these indices, with over 81% classification accuracies. The main objective was to design an integrated early pest detection framework using advanced imaging and machine learning techniques. Results show that both approaches have resulted in high classification accuracy, highlighting the potential of this approach in precision agriculture for timely pest management interventions. Full article
(This article belongs to the Topic Digital Agriculture, Smart Farming and Crop Monitoring)
Show Figures

Figure 1

15 pages, 7102 KiB  
Article
Non-Contact Detection of Wine Grape Load Volume in Hopper During Mechanical Harvesting
by Haowei Liu, Xiu Wang, Jian Song, Mingzhou Chen, Cuiling Li and Changyuan Zhai
Agriculture 2025, 15(9), 918; https://doi.org/10.3390/agriculture15090918 - 23 Apr 2025
Viewed by 175
Abstract
Issues of poor real-time performance and low accuracy in the detection of load volume in the hopper during the mechanized harvesting of wine grapes are addressed in this study through the development of a proposed volume detection method based on ultrasonic sensors. First, [...] Read more.
Issues of poor real-time performance and low accuracy in the detection of load volume in the hopper during the mechanized harvesting of wine grapes are addressed in this study through the development of a proposed volume detection method based on ultrasonic sensors. First, the ultrasonic sensor beamwidth and detection height were determined through calibration tests. Next, a test bench was used to explore the influence of the number of ultrasonic sensors and conveying speed on the detected grape pile height. Data-based regression and hopper configuration-based geometric models correlating grape load volume with detected pile height were subsequently constructed; their accuracies were compared using test bench experiments to identify the optimal detection scheme. The regression model was more accurate than the geometric model under the considered conveying speeds with a maximum relative error of 8.0% for the former. Finally, field tests determined that the average grape load volume detection error during actual harvesting was 14.4%. Therefore, this study provides an effective solution for the detection of grape load volume in the hopper during mechanized harvesting and establishes a theoretical basis for the development of intelligent grape harvesting methods. Full article
(This article belongs to the Topic Digital Agriculture, Smart Farming and Crop Monitoring)
Show Figures

Figure 1

Back to TopTop