Crop Monitoring and Weed Management Based on Sensor-Actuation Systems

A special issue of Agriculture (ISSN 2077-0472). This special issue belongs to the section "Digital Agriculture".

Deadline for manuscript submissions: closed (15 December 2023) | Viewed by 7427

Special Issue Editors


E-Mail Website
Guest Editor
Norwegian Institute of Bioeconomy Research, NIBIO Særheim, Postvegen 213, 4353 Klepp Stasjon, Norway
Interests: statistical analysis; AI and Ml in agriculture; sensor-based agricultural management; plant-plant interactions; alternative weed control; allelopathy
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor

Special Issue Information

Dear Colleagues,

The need to produce more and high-quality food contrasts with the increasing social and legislative pressure to reduce chemical inputs in agriculture. Remote sensing, precision application technologies, artificial intelligence, and digital data management contribute to generate reliable agricultural research and improve food production. Sensor and actuation systems offer promising alternatives to accomplish precision farming and data digitalization. This smart farming approach helps farmers to better monitor crops and optimize practices, adapted to changing environmental factors. The current use of sensing systems for crop assessment and weed management, either for conventional or organic farming, is less than anticipated. This Special Issue aims to combine current research and developments, concerning novel sensors and methodologies together with their specific applications in crop monitoring and weed management. Papers demonstrating innovative ways in which sensor and actuation systems are implemented in precision agriculture, such as plant stress identification, plant nutrition status assessment and fertilization and weed management, precision application or environmentally friendly agricultural technologies, are highly welcome.

Kind regards,

Dr. Victor Rueda-Ayala
Dr. Dionisio Andujar
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Agriculture is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • sensors
  • actuators
  • precision agriculture
  • online application
  • phenotyping
  • real-time actuation
  • artificial intelligence
  • environmental protection
  • cost-effective technology

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 3774 KiB  
Article
3D Assessment of Vine Training Systems Derived from Ground-Based RGB-D Imagery
by Hugo Moreno, José Bengochea-Guevara, Angela Ribeiro and Dionisio Andújar
Agriculture 2022, 12(6), 798; https://doi.org/10.3390/agriculture12060798 - 31 May 2022
Cited by 3 | Viewed by 2565
Abstract
In the field of computer vision, 3D reconstruction of crops plays a crucially important role in agriculture. On-ground assessment of geometrical features of vineyards is of vital importance to generate valuable information that enables producers to take the optimum actions in terms of [...] Read more.
In the field of computer vision, 3D reconstruction of crops plays a crucially important role in agriculture. On-ground assessment of geometrical features of vineyards is of vital importance to generate valuable information that enables producers to take the optimum actions in terms of agricultural management. A training system of vines (Vitis vinifera L.), which involves pruning and a trellis system, results in a particular vine architecture, which is vital throughout the phenological stages. Pruning is required to maintain the vine’s health and to keep its productivity under control. The creation of 3D models of vineshoots is of crucial importance for management planning. Volume and structural information can improve pruning systems, which can increase crop yield and improve crop management. In this experiment, an RGB-D camera system, namely Kinect v2, was used to reconstruct 3D vine models, which were used to determine shoot volume on eight differentiated vineyard training systems: Lyre, GDC (Geneva Double Curtain), Y-Trellis, Pergola, Single Curtain, Smart Dyson, VSP (Vertical Shoot Positioned), and the head-trained Gobelet. The results were compared with dry biomass ground truth-values. Dense point clouds had a substantial impact on the connection between the actual biomass measurements in four of the training systems (Pergola, Curtain, Smart Dyson and VSP). For the comparison of actual dry biomass and RGB-D volume and its associated 3D points, strong linear fits were obtained. Significant coefficients of determination (R2 = 0.72 to R2 = 0.88) were observed according to the number of points connected to each training system separately, and the results revealed good correlations with actual biomass and volume values. When comparing RGB-D volume to weight, Pearson’s correlation coefficient increased to 0.92. The results reveal that the RGB-D approach is also suitable for shoot reconstruction. The research proved how an inexpensive optical sensor can be employed for rapid and reproducible 3D reconstruction of vine vegetation that can improve cultural practices such as pruning, canopy management and harvest. Full article
(This article belongs to the Special Issue Crop Monitoring and Weed Management Based on Sensor-Actuation Systems)
Show Figures

Figure 1

17 pages, 4493 KiB  
Article
Impact of Camera Viewing Angle for Estimating Leaf Parameters of Wheat Plants from 3D Point Clouds
by Minhui Li, Redmond R. Shamshiri, Michael Schirrmann and Cornelia Weltzien
Agriculture 2021, 11(6), 563; https://doi.org/10.3390/agriculture11060563 - 20 Jun 2021
Cited by 10 | Viewed by 3751
Abstract
Estimation of plant canopy using low-altitude imagery can help monitor the normal growth status of crops and is highly beneficial for various digital farming applications such as precision crop protection. However, extracting 3D canopy information from raw images requires studying the effect of [...] Read more.
Estimation of plant canopy using low-altitude imagery can help monitor the normal growth status of crops and is highly beneficial for various digital farming applications such as precision crop protection. However, extracting 3D canopy information from raw images requires studying the effect of sensor viewing angle by taking into accounts the limitations of the mobile platform routes inside the field. The main objective of this research was to estimate wheat (Triticum aestivum L.) leaf parameters, including leaf length and width, from the 3D model representation of the plants. For this purpose, experiments with different camera viewing angles were conducted to find the optimum setup of a mono-camera system that would result in the best 3D point clouds. The angle-control analytical study was conducted on a four-row wheat plot with a row spacing of 0.17 m and with two seeding densities and growth stages as factors. Nadir and six oblique view image datasets were acquired from the plot with 88% overlapping and were then reconstructed to point clouds using Structure from Motion (SfM) and Multi-View Stereo (MVS) methods. Point clouds were first categorized into three classes as wheat canopy, soil background, and experimental plot. The wheat canopy class was then used to extract leaf parameters, which were then compared with those values from manual measurements. The comparison between results showed that (i) multiple-view dataset provided the best estimation for leaf length and leaf width, (ii) among the single-view dataset, canopy, and leaf parameters were best modeled with angles vertically at −45° and horizontally at 0° (VA −45, HA 0), while (iii) in nadir view, fewer underlying 3D points were obtained with a missing leaf rate of 70%. It was concluded that oblique imagery is a promising approach to effectively estimate wheat canopy 3D representation with SfM-MVS using a single camera platform for crop monitoring. This study contributes to the improvement of the proximal sensing platform for crop health assessment. Full article
(This article belongs to the Special Issue Crop Monitoring and Weed Management Based on Sensor-Actuation Systems)
Show Figures

Figure 1

Back to TopTop