sensors-logo

Journal Browser

Journal Browser

Special Issue "Smart Agriculture Sensors"

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Remote Sensors".

Deadline for manuscript submissions: closed (20 April 2021).

Special Issue Editor

Prof. Dr. Emilio Gil
E-Mail Website
Guest Editor
Department of Agro Food Engineering and Biotechnology, Universitat Politècnica de Catalunya, Esteve Terradas, 8, 08860 Castelldefels, Spain
Interests: crop protection; precision farming; spray technology, training; specialty crops

Special Issue Information

Dear Colleagues,

Agriculture 4.0 is a topic that is widely used to define activities related to agriculture and food production where technology plays a fundamental role; data management and the execution of activities is a holistic process based on the accuracy, efficiency, and friendliness of data acquisition. Sensors and all the specific characteristics around them are fundamental elements to this topic, with great influence on the success of the global process. In recent decades, important changes have occurred in the sensor development process. Great accuracy, robustness, broad adaptability to different objectives, allows almost any data acquisition process to be carried out with guaranteed quality. However, the agricultural sector has enormous differences when compared with other activities. Weather conditions, including high temperatures, high sunshine luminosity, vibrations, dust, and other special circumstances, makes the use of sensors in agriculture a great and exciting challenge. From close to remote sensors, light- or sound-based functioning, sophisticated or simple data acquisition processes, from the most popular to the most specific devices, the agricultural sector offers a wide list of potential applications of the latest technologies to improve the food production chain, giving special attention to the primary phase of the chain.

The objective of this Special Issue is to create an updated state-of-the-art concerning the application of the sensor technology adapted for the special circumstances of the agricultural sector. Regarding soil and/or leaf moisture detectors, chlorophyll-measuring devices, canopy detectors, early detection systems for pest/disease evaluation, etc., this Special Issue represents a good opportunity to update the substantial and interesting research activity all around the world. Original contributions related to the described topic are especially encouraged.

Prof. Dr. Emilio Gil
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

Article
Identification of Two Commercial Pesticides by a Nanoparticle Gas-Sensing Array
Sensors 2021, 21(17), 5803; https://doi.org/10.3390/s21175803 - 28 Aug 2021
Viewed by 413
Abstract
This study presents the experimental testing of a gas-sensing array, for the detection of two commercially available pesticides (i.e., Chloract 48 EC and Nimrod), towards its eventual use along a commercial smart-farming system. The array is comprised of four distinctive sensing devices based [...] Read more.
This study presents the experimental testing of a gas-sensing array, for the detection of two commercially available pesticides (i.e., Chloract 48 EC and Nimrod), towards its eventual use along a commercial smart-farming system. The array is comprised of four distinctive sensing devices based on nanoparticles, each functionalized with a different gas-absorbing polymeric layer. As discussed herein, the sensing array is able to identify as well as quantify three gas-analytes, two pesticide solutions, and relative humidity, which acts as a reference analyte. All of the evaluation experiments were conducted in close to real-life conditions; specifically, the sensors response towards the three analytes was tested in three relative humidity backgrounds while the effect of temperature was also considered. The unique response patterns generated after the exposure of the sensing-array to the two gas-analytes were analyzed using the common statistical analysis tool Principal Component Analysis (PCA). The sensing array, being compact, low-cost, and highly sensitive, can be easily integrated with pre-existing crop-monitoring solutions. Given that there are limited reports for effective pesticide gas-sensing solutions, the proposed gas-sensing technology would significantly upgrade the added-value of the integrated system, providing it with unique advantages. Full article
(This article belongs to the Special Issue Smart Agriculture Sensors)
Show Figures

Graphical abstract

Article
Sensor Fusion with NARX Neural Network to Predict the Mass Flow in a Sugarcane Harvester
Sensors 2021, 21(13), 4530; https://doi.org/10.3390/s21134530 - 01 Jul 2021
Viewed by 650
Abstract
Measuring the mass flow of sugarcane in real-time is essential for harvester automation and crop monitoring. Data integration from multiple sensors should be an alternative to receive more reliable, accurate, and valuable predictions than data delivered by a single sensor. In this sense, [...] Read more.
Measuring the mass flow of sugarcane in real-time is essential for harvester automation and crop monitoring. Data integration from multiple sensors should be an alternative to receive more reliable, accurate, and valuable predictions than data delivered by a single sensor. In this sense, the objective was to evaluate if the fusion of different sensors installed in a sugarcane harvester improves the mass flow prediction accuracy. A harvester was experimentally instrumented, and neural network models integrated sensor data along the harvester to perform the self-calibration of these sensors and estimate the mass flow. Nonlinear autoregressive networks with exogenous input (NARX) and multiple linear regression (MLR) models were compared to predict the mass flow. The prediction with the NARX showed a significant superiority over MLR. MLR decreases the estimated mass flow variability in the harvester. NARX with multi-sensor data has an RMSE of 0.3 kg s−1, representing a MAPE of 0.7%. The fusion of sensor signals improves prediction accuracy, with higher performance than studies with approaches that used a single sensor. The mass flow approach with multiple sensors is a potential approach to replace conventional yield monitors. The system generates accurate data with high sample density within sugarcane rows. Full article
(This article belongs to the Special Issue Smart Agriculture Sensors)
Show Figures

Figure 1

Article
Assessing the Performance of RGB-D Sensors for 3D Fruit Crop Canopy Characterization under Different Operating and Lighting Conditions
Sensors 2020, 20(24), 7072; https://doi.org/10.3390/s20247072 - 10 Dec 2020
Cited by 1 | Viewed by 910
Abstract
The use of 3D sensors combined with appropriate data processing and analysis has provided tools to optimise agricultural management through the application of precision agriculture. The recent development of low-cost RGB-Depth cameras has presented an opportunity to introduce 3D sensors into the agricultural [...] Read more.
The use of 3D sensors combined with appropriate data processing and analysis has provided tools to optimise agricultural management through the application of precision agriculture. The recent development of low-cost RGB-Depth cameras has presented an opportunity to introduce 3D sensors into the agricultural community. However, due to the sensitivity of these sensors to highly illuminated environments, it is necessary to know under which conditions RGB-D sensors are capable of operating. This work presents a methodology to evaluate the performance of RGB-D sensors under different lighting and distance conditions, considering both geometrical and spectral (colour and NIR) features. The methodology was applied to evaluate the performance of the Microsoft Kinect v2 sensor in an apple orchard. The results show that sensor resolution and precision decreased significantly under middle to high ambient illuminance (>2000 lx). However, this effect was minimised when measurements were conducted closer to the target. In contrast, illuminance levels below 50 lx affected the quality of colour data and may require the use of artificial lighting. The methodology was useful for characterizing sensor performance throughout the full range of ambient conditions in commercial orchards. Although Kinect v2 was originally developed for indoor conditions, it performed well under a range of outdoor conditions. Full article
(This article belongs to the Special Issue Smart Agriculture Sensors)
Show Figures

Figure 1

Article
Empirical Model of Radio Wave Propagation in the Presence of Vegetation inside Greenhouses Using Regularized Regressions
Sensors 2020, 20(22), 6621; https://doi.org/10.3390/s20226621 - 19 Nov 2020
Cited by 1 | Viewed by 610
Abstract
Spain is Europe’s leading exporter of tomatoes harvested in greenhouses. The production of tomatoes should be kept and increased, supported by precision agriculture to meet food and commercial demand. The wireless sensor network (WSN) has demonstrated to be a tool to provide farmers [...] Read more.
Spain is Europe’s leading exporter of tomatoes harvested in greenhouses. The production of tomatoes should be kept and increased, supported by precision agriculture to meet food and commercial demand. The wireless sensor network (WSN) has demonstrated to be a tool to provide farmers with useful information on the state of their plantations due to its practical deployment. However, in order to measure its deployment within a crop, it is necessary to know the communication coverage of the nodes that make up the network. The multipath propagation of radio waves between the transceivers of the WSN nodes inside a greenhouse is degraded and attenuated by the intricate complex of stems, branches, leaf twigs, and fruits, all randomly oriented, that block the line of sight, consequently generating a signal power loss as the distance increases. Although the COST235 (European Cooperation in Science and Technology - COST), ITU-R (International Telecommunications Union—Radiocommunication Sector), FITU-R (Fitted ITU-R), and Weisbberger models provide an explanation of the radio wave propagation in the presence of vegetation in the 2.4 GHz ICM band, some significant discrepancies were found when they are applied to field tests with tomato greenhouses. In this paper, a novel method is proposed for determining an empirical model of radio wave attenuation for vegetation in the 2.4 GHz band, which includes the vegetation height as a parameter in addition to the distance between transceivers of WNS nodes. The empirical attenuation model was obtained applying regularized regressions with a multiparametric equation using experimental signal RSSI measurements achieved by our own RSSI measurement system for our field tests in four plantations. The evaluation parameters gave 0.948 for R2, 0.946 for R2 Adj considering 5th grade polynomial (20 parameters), and 0.942 for R2, and 0.940 for R2 Adj when a reduction of parameters was applied using the cross validation (15 parameters). These results verify the rationality and reliability of the empirical model. Finally, the model was validated considering experimental data from other plantations, reaching similar results to our proposed model. Full article
(This article belongs to the Special Issue Smart Agriculture Sensors)
Show Figures

Figure 1

Article
Monitoring the Growth and Yield of Fruit Vegetables in a Greenhouse Using a Three-Dimensional Scanner
Sensors 2020, 20(18), 5270; https://doi.org/10.3390/s20185270 - 15 Sep 2020
Viewed by 973
Abstract
Monitoring the growth of fruit vegetables is essential for the automation of cultivation management, and harvest. The objective of this study is to demonstrate that the current sensor technology can monitor the growth and yield of fruit vegetables such as tomato, cucumber, and [...] Read more.
Monitoring the growth of fruit vegetables is essential for the automation of cultivation management, and harvest. The objective of this study is to demonstrate that the current sensor technology can monitor the growth and yield of fruit vegetables such as tomato, cucumber, and paprika. We estimated leaf area, leaf area index (LAI), and plant height using coordinates of polygon vertices from plant and canopy surface models constructed using a three-dimensional (3D) scanner. A significant correlation was observed between the measured and estimated leaf area, LAI, and plant height (R2 > 0.8, except for tomato LAI). The canopy structure of each fruit vegetable was predicted by integrating the estimated leaf area at each height of the canopy surface models. A linear relationship was observed between the measured total leaf area and the total dry weight of each fruit vegetable; thus, the dry weight of the plant can be predicted using the estimated leaf area. The fruit weights of tomato and paprika were estimated using the fruit solid model constructed by the fruit point cloud data extracted using the RGB value. A significant correlation was observed between the measured and estimated fruit weights (tomato: R2 = 0.739, paprika: R2 = 0.888). Therefore, it was possible to estimate the growth parameters (leaf area, plant height, canopy structure, and yield) of different fruit vegetables non-destructively using a 3D scanner. Full article
(This article belongs to the Special Issue Smart Agriculture Sensors)
Show Figures

Figure 1

Article
Classification Accuracy Improvement for Small-Size Citrus Pests and Diseases Using Bridge Connections in Deep Neural Networks
Sensors 2020, 20(17), 4992; https://doi.org/10.3390/s20174992 - 03 Sep 2020
Cited by 1 | Viewed by 783
Abstract
Due to the rich vitamin content in citrus fruit, citrus is an important crop around the world. However, the yield of these citrus crops is often reduced due to the damage of various pests and diseases. In order to mitigate these problems, several [...] Read more.
Due to the rich vitamin content in citrus fruit, citrus is an important crop around the world. However, the yield of these citrus crops is often reduced due to the damage of various pests and diseases. In order to mitigate these problems, several convolutional neural networks were applied to detect them. It is of note that the performance of these selected models degraded as the size of the target object in the image decreased. To adapt to scale changes, a new feature reuse method named bridge connection was developed. With the help of bridge connections, the accuracy of baseline networks was improved at little additional computation cost. The proposed BridgeNet-19 achieved the highest classification accuracy (95.47%), followed by the pre-trained VGG-19 (95.01%) and VGG-19 with bridge connections (94.73%). The use of bridge connections also strengthens the flexibility of sensors for image acquisition. It is unnecessary to pay more attention to adjusting the distance between a camera and pests and diseases. Full article
(This article belongs to the Special Issue Smart Agriculture Sensors)
Show Figures

Figure 1

Article
Estimation of a New Canopy Structure Parameter for Rice Using Smartphone Photography
Sensors 2020, 20(14), 4011; https://doi.org/10.3390/s20144011 - 19 Jul 2020
Cited by 3 | Viewed by 815
Abstract
The objective of this study was to develop a low-cost method for rice growth information obtained quickly using digital images taken with smartphone. A new canopy parameter, namely, the canopy volume parameter (CVP), was proposed and developed for rice using the leaf area [...] Read more.
The objective of this study was to develop a low-cost method for rice growth information obtained quickly using digital images taken with smartphone. A new canopy parameter, namely, the canopy volume parameter (CVP), was proposed and developed for rice using the leaf area index (LAI) and plant height (PH). Among these parameters, the CVP was selected as an optimal parameter to characterize rice yields during the growth period. Rice canopy images were acquired with a smartphone. Image feature parameters were extracted, including the canopy cover (CC) and numerous vegetation indices (VIs), before and after image segmentation. A rice CVP prediction model in which the CC and VIs served as independent variables was established using a random forest (RF) regression algorithm. The results revealed the following. The CVP was better than the LAI and PH for predicting the final yield. And a CVP prediction model constructed according to a local modelling method for distinguishing different types of rice varieties was the most accurate (coefficient of determination (R2) = 0.92; root mean square error (RMSE) = 0.44). These findings indicate that digital images can be used to track the growth of crops over time and provide technical support for estimating rice yields. Full article
(This article belongs to the Special Issue Smart Agriculture Sensors)
Show Figures

Figure 1

Review

Jump to: Research

Review
Review of Weed Detection Methods Based on Computer Vision
Sensors 2021, 21(11), 3647; https://doi.org/10.3390/s21113647 - 24 May 2021
Cited by 2 | Viewed by 975
Abstract
Weeds are one of the most important factors affecting agricultural production. The waste and pollution of farmland ecological environment caused by full-coverage chemical herbicide spraying are becoming increasingly evident. With the continuous improvement in the agricultural production level, accurately distinguishing crops from weeds [...] Read more.
Weeds are one of the most important factors affecting agricultural production. The waste and pollution of farmland ecological environment caused by full-coverage chemical herbicide spraying are becoming increasingly evident. With the continuous improvement in the agricultural production level, accurately distinguishing crops from weeds and achieving precise spraying only for weeds are important. However, precise spraying depends on accurately identifying and locating weeds and crops. In recent years, some scholars have used various computer vision methods to achieve this purpose. This review elaborates the two aspects of using traditional image-processing methods and deep learning-based methods to solve weed detection problems. It provides an overview of various methods for weed detection in recent years, analyzes the advantages and disadvantages of existing methods, and introduces several related plant leaves, weed datasets, and weeding machinery. Lastly, the problems and difficulties of the existing weed detection methods are analyzed, and the development trend of future research is prospected. Full article
(This article belongs to the Special Issue Smart Agriculture Sensors)
Show Figures

Figure 1

Back to TopTop