Special Issue "Applications of Remote Image Capture System in Agriculture"

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Earth Sciences and Geography".

Deadline for manuscript submissions: 30 June 2020.

Special Issue Editors

Prof. Dr. José Miguel Molina Martínez
Website
Guest Editor
Department of Agricultural Engineering, Technical University of Cartagena, 30202, Cartagena, Murcia, Spain
Interests: agricultural sciences; computer science in agronomy; decision sciences
Prof. Dr. Ginés García-Mateos
Website
Guest Editor
Department of Computer Science, University of Murcia, 30100 Murcia, Spain
Interests: computer vision; image processing in agriculture; pattern recognition

Special Issue Information

Dear Colleagues,

In recent years, image capture systems are increasingly being used in agricultural engineering as a means to obtain information of interest from the crops, the soil, and the environment. Remote imaging systems are especially relevant, since they allow acquiring frequent and high-resolution information of great extensions. This Special Issue aims to address the applications of digital photography for the management of water resources, energy, pest and disease control, etc. in agriculture. The concept of remote image capture systems includes different types of devices (from satellites and drones, to digital cameras on the ground integrated in wireless sensor networks), different types of spectral information (from standard RGB images, to multispectral and hyperspectral images), different types of applications (water management, pest detection, yield estimation, plant monitoring, etc.), and different types of techniques (in the fields of image capture systems, image processing and analysis, computer vision and pattern recognition, decision support systems, etc.). Manuscripts covering these topics are invited to participate in the present Special Issue.

Prof. Dr. José Miguel Molina Martínez
Prof. Dr. Ginés García-Mateos
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • computer vision in agriculture
  • digital photography
  • agricultural engineering
  • mathematical models
  • hydrology
  • energy efficiency
  • multispectral and hyperspectral imaging systems
  • drones and satellites in agriculture

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

Open AccessArticle
An Augmented Reality Tool for Teaching Application in the Agronomy Domain
Appl. Sci. 2020, 10(10), 3632; https://doi.org/10.3390/app10103632 - 24 May 2020
Abstract
Nowadays, the combination of new technologies and the use of mobile devices opens up a new range of teaching–learning strategies in different agricultural engineering degrees. This article presents an augmented reality tool that allows for improved spatial viewing for students who have certain [...] Read more.
Nowadays, the combination of new technologies and the use of mobile devices opens up a new range of teaching–learning strategies in different agricultural engineering degrees. This article presents an augmented reality tool that allows for improved spatial viewing for students who have certain difficulties with viewing graphic representations of agronomic systems and devices. This tool is known as ARTID (Augmented Reality for Teaching, Innovation and Design) and consists in a free-access mobile application for devices using the Android operating system. The proposed method provides each exploded drawing or overall drawing with a QR code that can be used by students to view their 3D models by augmented reality in their own mobile devices. An evaluation experience was carried out to assess the validity of the tool on different devices and the acceptance and satisfaction level of this kind of resources in subjects of graphic expression in engineering. Finally, an example of application in the agronomic domain is provided by the 3D virtual model of portable ferticontrol equipment that comprises the different structures and tanks, which, if viewed by conventional graphical representations, may entail a certain level of difficulty. Thanks to this tool, reality can be merged with the virtual world to help favour the understanding of certain concepts and to increase student motivation in agronomy studies. Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Open AccessArticle
Detecting Banana Plantations in the Wet Tropics, Australia, Using Aerial Photography and U-Net
Appl. Sci. 2020, 10(6), 2017; https://doi.org/10.3390/app10062017 - 16 Mar 2020
Abstract
Bananas are the world’s most popular fruit and an important staple food source. Recent outbreaks of Panama TR4 disease are threatening the global banana industry, which is worth an estimated $8 billion. Current methods to map land uses are time- and resource-intensive and [...] Read more.
Bananas are the world’s most popular fruit and an important staple food source. Recent outbreaks of Panama TR4 disease are threatening the global banana industry, which is worth an estimated $8 billion. Current methods to map land uses are time- and resource-intensive and result in delays in the timely release of data. We have used existing land use mapping to train a U-Net neural network to detect banana plantations in the Wet Tropics of Queensland, Australia, using high-resolution aerial photography. Accuracy assessments, based on a stratified random sample of points, revealed the classification achieves a user’s accuracy of 98% and a producer’s accuracy of 96%. This is more accurate compared to existing (manual) methods, which achieved a user’s and producer’s accuracy of 86% and 92% respectively. Using a neural network is substantially more efficient than manual methods and can inform a more rapid respond to existing and new biosecurity threats. The method is robust and repeatable and has potential for mapping other commodities and land uses which is the focus of future work. Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Open AccessArticle
A Machine Learning Method to Estimate Reference Evapotranspiration Using Soil Moisture Sensors
Appl. Sci. 2020, 10(6), 1912; https://doi.org/10.3390/app10061912 - 11 Mar 2020
Abstract
One of the most important applications of remote imaging systems in agriculture, with the greatest impact on global sustainability, is the determination of optimal crop irrigation. The methodology proposed by the Food and Agriculture Organization (FAO) is based on estimating crop evapotranspiration (ETc), [...] Read more.
One of the most important applications of remote imaging systems in agriculture, with the greatest impact on global sustainability, is the determination of optimal crop irrigation. The methodology proposed by the Food and Agriculture Organization (FAO) is based on estimating crop evapotranspiration (ETc), which is done by computing the reference crop evapotranspiration (ETo) multiplied by a crop coefficient (Kc). Some previous works proposed methods to compute Kc using remote crop images. The present research aims at complementing these systems, estimating ETo with the use of soil moisture sensors. A crop of kikuyu grass (Pennisetum clandestinum) was used as the reference crop. Four frequency-domain reflectometry sensors were installed, gathering moisture information during the study period from May 2015 to September 2016. Different machine learning regression algorithms were analyzed for the estimation of ETo using moisture and climatic data. The values were compared with respect to the ETo computed in an agroclimatic station using the Penman–Monteith method. The best method was the randomizable filtered classifier technique, based on the K* algorithm. This model achieved a correlation coefficient, R, of 0.9936, with a root-mean-squared error of 0.183 mm/day and 6.52% mean relative error; the second-best model used artificial neural networks, with an R of 0.9470 and 11% relative error. Thus, this new methodology allows obtaining accurate and cost-efficient prediction models for ETo, as well as for the water balance of the crops. Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Open AccessArticle
Prediction of Fracture Damage of Sandstone Using Digital Image Correlation
Appl. Sci. 2020, 10(4), 1280; https://doi.org/10.3390/app10041280 - 14 Feb 2020
Abstract
Investigation on the deformation mechanism of sandstone is crucial to understanding the life cycle patterns of pertinent infrastructure systems considering the extensive adoption of sandstone in infrastructure construction of various engineering systems, e.g., agricultural engineering systems. In this study, the state-of-the-art digital image [...] Read more.
Investigation on the deformation mechanism of sandstone is crucial to understanding the life cycle patterns of pertinent infrastructure systems considering the extensive adoption of sandstone in infrastructure construction of various engineering systems, e.g., agricultural engineering systems. In this study, the state-of-the-art digital image correlation (DIC) method, which uses classical digital photography, is employed to explore the detailed failure course of sandstone with physical uniaxial compression tests. Four typical points are specifically selected to characterize the global strain field by plotting their corresponding strain–time relationship curves. Thus, the targeted failure thresholds are identified. The Hill–Tsai failure criterion and finite element simulation are then used for the cross-check process of DIC predictions. The results show that, though errors exist between the experimental and the theoretical values, overall, they are sufficiently low to be ignored, indicating good agreement. From the results, near-linear relationships between strain and time are detected before failure at the four chosen points and the failure strain thresholds are almost the same; as low as 0.004. Failure thresholds of sandstone are reliably determined according to the strain variation curve, to forecast sandstone damage and failure. Consequently, the proposed technology and associated information generated from this study could be of assistance in the safety and health monitoring processes of relevant infrastructure system applications. Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Open AccessArticle
A Method for Detecting Coffee Leaf Rust through Wireless Sensor Networks, Remote Sensing, and Deep Learning: Case Study of the Caturra Variety in Colombia
Appl. Sci. 2020, 10(2), 697; https://doi.org/10.3390/app10020697 - 19 Jan 2020
Abstract
Agricultural activity has always been threatened by the presence of pests and diseases that prevent the proper development of crops and negatively affect the economy of farmers. One of these pests is Coffee Leaf Rust (CLR), which is a fungal epidemic disease that [...] Read more.
Agricultural activity has always been threatened by the presence of pests and diseases that prevent the proper development of crops and negatively affect the economy of farmers. One of these pests is Coffee Leaf Rust (CLR), which is a fungal epidemic disease that affects coffee trees and causes massive defoliation. As an example, this disease has been affecting coffee trees in Colombia (the third largest producer of coffee worldwide) since the 1980s, leading to devastating losses between 70% and 80% of the harvest. Failure to detect pathogens at an early stage can result in infestations that cause massive destruction of plantations and significantly damage the commercial value of the products. The most common way to detect this disease is by walking through the crop and performing a human visual inspection. As a result of this problem, different research studies have proven that technological methods can help to identify these pathogens. Our contribution is an experiment that includes a CLR development stage diagnostic model in the Coffea arabica, Caturra variety, scale crop through the technological integration of remote sensing (through drone capable multispectral cameras), wireless sensor networks (multisensor approach), and Deep Learning (DL) techniques. Our diagnostic model achieved an F1-score of 0.775. The analysis of the results revealed a p-value of 0.231, which indicated that the difference between the disease diagnosis made employing a visual inspection and through the proposed technological integration was not statistically significant. The above shows that both methods were significantly similar to diagnose the disease. Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Open AccessArticle
Designing a Fruit Identification Algorithm in Orchard Conditions to Develop Robots Using Video Processing and Majority Voting Based on Hybrid Artificial Neural Network
Appl. Sci. 2020, 10(1), 383; https://doi.org/10.3390/app10010383 - 04 Jan 2020
Abstract
The first step in identifying fruits on trees is to develop garden robots for different purposes such as fruit harvesting and spatial specific spraying. Due to the natural conditions of the fruit orchards and the unevenness of the various objects throughout it, usage [...] Read more.
The first step in identifying fruits on trees is to develop garden robots for different purposes such as fruit harvesting and spatial specific spraying. Due to the natural conditions of the fruit orchards and the unevenness of the various objects throughout it, usage of the controlled conditions is very difficult. As a result, these operations should be performed in natural conditions, both in light and in the background. Due to the dependency of other garden robot operations on the fruit identification stage, this step must be performed precisely. Therefore, the purpose of this paper was to design an identification algorithm in orchard conditions using a combination of video processing and majority voting based on different hybrid artificial neural networks. The different steps of designing this algorithm were: (1) Recording video of different plum orchards at different light intensities; (2) converting the videos produced into its frames; (3) extracting different color properties from pixels; (4) selecting effective properties from color extraction properties using hybrid artificial neural network-harmony search (ANN-HS); and (5) classification using majority voting based on three classifiers of artificial neural network-bees algorithm (ANN-BA), artificial neural network-biogeography-based optimization (ANN-BBO), and artificial neural network-firefly algorithm (ANN-FA). Most effective features selected by the hybrid ANN-HS consisted of the third channel in hue saturation lightness (HSL) color space, the second channel in lightness chroma hue (LCH) color space, the first channel in L*a*b* color space, and the first channel in hue saturation intensity (HSI). The results showed that the accuracy of the majority voting method in the best execution and in 500 executions was 98.01% and 97.20%, respectively. Based on different performance evaluation criteria of the classifiers, it was found that the majority voting method had a higher performance. Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Open AccessArticle
Geometric and Radiometric Consistency of Parrot Sequoia Multispectral Imagery for Precision Agriculture Applications
Appl. Sci. 2019, 9(24), 5314; https://doi.org/10.3390/app9245314 - 05 Dec 2019
Abstract
This paper is about the geometric and radiometric consistency of diverse and overlapping datasets acquired with the Parrot Sequoia camera. The multispectral imagery datasets were acquired above agricultural fields in Northern Italy and radiometric calibration images were taken before each flight. Processing was [...] Read more.
This paper is about the geometric and radiometric consistency of diverse and overlapping datasets acquired with the Parrot Sequoia camera. The multispectral imagery datasets were acquired above agricultural fields in Northern Italy and radiometric calibration images were taken before each flight. Processing was performed with the Pix4Dmapper suite following a single-block approach: images acquired in different flight missions were processed in as many projects, where different block orientation strategies were adopted and compared. Results were assessed in terms of geometric and radiometric consistency in the overlapping areas. The geometric consistency was evaluated in terms of point cloud distance using iterative closest point (ICP), while the radiometric consistency was analyzed by computing the differences between the reflectance maps and vegetation indices produced according to adopted processing strategies. For normalized difference vegetation index (NDVI), a comparison with Sentinel-2 was also made. This paper will present results obtained for two (out of several) overlapped blocks. The geometric consistency is good (root mean square error (RMSE) in the order of 0.1 m), except for when direct georeferencing is considered. Radiometric consistency instead presents larger problems, especially in some bands and in vegetation indices that have differences above 20%. The comparison with Sentinel-2 products shows a general overestimation of Sequoia data but with similar spatial variations (Pearson’s correlation coefficient of about 0.7, p-value < 2.2 × 10−16). Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Open AccessArticle
Monitor Cotton Budding Using SVM and UAV Images
Appl. Sci. 2019, 9(20), 4312; https://doi.org/10.3390/app9204312 - 14 Oct 2019
Abstract
Monitoring the cotton budding rate is important for growers so that they can replant cotton in a timely fashion at locations at which cotton density is sparse. In this study, a true-color camera was mounted on an unmanned aerial vehicle (UAV) and used [...] Read more.
Monitoring the cotton budding rate is important for growers so that they can replant cotton in a timely fashion at locations at which cotton density is sparse. In this study, a true-color camera was mounted on an unmanned aerial vehicle (UAV) and used to collect images of young cotton plants to estimate the germination of cotton plants. The collected images were preprocessed by stitching them together to obtain the single orthomosaic image. The support-vector machine method and maximum likelihood classification method were conducted to identify the cotton plants in the image. The accuracy evaluation indicated the overall accuracy of the classification for SVM is 96.65% with the Kappa coefficient of 93.99%, while for maximum likelihood classification, the accuracy is 87.85% with a Kappa coefficient of 80.67%. A method based on the morphological characteristics of cotton plants was proposed to identify and count the overlapping cotton plants in this study. The analysis showed that the method can improve the detection accuracy by 6.3% when compared to without it. The validation based on visual interpretation indicated that the method presented an accuracy of 91.13%. The study showed that the minimal resolution of no less than 1.2 cm/pixel in practice for image collection is necessary in order to recognize cotton plants accurately. Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Review

Jump to: Research, Other

Open AccessReview
Systematic Mapping Study on Remote Sensing in Agriculture
Appl. Sci. 2020, 10(10), 3456; https://doi.org/10.3390/app10103456 - 17 May 2020
Abstract
The area of remote sensing techniques in agriculture has reached a significant degree of development and maturity, with numerous journals, conferences, and organizations specialized in it. Moreover, many review papers are available in the literature. The present work describes a literature review that [...] Read more.
The area of remote sensing techniques in agriculture has reached a significant degree of development and maturity, with numerous journals, conferences, and organizations specialized in it. Moreover, many review papers are available in the literature. The present work describes a literature review that adopts the form of a systematic mapping study, following a formal methodology. Eight mapping questions were defined, analyzing the main types of research, techniques, platforms, topics, and spectral information. A predefined search string was applied in the Scopus database, obtaining 1590 candidate papers. Afterwards, the most relevant 106 papers were selected, considering those with more than six citations per year. These are analyzed in more detail, answering the mapping questions for each paper. In this way, the current trends and new opportunities are discovered. As a result, increasing interest in the area has been observed since 2000; the most frequently addressed problems are those related to parameter estimation, growth vigor, and water usage, using classification techniques, that are mostly applied on RGB and hyperspectral images, captured from drones and satellites. A general recommendation that emerges from this study is to build on existing resources, such as agricultural image datasets, public satellite imagery, and deep learning toolkits. Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Other

Jump to: Research, Review

Open AccessTechnical Note
Effect of Missing Vines on Total Leaf Area Determined by NDVI Calculated from Sentinel Satellite Data: Progressive Vine Removal Experiments
Appl. Sci. 2020, 10(10), 3612; https://doi.org/10.3390/app10103612 - 23 May 2020
Abstract
Remote Sensing (RS) allows the estimation of some important vineyard parameters. There are several platforms for obtaining RS information. In this context, Sentinel satellites are a valuable tool for RS since they provide free and regular images of the earth’s surface. However, several [...] Read more.
Remote Sensing (RS) allows the estimation of some important vineyard parameters. There are several platforms for obtaining RS information. In this context, Sentinel satellites are a valuable tool for RS since they provide free and regular images of the earth’s surface. However, several problems regarding the low-resolution of the imagery arise when using this technology, such as handling mixed pixels that include vegetation, soil and shadows. Under this condition, the Normalized Difference Vegetation Index (NDVI) value in a particular pixel is an indicator of the amount of vegetation (canopy area) rather than the NDVI from the canopy (as a vigour expression), but its reliability varies depending on several factors, such as the presence of mixed pixels or the effect of missing vines (a vineyard, once established, generally loses grapevines each year due to diseases, abiotic stress, etc.). In this study, a vine removal simulation (greenhouse experiment) and an actual vine removal (field experiment) were carried out. In the field experiment, the position of the Sentinel-2 pixels was marked using high-precision GPS. Controlled removal of vines from a block of cv. Cabernet Sauvignon was done in four steps. The removal of the vines was done during the summer of 2019, matching with the start of the maximum vegetative growth. The Total Leaf Area (TLA) of each pixel was calculated using destructive field measurements. The operations were planned to have two satellite images available between each removal step. As a result, a strong linear relationship (R2 = 0.986 and R2 = 0.72) was obtained between the TLA and NDVI reductions, which quantitatively indicates the effect of the missing vines on the NDVI values. Full article
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)
Show Figures

Figure 1

Back to TopTop