Computer Vision for Agriculture and Smart Farming

A special issue of AgriEngineering (ISSN 2624-7402).

Deadline for manuscript submissions: closed (31 October 2024) | Viewed by 12963

Special Issue Editor


E-Mail Website
Guest Editor
Department of Agricultural Sciences, University of Naples Federico II, Portici, Italy
Interests: precision agriculture; AI in agriculture; sensors; UAVs; remote sensing
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In recent years, computer vision as a non-destructive, non-contact image analysis technique has been used in a variety of application fields. The use of deep learning algorithms and artificial intelligence has enabled the enhancement and specialization of these techniques by also specializing them in the field of agriculture and smart farming. The use of computer vision allows the improvement of management, planning, prediction, and decision-making of every stage of agricultural processing and smart farming. The implementation of computer vision also enables the use of automated operators such as rovers and UAVs capable of recognizing the target and operating autonomously.

This Special Issue aims to bring together recent developments and applications of computer vision and artificial intelligence in the field of agriculture and smart farming as evidence that they can be applied to improve the management, prediction, planning and enforcement of all phases of agricultural and farming practices. Submissions are open for original scientific articles, reviews, and technical reports on the use of computer vision and artificial intelligence in disease, pest, and weed detection; crop growth monitoring; automatic crop harvesting; automated pesticide spraying; product inspection and quality testing; plant phenotyping; species recognition; yield prediction; water management; soil management; livestock, poultry, and fish farming. We would like to invite you to share with the broad audience of the journal AgriEngineering your experience in the research and development of computer vision applications and techniques for agriculture and smart farming. Papers presented in this Special Issue can build on the number of other published publications in this field. Your papers will enlarge the knowledge and skills of the scientific community in this area of expanding agricultural research and development.

Dr. Mariano Crimaldi
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. AgriEngineering is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • computer vision
  • artificial intelligence
  • deep learning
  • machine learning
  • smart agriculture
  • agriculture 4.0
  • neural networks
  • sensors
  • UAV
  • drones
  • autonomous agricultural operators
  • rovers
  • UTVs
  • livestock farming
  • IoT
  • remote sensing

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

19 pages, 53371 KiB  
Article
Efficient UAV-Based Automatic Classification of Cassava Fields Using K-Means and Spectral Trend Analysis
by Apinya Boonrang, Pantip Piyatadsananon and Tanakorn Sritarapipat
AgriEngineering 2024, 6(4), 4406-4424; https://doi.org/10.3390/agriengineering6040250 - 22 Nov 2024
Viewed by 447
Abstract
High-resolution images captured by Unmanned Aerial Vehicles (UAVs) play a vital role in precision agriculture, particularly in evaluating crop health and detecting weeds. However, the detailed pixel information in these images makes classification a time-consuming and resource-intensive process. Despite these challenges, UAV imagery [...] Read more.
High-resolution images captured by Unmanned Aerial Vehicles (UAVs) play a vital role in precision agriculture, particularly in evaluating crop health and detecting weeds. However, the detailed pixel information in these images makes classification a time-consuming and resource-intensive process. Despite these challenges, UAV imagery is increasingly utilized for various agricultural classification tasks. This study introduces an automatic classification method designed to streamline the process, specifically targeting cassava plants, weeds, and soil classification. The approach combines K-means unsupervised classification with spectral trend-based labeling, significantly reducing the need for manual intervention. The method ensures reliable and accurate classification results by leveraging color indices derived from RGB data and applying mean-shift filtering parameters. Key findings reveal that the combination of the blue (B) channel, Visible Atmospherically Resistant Index (VARI), and color index (CI) with filtering parameters, including a spatial radius (sp) = 5 and a color radius (sr) = 10, effectively differentiates soil from vegetation. Notably, using the green (G) channel, excess red (ExR), and excess green (ExG) with filtering parameters (sp = 10, sr = 20) successfully distinguishes cassava from weeds. The classification maps generated by this method achieved high kappa coefficients of 0.96, with accuracy levels comparable to supervised methods like Random Forest classification. This technique offers significant reductions in processing time compared to traditional methods and does not require training data, making it adaptable to different cassava fields captured by various UAV-mounted optical sensors. Ultimately, the proposed classification process minimizes manual intervention by incorporating efficient pre-processing steps into the classification workflow, making it a valuable tool for precision agriculture. Full article
(This article belongs to the Special Issue Computer Vision for Agriculture and Smart Farming)
Show Figures

Figure 1

20 pages, 4375 KiB  
Article
Differentiating Growth Patterns in Winter Wheat Cultivars via Unmanned Aerial Vehicle Imaging
by Asparuh I. Atanasov, Hristo P. Stoyanov and Atanas Z. Atanasov
AgriEngineering 2024, 6(4), 3652-3671; https://doi.org/10.3390/agriengineering6040208 - 7 Oct 2024
Viewed by 728
Abstract
Wheat is one of the most widely grown cereal crops, serving as a key factor in sustaining the nutritional and food balance in numerous countries. The use of non-contact methods for wheat monitoring allows for the rapid diagnosis of vegetation density, crop growth, [...] Read more.
Wheat is one of the most widely grown cereal crops, serving as a key factor in sustaining the nutritional and food balance in numerous countries. The use of non-contact methods for wheat monitoring allows for the rapid diagnosis of vegetation density, crop growth, and the presence of weeds and diseases in the investigated fields. This study aims to assess the potential for differentiating growth patterns in winter wheat cultivars by examining them with two unmanned aerial vehicles (UAVs), the Mavic 2 Pro and Phantom 4 Pro, equipped with a multispectral camera from the MAPIR™ brand. Based on an experimental study conducted in the Southern Dobruja region (Bulgaria), vegetation reflectance indices, such as the Normalized-Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and Enhanced Vegetation Index 2 (EVI2), were generated, and a database was created to track their changing trends. The obtained results showed that the values of the NDVI, EVI2, and SAVI can be used to predict the productive potential of wheat, but only after accounting for the meteorological conditions of the respective growing season. The proposed methodology provides accurate results in small areas, with a resolution of 0.40 cm/pixel when flying at an altitude of 12 m and 2.3 cm/pixel when flying at an altitude of 100 m. The achieved precision in small and ultra-small agricultural areas, at a width of 1.2 m, will help wheat breeders conduct precise diagnostics of individual wheat varieties. Full article
(This article belongs to the Special Issue Computer Vision for Agriculture and Smart Farming)
Show Figures

Figure 1

21 pages, 28695 KiB  
Article
Augmented Reality Applied to Identify Aromatic Herbs Using Mobile Devices
by William Aparecido Celestino Lopes, João Carlos Lopes Fernandes, Samira Nascimento Antunes, Marcelo Eloy Fernandes, Irenilza de Alencar Nääs, Oduvaldo Vendrametto and Marcelo Tsuguio Okano
AgriEngineering 2024, 6(3), 2824-2844; https://doi.org/10.3390/agriengineering6030164 - 13 Aug 2024
Viewed by 695
Abstract
Correctly identifying and classifying food is decisive in food safety. The food sector is constantly evolving, and one of the technologies that stands out is augmented reality (AR). During practical studies at Companhia de Entreposto e Armazéns Gerais de São Paulo (CEAGESP), responsible [...] Read more.
Correctly identifying and classifying food is decisive in food safety. The food sector is constantly evolving, and one of the technologies that stands out is augmented reality (AR). During practical studies at Companhia de Entreposto e Armazéns Gerais de São Paulo (CEAGESP), responsible for the largest food storage in South America, difficulties were identified in classifying aromatic herbs due to the large number of species. The project aimed to create an innovative AR application called ARomaticLens to solve the challenges associated with identifying and classifying aromatic herbs using the design science research (DSR) methodology. The research was divided into five stages according to the DSR methodology, from surveying the problem situation at CEAGESP to validating the application through practical tests and an experience questionnaire carried out by CEAGESP specialists. The result of the study presented 100% accuracy in identifying the 18 types of aromatic herbs studied when associated with the application’s local database without the use of an Internet connection, in addition to a score of 8 on a scale of 0 to 10 in terms of the usability of the interface as rated by users. The advantage of the applied method is that the app can be used offline. Full article
(This article belongs to the Special Issue Computer Vision for Agriculture and Smart Farming)
Show Figures

Figure 1

18 pages, 9158 KiB  
Article
A Novel Algorithm to Detect White Flowering Honey Trees in Mixed Forest Ecosystems Using UAV-Based RGB Imaging
by Atanas Z. Atanasov, Boris I. Evstatiev, Valentin N. Vladut and Sorin-Stefan Biris
AgriEngineering 2024, 6(1), 95-112; https://doi.org/10.3390/agriengineering6010007 - 11 Jan 2024
Cited by 3 | Viewed by 1492
Abstract
Determining the productive potential of flowering vegetation is crucial in obtaining bee products. The application of a remote sensing approach of terrestrial objects can provide accurate information for the preparation of maps of the potential bee pasture in a given region. The study [...] Read more.
Determining the productive potential of flowering vegetation is crucial in obtaining bee products. The application of a remote sensing approach of terrestrial objects can provide accurate information for the preparation of maps of the potential bee pasture in a given region. The study is aimed at the creation of a novel algorithm to identify and distinguish white flowering honey plants, such as black locust (Robinia pseudo-acacia) and to determine the areas occupied by this forest species in mixed forest ecosystems using UAV-based RGB imaging. In our study, to determine the plant cover of black locust in mixed forest ecosystems we used a DJI (Da-Jiang Innovations, Shenzhen, China) Phantom 4 Multispectral drone with 6 multispectral cameras with 1600 × 1300 image resolution. The monitoring was conducted in the May 2023 growing season in the village of Yuper, Northeast Bulgaria. The geographical location of the experimental region is 43°32′4.02″ N and 25°45′14.10″ E at an altitude of 223 m. The UAV was used to make RGB and multispectral images of the investigated forest massifs, which were thereafter analyzed with the software product QGIS 3.0. The spectral images of the observed plants were evaluated using the newly created criteria for distinguishing white from non-white colors. The results obtained for the scanned area showed that approximately 14–15% of the area is categorized as white-flowered trees, and the remaining 86–85%—as non-white-flowered. The comparison of the developed algorithm with the Enhanced Bloom Index (EBI) approach and with supervised Support Vector Machine (SVM) classification showed that the suggested criterion is easy to understand for users with little technical experience, very accurate in identifying white blooming trees, and reduces the number of false positives and false negatives. The proposed approach of detecting and mapping the areas occupied by white flowering honey plants, such as black locust (Robinia pseudo-acacia) in mixed forest ecosystems is of great importance for beekeepers in determining the productive potential of the region and choosing a place for an apiary. Full article
(This article belongs to the Special Issue Computer Vision for Agriculture and Smart Farming)
Show Figures

Figure 1

19 pages, 17534 KiB  
Article
Detection of Varroa destructor Infestation of Honeybees Based on Segmentation and Object Detection Convolutional Neural Networks
by Mochen Liu, Mingshi Cui, Baohua Xu, Zhenguo Liu, Zhenghao Li, Zhenyuan Chu, Xinshan Zhang, Guanlu Liu, Xiaoli Xu and Yinfa Yan
AgriEngineering 2023, 5(4), 1644-1662; https://doi.org/10.3390/agriengineering5040102 - 26 Sep 2023
Cited by 5 | Viewed by 2380
Abstract
Varroa destructor infestation is a major factor leading to the global decline of honeybee populations. Monitoring the level of Varroa mite infestation in order to take timely control measures is crucial for the protection of bee colonies. Machine vision systems can achieve non-invasive [...] Read more.
Varroa destructor infestation is a major factor leading to the global decline of honeybee populations. Monitoring the level of Varroa mite infestation in order to take timely control measures is crucial for the protection of bee colonies. Machine vision systems can achieve non-invasive Varroa mite detection on bee colonies, but it is challenged by two factors: the complex dynamic scenes of honeybees and small-scale and limited data on Varroa destructor. We design a convolutional neural network integrated with machine vision to solve these problems. To address the first challenge, we separate the image of the honeybee from its surroundings using a segmentation network, and the object-detection network YOLOX detects Varroa mites within the segmented regions. This collaboration between segmentation and object detection allows for more precise detection and reduces false positives. To handle the second challenge, we add a Coordinate Attention (CA) mechanism in YOLOX to extract a more discriminative representation of Varroa destructor and improve the confidence loss function to alleviate the problem of class imbalance. The experimental results in the bee farm showed that the evaluation metrics of our model are better than other models. Our network’s detection value for the percentage of honeybees infested with Varroa mites is 1.13%, which is the closest to the true value of 1.19% among all the detection values. Full article
(This article belongs to the Special Issue Computer Vision for Agriculture and Smart Farming)
Show Figures

Figure 1

Review

Jump to: Research

20 pages, 666 KiB  
Review
Trends and Prospect of Machine Vision Technology for Stresses and Diseases Detection in Precision Agriculture
by Jaemyung Shin, Md. Sultan Mahmud, Tanzeel U. Rehman, Prabahar Ravichandran, Brandon Heung and Young K. Chang
AgriEngineering 2023, 5(1), 20-39; https://doi.org/10.3390/agriengineering5010003 - 24 Dec 2022
Cited by 26 | Viewed by 4869
Abstract
Introducing machine vision-based automation to the agricultural sector is essential to meet the food demand of a rapidly growing population. Furthermore, extensive labor and time are required in agriculture; hence, agriculture automation is a major concern and an emerging subject. Machine vision-based automation [...] Read more.
Introducing machine vision-based automation to the agricultural sector is essential to meet the food demand of a rapidly growing population. Furthermore, extensive labor and time are required in agriculture; hence, agriculture automation is a major concern and an emerging subject. Machine vision-based automation can improve productivity and quality by reducing errors and adding flexibility to the work process. Primarily, machine vision technology has been used to develop crop production systems by detecting diseases more efficiently. This review provides a comprehensive overview of machine vision applications for stress/disease detection on crops, leaves, fruits, and vegetables with an exploration of new technology trends as well as the future expectation in precision agriculture. In conclusion, research on the advanced machine vision system is expected to develop the overall agricultural management system and provide rich recommendations and insights into decision-making for farmers. Full article
(This article belongs to the Special Issue Computer Vision for Agriculture and Smart Farming)
Show Figures

Figure 1

Back to TopTop