Next Article in Journal
Editorial on Environmental Planning and Modeling
Next Article in Special Issue
Application of GIS in Agriculture in Promoting Evidence-Informed Decision Making for Improving Agriculture Sustainability: A Systematic Review
Previous Article in Journal
The Multi-Type Demands Oriented Framework for Flex-Route Transit Design
Previous Article in Special Issue
Opportunities from Unmanned Aerial Vehicles to Identify Differences in Weed Spatial Distribution between Conventional and Conservation Agriculture
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

EyesOnTraps: AI-Powered Mobile-Based Solution for Pest Monitoring in Viticulture

1
Fraunhofer Portugal AICOS, 4200-135 Porto, Portugal
2
GeoDouro—Consultoria e Topografia, Lda., 5100-196 Lamego, Portugal
3
Associação para o Desenvolvimento da Viticultura Duriense, 5000-033 Vila Real, Portugal
4
Centro de Investigação e de Tecnologias Agro-Ambientais e Biológica, Universidade de Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(15), 9729; https://doi.org/10.3390/su14159729
Submission received: 2 July 2022 / Revised: 1 August 2022 / Accepted: 5 August 2022 / Published: 8 August 2022

Abstract

:
Due to the increasingly alarming consequences of climate change, pests are becoming a growing threat to grape quality and viticulture yields. Estimating the quantity and type of treatments to control these diseases is particularly challenging due to the unpredictability of insects’ dynamics and intrinsic difficulties in performing pest monitoring. Conventional pest monitoring programs consist of deploying sticky traps on vineyards, which attract key insects and allow human operators to identify and count them manually. However, this is a time-consuming process that usually requires in-depth taxonomic knowledge. This scenario motivated the development of EyesOnTraps, a novel AI-powered mobile solution for pest monitoring in viticulture. The methodology behind the development of the proposed system merges multidisciplinary research efforts by specialists from different fields, including informatics, electronics, machine learning, computer vision, human-centered design, agronomy and viticulture. This research work resulted in a decision support tool that allows winegrowers and taxonomy specialists to: (i) ensure the adequacy and quality of mobile-acquired sticky trap images; (ii) provide automated detection and counting of key insects; (iii) register local temperature near traps; and (iv) improve and anticipate treatment recommendations for the detected pests. By merging mobile computing and AI, we believe that broader technology acceptance for pest management in viticulture can be achieved via solutions that work on regular sticky traps and avoid the need for proprietary instrumented traps.

1. Introduction

The usage of edge artificial intelligence (AI) is reshaping the process of farming. Common pain points are being alleviated through digitalization and process automation via AI-powered agricultural systems, such as the prevention and monitoring of pest outbreaks in viticulture. Accurate identification and seasonal tracking of key insect dynamics are essential to optimize current pest management programs, particularly in terms of effectiveness, cost, and insecticide usage.
Pest monitoring in viticulture may involve different strategies, including key insect counting in sticky traps, ambient temperature tracking, damage assessment, and risk estimation in vineyards. These procedures make it possible to maximize the effectiveness of protection methods against key insects, and consequently promote the sustainable use of pesticides, as required by international directives [1]. Conventional pest monitoring programs have mainly consisted of deploying yellow sticky traps on vineyards that attract key insects by color, or delta traps that attract by pheromones. The monitoring consists of manually identifying and counting the key species, sometimes using tweezers or magnifying glass. These time-consuming monitoring processes are mainly performed through visual inspection by winegrowers, who often lack taxonomy knowledge to identify key insects and properly determine the most appropriate protection and prevention measures.
This work aims to improve pest monitoring and prevention procedures via AI algorithms that support intelligent data acquisition and provide automated detection and counting of key insects in viticulture. We also seek to enhance supervision and assistance procedures of taxonomy specialists by registering local temperature near traps, allowing remote consultation of each trap history, and enabling the automatic sending of warnings and recommendations to winegrowers. This paper arises from substantial multidisciplinary research work, benefiting from a wide range of scientific competencies, such as informatics, electronics, machine learning, computer vision, human-centered design, agronomy, and viticulture. As output technology, it resulted in mobile and web applications that meet winegrowers and taxonomy specialists’ needs on pest management procedures. Given the evolving dynamics of insect populations, the system architecture is modular and scalable, allowing future expansions to support the detection of new insect species.
The paper is structured as follows: Section 1 summarizes the motivation and objectives of the work; Section 2 presents the related work found in the literature; Section 3 describes the proposed methodology; Section 4 details the results and the respective discussion; and finally, the conclusions and future work are drawn in Section 5.

2. Related Work

The usage of insect traps for pest management often involves relevant costs in terms of labor, transport, and logistics. In recent years, thanks to the widespread accessibility of information and communication technologies, innovative solutions have been developed to face the traditional constraints of pest monitoring procedures. In particular, several solutions already commercially available have allowed a transition from the traditional methods of monitoring pests associated with intensive human work to intelligent digital methods that integrate computer vision techniques, pest occurrence models and meteorological data with remote sensing to monitor outbreaks.
In Table 1, we present the most relevant solutions found and respective functionalities. Among the most transversal functionalities of these solutions, we highlight the following: (i) image acquisition with automatic insects identification and counting [2,3,4,5,6,7]; (ii) view trap history, review and edit pests detected automatically [2,3,4,6,8,9]; (iii) map georeferenced traps [2,3,4,5,7,10]; (iv) communicate with temperature and humidity sensors [2,3,5,7,10]; and (v) provide pest infestation alerts [2,3,7,8,9,10].
While a high degree of trap automation allows for continuous and remote data collection and processing, these benefits can be overshadowed by the need to establish a costly hardware infrastructure on the field. In particular, each pest monitoring point needs to be equipped with trap hardware components for image acquisition, data communication (e.g., GSM), processing power and power supply via batteries or solar panels. If the producer needs to create more individual pest surveillance points to increase monitoring granularity, the equipment and maintenance costs will consequently increase and eventually become financially prohibitive. It also should be noted that the use of technologies based on instrumented traps does not fully replace the need to visit each monitoring point by trained personnel. Routines, such as collecting and replacing traps, cleaning the trap of debris, changing the pest attractant (e.g., pheromone), or recording the phenological state of the surrounding crop, are still required to be performed on a regular basis [11].
From this perspective, mobile devices appear as an exciting alternative to avoid the identified infrastructure restrictions. Smartphones are an easy-to-use and widely disseminated technological tool that simultaneously allows communication, portability, and local processing power to execute AI algorithms on edge scenarios without GSM coverage. Only one of the referred commercial solutions [7] allows the usage of mobile-acquired images for pest detection, but the AI algorithms of this solution do not seem customized for sticky trap images, also missing critical features, such as communication with temperature sensors or processing the images locally on the device (offline usage). Furthermore, the reviewed commercial solutions that allow hand-held image acquisition only provide fully manual processes for image capture. Thus, the current available alternatives can be considered prone to human error since they do not include software layers to automatically assess key quality and adequacy requirements, such as image focus, the absence of illumination artifacts, or the presence of the whole trap on the image.
In summary, current commercial solutions already bring exciting efficiency gains for pest monitoring in agricultural productions. Nevertheless, generating profits in agriculture is quite challenging, and significant increases in crop maintenance costs should be avoided whenever possible. Given this, the exploration of new affordable solutions that leverage widespread pest monitoring without the aforementioned financial and logistics constraints should be considered. Specifically for the viticulture use case, there are still missing portable and sustainable solutions fully customized for monitoring vineyards on edge scenarios. By merging mobile computing and AI, we believe that broader technology acceptance can be achieved for winegrowers and taxonomy specialists, particularly via solutions that work on regular sticky traps and avoid the need for proprietary instrumented traps.

3. Methodology

The main objective of this work is to develop a novel AI-powered mobile solution for pest management in viticulture. This system (hereinafter referred to as EyesOnTraps system) is envisioned to work on regular sticky traps and be fully customized for monitoring vineyards in edge scenarios, such as open-field productions. Solutions that operate in edge scenarios are conditioned by the scenario itself (e.g., lack of GSM coverage or power supply) and by the capabilities of end devices (e.g., limited processing power). In our use case, the open-field grape production represents our edge scenario, while mobile devices are our end devices.
To develop such a system, we started by performing user-centered research for requirements and functionalities definition (see Section 3.1). This knowledge enabled us to delineate the system architecture and its main components, namely: (i) Image Acquisition Module; (ii) Insects Detection Module; (iii) Sensorization Module; (iv) Mobile Application; (v) Online Image Annotator (ground truth generator); and (vi) Web Portal.
It should be noted that the methods behind the first two components, namely the Image Acquisition Module (see the methods detailed Section 3.2) and the Insects Detection Module (see the methods detailed in Section 3.3) represent prior work from our team, presented and validated in previous publications [11,12]. Thus, these two methods do not represent direct contributions to this work. Nevertheless, given the key importance of such components on the proposed system, and to enhance the readability and understanding of the present work, we included in this section a summarized description of these previously presented methods and respective results.
Regarding the remaining system components (i.e., the Sensorization Module, the Mobile Application, the Online Image Annotator and the Web Portal), they were developed and implemented according to the guidelines that emerged from the interviews performed during the user research phase. To evaluate the overall usability of the mobile application and users’ satisfaction, we also performed usability tests.

3.1. User Research

The development of the EyesOnTraps system followed a user-centered approach in order to guarantee that it would effectively improve the trap management process and cater to the needs and expectations of winegrowers and taxonomy specialists. This meant that the first step to the digitalization of this process was gaining a deep understanding of their current procedures.
To do so, we planned interviews for winegrowers and taxonomy specialists. Questions were set by both human-centered design and machine learning researchers from our team. We used this method to ensure that the script would not only meet end-users’ needs, but also answer technical doubts that would influence the developers’ work. Together, we designed a semi-structured interview script, with questions tailored to the winegrowers and others aimed at the taxonomy specialists.
Of the questions set, nine were for both profiles. They were general questions about traps and pests to acquire a global understanding of the monitoring process. Then, six questions were concerned with eliciting specific requirements for both the web portal and the annotator for the taxonomy specialists. Finally, 10 questions were aimed at winegrowers’ practices and challenges when visiting and monitoring the traps. For the interview setup, we invited a taxonomy specialist to be present in each individual interview. We used this strategy so that any required translation related to their area of expertise could be immediately ensured. This way, although we, as interviewers, were not experts in the field, we would be able to fully understand the issues at hand, enabling to make follow-up questions that would be relevant to the identification of the solution’s requirements.
Questions were both open- and closed ended. When starting the interview, we would ask the winegrowers to describe us a regular visiting day. We would then build up questions as the interviewees described their practices. In total, we conducted 1-h long interviews with four winegrowers and two taxonomy specialistsInterviews were audio-recorded and later transcribed and analyzed, focusing on existing practices and challenges. With the knowledge acquired, the team identified practices that could be digitalized into the proposed solution and designed how to combine them with the remaining modules.
Following the feedback received from the user research phase, a functional prototype was developed, which consisted of a mobile application that includes a user interface flow for the trap management process. This interactive prototype was tested with final users, with the goal of evaluating the overall usability of the mobile application and users’ satisfaction with the prototype. Tests were conducted in laboratory and in real context. Laboratory tests were conducted at home or at our offices, focused on testing the user interfaces, and served as pre-tests for the real context tests. Usability tests with final users were then conducted in three different locations: Quinta do Bom Retiro (Ramos Pinto), Quinta do Seixo (Sogrape) and Quinta S. Luiz (Sogevinus).
In total, we conducted tests with nine participants: three in laboratory conditions and six in real context conditions. Participants of the latter included: one winegrower technician, one quality control specialist, one taxonomy expert, one agricultural technician, one farm technician and one agronomist. All participants had previous experience using smartphones for more than 2 years. All tests were recorded and later analyzed, and a quantitative analysis was performed of the collected data to measure efficacy and subjective user satisfaction. Efficacy—the ability to complete the proposed tasks—was measured via the task completion rate, number of errors and assistance required by users. As for assessing user satisfaction with system usability, we administered the Portuguese version of the Post-Study System Usability Questionnaire (PSSUQ) [13].

3.2. Image Quality & Adequacy Assessment

Hand-held image acquisition of sticky traps in the open field is quite challenging due to the variability and heterogeneity of light exposure, as well as the adequacy of the traps images (e.g., trap clearly visible, properly focused and in an acceptable perspective). Since the EyesOnTraps system relies on sticky trap images exclusively acquired with mobile devices, we considered that the precise control of image quality and adequacy should be regarded as a crucial step. Thus, we embedded an AI-based module to support the trap image acquisition via mobile devices, as the methods behind this module were presented and validated in a previous work [11].
The main goal of this module is to help winegrowers during the mobile image acquisition step to effortlessly capture quality and adequate images of sticky trap in real time. In particular, this previously reported method automatically assesses the focus and illumination of the image, while ensuring that an insect trap is present in the camera preview via the automated detection of trap type (CT or DT) and respective segmentation. This method is divided into two main steps: the preview analysis and the acquisition analysis, responsible for the processing of each camera preview frame and the acquired trap image, respectively (see Figure 1).
In order to automatically acquire both delta traps (DT) and chromotropic traps (CT) images, each preview frame must meet the requirements of four different modules: (i) focus validation; (ii) trap-type classification; (iii) shadows and reflection validation; and (iv) trap segmentation. When a set of consecutive frames passes those conditions, an image is automatically acquired, and a perspective correction step that relies on the corners of the detect trap segmentation mask is applied. To develop and validate these different algorithms, we collected and manually annotated a dataset of 516 DT and CT images. The dataset was divided into three subsets: image focus subset, reflections/shadows subset, and trap segmentation subset. The proposed approaches achieved an accuracy of 84% regarding focus assessment, an accuracy of 96% and 80% regarding the shadows and/or reflections on CT and DT traps, respectively, and a Jaccard index value of 97% for the segmentation approach [11].

3.3. Automated Insects Detection

Conventional sticky traps monitoring is performed through visual inspection by winegrowers via the manual identification and counting of key insects for viticulture. However, winegrowers lack taxonomy knowledge and availability for a thorough identification of these key insects. Thus, we embedded an AI-based module on the EyesOnTraps system to provide the automated detection of key insects. The methods behind this module were presented and validated in a previous work [12].
The main goal of this module is to support winegrowers by automatically detecting and counting key insects on conventional sticky traps (CT or DT) via mobile devices. In particular, the previously developed computer vision approach is based on lightweight deep object detection networks, suitable to be deployed and run locally on smartphones. The methodology behind the development of this model involved the exploration of five different object detection deep learning models, namely the following: (i) SSD ResNet50 (RetinaNet50); (ii) Faster R-CNN ResNet101; (iii) EfficientDet-D0; (iv) SSD MobileNet V2; and (v) CenterNet ResNet50. For benchmarking purposes, three groups of optimization strategies were considered: model-centric, data-centric, and deployment-centric. Each group comprised different types of optimizations, executed and assessed iteratively, as detailed in Figure 2.
The automated detection comprised the identification and counting of three key insects for viticulture, specifically the European Grapevine Moth, Green Leafhopper and “Flavescence Dorée” Leafhopper. Moreover, the Tomato Moth and Morphotype C (Idaea degeneraria) were also included, as they are usually caught in delta traps and can be confused with the Grapevine Moth (Figure 3).
Given the lack of freely available image datasets of key insects for viticulture, a dataset of 168 images of yellow sticky and delta traps was acquired using different mobile devices, resulting in a total of 8966 manually annotated insects by experienced taxonomy specialists. The best model for deployment on edge devices was achieved by an SSD ResNet50 model, with accuracies per class ranging from 82% to 99%, F1 Score from 58% to 84% and inference speeds per trap image of 19.4 s and 62.7 s for high-end and low-end smartphones, respectively. The confusion matrix for the final SSD Resnet50 model is provided in Table 2, as well as some illustrative examples of predicted detection in Figure 4.

3.4. Ambient Temperature Monitoring

Several exploratory experiments were conducted to support the development of a sensorization module (hereinafter referred to as EoT Sensor) that allows ambient temperature monitoring near traps. We aim to develop an affordable and reliable temperature sensor that communicates with mobile devices via Bluetooth low energy (BLE), thus allowing its widespread deployment in vineyards.
With the envisioned sensorization module, it will become economically and logistically viable to record the ambient temperature near each trap. Thus, our system will be able to apply phenology prediction models based on degrees day (DD) to predict main pest flights in every monitoring point (e.g., the model developed for the European Grapevine Moth in the Douro Demarcated Region [14]). The Douro Demarcated Region is characterized by large valleys, irregularities and steep slopes, which results in a multiplicity of microclimates with very different climatic characteristics. Temperature measurements can change drastically between parcels, and using weather stations readings to feed DD-based phenology prediction models on distant parcels might not be adequate. Thus, monitoring ambient temperature near each trap was then considered a crucial climatic parameter, while relative humidity and barometric pressure were also considered interesting parameters, being that these three climatic parameters were recorded by the EoT sensor.
Regarding electronics, we selected the platform Kallisto® (generation 2) [15], a commercial solution that allows integration with different sensors. In this platform, we integrated the environmental sensor Olimex BME280, which consists of a printed circuit board (PCB) with an integrated Bosch BME280 sensor [16]. We also had to develop a expansion board that would allow to gain access to the I2C pins, battery power pins and add an real-time clock (RTC) chip to Kallisto (see Figure 5a,b). The EoT sensor uses a battery of 2000 mAh, which theoretically allows an autonomy that covers a complete viticultural season (about 10 months) with a single charge.
Finally, we also conducted experiments regarding the casing of the sensor. To decrease the impact of environmental conditions in the precision and replicability of the readings (e.g., solar radiation, wind or dust), we concluded that the best cost–benefit option was to use a radiation shield to encapsulate the sensor [17], and a IP66 protective enclosure for the PCB (see Figure 5c).

4. Results and Discussion

4.1. Requirements and System Architecture Definition

The applied user-centered approach firstly enabled us to define a set of requirements that meet the needs and expectations of winegrowers and taxonomy specialists. Subsequently, the identified requirements served as the basis to define the architecture of the EyesOnTraps system.

4.1.1. Requirements

As reported by winegrowers, traps are usually scattered around the vineyards, sometimes far from each others: “(...) we have a parcel with 6 acres with two traps (...) and we have estates with 30 acres that have 3 traps”, which means that the simple fact of visiting them can already be a time consuming task. As such, including traps georeferencing on the system could help winegrowers plan their visits to the traps by aiding them in defining an optimal route. In addition, reminders of the best time to visit a trap could also improve this process.
Through the interviews, we understood that trap monitoring was a time-consuming and inefficient task, happening during spring and summertime, where winegrowers need to closely monitor traps on their vineyards by counting all insects stuck on the traps, usually with the help of tweezers or even using a magnifying glass. According to winegrowers, these traps can accumulate up to 600 insects, which is a challenge since winegrowers count these insects manually in the field, usually under high temperatures. Sometimes, this situation is so uncomfortable or they have so much work to do that they cannot count all the insects during the vineyard visit, so they end up collecting the traps to count later on at their office. Moreover, counting key insects implies that winegrowers need to have an in-depth taxonomic expertise to correctly identify their species, which is not usually the case. Thus, the regular involvement of taxonomy specialists in this process is crucial for the effective and timely prevention of pests. Currently, when in doubt, winegrowers take pictures of parts of the traps and share them with taxonomy specialists to ask them what species it is and if it poses a risk for the vine. Considering the above mentioned, a top priority of our system is to reduce time spent counting insects and aiding in their identification. This should be achieved by including the automatic counting and identification of insects via a trap picture collected by winegrowers.
Winegrowers also need to keep track of trap maintenance, namely when to change the pheromone (delta traps) or the glue base (yellow sticky traps). This tracking is currently performed manually, in a paper notebook with all the required information (trap location, insects count, when the last pheromone/glue base change was made, etc). By digitalizing the field notebook, EyesOnTraps can generate alerts warning users for the correct time to maintain and monitor each trap.
Additionally, taxonomy specialists should be informed by winegrowers about the phenological stage of the surrounding vines, which is usually reported according to the Baggiolini scale [18]—a standard nomenclature of the plants’ development that classifies their evolution in different stages (organized alphabetically, from stage A to N). Although this is a standard nomenclature, the taxonomy specialists raised the concern that stage selection is not very objective, with some stages being more difficult to identify than others (e.g., stage I and J). Winegrowers in their turn mentioned that they usually carry the Baggiolini scale on paper when visiting the vineyards, which helps them to select the stage by consulting illustrative images of each stage. To mitigate the subjectivity of vines classification regarding growth stage, taxonomy specialists said that having access to different photographs of the surrounding vines would be important to ensure that the actual predominant stage is selected. As such, our system should allow the selection and image capture of the phenological status by winegrowers, and their decision could be later validated by taxonomy experts.
Temperature information around traps is also vital information for advisers with taxonomic expertise. These data are used to choose the best time of year to place traps in the vines, as well as to predict when pests are most likely to attack, thus when is the best time to apply the necessary products to prevent these attacks. By including temperature sensors near each trap in our system, we can provide this relevant information to advisers. In sum, Table 3 summarizes the key tasks in the winegrower’s journey and the respective requirements (opportunities for improvement).

4.1.2. System Architecture

The system architecture of the proposed AI-powered mobile-based solution for pest monitoring in viticulture can be divided into six main components (see Figure 6): (i) Image Acquisition Module; (ii) Insects Detection Module; (iii) Sensorization Module; (iv) Mobile Application; (v) Online Image Annotator (ground truth generator); and (vi) Web Portal. It should be noted that the methods behind the Image Acquisition Module and Insects Detection Module represent prior work from our team that was already presented and validated in previous publications [11,12]; a summarized description of the explored methods and respective results is provided in Section 3.

4.2. Image Acquisition Module

To embed the methodology detailed in Section 3.2 for automated image quality and adequacy assessment in the EyesOnTraps system, an Android library module was created and compiled into an Android Archive (AAR) file used as a dependency for the EyesOnTraps mobile application. In particular, the developed Android module integrates all the previously referred algorithms in the sequence depicted in Figure 1 and guides the user during trap image acquisition by providing real-time feedback in terms of (i) focus assessment; (ii) shadow and reflections assessment; (iii) trap type detection; and (iv) trap segmentation. When a set of consecutive frames passes those conditions (to ensure stability), an image is automatically acquired (see Figure 7A–D), and the perspective correction procedure is further applied (see Figure 7E).

4.3. Insects Detection Module

The methodology detailed in Section 3.3 for the automated detection of key insects in viticulture was also integrated in the EyesOnTraps Mobile Application via an Android library module compiled into an AAR file. This Android module receives the images acquired by the Image Acquisition Module, and the developed computer vision model automatically detects and counts the five considered species of insects. This model is suitable to run on both low-end and high-end mobile devices on the edge (offline), using mobile-acquired images of conventional sticky traps (DT and CT), thus avoiding instrumented traps and complex hardware infrastructure. As output, this module provides an image with the location of the detected insects and respective counting (see Figure 8).

4.4. Sensorization Module

To test the precision and replicability of the EoT sensor readings, we deployed six sensors on three different farms in the Douro Demarcated Region (Quinta de S. Luiz, Quinta do Bairro and Quinta do Seixo). All these farms already have at least one commercial weather station installed on the vineyards, so we started by deploying three EoT sensors near a weather station of each farm, a scenario that allowed us to make a direct comparative analysis and assess the precision of the EoT sensor temperature readings. Additionally, we also installed three EoT sensors distant from weather stations. With this second scenario, we wanted to evaluate how much the temperature changes in areas with outbreaks that are distant from weather stations. To support a detailed results analysis, Table 4 presents the different types of errors between the hourly measurements of the EoT sensor and the respective weather station, for both near and distant scenarios. For each scenario, we also provide an illustrative example of temperature readings for a period of approximately 1 month in Figure 9.
The average absolute error for EoT sensors near weather stations was only 1.07 °C, with hourly temperature readings for both equipment showing very similar behavior. Given the clear differences between the compared equipment (e.g., price and specs), this small discrepancy was already expected. These results show that the precision and replicability of EoT sensor temperature readings are suitable for pest monitoring in vineyards, particularly to feed phenology prediction models based on DD to predict pest flights near each trap. Regarding the scenario of EoT sensors placed distant from weather stations, the average absolute error was 2.74 °C. This demarked difference of average absolute error between near and distant scenarios reinforces the relevance of using EoT sensors in parcels distant from weather stations to obtain more precise temperature readings.

4.5. Mobile Application

The EyesOnTraps mobile application was developed to guide winegrowers in taking photographs of sticky traps using the image acquisition module, reading and interpreting the results provided by the insects detection module, as well as sharing this information with the taxonomy experts. The applications also allow users to collect temperature data from EoT sensors via BLE. Upon entering the application (Figure 10a), winegrowers have direct access to notifications, including reminders and alerts sent by taxonomy experts regarding crop protection (Figure 10b). The main screen also includes a map with the location for each trap and the option to start an acquisition.
The first step in a new acquisition is taking photographs of the traps using the image acquisition module (see Figure 7). Users can log information regarding pheronome/glue base changes for each trap or if any insects were manually removed (Figure 10c). After completing these steps, there is a list of tasks that can be (optionally) completed to provide more information for the acquisition. In particular, the user can collect temperature readings from EoT sensors via BLE, record the phenological status or include additional comments, as depicted in Figure 10d.
To collect temperature information, users must be within the BLE range of the sensor installed near the trap and simply select the temperature option on the application. Data are then automatically downloaded and shown to the user (Figure 11a). Recording the phenological state requires winegrowers to take pictures of the surrounding vines user (Figure 11b) and then select the most appropriate status according to Baggiolini scale (Figure 11c). As this information is sent to taxonomy experts, it is expected that including these images will help them validate the chosen status. As soon as the winegrowers complete the data acquisition procedure, the collected trap image is automatically analyzed by the insect detection module, and the results are presented to the user (Figure 11d).

Usability Tests

During usability tests, participants were asked to perform the following tasks using the interactive prototype for the mobile application: (i) choose a trap and initiate an acquisition; (ii) get data from the temperature sensor; (iii) register the phenological stage; (iv) add comments to the acquisition; (v) check results and end acquisition; and (vi) check app notifications. These tasks include the main flow of the application and the necessary steps to gather the required data for a successful acquisition process and to access important information coming from the web portal through the notifications.
Results from the usability tests indicate a good performance of the application with, on average, less than one error or assistance per task and an average completion rate of 100% in 5 out of 6 tasks. Table 5 presents the mean results from PSSUQ and its sub scales. PSSUQ score starts with 1 (strongly agree) and ends with 7 (strongly disagree). The lower the score, the better the performance and satisfaction. To make sense of the scores derived from PSSUQ, we compared our results with the means determined by Sauro and Lewis [19] involving 21 studies and 210 participants. Results are positive for the overall scale as well as for each sub-scale, indicating a high perceived usability of the application by test participants.
Overall, these results mean that the acquisition process is easily completed by users, with the application fully supporting and guiding this process. These tests also confirm that the results from the user research phase were successfully translated into the application, as users perceived the system as useful, including the required features to ease the trap management process.

4.6. Online Image Annotator

To develop the insects detection module via supervised machine learning approaches, it was necessary to train computer vision algorithms with data manually labeled by taxonomy specialists. Therefore, an online image annotator was developed to generate the ground truth data required to train the AI algorithms. It consists of a web tool platform accessible via the internet, protected by user authentication, and each user may have two different roles: (i) annotator: usually a taxonomy specialist that accesses the platform for image annotation purposes; or (ii) supervisor: serves as administrator of the platform, being able to create/delete users, attribute specific traps to annotators, import/export data, etc. With this solution, it is easy for personnel with technical knowledge to contribute and support the development of the EyesOnTraps system by marking key insects on trap images and specifying the respective species (Figure 12a), as well as easily reviewing the attributed annotations by species (Figure 12b).
This tool is also fully integrated with the EyesOnTraps back-end, being that all images acquired on the field via the mobile application are automatically synchronized with this platform. We also synchronize the automated results generated on the mobile application by the insects detection module. Thus, the human annotator has the option to load and edit these automated results as an automated pre-annotation of the image (Figure 13). This process accelerated and optimized the annotation process, especially in image traps with hundreds of key insects.

4.7. Web Portal

The Web Portal is the core platform of the EyesOnTraps solution. This is where the information is stored, creating a history organized by agricultural years (time intervals of each cultural cycle) and where a set of features are made available that aggregate information obtained by the various components of the EyesOnTraps solution. It is also in the Web Portal that, using the API and the respective web services created for this purpose, the communication mechanisms between the various components of the system are implemented, allowing the transfer of data with the Mobile Application, EoT sensor and Online Image Annotator. This promotes the integration of systems and enhances future improvements of the embedded AI algorithms, as well as the expansion of the EyesOnTraps solution to other pests and other crops.
The Web Portal is made available through the internet, being integrated in the Geodouro Agricultural Management Software—SIGP [20], functioning as an independent application module from the other modules. Still, it can be integrated with some of the other SIGP modules, allowing to take advantage of the entire base structure already implemented in this information system. The Web Portal is protected by user authentication, and supports various types of users: winegrower, taxonomy specialist, entity administrator, system administrator and query user. Each type of user has different permissions and set of functionalities, according to the specifications previously defined. The implemented features allow actors to manage the entire pest monitoring process in crops, highlighting the following features:
Geographic representation of elements relevant to the management of open-field grape production (Figure 14a), such as (i) geographic delimitation of grape parcels; (ii) location of deployed sticky traps; or (iii) location of EoT sensors and weather stations. It is also possible to interact with these elements and consult additional information, such as grape varieties present in each parcel, history of trap images collected via the mobile application for each sticky trap location (Figure 14b), or temperature readings for each sensor location.
Visualization of the automated detection and counting of insects (Figure 15a), including a graphical consultation of the evolution of pest counts over time by trap and species (Figure 15b).
The Web Portal also enables the visualization of the phenological stage of the vineyard registered via the mobile application, as well as editing/correcting the information associated with the acquisitions. It is also devised to incorporate and display the outputs of phenology prediction models based on DD to predict pest flights, such as the model developed for the European Grapevine Moth [14] already integrated in the system (Figure 16a). The temperature data that feed these models are collected via the EoT sensors or commercial weather stations, being that the communication with both types of equipment are already available on the system.
We also developed a dashboard that exhibits a set of statistics related with pest flight predictions and automated pest counts, with the possibility of comparison with previous years (Figure 16b). In order to support the communication between winegrowers and taxonomy specialists, the latter can also send warnings, alerts and recommendations based on the available information that allows them to remotely monitor pest counts on winegrowers’ holdings.

5. Conclusions and Future Work

In this paper, we present the EyesOnTraps system, a novel AI-powered mobile solution for pest management in viticulture. This solution aims to improve pest monitoring and prevention procedures through AI algorithms for the intelligent data acquisition and automated detection of key insects in sticky traps. We also seek to enhance supervision and assistance provided by advisers with taxonomic expertise to winegrowers, by registering the local temperature near traps, allowing remote access to each trap history, and enabling the automatic sending of warnings and recommendations to winegrowers. The system is also prepared to incorporate different phenology prediction models based on DD to predict pest flights near every monitoring point, such as the model developed for the European Grapevine Moth already integrated in the system.
As output technology, this multidisciplinary research work resulted in mobile and web applications that meet winegrowers and taxonomy specialists’ needs on pest management procedures. Given the evolving dynamics of insect populations, the system architecture is modular and scalable, allowing future expansions to support the detection of new insect species. Future evolution of the system might also involve grouping the detected insects according to their taxonomic order or even family, which would allow to flag and track unusual (and potentially damaging) bycatches. Additionally, a wider detection of insect species could also help support further system improvements, such as taking into account the density of captured insects to signal traps for clearing and replacement, as certain situations might diminish their effectiveness in attracting other relevant species.
Regarding future work, we plan to validate the solution in real-life scenarios by performing field trials of the solution with stakeholders. We will directly evaluate the EyesOnTraps web and mobile applications with winegrowers and taxonomy specialists to assess the correct functioning of currently available features. We also aim to perform a direct comparative analysis between conventional and EyesOnTraps methods for pest management by collecting key performance indicators. Particularly, this analysis will help us quantify the efficiency gains regarding aspects, such as time spent on pest monitoring tasks, performance improvements for insects detection and counting, or the optimized and more sustainable usage of phytosanitary treatments.

Author Contributions

Conceptualization, L.R., T.N., A.F. and C.C.; methodology, L.R., P.F., J.G., E.S., A.V., C.B., J.O., R.G., T.N., A.F. and C.C.; software, L.R., P.F., J.G., E.S., A.V., J.O., R.G., T.B., D.R. and T.N.; validation, L.R., P.F., J.G., E.S., A.V., C.B., J.O., R.G., T.N., A.F. and C.C.; formal analysis, L.R., P.F., J.G., E.S., A.V., C.B., J.O. and R.G.; investigation, L.R., P.F., J.G., E.S., A.V., C.B., J.O. and R.G.; resources, T.N., A.F. and C.C.; data curation, L.R., P.F., J.G., E.S., A.V., C.B., J.O., R.G. and T.N.; writing—original draft preparation, L.R.; writing—review and editing, L.R., P.F., J.G., E.S., A.V., C.B., J.O., R.G., T.B., D.R., T.N., A.F. and C.C.; visualization, L.R., P.F., J.G., E.S., A.V., C.B., J.O., R.G. and T.N.; supervision, L.R., T.N., A.F. and C.C.; project administration, L.R., T.N., A.F. and C.C.; funding acquisition, L.R., T.N. and C.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by European Regional Development Fund (ERDF) in the frame of Norte 2020 (Programa Operacional Regional do Norte), through the project EyesOnTraps+ - Smart Learning Trap and Vineyard Health Monitoring, NORTE-01-0247-FEDER-039912.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank the stakeholders from Douro Wine Region that collaborated in this work, namely Sogevinus Quintas SA, Adriano Ramos Pinto—Vinhos SA and Sogrape Vinhos, SA.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AIArtificial Intelligence
MLMachine Learning
CTChromotropic Traps
DTDelta Traps
BLEBluetooth Low Energy
PCBPrinted Circuit Board
RTCReal-Time Clock
DDDegrees-Day
AARAndroid Archive
PSSUQPost-Study System Usability Questionnaire

References

  1. European Union. Directive 2009/128/EC of the European Parliament and of the Council of 21 October 2009 establishing a framework for Community action to achieve the sustainable use of pesticides. Off. J. Eur. Union 2009, 309, 71–86. Available online: https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=celex%3A32009L0128 (accessed on 19 June 2022).
  2. Semios. Available online: https://semios.com/ (accessed on 19 June 2022).
  3. TrapView. Available online: https://www.trapview.com/ (accessed on 19 June 2022).
  4. DTN Smart Trap. Available online: https://www.dtn.com/agriculture/producer/dtn-smart-trap/ (accessed on 19 June 2022).
  5. FieldClimate. Available online: https://metos.at/fieldclimate/ (accessed on 19 June 2022).
  6. SnapTrap. Available online: https://snaptrap.com.au/ (accessed on 19 June 2022).
  7. Agrio. Available online: https://agrio.app (accessed on 19 June 2022).
  8. RapidAIM. Available online: https://rapidaim.io/ (accessed on 19 June 2022).
  9. CapTrap. Available online: https://www.sival-innovation.com/en/captrap-service/ (accessed on 19 June 2022).
  10. TarvosView. Available online: https://tarvos.ag/tarvos-view/ (accessed on 19 June 2022).
  11. Faria, P.; Nogueira, T.; Ferreira, A.; Carlos, C.; Rosado, L. AI-Powered Mobile Image Acquisition of Vineyard Insect Traps with Automatic Quality and Adequacy Assessment. Agronomy 2021, 11, 731. [Google Scholar] [CrossRef]
  12. Gonçalves, J.; Silva, E.; Faria, P.; Nogueira, T.; Ferreira, A.; Carlos, C.; Rosado, L. Edge-compatible Deep Learning Models for Detection of Pest Outbreaks in Viticulture. IEEE Access 2022, submitted.
  13. Rosa, A.F.; Martins, A.I.; Costa, V.; Queirós, A.; Silva, A.; Rocha, N.P. European Portuguese validation of the Post-Study System Usability Questionnaire (PSSUQ). In Proceedings of the 2015 10th Iberian Conference on Information Systems and Technologies (CISTI), Aveiro, Portugal, 17–20 June 2015; pp. 1–5. [Google Scholar] [CrossRef]
  14. Carlos, C.; Gonçalves, F.; Oliveira, I.; Torres, L. Is a biofix necessary for predicting the flight phenology of Lobesia botrana in Douro Demarcated Region vineyards? Crop Prot. 2018, 110, 57–64. [Google Scholar] [CrossRef]
  15. Kallisto. Available online: https://docs.sensry.net/Overview/Kallisto/ (accessed on 21 June 2022).
  16. Humidity Sensor BME280—Bosch Sensortec. Available online: https://www.bosch-sensortec.com/products/environmental-sensors/humidity-sensors-bme280/ (accessed on 21 June 2022).
  17. TFA Dostmann Protective Cover for Outdoor Transmitter 98.1114. Available online: https://www.tfa-dostmann.de/en/product/protective-cover-for-outdoor-transmitter-98-1114/ (accessed on 21 June 2022).
  18. Baillod, M.; Baggiolini, M. Les stades repères de la vigne. Rev. Suisse Vitic. Arboric. Hortic. 1993, 25, 7–9. [Google Scholar]
  19. Sauro, J.; Lewis, J.R. Quantifying the User Experience: Practical Statistics for User Research; Morgan Kaufmann: Burlington, MA, USA, 2016. [Google Scholar]
  20. SIGP—Geodouro Agricultural Management Software. Available online: https://www.sigp.pt (accessed on 23 June 2022).
Figure 1. Process flow diagram for trap image quality and adequacy validation [11].
Figure 1. Process flow diagram for trap image quality and adequacy validation [11].
Sustainability 14 09729 g001
Figure 2. Methodology used to develop the insects detection computer vision model [12].
Figure 2. Methodology used to develop the insects detection computer vision model [12].
Sustainability 14 09729 g002
Figure 3. Illustrative examples of detected insects (in scale): (A) European Grapevine Moth; (B) Green Leafhopper; (C) "Flavescence Dorée" Leafhopper; (D) Tomato Moth; and (E) Morphotype C [12].
Figure 3. Illustrative examples of detected insects (in scale): (A) European Grapevine Moth; (B) Green Leafhopper; (C) "Flavescence Dorée" Leafhopper; (D) Tomato Moth; and (E) Morphotype C [12].
Sustainability 14 09729 g003
Figure 4. Examples of detections predicted by SSD Resnet50 model on test set images of (a) delta trap; (b) yellow sticky trap. The groundtruth annotations are marked in white, while GM predictions are in blue, MC in beige and GL in green [12].
Figure 4. Examples of detections predicted by SSD Resnet50 model on test set images of (a) delta trap; (b) yellow sticky trap. The groundtruth annotations are marked in white, while GM predictions are in blue, MC in beige and GL in green [12].
Sustainability 14 09729 g004
Figure 5. EoT sensor: (a) PCB layout; (b) Kallisto and expansion board; (c) field installation.
Figure 5. EoT sensor: (a) PCB layout; (b) Kallisto and expansion board; (c) field installation.
Sustainability 14 09729 g005
Figure 6. Architecture of the EyesOnTraps system.
Figure 6. Architecture of the EyesOnTraps system.
Sustainability 14 09729 g006
Figure 7. Image Acquisition Module screenshots: (A) unfocused image; (B) focused and good illumination; (C) trap detected; (D) automatic image capture; (E) perspective correction; and (F) manual mode [11].
Figure 7. Image Acquisition Module screenshots: (A) unfocused image; (B) focused and good illumination; (C) trap detected; (D) automatic image capture; (E) perspective correction; and (F) manual mode [11].
Sustainability 14 09729 g007
Figure 8. Insects Detection Module screenshots: (a) automatic image capture; (b) perspective correction; (c) image processing; and (d) output image with insects location and respective counting.
Figure 8. Insects Detection Module screenshots: (a) automatic image capture; (b) perspective correction; (c) image processing; and (d) output image with insects location and respective counting.
Sustainability 14 09729 g008
Figure 9. Temperature readings for EoT sensors: (a) near a weather station (EoT Sensor 1); (b) distant from a weather station (EoT Sensor 6). The EoT sensor data are depicted in blue, the weather station data in orange and the absolute error between them in green.
Figure 9. Temperature readings for EoT sensors: (a) near a weather station (EoT Sensor 1); (b) distant from a weather station (EoT Sensor 6). The EoT sensor data are depicted in blue, the weather station data in orange and the absolute error between them in green.
Sustainability 14 09729 g009
Figure 10. Mobile application screens: (a) main screen; (b) notifications screen; (c) trap information screen; (d) tasks available for each acquisition.
Figure 10. Mobile application screens: (a) main screen; (b) notifications screen; (c) trap information screen; (d) tasks available for each acquisition.
Sustainability 14 09729 g010
Figure 11. Mobile application screens: (a) temperature synchronization; (b) phenological status image acquisition; (c) phenological status selection; (d) automated insects detection.
Figure 11. Mobile application screens: (a) temperature synchronization; (b) phenological status image acquisition; (c) phenological status selection; (d) automated insects detection.
Sustainability 14 09729 g011
Figure 12. Online image annotator: (a) insect annotation; (b) annotations review by species.
Figure 12. Online image annotator: (a) insect annotation; (b) annotations review by species.
Sustainability 14 09729 g012
Figure 13. Automated image pre-annotation: (a) preview; (b) load automated pre-annotations.
Figure 13. Automated image pre-annotation: (a) preview; (b) load automated pre-annotations.
Sustainability 14 09729 g013
Figure 14. Web portal screens: (a) the geographic representation of elements relevant to the management of open-field grape production; and (b) trap images collected via the mobile application.
Figure 14. Web portal screens: (a) the geographic representation of elements relevant to the management of open-field grape production; and (b) trap images collected via the mobile application.
Sustainability 14 09729 g014
Figure 15. Web portal screens: (a) automated detection and counting of insects; and (b) visualization of the evolution of pest counts over time by trap and by species.
Figure 15. Web portal screens: (a) automated detection and counting of insects; and (b) visualization of the evolution of pest counts over time by trap and by species.
Sustainability 14 09729 g015
Figure 16. Web portal screens with: (a) European Grapevine Moth flights predicted by the phenology prediction models based on DD; and (b) statistics dashboard.
Figure 16. Web portal screens with: (a) European Grapevine Moth flights predicted by the phenology prediction models based on DD; and (b) statistics dashboard.
Sustainability 14 09729 g016
Table 1. Commercial solutions for pest monitoring and respective functionalities.
Table 1. Commercial solutions for pest monitoring and respective functionalities.
Semios [2]TrapView [3]SmartTrap [4]FieldClimate [5]SnapTrap [6]Agrio [7]TarvosView [10]RapidAIM [8]CapTrap [9]
Image acquisition and visualization
Automatically identify and count insects
Review and edit automated results
Pest forecasting
Requires proprietary instrumented trap
Pest detection via mobile-acquired images
Trap georeferencing
Provide infestation / pest alerts
Communication with temperature sensors
Allow offline usage
Table 2. Confusion matrix for final SSD Resnet50 model on the test set (true positives in gray) [12].
Table 2. Confusion matrix for final SSD Resnet50 model on the test set (true positives in gray) [12].
Predicted
ClassGreen LeafhopperMorphotype C“Flavescence Dorée” LeafhopperEuropean Grapevine MothTomato MothOnly Groundtruth (FN)
Green Leafhopper3780000138
Morphotype C03100013
Groundtruth“Flavescence Dorée” Leafhopper00001016
European Grapevine Moth02010133200
Tomato Moth000164413
Only Detection (FP)29313017123-
Table 3. Identified requirements for the EyesOnTraps system.
Table 3. Identified requirements for the EyesOnTraps system.
Tasks in the Winegrower’s JourneyRequirements (Opportunities for Improvement)
Travel to the trapTraps georeferencing
Trap monitoring and maintenanceReminders to monitor the trap,
change the pheromone or change the glue base
Manual identification and counting of insectsAutomatic identification and counting of insects
Field notebook (on paper)Digitization of the field notebook
Identification of the phenological stateRecording of phenological status (with image capture)
Track temperature near trapsRecording of temperature history near traps
Application of preventive phytosanitary treatmentsSend warnings and recommendations to promote
the effective management of phytosanitary treatments
Table 4. Hourly measurements results between EoT sensors and respective weather stations.
Table 4. Hourly measurements results between EoT sensors and respective weather stations.
Average Error CMean Error Deviation CMean Absolute Error CAbsolute Mean Error Deviation C
Near
Weather
Station
EoT Sensor 1−0.311.611.380.89
EoT Sensor 2−0.221.050.810.71
EoT Sensor 3−0.531.201.030.80
Average−0.381.291.070.8
Distant
Weather
Station
EoT Sensor 4−0.873.112.641.86
EoT Sensor 5−1.562.52.261.89
EoT Sensor 6−0.913.633.331.71
Average−1.113.082.741.82
Table 5. Post-study system usability questionnaire (PSSUQ) results.
Table 5. Post-study system usability questionnaire (PSSUQ) results.
ScaleResultMean
Overall1.602.82
System Usefulness (SYSUSE)1.572.80
Information Quality (INFOQUAL)1.563.02
Interface Quality (INTERQUAL)1.742.49
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Rosado, L.; Faria, P.; Gonçalves, J.; Silva, E.; Vasconcelos, A.; Braga, C.; Oliveira, J.; Gomes, R.; Barbosa, T.; Ribeiro, D.; et al. EyesOnTraps: AI-Powered Mobile-Based Solution for Pest Monitoring in Viticulture. Sustainability 2022, 14, 9729. https://doi.org/10.3390/su14159729

AMA Style

Rosado L, Faria P, Gonçalves J, Silva E, Vasconcelos A, Braga C, Oliveira J, Gomes R, Barbosa T, Ribeiro D, et al. EyesOnTraps: AI-Powered Mobile-Based Solution for Pest Monitoring in Viticulture. Sustainability. 2022; 14(15):9729. https://doi.org/10.3390/su14159729

Chicago/Turabian Style

Rosado, Luís, Pedro Faria, João Gonçalves, Eduardo Silva, Ana Vasconcelos, Cristiana Braga, João Oliveira, Rafael Gomes, Telmo Barbosa, David Ribeiro, and et al. 2022. "EyesOnTraps: AI-Powered Mobile-Based Solution for Pest Monitoring in Viticulture" Sustainability 14, no. 15: 9729. https://doi.org/10.3390/su14159729

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop