Next Article in Journal
Assessing the Role of Crop Rotation in Shaping Foliage Characteristics and Leaf Gas Exchange Parameters for Winter Wheat
Next Article in Special Issue
Hardware and Software Support for Insect Pest Management
Previous Article in Journal
The Use of Temperature Based Indices for Estimation of Fruit Production Conditions and Risks in Temperate Climates
Previous Article in Special Issue
Predicting the Occurrence and Risk Damage Caused by the Two-Spotted Spider Mite Tetranychus urticae (Koch) in Dry Beans (Phaseolus vulgaris L.) Combining Rate and Heat Summation Models for Digital Decisions Support
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

EfficientDet-4 Deep Neural Network-Based Remote Monitoring of Codling Moth Population for Early Damage Detection in Apple Orchard

1
Department of Agricultural Zoology, Faculty of Agriculture, University of Zagreb, Svetošimunska cesta 25, 10000 Zagreb, Croatia
2
Department of Computer Engineering and Automation, Faculty of Electrical Engineering, Computer Science and Information Technology, Josip Juraj Strossmayer University of Osijek, Kneza Trpimira 2B, 31000 Osijek, Croatia
*
Author to whom correspondence should be addressed.
Agriculture 2023, 13(5), 961; https://doi.org/10.3390/agriculture13050961
Submission received: 7 March 2023 / Revised: 4 April 2023 / Accepted: 24 April 2023 / Published: 26 April 2023
(This article belongs to the Special Issue Hardware and Software Support for Insect Pest Management)

Abstract

:
Deep neural networks (DNNs) have recently been applied in many areas of agriculture, including pest monitoring. The codling moth is the most damaging apple pest, and the currently available methods for its monitoring are outdated and time-consuming. Therefore, the aim of this study was to develop an automatic monitoring system for codling moth based on DNNs. The system consists of a smart trap and an analytical model. The smart trap enables data processing on-site and does not send the whole image to the user but only the detection results. Therefore, it does not consume much energy and is suitable for rural areas. For model development, a dataset of 430 sticky pad photos of codling moth was collected in three apple orchards. The photos were labelled, resulting in 8142 annotations of codling moths, 5458 of other insects, and 8177 of other objects. The results were statistically evaluated using the confusion matrix, and the developed model showed an accuracy > of 99% in detecting codling moths. This developed system contributes to automatic pest monitoring and sustainable apple production.

1. Introduction

The codling moth (Cydia pomonella (Linnaeus, 1758)) (Lepidoptera: Tortricidae) is among the most important and well-studied insects. Its genetics [1,2,3,4,5,6,7], resistance to chemical insecticides [8,9,10,11,12,13], monitoring and control strategies [14,15,16,17,18], effects of climate change on its biology and ecology [2,19], and many other phenomena have been studied in detail. It is the most harmful and widespread pest in apple orchards. It occurs in all apple-growing regions and has reached an almost cosmopolitan distribution, making it one of the most successful pest species currently known [20].
Its larvae directly damage apple fruits and make them unsuitable for the market [21]. Adult codling moths lay their eggs on apple fruit in late spring. After hatching, the larvae begin feeding by penetrating the fruit tissue and forming tunnels to the seed chamber [22]. By the end of the growing season, a single larva can damage several fruits before reaching full growth and development. An infested fruit can be identified by the entrance hole, which contains larval droppings. Early damaged apple fruit ripens faster and falls off the tree prematurely. Later, infested fruits remain on the tree, but their insides are damaged, i.e., “wormy” [23,24]. Therefore, to prevent the spread of codling moth in warehouses and the occurrence of total damage, targeted control measures must be implemented. Without proper management, this pest can cause a yield reduction of 30–50%, and in certain years even up to 80%. In intensive apple production, 1% damaged fruit is tolerated, although growers try to reduce this number to below 0.5% [2,23].
There are numerous Integrated Pest Management (IPM) techniques, including mating disruption [25,26,27,28], attract-and-kill [29], or sterile insect technique [30,31], and many other environmentally friendly strategies that can be implemented in the modern codling moth management [32,33]. Although all these techniques are available, growers mostly resort to chemical measures [18,34], which has led to the emergence of insecticide resistance, making codling moth control much more difficult [2,34,35,36,37]. Codling moth control accounts for more than 70 percent of insecticide applications in apple orchards [38], which is both economically and environmentally challenging. The timing of insecticide use is highly dependent on critical numbers; therefore, precise monitoring of codling moth is crucial [39].
Accurate monitoring using traps can predict several important parts of pest development, such as adult emergence (beginning of flight) and larval emergence, but more importantly, empirical economic thresholds for insecticide use. Due to climate change, most insect pests, including codling moth, are susceptible to various irregularities and changes in biology and ecology [40], resulting in a rise in the number of generations, viability, and unanticipated insect outbreaks [2,41]. In fact, changes in codling moth biology have been observed in the field trials [42]. The codling moth now emerges earlier in spring, the flight duration of moths in autumn is longer, and the total number of moths captured during the growing season has increased [19]. Pajač et al. [19] found that the codling moth in Croatia has developed an additional third generation thanks to global warming. As a result, current monitoring methods using classical traps are becoming increasingly unreliable and time consuming. In order to reduce the impact of climate change on pest management, the ecological footprint, and to improve fruit production, it is necessary to use reliable and modern monitoring techniques [43].
In addition, the apple (Malus domestica Borkh.) (Rosales: Rosaceae) is one of the world’s most important fruit crops and has great economic significance. It is easy to grow and manipulate and can be used year-round for fresh consumption or for processing into a product [44]. In 2021, the global apple production was 93 million tons, and the cultivated area was 4.8 million hectares [45].
Considering the aspects mentioned so far, improved monitoring techniques in apple production and management have been researched recently, which refers to automatic pest monitoring methods. In particular, deep neural networks (DNN) and their variants are very effective tools for object detection [46]. Therefore, DNN-based detection methods have become increasingly important for automatic pest monitoring in agriculture [47,48,49]. Due of the diverse positions, sizes, and textures of the insects found on the adhesive pads, however, it is difficult to develop these solutions [50]. Suto [50] emphasised that there is no universally acceptable automatic method for monitoring codling moth. Considering that codling moth control is mostly based on chemical measures, it is very important to apply monitoring methods as accurately as possible to limit the use of chemical insecticides to targeted, necessary, and effective applications. Therefore, it is important to develop a generally accepted technique for the automatic monitoring of codling moth that has high detection accuracy and is thus effective.
Recent studies have focused on developing methods based on DNNs for automated monitoring of codling moth as well. For example, Ding and Taylor [51] were among the first to develop an analytical model for codling moth monitoring. DNNs were used as an image classification tool to identify and count pests in images taken in the field. They collected approximately 200 red–green–blue (RGB) photographs. This codling moth detection model proved to be effective, but a larger dataset was needed to make the whole system more functional. Suto et al. [52] proposed a system-integrated trap with a deep-learning insect identification model that can be used anywhere in the field. Given that they did not have a sufficient set of photos for detecting codling moth, they developed a model for detecting moths in general. Considering that the codling moth is the most economically important apple pest, the development of such a system is also crucial for its monitoring and apple production in general. In the same year, Preti et al. [53] developed a prototype smart trap and model for codling moth detection based on deep learning. Due to an insufficient dataset, it could not provide reliable count data remotely. Training the model usually requires a larger dataset to achieve a sufficient level of reliability. This model had a high number of false positive (FP) detections, which affects accuracy by overestimating the true number of codling moths. Suárez et al. [54] developed a codling moth detection model using DNNs, with an overall detection accuracy of 94.8%. The amount of data for their system was also insufficient. Therefore, larger datasets need to be collected to develop a reliable and accurate model for pest detection.
In addition, the work of Albanese et al. [55] opened the potential for reliable automatic monitoring of codling moth. The authors proposed a smart trap for codling moth monitoring using various deep-learning algorithms. They collected 4400 photos for model development and achieved 95–98% accuracy. In this work, the photo processing was done inside the trap, and only the detection results are sent to the user. Čirjak et al. [56] proposed a similar system for monitoring another important lepidopteran apple pest (Leucoptera malifoliella (O. Costa, 1836)). The proposed system was also based on DNNs and achieved high detection accuracy (>98%). This system could also be adapted for monitoring other pests. Therefore, it is necessary to develop such a comprehensive system, with a larger dataset and even higher accuracy, for other important pest in apple production.
Consequently, the objective of this study was to build an analytical model and a technology for an automatic monitoring system for early identification of the codling moth population (smart trap). The core of this system would be the EfficientDet-4 Deep Neural Network (DNN)-based model for early detection of codling moth, which is fast, reliable, accurate, based on a large dataset, and requires minimal data pre-processing. Therefore, the device built with the RGB camera (smart trap) for monitoring codling moth individuals would be economically viable and have optimised energy usage, making it ideal for rural locations. This method would utilise images taken in the field to detect pests. Data processing would be performed on the node, resulting in decreased energy consumption and a longer lifespan for the entire system, requiring less human interaction. It would be an innovative solution that can be used in rural areas for more reliable and faster pest monitoring.
This system is designed to monitor the most important apple pest, the codling moth, in order to prevent economically significant damage from occurring. The proposed system contributes to the advanced monitoring of codling moth and improves its complete management and apple production in general. The hypothesis of this study is that the proposed method might be a reliable tool for codling moth monitoring if the detection accuracy is greater than 90 percent when compared to entomologist visual inspection.

2. Materials and Methods

The methodology for developing an automatic monitoring system for codling moth consists of two parts: the development of a smart trap prototype and the development of an analytical model for object detection based on DNNs.

2.1. Smart Trap Prototype

The smart trap prototype consists of a housing with a built-in red–green–blue (RGB) camera (Figure 1), which is placed on a customised stand attached to the orchard’s stakes. On the top of the housing are antenna and the solar panel. The housing shields the camera, single-board computer (SBC), and pheromone lure from rain and sunlight. The rectangular-shaped opening on opposing sides measures 16 × 10 cm, while the overall size of the housing is 25 × 24 cm (Figure 1). The Raspberry Pi 4B SBC uses a camera for capturing images. The camera’s resolution is 12.5 megapixels, or 4056 × 3040 pixels. The SIM card onboard provides 4G mobile network connectivity. Three rechargeable 18,650 Li–ion batteries power the SBC with a maximum current of 5 A. An integrated Battery Management System (BMS) maintains long battery life by managing charge and discharge cycles. The solar panels charge the batteries, so the system is self-sufficient and does not need to be connected to the power grid.
To ensure effective monitoring, the farmer must insert an adhesive pad and pheromone lure into the smart trap and replace it as needed. One smart trap is sufficient for 1 to 3 ha of orchard and is suitable for installation in the orchard. It is recommended to install the trap at a height of 2 m, as well as the classic Delta trap for codling moth monitoring.
The Camera Control Server (CCS) is an important segment of smart trap functioning, which provides the user with remote monitoring and with image processing. The camera communicates with the CCS over the 4G cellular network (Enterprise APN). To enable adequate communication with the cellular network, it is possible to use different types of antennas in the camera configuration. This option was chosen due to the problem of the distance of certain agricultural sites from the access to the mobile network. The main purpose of the control servers is to communicate the work results and camera status and to allow the system to update the device software. Therefore, the CCS have several basic functions: monitoring camera status, performing software updates, and receiving work results and analyses. Monitoring the operational status of the camera involves the continuous transmission of basic operational parameters. Basic operating parameters include the correctness of UTC time; the correctness of software upgrades; the correctness of operation according to the predicted analytical model; the temperature, the state, and type of the cellular network to which the device is connected; the local time received from the cellular network; etc. Every time an error occurs, or a message is not received within a certain period, an alarm is triggered on the control server. The smart trap was installed in the orchard in the period from August to November to test its operability.

2.2. Analytical Model

For effective automatic monitoring of codling moth, a new model (detection algorithm) for codling moth detection is required in addition to the smart trap prototype. Therefore, the analytical model for codling moth monitoring was developed using a deep neural network (DNN).
The model was created using photos taken between May and September 2022 under field conditions. To train the DNN, a large dataset was provided; 10 standard delta traps with codling moth pheromone lures were placed in three apple orchards in Zagreb County, Croatia (Mičevec—N 45.736278, E 16.047883, Petrovina Turopoljska—N 45.689562, E 16.021316, and Staro Čiče—N 45.696737, E 16.108477) (Figure 2). The orchard in Staro Čiče was managed organically, and two others (Mičevec and Petrovina Turopoljska) were managed according to the principles of IPM. Delta traps were placed at a height of 2 m in the rows of the orchard with a minimum spacing of 10 m and inspected weekly. The adhesive pads containing adult codling moths were changed in the field and photographed manually with a camera mounted in a housing, modelled after the smart trap prototype (Figure 3) to obtain the most diverse capturing conditions. The camera was connected to the SBC from where capture was controlled.
To train the DNN, it was crucial to define the classes or the target objects on the images for their detection. The first class is the most significant and represents the targeted pest species, the codling moth. It is referred to as MOTH throughout the rest of the paper (Figure 4). The second class, referred to as INSECTS, identifies all non-target insects captured by the adhesive pad (Figure 5). The third class, referred to as OTHER, represents all other objects identified on the adhesive pad, such as dirt, twigs, leaves, etc. (Figure 6).
The captured images are then processed, and target objects are identified and labelled (annotated) with bounding boxes in the LabelImg program (Figure 7). In order to define the ground truth, image labelling was performed by an entomologist expert.
After photos were collected and annotated, the final number of detected objects (annotations) for the class MOTH (target insect codling moth) was 8142, for the class INSECTS (other non-target insects) was 5458, and for the class OTHER (other objects) was 8177. This exhaustive dataset was utilised for DNN training because it was deemed adequate.
After preparing the data for further development of the analytical model and considering that learning (or training) of the DNN requires a large amount of data [54], data adjustment follows. In this instance, DNN learning begins with raw photos, extracting significant features, including blobs, edges, and bounding boxes of the codling moth for the dataset. With the application of annotations and modelling techniques, they were adapted for use in image processing. For learning the DNN, 430 photos were utilised as training data. The photos were flipped and rotated along the vertical and horizontal axes. Finally, twelve pictures were extracted from each source image. The original image (4000 × 3900 pixels) was then broken into smaller images (640 × 640 pixels), and 30 images were generated from each of these 12 photos. Hence, 154,800 photos were acquired, which will be utilised to build DNN for monitoring the codling moth.
An artificial intelligence algorithm accomplished the automatic detection of objects. The labelled photos were utilised to train a deep neural network (DNN) in order to develop the analytical model. There are three phases of model development, and each step has its own dataset. Randomly, the photos were divided into three groups: training set, validation set, and test set. Ninety percent of the photos were used for model training, whereas only ten percent were used for model validation. Subsequently, 30 new and complex images were selected for the test set; these images were not included in the previous dataset used for training or validation. These are images that the model did not see before, so the model was tested on that dataset. Table 1 depicts the number of images used in each phase.
To obtain information about the model quality and performance, the most important is the validation phase, where the model is checked during and after the creation process. To validate the model, it is also important to check the accuracy of the model validation, using the learning and validation loss in order to avoid overfitting [57]. In addition, the validation loss was measured after each epoch, indicating whether the model needs to be adjusted. During the validation process, the validation is done on each epoch on 10% of the data. As soon as validation of each epoch is done, the validation of the complete model performance is provided on those 10% validation datasets as well.
In addition, during the validation phase, a number of statistical elements that indicate the accuracy of the object recognition are calculated, including Average Precision (AP) and Average Recall (AR) and statistical indicators for the model and its classes. Table 2 depicts and explains these metrics in their entirety.
AP and AR are commonly used evaluation metrics for object detection models in DNNs such as TensorFlow (Table 2).
AP is a measure of the model’s accuracy in terms of how well it can correctly identify objects of interest in an image. Specifically, AP is the area under the precision–recall curve (PR curve), where precision is the number of true positive (TP) detections divided by the total number of detections, and recall is the number of TP detections divided by the total number of ground truth objects (Table 3). The PR curve plots the precision values against the corresponding recall values for various confidence thresholds. A perfect model would have a PR curve that rises rapidly and settles at a high precision level, resulting in a high AP score.
AR is a measure of the completeness of the model, i.e., how well it can detect all objects of interest in an image. More specifically, AR is the average recall value across different confidence thresholds. In other words, AR measures the percentage of objects that the model recognises at a given confidence threshold. A perfect model would have an AR value of 1.0, meaning that it recognises all objects of interest in the image.
Overall, AP and AR are important metrics for evaluating the performance of object detection models and are widely used in research and the industry to compare and improve different models [58] (Table 2).
Table 2. Explanation of model validation parameters.
Table 2. Explanation of model validation parameters.
ParameterExplanation *
AP AP @ IoU+ = 50% to 95% with steps of 5%
APIoU = 0.50AP @ IoU = 50%
APIoU = 0.75AP @ IoU = 75%
APsAP for objects with small size: area < 32 × 32
APmAP for objects with medium size: 32 × 32 < area < 96 × 96
APlAP for objects with large size: area > 96 × 96
ARmax1AR given 1 detection per image
ARmax10AR given 10 detections per image
ARmax100AR given 100 detections per image
ARsAR for objects with small size: area < 32 × 32
ARmAR for objects with medium size: 32 × 32 < area < 96 × 96
ARlAR for objects with large size: area > 96 × 96
AP_MOTHAP for the class MOTH
Model validation accuracy **
Learning lossThe number of errors in the training dataset indicates how well the deep learning model fits the test dataset.
Validation lossThe number of errors in the validation dataset indicates how well the deep learning model performs on the validation dataset.
* [59] and ** [60]. IoU+ (Intersection over Union)—metric used to evaluate deep learning algorithms by calculating the similarity between the predicted bounding box and the actual data [61].
After the quantisation and compilation process of the model, the test phase is provided. The test phase is an important part of model creation when various quality parameters of the model can be calculated. The model was evaluated by comparing the automatic counts generated by the model to the expert entomologist’s observations (ground truth).
Using the statistical method of the confusion matrix, the outcomes of the model testing were analysed. TP stands for “true positive”, TN for “true negative”, FP for “false positive”, and FN for “false negative” [62]. TP refers to a sample that was correctly classified as belonging to the positive class, whereas TN refers to a sample that was correctly classified as belonging to the negative class. FP refers to a sample that was wrongly classified as positive when it belonged to the negative class, while FN refers to a sample that was incorrectly classified as negative when it belonged to the positive class [63]. The confusion matrix can be utilised to determine whether the model is “confused” in distinguishing between the two classes. It is utilised to estimate the quantity of real and predicted values. In this work, the confusion matrix was applied to calculate the model’s accuracy, precision, recall, and F1 score. Table 3 depicts the equations and descriptions for each metric. The measures were computed based on the correspondences of the full dataset and do not represent image-specific averages.
Table 3. Description of used metrics.
Table 3. Description of used metrics.
Metric and Formula *Explanation **
Accuracy = TP + TN TP + FP + TN + FN × 100 General model performance across all classes. The proportion of accurate predictions to the total number of predictions.
Precision = TP TP + FP Determines the model’s ability to correctly categorise a sample as Positive. The ratio between the number of TP detections and the total number of positive samples (either correct or incorrect).
Recall = TP TP + FN Determines the model’s capacity to identify Positive samples. The proportion of TP samples relative to the total number of Positive samples. As recall increases, more positive samples are identified.
F 1   score = 2 TP 2 TP + FP + FN The mean of accuracy and recall. Combining the precision and recall measures into one metric.
* [64] and ** [65,66,67].
In addition to conventional techniques for object detection, powerful deep learning models can perform exceptional object detection. These models receive an image as input and provide the coordinates of each identified object’s bounding box [65].
This analytical model was created through a series of algorithm evaluations and parameter adjustments to achieve optimal performance for the intended use. TensorFlow was executed using Python 3.6 and the TensorFlow artificial intelligence framework. The production application utilised TensorFlow Lite Model Maker, allowing the created model to be implemented on SBC [57]. After the model has been created, it must be evaluated and adjusted as necessary. When necessary, the analytical model is updated remotely using a mobile network.
This work employs a novel class of object detectors called EfficientDet: Scalable and Efficient Object Detection (ED). ED is a novel class of object detectors utilised in this work. Tan et al. [68] created and provided a detailed explanation of ED. ED detectors are more precise and need fewer computational resources (working memory, power consumption, and floating-point operations per second or FLOPS) than its predecessors. In their work, the authors presented eight models of ED, numbered ED0 through ED7. Based on the concepts described in [68], ED4 is a compromise between FLOPS and detection accuracy. All models numbered ED0 through ED3 are 5 to 15% less precise than model ED4 without considerably increasing the FLOPS. Each succeeding model after ED4 doubles the required FLOPS, while the detection accuracy is less than 5%. Thus, the ED4 detector model has relatively low FLOPS while maintaining a high level of accuracy, making it appropriate for implementation on SBCs [57].

3. Results and Discussion

After completion of all phases, a DNN-based smart trap with a camera and SBC and an associated analytical model for automatic monitoring of the codling moth are created.

3.1. Analytical Model Performance

The validation parameters describing the analytical model performance are listed in Table 4. The Average Precision (AP) for the class MOTH was 0.79, which is a very satisfactory AP value during model validation. The overall model had a slightly lower value (0.66) due to other less important classes in the model (INSECT and OTHER). For the class INSECT, the AP was 0.62, and for the class OTHER, the AP was 0.55. These values are lower than the AP for the class MOTH but still acceptable because these classes are given only to better distinguish the important class MOTH (Table 4).
The statistical indicators calculated during the learning and validation of the model are shown in Figure 8. The learning and validation losses reduced and stabilised at a specific point (Figure 8), suggesting that the model is optimally fitted and that no further adjustments are required.
To assure the system’s dependability, it is essential to establish whether the model works in practice. Thus, the model was tested, signifying that its quality was evaluated following its development. Figure 8 illustrates an example of automatic counts on test photos generated by the model.
The test results (Figure 9), which were statistically analysed using the confusion matrix, are shown in Figure 10. This figure demonstrates that the number of recognised objects belonging to the class OTHER was relatively high (>900) and the number of recognised objects from the classes INSECT and MOTH was >100, indicating a large number of correctly recognised objects and thus a fairly accurate model.
Taking into account the number of actual values and those predicted by the model, there were no FP detections (Figure 10), indicating that the model detected the class MOTH completely precise (1.0) (Table 5). The recall for the class MOTH was 0.95, indicating a high proportion of correctly predicted moths relative to all positive observations. Given that the F1 score is a weighted average of precision and recall [69], the F1 score data were, as expected, slightly below 1.0 (Table 5). The detection accuracy for the other two classes (INSECT and OTHER) was also very high (>99%). In addition, all other detection parameters (precision, recall, and F1 score) for the class INSECT were 0.97. Thanks to the higher number of TP detections (Figure 10), the class OTHER had slightly higher parameters, where recall was 1.0, while precision and F1 score were 0.99 (Table 5).
The multiple FN detections (Figure 10) were primarily the result of colour changes and moth disintegration over time. Ding and Taylor [51] analysed the errors generated by numerous causes and stressed that a significant number of errors are time related. Insects that are decaying can reduce contrast with the background. If the adhesive pads in smart traps are replaced every ten days or so to prevent insect disintegration and dirt collection, and hence, false detections, time-related errors, could be substantially reduced in true production systems. The problem is that changing the adhesive pads is time-consuming and must be done manually, especially for large areas. Therefore, Teixeira et al. [70] proposed to implement an automatic trap changing system. In this case, the efficiency of the monitoring process can be improved. This proposal is one of the main challenges from the industry point of view and can be overcome through collaboration between researchers and farmers.
The number of FN detections was negligible (9) compared to the 180 TP detections (Figure 10). Based on these numbers, the model showed a very high accuracy of the class MOTH (99.3%), indicating that the model is almost completely accurate (less than 1% deviation from the ground truth) for the detection of codling moth (Table 5). Albanese et al. [55] developed an embedded codling moth monitoring system using different deep learning algorithms for model development. Compared to this work, in our proposed model, all detection parameters are slightly improved.
Suárez et al. [54] developed a detection and classification algorithm for codling moth detection. The authors also used a confusion matrix as a tool for the statistical evaluation of model accuracy and all other parameters. The overall accuracy measures the extent to which the model correctly predicts the entire dataset and does not distinguish classes [65]. In the work of Suárez et al. [54], the overall accuracy was 94.8% due to an insufficient amount of data, much lower than the overall accuracy of the model proposed in our work, which was 98.92%.
Suto et al. [52] developed a similar tool for codling moth monitoring that included a prototype smart trap and an insect counting algorithm. In other work Suto et al. [71] presented plug-in board for pest detection, based on Raspberry Pi. The system from both works overcomes limitations from the systems built before, but the dataset used was not reliable enough. Namely, because there is no freely available dataset on the codling moth, the author used various images from the Internet and images of other moth species, which makes qualitative monitoring difficult. In this work, the whole dataset was collected personally and manually in the field, using an appropriate tool for capturing images to obtain the most reliable dataset by mimicking environmental conditions and developing an accurate and precise model. Wen and Guyer [72] trained a neural network for pest detection on the images from the laboratory conditions, which provided significantly lower recognition accuracy than images captured in the field conditions. Thanks to the large and high-quality dataset, the detection accuracy and all other moth detection parameters in this work reached high values (Table 5). Hong et al. [73] also manually collected images from the pheromone traps to build detection models for three moth species. However, the dataset used did not consist of real-time images from the insect trap in the field, because in this case light interference would occur and the training dataset would be more reliable and applicable under field conditions. In our work, this shortcoming was overcome. Namely, our dataset was captured using a camera that mimics a smart trap. The training images were acquired in the field under different natural lighting conditions to ensure natural and most reliable acquisition conditions. Thanks to this kind of data acquisition, our system works more reliably under field conditions. Sun et al. [47] performed the image acquisition in the trap using a digital camera integrated into the pheromone trap. In this way, the authors ensured real capture conditions and a high-quality training dataset. In addition, Ahmad et al. [74] collected 7046 images in the field under different background and lighting conditions to detect 23 different agricultural pests. This work also achieved excellent results and once again confirmed that image acquisition under field conditions provides more reliable models for pest monitoring. This was also confirmed by Xia et al. [75], who collected datasets in the field rather than under controlled conditions to develop a model with higher noise immunity.
In this study, EfficientDet object detectors were used as a tool for codling moth detection. This object detector showed high accuracy in monitoring codling moth, which was expected since it has already been experimentally demonstrated to be a reliable insect detection tool. Hong et al. [76] used several deep learning-based object detection methods to automatically monitor the forest pest Matsucoccus thunbergianae, including EfficientDet D4. This deep learning-based object detector showed great potential for insect detection with a detection accuracy of 95–98%. Therefore, in our work, we used the same family of object detectors for monitoring an important agricultural pest, codling moth, which also showed high accuracy in codling moth detection (>99%) (Table 5). In addition, Popescu et al. [77] tested different neural networks for monitoring the invasive and very important agricultural pest Halyomorpha halys, where the neural network EfficientDet showed the best results in terms of detection time. In the last three years, several smart devices for codling moth detection have been developed [52,53,54,55,78], but in this work, the proposed model showed the highest accuracy in codling moth detection (Table 5). Namely, Segalla et al. [79] developed a model with 98.3% accuracy and Albanese et al. [55] with the highest detection accuracy of 97.9%, while the detection accuracy in this work was >99% (Table 5).
In addition, the developed system consumes less energy because it uses the SBC, a very efficient computer that consumes less power than conventional computers and therefore requires less energy to operate. These computers are very power efficient [78] and therefore are used in many similar research projects for automatic pest monitoring [80,81,82] and also for automatic codling moth monitoring [83].
In order to use the model in rural areas on SBC devices, it is important to include the optimization phase of the model as the final phase of model development and adaptation. Bigger models demand more memory and more time to function, making them more challenging to operate. By optimising the model, it is possible to minimise the model’s capacity while reducing the loss of accuracy and performance, so achieving a balance that was optimal. This kind of optimised model enables faster evaluation and detection. At the same time, the developed model is constantly improved and updated. Thanks to these comprehensive improvements, our work represents a step forward in the automatic monitoring of codling moth.

3.2. Smart Trap Operating

The functionality of the hardware part was tested for three months in two apple orchards and proved to be effective under field conditions. The CCS regularly reported camera status, software updates, work results, and analysis that included basic operating parameters such as time accuracy, temperature, system status, and the type of cellular network to which the device is connected, as well as many other parameters. The CCS reported regularly every day at a specific time throughout the test period. The optimal time of day (between 9:00 a.m. and 1:00 p.m.) for CCS to report is still being researched, as this presents a very important challenge related to lighting and branch shadows. This problem in object detection was also pointed out by Wosner et al. [84]. Therefore, choosing the optimal time for capturing image and reporting CCS is an important segment for quality pest monitoring.
The developed system has a built-in mechanism for on-site detection, which allows data processing directly in the orchard, making it suitable for rural areas. After the model does all the work and detects the targeted objects (codling moths), the system does not send the entire image to the end user at a given time, as is usually the case [53] but only the detection results, which take an average of 5 min to execute. Accordingly, data processing is performed on-site with 15 MB a storage limit. This kind of system enables faster data transmission and reduces energy costs, which was confirmed by Albanese et al. [55].
The SBC has 5 W of power and is turned on once a day for 5 min. The total daily energy consumption is 0.4170 Wh or 0.017 W per day. Each battery in the battery system has a nominal voltage of 3.7 V. The capacity of the battery system is 9 Ah. The total energy capacity is 33.3 Wh. The total available energy to power the SBC is 119.16 Wh. Thus, the energy needed to run the SBC for 5 min daily is significantly less than the energy the battery system and solar cell can provide. Finally, if the batteries and solar cell can be fully charged daily, this solution can power the SBC for a long time. The SBC examined in this study has restricted computing resources. It has a maximum memory capacity of 15 MB, a power consumption of 5 W, and a daily power usage of 17 mW; therefore, the complete system consumes small amount of energy, is completely autonomous, and has long lifespan.
Zhong et al. [81] and Rustia et al. [82] have developed automatic pest monitoring systems also based on SBC with Raspberry Pi as the control unit, which is an edge computing design that can ensure lower power consumption and avoid transmission of high-resolution insect images. These automatic systems require WIFI for communication and thus are only suitable for use in controlled conditions (greenhouses), while our proposed automatic system can be deployed in rural areas at any location in orchards, even if network connectivity is very low and controlled conditions are not required. Our proposed prototype smart trap is weather resistant and can be deployed in the field without requiring human intervention (only when the adhesive pad or pheromone lure needs to be replaced). In the work of Segalla et al. [78], the alarms in the system are triggered when a codling moth is positively detected. The peculiarity of our system is that during monitoring, appropriate alarms are triggered when certain phenomena occur, not only when a single adult moth occurs which serves growers as support in making decisions about timely control measures.
The camera forwards the analysis and associated images to the control server. It receives basic data, such as the results of the performed analyses of all classes. Performing a software update is primarily related to the programming code of the camera and the analytical model used. The updates are placed on the control server while the camera downloads the latest software updates and analysis models and applies them in its work. The analytical models, especially in the development area, are subject to frequent changes as new datasets are constantly collected and new models are developed.
The developed smart trap prototype is a portable device that does not require additional infrastructure. The trap operates mostly in sleep mode and therefore consumes less energy, making it last longer and requiring less human intervention. Schrader et al. [85] developed a plug-in imaging system that can be incorporated into the Delta trap for codling moth monitoring. It is also a device with optimised power consumption that operates in sleep mode. This device has many advantages, including low price, but the problem is that it is not able to process captured images, which must then be collected manually.
In the cases where the entire image is sent to the user, the battery can drain extremely quickly, making remote monitoring impossible for a period [53]. Since the model output is comprised of a little quantity of data, it can be transmitted rapidly over a cellular network. The detection results are transmitted to a web portal or user interface, where all subsequent analysis and reporting is performed. The web portal is intended to be the primary place for system configuration and management of the entire system from the reporting side and the expert side, which monitors and customises the system for the user. It consists of a backend, frontend, database, system for advanced analysis and data processing, and storage of files and server systems.
This system is a comprehensive and reliable tool that has proven its great potential in practice. It consists of a model with high detection accuracy and precision, a smart trap prototype with an external structure that can be adapted to all weather conditions, and a functional trap management system.

4. Conclusions

The developed model showed 99.3% accuracy in detecting codling moth. Hence, the stated hypothesis is accepted, and the model is deemed effective for monitoring the codling moth. In addition to the codling moth detection model, a smart trap was also developed. Thanks to its external structure, the trap is operational in field conditions, consumes little energy, and requires less human intervention, making it suitable for rural areas. This work contributes to a broader application of precision pest monitoring that can significantly improve both sustainable apple production and agricultural adaptation to climate change. The application of less insecticides is necessitated, and by the usage of this method, targeted and effective pest control is enabled. This strategy reduces the negative influence on the environment and improves the quality and profitability of apple production. It is proposed that future study improve this type of system by establishing a model for the detection of other key agricultural pests.

Author Contributions

Conceptualisation, D.Č., I.P.Ž. and I.A.; methodology, D.Č., I.P.Ž. and I.A.; software, I.A.; validation, I.P.Ž., I.A. and D.L.; formal analysis, D.Č. and I.A.; investigation, D.Č. and I.P.Ž.; resources, I.P.Ž. and D.L.; data curation, D.Č., I.P.Ž. and I.A.; writing—original draft preparation, D.Č., I.P.Ž. and I.A.; writing—review and editing, I.P.Ž., I.A. and D.L.; visualisation, I.A.; supervision, I.P.Ž. and I.A.; project administration, I.P.Ž. and D.L.; and funding acquisition, I.P.Ž., I.A. and D.L. All authors have read and agreed to the published version of the manuscript.

Funding

The publication was supported by the Open Access Publication Fund of the University of Zagreb Faculty of Agriculture and the European Regional Development Found through project AgriART, a comprehensive management system in the field of precision agriculture (KK.01.2.1.02.0290).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The authors thank the European Union, which supported the project AgriART comprehensive management system in the field of precision agriculture (KK.01.2.1.02.0290) through the European Regional Development Fund within the Operational Programme Competitiveness and Cohesion (OPCC) 2014–2020. Our thanks also to Ivana Miklečić, who helped with capturing the images and labelling the objects.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Franck, P.; Timm, A.E. Population genetic structure of Cydia pomonella: A review and case study comparing spatiotemporal variation. J. Appl. Entomol. 2010, 134, 191–200. [Google Scholar] [CrossRef]
  2. Pajač, I.; Barić, B.; Šimon, S.; Mikac, K.M.; Pejić, I. An initial examination of the population genetic structure of Cydia pomonella (Lepidoptera: Tortricidae) in Croatian apple orchards. J. Food Agric. Environ. 2011, 9, 459–464. [Google Scholar]
  3. Men, Q.L.; Chen, M.H.; Zhang, Y.L.; Feng, J.N. Genetic structure and diversity of a newly invasive species, the codling moth, Cydia pomonella (L.) (Lepidoptera: Tortricidae) in China. Biol. Invasions 2013, 15, 447–458. [Google Scholar] [CrossRef]
  4. Basoalto, A.; Ramírez, C.C.; Lavandero, B.; Devotto, L.; Curkovic, T.; Franck, P.; Fuentes-Contreras, E. Population genetic structure of codling moth, Cydia pomonella (L.) (Lepidoptera: Tortricidae), in different localities and host plants in Chile. Insects 2020, 11, 285. [Google Scholar] [CrossRef] [PubMed]
  5. Kuyulu, A.; Genç, H. Genetic diversity of codling moth Cydia pomonella L. (Lepidoptera: Tortricidae) populations in Turkey. Turk. J. Zool. 2020, 44, 462–471. [Google Scholar] [CrossRef]
  6. Cichón, L.I.; Soleño, J.; Garrido, S.A.; Guiñazú, N.; Montagna, C.M.; Franck, P.; Olivares, J.; Musleh, S.; Rodríguez, M.A.; Fuentes-Contreras, E. Genetic structure of Cydia pomonella populations in Argentina and Chile implies isolating barriers exist between populations. J. Appl. Entomol. 2021, 145, 911–921. [Google Scholar] [CrossRef]
  7. Kadoić Balaško, M.; Bažok, R.; Mikac, K.M.; Benítez, H.A.; Suazo, M.J.; Viana, J.P.G.; Lemic, D.; Živković, I.P. Population Genetic Structure and Geometric Morphology of Codling Moth Populations from Different Management Systems. Agronomy 2022, 12, 1278. [Google Scholar] [CrossRef]
  8. Sauphanor, B.; Brosse, V.; Bouvier, J.C.; Speich, P.; Micoud, A.; Martinet, C. Monitoring resistance to diflubenzuron and deltamethrin in French codling moth populations (Cydia pomonella). Pest Manag. Sci. 2000, 56, 74–82. [Google Scholar] [CrossRef]
  9. Reyes, M.; Barros-Parada, W.; Ramírez, C.C.; Fuentes-Contreras, E. Organophosphate resistance and its main mechanism in populations of codling moth (Lepidoptera: Tortricidae) from Central Chile. J. Econ. Entomol. 2015, 108, 277–285. [Google Scholar] [CrossRef]
  10. Yang, X.Q.; Zhang, Y.L. Investigation of insecticide-resistance status of Cydia pomonella in Chinese populations. Bull. Entomol. Res. 2015, 105, 316–325. [Google Scholar] [CrossRef]
  11. Pajač Živković, I.; Barić, B. Rezistentnost jabukova savijača na insekticidne pripravke. Glas. Biljn. Zast. 2017, 17, 469–479. [Google Scholar]
  12. Bosch, D.; Rodríguez, M.A.; Avilla, J. Monitoring resistance of Cydia pomonella (L.) Spanish field populations to new chemical insecticides and the mechanisms involved. Pest Manag. Sci. 2018, 74, 933–943. [Google Scholar] [CrossRef]
  13. Pajač Živković, I.; Benitez, H.A.; Barić, B.; Drmić, Z.; Kadoić Balaško, M.; Lemic, D.; Dominguez Davila, J.H.; Mikac, K.M.; Bažok, R. Codling Moth Wing Morphology Changes Due to Insecticide Resistance. Insects 2019, 10, 310. [Google Scholar] [CrossRef]
  14. Knight, A.L.; Hilton, R.; Light, D.M. Monitoring codling moth (Lepidoptera: Tortricidae) in apple with blends of ethyl (E,Z)-2,4-decadienoate and codlemone. Environ. Entomol. 2005, 34, 598–603. [Google Scholar] [CrossRef]
  15. Lacey, L.A.; Unruh, T.R. Biological control of codling moth (Cydia pomonella, Lepidoptera: Tortricidae) and its role in integrated pest management, with emphasis on entomopathogens. Vedalia 2005, 12, 33–60. [Google Scholar]
  16. Knight, A.L. Codling moth areawide integrated pest management. In Areawide Pest Management: Theory and Implementation, 1st ed.; Koul, O., Cuperus, G., Elliott, N., Eds.; CAB International: Oxfordshire, UK, 2008; pp. 159–190. [Google Scholar] [CrossRef]
  17. Mitchell, V.J.; Manning, L.A.; Cole, L.; Suckling, D.M.; El-Sayed, A.M. Efficacy of the pear ester as a monitoring tool for codling moth Cydia pomonella (Lepidoptera: Tortricidae) in New Zealand apple orchards. Pest Manag. Sci. 2008, 64, 209–214. [Google Scholar] [CrossRef]
  18. Kadoić Balaško, M.; Bažok, R.; Mikac, K.M.; Lemic, D.; Pajač Živković, I. Pest management challenges and control practices in codling moth: A review. Insects 2020, 11, 38. [Google Scholar] [CrossRef]
  19. Pajač, I.; Barić, B.; Mikac, K.M.; Pejić, I. New insights into the biology and ecology of Cydia pomonella from apple orchards in Croatia. Bull. Insectol. 2012, 65, 185–193. [Google Scholar]
  20. Thaler, R.; Brandstätter, A.; Meraner, A.; Chabicovski, M.; Parson, W.; Zelger, R.; Dalla Via, J.; Dallinger, R. Molecular phylogeny and population structure of the codling moth (Cydia pomonella) in Central Europe: II. AFLP analysis reflects human-aided local adaptation of a global pest species. Mol. Phylogenet. Evol. 2008, 48, 838–849. [Google Scholar] [CrossRef]
  21. Lacey, L.A.; Thomson, D.; Vincent, C.; Arthurs, S.P. Codling moth granulovirus: A comprehensive review. Biocontrol Sci. Technol. 2008, 18, 639–663. [Google Scholar] [CrossRef]
  22. Maceljski, M. Jabučni savijač (Cydia/Laspeyresia, Carpocapsa, Grapholita/pomonella L.). In Poljoprivredna Entomologija, 2nd ed.; Zrinski: Cakovec, Croatia, 2002; pp. 302–309. [Google Scholar]
  23. Ciglar, I. Integrirana Zaštita Voćaka i Vinove Loze, 1st ed.; Zrinski: Cakovec, Croatia, 1998; pp. 82–87. [Google Scholar]
  24. Šubić, M.; Braggio, A.; Bassanetti, C.; Aljinović, S.; Tomšić, A.; Tomšić, T. Suzbijanje jabučnog savijača (Cydia pomonella L.) metodom konfuzije ShinEtsu® (Isomate C/OFM i Isomate CTT + OFM rosso FLEX) u Medimurju tijekom 2014. Glas. Biljn. Zast. 2015, 15, 277–290. [Google Scholar]
  25. Witzgall, P.; Stelinski, L.; Gut, L.; Thomson, D. Codling moth management and chemical ecology. Annu. Rev. Entomol. 2008, 53, 503–522. [Google Scholar] [CrossRef]
  26. Fernández, D.E.; Cichón, L.; Garrido, S.; Ribes-Dasi, M.; Avilla, J. Comparison of lures loaded with codlemone and pear ester for capturing codling moths, Cydia pomonella, in apple and pear orchards using mating disruption. J. Insect Sci. 2010, 10, 139. [Google Scholar] [CrossRef]
  27. Barić, B.; Pajač Živković, I. Učinkovitost konfuzije u suzbijanju jabukova savijača u Hrvatskoj s posebnim osvrtom na troškove zaštite. Pomol. Croat. Glas. Hrvat. Agron. Drus. 2017, 21, 125–132. [Google Scholar] [CrossRef]
  28. Miller, J.R.; Gut, L.J. Mating disruption for the 21st century: Matching technology with mechanism. Environ. Entomol. 2015, 44, 427–453. [Google Scholar] [CrossRef]
  29. Charmillot, P.J.; Hofer, D.; Pasquier, D. Attract and kill: A new method for control of the codling moth Cydia pomonella. Entomol. Exp. Appl. 2000, 94, 211–216. [Google Scholar] [CrossRef]
  30. Vreysen, M.J.B.; Carpenter, J.E.; Marec, F. Improvement of the sterile insect technique for codling moth Cydia pomonella (Linnaeus) (Lepidoptera Tortricidae) to facilitate expansion of field application. J. Appl. Entomol. 2010, 134, 165–181. [Google Scholar] [CrossRef]
  31. Thistlewood, H.M.A.; Judd, G.J.R. Twenty-five Years of Research Experience with the Sterile Insect Technique and Area-Wide Management of Codling Moth, Cydia pomonella (L.), in Canada. Insects 2019, 10, 292. [Google Scholar] [CrossRef]
  32. Gümüssoy, A.; Yüksel, E.; Özer, G.; Imren, M.; Canhilal, R.; Amer, M.; Dababat, A.A. Identification and Biocontrol Potential of Entomopathogenic Nematodes and Their Endosymbiotic Bacteria in Apple Orchards against the Codling Moth, Cydia pomonella (L.) (Lepidoptera: Tortricidae). Insects 2022, 13, 1085. [Google Scholar] [CrossRef]
  33. Laffon, L.; Bischoff, A.; Gautier, H.; Gilles, F.; Gomez, L.; Lescourret, F.; Franck, P. Conservation Biological Control of Codling Moth (Cydia pomonella): Effects of Two Aromatic Plants, Basil (Ocimum basilicum) and French Marigolds (Tagetes patula). Insects 2022, 13, 908. [Google Scholar] [CrossRef]
  34. Ju, D.; Mota-Sanchez, D.; Fuentes-Contreras, E.; Zhang, Y.L.; Wang, X.Q.; Yang, X.Q. Insecticide resistance in the Cydia pomonella (L.): Global status, mechanisms, and research directions. Pest. Biochem. Physiol. 2021, 178, 104925. [Google Scholar] [CrossRef]
  35. Reyes, M.; Franck, P.; Charmillot, P.J.; Ioriatti, C.; Olivares, J.; Pasqualini, E.; Sauphanor, B. Diversity of insecticide resistance mechanisms and spectrum in European populations of the codling moth, Cydia pomonella. Pest Manag. Sci. 2007, 63, 890–902. [Google Scholar] [CrossRef]
  36. Schulze-Bopp, S.; Jehle, J.A. Development of a direct test of baculovirus resistance in wild codling moth populations. J. Appl. Entomol. 2013, 137, 153–160. [Google Scholar] [CrossRef]
  37. Fan, J.; Jehle, J.A.; Rucker, A.; Nielsen, A.L. First Evidence of CpGV Resistance of Codling Moth in the USA. Insects 2022, 13, 533. [Google Scholar] [CrossRef]
  38. Franck, P.; Reyes, M.; Olivares, J.; Sauphanor, B. Genetic architecture in codling moth populations: Comparison between microsatellite and insecticide resistance markers. Mol. Ecol. 2007, 16, 3554–3564. [Google Scholar] [CrossRef]
  39. Beers, E.H.; Horton, D.R.; Miliczky, E. Pesticides used against Cydia pomonella disrupt biological control of secondary pests of apple. Biol. Control 2016, 102, 35–43. [Google Scholar] [CrossRef]
  40. Skendžić, S.; Zovko, M.; Pajač Živković, I.; Lešić, V.; Lemic, D. The impact of climate change on agricultural insect pests. Insects 2021, 12, 440. [Google Scholar] [CrossRef]
  41. Stoeckli, S.; Hirschi, M.; Spirig, C.; Calanca, P.; Rotach, M.W.; Samietz, J. Impact of climate change on voltinism and prospective diapause induction of a global pest insect—Cydia pomonella (L.). PLoS ONE 2012, 7, e35723. [Google Scholar] [CrossRef]
  42. Juszczak, R.; Kuchar, L.; Leśny, J.; Olejnik, J. Climate change impact on development rates of the codling moth (Cydia pomonella L.) in the Wielkopolska region, Poland. Int. J. Biometeorol. 2013, 57, 31–44. [Google Scholar] [CrossRef]
  43. Čirjak, D.; Miklečić, I.; Lemic, D.; Kos, T.; Pajač Živković, I. Automatic Pest Monitoring Systems in Apple Production under Changing Climatic Conditions. Horticulturae 2022, 8, 520. [Google Scholar] [CrossRef]
  44. Brown, S. Apple. In Fruit Breeding, Handbook of Plant Breeding; Badenes, M.L., Byrne, D.H., Eds.; Springer: Boston, MA, USA, 2012; pp. 329–367. [Google Scholar] [CrossRef]
  45. FAOSTAT. Food and Agriculture Organization of the United Nations. 2022. Available online: https://www.fao.org/faostat/en/#home (accessed on 10 January 2023).
  46. O’Shea, K.; Nash, R. An introduction to convolutional neural networks. arXiv 2015, arXiv:1511.08458. [Google Scholar]
  47. Sun, Y.; Liu, X.; Yuan, M.; Ren, L.; Wang, J.; Chen, Z. Automatic in-trap pest detection using deep learning for pheromone-based Dendroctonus valens monitoring. Biosyst. Eng. 2018, 176, 140–150. [Google Scholar] [CrossRef]
  48. Bjerge, K.; Nielsen, J.B.; Sepstrup, M.V.; Helsing-Nielsen, F.; Høye, T.T. An Automated Light Trap to Monitor Moths (Lepidoptera) Using Computer Vision-Based Tracking and Deep Learning. Sensors 2021, 21, 343. [Google Scholar] [CrossRef] [PubMed]
  49. Rustia, D.J.A.; Wu, Y.F.; Shih, P.Y.; Chen, S.K.; Chung, J.Y.; Lin, T.T. Tree-based Deep Convolutional Neural Network for Hierarchical Identification of Low-resolution Insect Images. In Proceedings of the 2021 ASABE Annual International Virtual Meeting, Virtual, 12–16 July 2021; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2021; p. 1. [Google Scholar]
  50. Suto, J. Codling Moth Monitoring with Camera-Equipped Automated Traps: A Review. Agriculture 2022, 12, 1721. [Google Scholar] [CrossRef]
  51. Ding, W.; Taylor, G. Automatic moth detection from trap images for pest management. Comput. Electron. Agric. 2016, 123, 17–28. [Google Scholar] [CrossRef]
  52. Suto, J. Embedded System-Based Sticky Paper Trap with Deep Learning-Based Insect-Counting Algorithm. Electronics 2021, 10, 1754. [Google Scholar] [CrossRef]
  53. Preti, M.; Moretti, C.; Scarton, G.; Giannotta, G.; Angeli, S. Developing a smart trap prototype equipped with camera for tortricid pests remote monitoring. Bull. Insectol. 2021, 74, 147–160. [Google Scholar]
  54. Suárez, A.; Molina, R.S.; Ramponi, G.; Petrino, R.; Bollati, L.; Sequeiros, D. Pest detection and classification to reduce pesticide use in fruit crops based on deep neural networks and image processing. In Proceedings of the 2021 XIX Workshop on Information Processing and Control (RPIC), San Juan, Argentina, 3–5 November 2021; pp. 1–6. [Google Scholar] [CrossRef]
  55. Albanese, A.; Nardello, M.; Brunelli, D. Automated Pest Detection with DNN on the Edge for Precision Agriculture. IEEE J. Emerg. Sel. Top. Circuits Syst. 2021, 11, 458–467. [Google Scholar] [CrossRef]
  56. Čirjak, D.; Aleksi, I.; Miklečić, I.; Antolković, A.M.; Vrtodušić, R.; Viduka, A.; Lemic, D.; Kos, T.; Pajač Živković, I. Monitoring System for Leucoptera malifoliella (O. Costa, 1836) and Its Damage Based on Artificial Neural Networks. Agriculture 2023, 13, 67. [Google Scholar] [CrossRef]
  57. Object Detection with TensorFlow Lite Model Maker. Available online: https://www.tensorflow.org/lite/models/modify/model_maker/object_detection#run_%20ob-ject_detection_and_show_the_detection_results/ (accessed on 10 February 2023).
  58. Module: Tf.keras.metrics. Available online: https://www.tensorflow.org/api_docs/python/tf/keras/metrics (accessed on 28 March 2023).
  59. COCO. Common Objects in Context. Available online: https://cocodataset.org/#detection-eval (accessed on 13 February 2023).
  60. Baeldung. Available online: https://www.baeldung.com/cs/training-validation-loss-deep-learning (accessed on 13 February 2023).
  61. Hasty GmbH. Available online: https://hasty.ai/docs/mp-wiki/metrics/iou-intersection-over-union (accessed on 26 March 2023).
  62. Kulkarni, A.; Chong, D.; Batarseh, F.A. Foundations of data imbalance and solutions for a data democracy. In Data Democracy, 1st ed.; Batarseh, A., Yang, R., Eds.; Academic Press: Cambridge, MA, USA, 2020; pp. 83–106. [Google Scholar] [CrossRef]
  63. V7Labs. Available online: https://www.v7labs.com/blog/confusion-matrix-guide (accessed on 9 February 2023).
  64. Aslan, M.F.; Sabanci, K.; Durdu, A. A CNN-based novel solution for determining the survival status of heart failure patients with clinical record data: Numeric to image. Biomed. Signal Process. Control 2021, 68, 102716. [Google Scholar] [CrossRef]
  65. PaperspaceBlog. Available online: https://blog.paperspace.com/deep-learning-metrics-precision-recall-accuracy/ (accessed on 12 February 2023).
  66. Towards Data Science. Available online: https://towardsdatascience.com/the-f1-score-bec2bbc38aa6 (accessed on 12 February 2023).
  67. Grandini, M.; Bagli, E.; Visani, G. Metrics for multi-class classification: An overview. arXiv 2020. [Google Scholar] [CrossRef]
  68. Tan, M.; Pang, R.; Le, Q.V. EfficientDet: Scalable and Efficient Object Detection. arXiv 2019. [Google Scholar] [CrossRef]
  69. Sasaki, Y. The Truth of the F-Measure; University of Manchester: Manchester, UK, 2007. [Google Scholar]
  70. Teixeira, A.C.; Ribeiro, J.; Morais, R.; Sousa, J.J.; Cunha, A. A Systematic Review on Automatic Insect Detection Using Deep Learning. Agriculture 2023, 13, 713. [Google Scholar] [CrossRef]
  71. Suto, J. A Novel Plug-in Board for Remote Insect Monitoring. Agriculture 2022, 12, 1897. [Google Scholar] [CrossRef]
  72. Wen, C.; Guyer, D. Image-based orchard insect automated identification and classification method. Comput. Electron. Agric. 2012, 89, 110–115. [Google Scholar] [CrossRef]
  73. Hong, S.-J.; Kim, S.-Y.; Kim, E.; Lee, C.-H.; Lee, J.-S.; Lee, D.-S.; Bang, J.; Kim, G. Moth Detection from Pheromone Trap Images Using Deep Learning Object Detectors. Agriculture 2020, 10, 170. [Google Scholar] [CrossRef]
  74. Ahmad, I.; Yang, Y.; Yue, Y.; Ye, C.; Hassan, M.; Cheng, X.; Wu, Y.; Zhang, Y. Deep Learning Based Detector YOLOv5 for Identifying Insect Pests. Appl. Sci. 2022, 12, 10167. [Google Scholar] [CrossRef]
  75. Xia, D.; Chen, P.; Wang, B.; Zhang, J.; Xie, C. Insect Detection and Classification Based on an Improved Convolutional Neural Network. Sensors 2018, 18, 4169. [Google Scholar] [CrossRef]
  76. Hong, S.-J.; Nam, I.; Kim, S.-Y.; Kim, E.; Lee, C.-H.; Ahn, S.; Park, I.-K.; Kim, G. Automatic Pest Counting from Pheromone Trap Images Using Deep Learning Object Detectors for Matsucoccus thunbergianae Monitoring. Insects 2021, 12, 342. [Google Scholar] [CrossRef]
  77. Popescu, D.; Ichim, L.; Dimoiu, M.; Trufelea, R. Comparative Study of Neural Networks Used in Halyomorpha Halys Detection. In Proceedings of the 2022 IEEE 30th Mediterranean Conference on Control and Automation (MED), Vouliagmeni, Greece, 28 June–1 July 2022; pp. 182–187. [Google Scholar] [CrossRef]
  78. RS Components Ltd. Available online: https://uk.rs-online.com/web/generalDisplay.html?id=solutions/single-board-computers-overview (accessed on 27 March 2023).
  79. Segalla, A.; Fiacco, G.; Tramarin, L.; Nardello, M.; Brunelli, D. Neural networks for pest detection in precision agriculture. In Proceedings of the 2020 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento, Italy, 4–6 November 2020; pp. 7–12. [Google Scholar] [CrossRef]
  80. Mendoza, Q.A.; Pordesimo, L.; Neilsen, M.; Armstrong, P.; Campbell, J.; Mendoza, P.T. Application of Machine Learning for Insect Monitoring in Grain Facilities. AI 2023, 4, 348–360. [Google Scholar] [CrossRef]
  81. Zhong, Y.; Gao, J.; Lei, Q.; Zhou, Y. A vision-based counting and recognition system for flying insects in intelligent agriculture. Sensors 2018, 18, 1489. [Google Scholar] [CrossRef] [PubMed]
  82. Rustia, D.J.A.; Lin, C.E.; Chung, J.Y.; Zhuang, Y.J.; Hsu, J.C.; Lin, T.T. Application of image and environmental sensor network for automated greenhouse insect pest monitoring. J. Asia Pac. Etomol. 2020, 23, 17–28. [Google Scholar] [CrossRef]
  83. Brunelli, D.; Albanese, A.; d’Acunto, D.; Nardello, M. Energy neutral machine learning based iot device for pest detection in precision agriculture. IEEE Internet Things Mag. 2019, 2, 10–13. [Google Scholar] [CrossRef]
  84. Wosner, O.; Farjon, G.; Bar-Hillel, A. Object detection in agricultural contexts: A multiple resolution benchmark and comparison to human. Comput. Electron. Agric. 2021, 189, 106404. [Google Scholar] [CrossRef]
  85. Schrader, M.J.; Smytheman, P.; Beers, E.H.; Khot, L.R. An Open-Source Low-Cost Imaging System Plug-In for Pheromone Traps Aiding Remote Insect Pest Population Monitoring in Fruit Crops. Machines 2022, 10, 52. [Google Scholar] [CrossRef]
Figure 1. The smart trap prototype installed in the orchard.
Figure 1. The smart trap prototype installed in the orchard.
Agriculture 13 00961 g001
Figure 2. Sampling sites of codling moth images in Croatian orchards.
Figure 2. Sampling sites of codling moth images in Croatian orchards.
Agriculture 13 00961 g002
Figure 3. Camera for data acquisition in the field.
Figure 3. Camera for data acquisition in the field.
Agriculture 13 00961 g003
Figure 4. Class MOTH (codling moth—different poses of target insects on the adhesive pad).
Figure 4. Class MOTH (codling moth—different poses of target insects on the adhesive pad).
Agriculture 13 00961 g004
Figure 5. Class INSECTS (other non-target insects on the adhesive pad).
Figure 5. Class INSECTS (other non-target insects on the adhesive pad).
Agriculture 13 00961 g005
Figure 6. Class OTHER (other objects on the adhesive pad, i.e., leaves, dirt, branches).
Figure 6. Class OTHER (other objects on the adhesive pad, i.e., leaves, dirt, branches).
Agriculture 13 00961 g006
Figure 7. Annotating targeted objects in the LabelImg program. Class MOTH (target insect codling moth) is labelled with green bounding boxes, class INSECTS (other non-target insects) is labelled with blue bounding boxes, and class OTHER (other objects) is labelled with purple bounding boxes.
Figure 7. Annotating targeted objects in the LabelImg program. Class MOTH (target insect codling moth) is labelled with green bounding boxes, class INSECTS (other non-target insects) is labelled with blue bounding boxes, and class OTHER (other objects) is labelled with purple bounding boxes.
Agriculture 13 00961 g007
Figure 8. Process of model learning and validation across epochs.
Figure 8. Process of model learning and validation across epochs.
Agriculture 13 00961 g008
Figure 9. Automatic counts of the model; blue bounding boxes—detected class MOTH; green bounding boxes—detected class INSECTS and class OTHER.
Figure 9. Automatic counts of the model; blue bounding boxes—detected class MOTH; green bounding boxes—detected class INSECTS and class OTHER.
Agriculture 13 00961 g009
Figure 10. Display of the number of detected objects with a confusion matrix.
Figure 10. Display of the number of detected objects with a confusion matrix.
Agriculture 13 00961 g010
Table 1. Image dataset divided by the phases.
Table 1. Image dataset divided by the phases.
Phases of Creating Analytical ModelNumber of Images
Training139,320 (90%)
Validation15,480 (10%)
Test30 (additional new images)
Table 4. Validation parameters of the analytical model.
Table 4. Validation parameters of the analytical model.
ParameterValue
AP0.66
APIoU = 0.500.93
APIoU = 0.750.80
APs0.42
APm0.65
APl0.57
ARmax10.30
ARmax100.71
ARmax1000.75
ARs0.59
ARm0.74
ARl0.63
AP_MOTH0.79
AP_INSECT0.62
AP_OTHER0.55
Table 5. Final parameters of the model testing.
Table 5. Final parameters of the model testing.
ClassMOTHINSECTOTHER
(n) truth189120983
(n) classified180199993
Accuracy99.3%99.46%99.07%
Precision1.00.970.99
Recall0.950.971.0
F1 Score0.980.970.99
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Čirjak, D.; Aleksi, I.; Lemic, D.; Pajač Živković, I. EfficientDet-4 Deep Neural Network-Based Remote Monitoring of Codling Moth Population for Early Damage Detection in Apple Orchard. Agriculture 2023, 13, 961. https://doi.org/10.3390/agriculture13050961

AMA Style

Čirjak D, Aleksi I, Lemic D, Pajač Živković I. EfficientDet-4 Deep Neural Network-Based Remote Monitoring of Codling Moth Population for Early Damage Detection in Apple Orchard. Agriculture. 2023; 13(5):961. https://doi.org/10.3390/agriculture13050961

Chicago/Turabian Style

Čirjak, Dana, Ivan Aleksi, Darija Lemic, and Ivana Pajač Živković. 2023. "EfficientDet-4 Deep Neural Network-Based Remote Monitoring of Codling Moth Population for Early Damage Detection in Apple Orchard" Agriculture 13, no. 5: 961. https://doi.org/10.3390/agriculture13050961

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop