Next Article in Journal
TWEEF: Trustworthiness Estimation and Enhancement Framework for Machine Learning Models
Previous Article in Journal
A Federated Deep Learning Framework for Sleep-Stage Monitoring Using the ISRUC-Sleep Dataset
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Deep Learning for Real-Time Detection of Brassicogethes aeneus in Oilseed Rape Using the YOLOv4 Architecture

by
Ziemowit Malecha
1,*,
Kajetan Ożarowski
1,†,
Rafał Siemasz
1,
Maciej Chorowski
1,
Krzysztof Tomczuk
2,
Bernadeta Strochalska
3 and
Anna Wondołowska-Grabowska
3
1
Department of Cryogenics and Aerospace Engineering, Wroclaw University of Science and Technology, 50-370 Wroclaw, Poland
2
Faculty of Fundamental Problems of Technology, Wroclaw University of Science and Technology, 50-370 Wroclaw, Poland
3
Institute of Agroecology and Plant Production, Wroclaw University of Environmental and Life Science, 50-370 Wroclaw, Poland
*
Author to whom correspondence should be addressed.
Deceased author.
Appl. Sci. 2026, 16(2), 1075; https://doi.org/10.3390/app16021075
Submission received: 27 November 2025 / Revised: 7 January 2026 / Accepted: 13 January 2026 / Published: 21 January 2026

Abstract

The growing global population and increasing food demand highlight the need for sustainable agricultural practices that balance productivity with environmental protection. Traditional blanket pesticide spraying leads to overuse of chemicals, environmental pollution, and biodiversity loss. This study aims to develop an innovative approach to precision pest management using mobile computing, computer vision, and deep learning techniques. A mobile measurement platform equipped with cameras and an onboard computer was designed to collect real-time field data and detect pest infestations. The system uses an advanced object detection algorithm based on the YOLOv4 architecture, trained on a custom dataset of rapeseed pest images. Modifications were made to enhance detection accuracy, especially for small objects. Field tests demonstrated the system’s ability to identify and count pests, such as the pollen beetle (Brassicogethes aeneus), in rapeseed crops. The collected data, combined with GPS information, generated pest density maps, which can guide site-specific pesticide applications. The results show that the proposed method achieved a mean average precision (mAP) of 83.7% on the test dataset. Field measurements conducted during the traversal of rapeseed fields enabled the creation of density maps illustrating the distribution of pollen beetles. Based on these maps, the potential for pesticide savings was demonstrated, and the migration dynamics of pollen beetle were discussed.

1. Introduction

This article is dedicated to the memory of the late Kajetan Ozarowski, who passed away unexpectedly and far too soon. May his work and contribution to this research serve as a lasting benefit to the scientific community and beyond.
The world population is currently around 8.1 billion and is projected to continue growing, reaching 9 billion by 2037 and 9.7 billion by 2050. It is expected to peak at 10.4 billion by 2080 [1]. This continuous growth in the human population drives an increasing demand for agricultural products. Among these, oilseed crops play a crucial role in human nutrition by providing edible oils and raw materials for fuel production.
During the past 50 years, the cultivation area of cruciferous oilseed crops has expanded significantly. This expansion has led to the spread of pests that attack these crops, particularly insects [2]. The most important oilseed crop grown in temperate climates is rapeseed (Brassica napus L.). Rapeseed is susceptible to attacks by numerous pests. These include cabbage stem flea beetle (Psylliodes chrysocephala), cabbage aphid (Brevicoryne brassicae syn. Aphis brassicae), cabbage gall midge (Dasineura brassicae syn. Contarinia brassicae), and pollen beetles (Brassicogethes aeneus syn. Meligethes aeneus) [3]. Among these, the pollen beetle (Brassicogethes aeneus, formerly known as Meligethes aeneus) poses a significant threat to rapeseed cultivation, being a major pest that causes substantial economic losses [2,4]. Due to the increased incidence of pests in rapeseed crops, the use of pesticides has become an almost unavoidable part of the cultivation of these plants.
Rapeseed (Brassica napus L.) is highly attractive to pollinators. In conventional agriculture, where pesticides are widely used, this high attractiveness can increase the risk of pesticide poisoning of bees [2,3]. Although pesticides are commonly used to control pests, their effectiveness can vary due to several factors, including environmental [5]. Global warming is a significant factor that can increase pest pressure on crops by enhancing insect reproduction rates [6]. Temperature during the plant growth season can influence insect flight activity, while wind direction [4], the proximity of forested areas [7,8], and the development of widespread resistance in pollen beetle to various insecticides, including pyrethroids, also play crucial roles [9].
Assessing rapeseed yield loss due to pest feeding is neither straightforward nor clear-cut. It is generally accepted that rapeseed insect pests account for an average yield loss of about 13% worldwide and 15% in Europe [10]. Other estimates suggest that key rapeseed pests cause an approximately 20% [11] or even 25% [12] yield loss. An unprotected spring or winter rapeseed crop against pollen beetle can result in yield losses ranging from a few percent to 50% [13], and in some cases, losses may exceed 80 [14,15].
Taking action to protect a plantation must be preceded by an ongoing analysis of its condition. Traditional procedures for monitoring winter rapeseed plantations for pests are based, among other things, on the classic method of direct inspection of the plantation, catching pests with an entomological net (net sampling) or shaking (shake method), the use of yellow pots, an indirect method that includes the assessment of plant damage, the use of pheromone, scent, and attractant traps (other than pheromone traps), and soil sample analysis [16]. Most of these methods are low-cost, with the exception of pheromone traps and soil samples [17]. Despite their many advantages, traditional methods also have disadvantages. For example, direct inspection does not involve equipment costs and allows for the simultaneous assessment of the developmental stage of plants and pests, but it is time-consuming for large areas, highly dependent on the observer’s experience, and not very effective against small and active pests [18]. The advantage of the scoop method is that it allows for a quick assessment of the number of active insects, but it is not very effective in low temperatures, strong winds, and humid conditions, and the results are highly dependent on the technique used [18,19]. Yellow pots are very effective in detecting early pest infestations, but this method does not directly reflect the number of pests on plants, there is no precise link to economic damage thresholds, and it is a selective method, working mainly on insects that react to color [9,17]. Unfortunately, the traditional methods presented provide unrepresentative information in relation to different plantation locations and are useless when it is necessary to develop a map of the degree of plantation infestation. This is where methods based on optical detection and deep learning algorithms come in handy, as they are particularly useful where there is potential for selective plant protection.
The dynamics of pollen beetle (Brassicogethes aeneus) immigration suggests significant potential for precision agriculture to reduce insecticide use. This potential lies in the ability to spatially map beetle clusters within crops where their density is high. Optical detection of the pollen beetle allows for a more effective assessment and determination of damage thresholds, both temporally and spatially [20]. Studies have shown that selective spraying with insecticides can effectively control pests and diseases in various crops, reducing the use of pesticides and foliar fertilizers by an average of 30% to 65% [21], while also protecting beneficial pollinators [22]. Furthermore, the application of insecticides on flowering rapeseed (Brassica napus L.) to combat pests such as cabbage seed weevil (Dasineura brassicae W.) often conflicts with the protection of pollinating and beneficial insects [3].
To address the challenges of pest management in fields of rapeseed (Brassica napus L.), various innovative approaches have been explored. One such approach involves the use of drones to detect pest infestations and precisely spray pesticides, allowing targeted and selective pest control [20,23]. Furthermore, various techniques and methods that use machine learning (ML) show great promise in solving this problem [24,25].
Machine learning has become a transformative tool in agriculture, offering various applications in precision agriculture, such as the detection and identification of diseases and pests [26,27,28,29]. For instance, smartphone cameras and appropriate ML models have been used to monitor and visualize pests in real time. As demonstrated in [25], the combination of optical sensors and machine learning algorithms can effectively detect rapeseed pests in real time.
In studies [5,15], machine learning was used to accurately detect pollen beetle (Brassicogethes aeneus), a significant pest of oilseed rape (Brassica napus), achieving high accuracy rates of approximately 91%. The above literature review suggests that the use of machine learning for pest detection can significantly improve crop management, reduce dependency on pesticides, and improve economic benefits for the agricultural industry. Furthermore, it indicates that the application of machine learning in agriculture will play an increasingly vital role in ensuring food security, promoting sustainability, and increasing productivity [30,31,32].
Several deep learning architectures have been developed for object detection, including Faster R-CNN [33], Single Shot Detector (SSD) [34], and the YOLO (You Only Look Once) [35,36,37,38,39]. Among these, YOLO stands out due to its balance of speed and accuracy, especially in real-time applications. In the context of this study, where the recognition of small objects and fast processing in a mobile, resource-constrained environment was crucial, YOLO was deemed the most suitable. Specifically, the YOLOv4 architecture was selected and further adapted to meet the demands of this research, given its improved performance for detecting small-scale features [40,41,42,43,44].
Despite significant advances in applying machine learning to detect rapeseed pests, there is a clear research gap in utilizing these algorithms on mobile platforms for dynamic, real-time pest detection across multiple channels simultaneously. Previous studies have primarily focused on static systems or limited laboratory conditions, not fully accounting for the complexity and variability of field conditions. There is a lack of comprehensive research that combines advanced AI algorithms with mobile measurement platforms capable of effectively monitoring large crop areas under varying environmental conditions.
This study addresses a key research gap by introducing an innovative approach to rapeseed pest detection. The novelty of our work lies in the development and implementation of a mobile measurement system capable of detecting pests in real time, across multiple image channels and under varying field conditions. The system integrates advanced machine learning algorithms optimized for mobile platforms, precise GPS-based spatial mapping, and a flexible methodology adaptable to different pest species and environmental conditions.
A major agricultural innovation of the proposed system is its potential ability to perform selective spraying, applying pesticides only in areas where pest density exceeds established harmfulness thresholds. Crucially, all operations, including pest recognition, density assessment, and decision-making for spraying, are executed in real time during a single field pass. This represents a significant step toward the automated control of sprayer actuators (e.g., nozzles) on commercially available agricultural equipment.
Importantly, the system is universal in its design and can be adapted for use in monitoring other crops, pests, weeds, and plant diseases. While rapeseed is the focus of this study, it serves as a case study for broader applications.

2. Materials and Methods

2.1. System Architecture and Hardware Configuration

Figure 1 illustrates the architecture of the mobile measurement system used in this study to evaluate the feasibility of using artificial intelligence for real-time pest detection in agricultural fields. The system comprises the following components:
  • An onboard Jetson Orin AGX computer (NVIDIA Corporation, Santa Clara, CA, USA) with a DC power supply housed in a protective enclosure.
  • Four GoPro Hero 11 Black cameras (GoPro, Inc., San Mateo, CA, USA) equipped with integrated GPS.
  • Input/output peripherals (LCD display, mouse, keyboard).
  • A 12 VDC power connection from the tractor.
  • A mounting structure installed on the tractor’s front linkage (Figure 2 and Figure 3).
The components were integrated into a dedicated mounting structure installed on the tractor’s front linkage (Figure 2 and Figure 3). The camera system, along with LED lamps, was mounted at the front, while the onboard computer and power systems were placed in a secure electric enclosure (Figure 3) to protect against dust and vibrations.
To facilitate the data collection process, a dedicated program was developed for the Jetson Orin AGX computer. This software enabled synchronous image and GPS data acquisition at a rate of approximately one image per second. Connectivity with the cameras was established using the Open GoPro API, and GoPro Labs Firmware with custom settings was used to enhance functionality. The main advantage of the GoPro cameras used in the study is the automatic selection of image parameters (such as gain, exposure time, adjusted to brightness), as well as shake correction and weather resistance.

2.2. Field Data Acquisition and Site Description

Field data were collected from several winter rapeseed plantations with a total area of approximately 80 hectares. During the migration period of pollen beetles (late April and May), a series of passes were made along the technological paths using an agricultural tractor equipped with the front-mounted support frame carrying cameras with a total field of view of approximately 6 m.
Prior to the field testing, a survey of field insect species was conducted. This was essential to ensure that the detection system could distinguish the target pest from morphologically similar species, such as cabbage stem weevils (Ceutorhynchus spp.), which were also present in the fields. Although these insects share a similar dark coloration, their distinct body morphology and movement patterns allowed for precise manual annotation of only the target species (Brassicogethes aeneus).
The images were recorded under various lighting conditions, including sunny and overcast weather as well as under artificial illumination—particularly in the late afternoon and evening hours, which correspond to the optimal time for pesticide application due to the absence of pollinators. To avoid contamination of the training dataset with test data, the routes were carefully planned so that the tractor did not pass through the same area more than once. For testing, randomly selected continuous segments were chosen, with additional margins added to ensure that the observation area did not overlap previously surveyed sections, both in the experimental and the control fields.

2.3. Dataset Preparation and Annotation

The dataset included 600 labeled images in 5K resolution (5312 × 4648 pixels), in an 8:7 aspect ratio. Labeling focused exclusively on pollen beetles, the primary pest at the BBCH 50–59 growth stage. Due to the specific conditions of the implemented project, the data were collected starting from the later growth stages.
Pest objects in the dataset were annotated manually in the YOLO format [35]. The small size of the insects (often only a few pixels wide) presented a challenge. Only images with pests in close proximity were labeled to maintain accuracy. Consequently, during detection, pests appeared smaller than during training. To address this mismatch, modifications suggested in [36,45], including adjusting the convolutional layers during inference, were implemented. Data augmentation techniques such as occlusion simulation were employed to enhance model robustness, while real-world variability was retained through diverse environmental settings. To streamline the labeling process, active learning was used. After each iteration of training, the model pre-labeled new images, and only incorrect annotations were manually corrected.

2.4. Object Detection Model and Training Configuration

The YOLOv4 algorithm [35] selected for its speed and effectiveness with small objects [40,41,42], was used to train the object detection model. While newer versions of the YOLO architecture (up to YOLOv13) have since been released, YOLOv4 was chosen because it was the most optimal and stable framework at the time this study was initiated. Its seamless integration with the Darknet framework and the NVIDIA Jetson hardware provided a proven, reliable environment for real-time inference, which was the primary goal of our field application. Compared to SSD [34] and Faster R-CNN [33], YOLOv4 better met the demands of the application.
Neural network training was performed on two NVIDIA RTX 4090 GPUs, using the following parameters: batch size = 64, subdivisions = 32, momentum = 0.949, decay = 0.0005, learning rate = 0.001, maximum batch size = 6000, and training steps = 4800–5400. The Mish activation function originally used in YOLOv4 was replaced with LeakyReLU due to performance and compatibility issues on the Jetson platform. This change ensured stable and efficient operation without degrading recognition performance; in fact, a slight improvement was observed.

2.5. Model Evaluation and Ground Truth Correlation

A 10-fold cross-validation was used for model evaluation. In each iteration, 90% of the dataset was used for training and 10% for testing, ensuring comprehensive model assessment. Since detecting pests solely based on recorded images does not allow for determining their actual number, manual counts were performed at randomly selected locations within the plantation.
First, beetles visible to the naked eye on inflorescences and stems were counted, and then all insects were shaken off into a container and counted again. In total, 100 representative samples were collected. It was calculated that the average share of beetles visible to the naked eye was approximately 21%, with a wide variability ranging from 12% to 89%. The average actual number of pollen beetles per plant was 28 individuals. Visual counting detected only 15–25% of the beetles actually present, with an overall mean of 20.87%. The Pearson correlation coefficient between the number of visible and actual beetles was 0.80. Based on these findings, it was assumed that the pests detected by the cameras represented approximately 21% of the actual pest population. This value was subsequently used to estimate the economic threshold of pest occurrence.
The primary system requirements for the object detection algorithm were as follows:
  • Effective detection of small objects: Given the pests’ minimal size, often only a few pixels wide.
  • Fast object recognition: Required for near-real-time inference on a mobile, tractor-mounted computer.
  • Moderate accuracy: Correlation between detected and actual pest count is more critical than individual detection errors for guiding site-specific pesticide application.

3. Results

3.1. Detection Accuracy and Performance Metrics

The dataset obtained during field passes was of high quality, with clearly visible objects (Figure 4 and Figure 5). To evaluate the algorithm’s performance, the mean Average Precision (mAP) at Intersection over Union (IoU) was used [46]. In this study, AP at IoU = 0.50 was chosen as the primary metric, yielding a result of AP@0.50 = 0.837.
While [47] reported a higher prediction value of 92.42% using low-resolution (480 × 480) online images, our study utilized high-resolution 5K images where pests occupied only a few pixels. Given these challenging field conditions, a result of 0.837 is satisfactorily high. Furthermore, in the context of pest detection, the system was tuned to prioritize Recall (sensitivity). By using a lower confidence threshold, we ensured that the majority of pests were detected (minimizing false negatives), which is a safer approach for agricultural intervention, even if it slightly increases the rate of false positives (lower Precision).

3.2. Real-Time Inference and Field Testing

Inference ran smoothly on the Jetson platform, achieving a processing speed of approximately 2 FPS for 5K images. This meet the operational requirements for real-time triggering of pest control actions. Figure 6 and Figure 7 illustrate the system’s ability to detect pests even when partially obscured by petals.
The collected data, combined with GPS coordinates, were used to visualize pest density via heatmaps (Figure 8, Figure 9 and Figure 10). Heatmaps were generated via opensource software heatmapper [48]. We compared thresholds of 0.05, 0.1, and 0.5. At the 0.05 threshold, the system captures the maximum extent of the infestation. Although this lower threshold introduces some statistical noise (false positives), the core areas of pest concentration remain consistent across all thresholds, confirming the stability of the detection zones.

3.3. Derivation of Harmfulness Thresholds and Mapping

To translate visual detections into actionable agricultural data, we established a correlation between standard harmfulness thresholds [9] and the system’s output. Based on the finding that only 21% of pests are visible in the inflorescences and factoring in the 83.7% system accuracy, we converted the “pests per plant” metric into “detected beetles per m2.
As a result, harmfulness thresholds of 1, 2, 3, 4, and 5 pests per plant were found to correspond to approximately 8, 16, 23, 32, and 49 detected beetles per m2, respectively. These calculated values allowed for the generation of the intervention map in Figure 11, created using spatial interpolation of GPS-tagged data. This map categorizes the field into three zones: threshold not exceeded (0–2), treatment recommended (2–3), and treatment required (3–5).

4. Discussion

4.1. Pest Migration and Distribution Patterns

Contrary to some literature suggesting colonization begins at field edges [49], our heatmaps (Figure 8, Figure 9 and Figure 10) showed uneven distribution not limited to borders. This aligns with [5], who noted that after flowering begins, beetles migrate toward the central parts of the field. Additionally, the number of pollen beetles can vary significantly from day to day [50].
For the final analysis (Figure 11), we adopted the 0.05 detection threshold. This is a conservative, high-recall approach designed to account for pests that might be hidden or outside the immediate field of view, ensuring that no critical infestation zones are overlooked.

4.2. Spatial Correlation and Environmental Proxies

Figure 12 and Figure 13 illustrate the relationship between pest aggregation and geographical coordinates. We observed a moderate correlation (r = 0.44 for latitude and r = 0.29 for longitude) at a significance level of p < 0.05. It is important to clarify that latitude and longitude serve as proxies for environmental determinants. The actual drivers of pest occurrence are microclimatic factors, such as temperature and wind exposure, which vary spatially across the coordinates recorded during our trials.

4.3. Technical Optimization and System Scalability

A potential challenge for the future deployment of this system is maintaining high operational speed on embedded devices, especially if the number of cameras is increased to cover wider field spans. While the current processing speed of 2 FPS is sufficient for real-time mapping at standard tractor speeds, further scaling will require advanced optimization. To address this, the implementation of TensorRT (NVIDIA’s high-performance deep learning inference optimizer) is planned. Furthermore, model compression techniques such as quantization (converting weights from FP32 to INT8) and pruning (removing redundant neurons) could significantly reduce the computational load without substantially compromising the detection accuracy of small objects like Brassicogethes aeneus.

4.4. Economic and Environmental Impact

Based on the mapping results, insecticide treatment was required on only 6.24 hectares (34% of the field). On the remaining 66% (12.06 ha), no treatment was necessary. Using Inazuma 130 WG at a rate of 0.2 kg/ha (approx. 12 USD/ha), this selective approach would save approximately 140 USD per 18.3 ha field. Beyond the financial gain, this 66% reduction in pesticide use significantly minimizes the environmental footprint and protects pollinator biodiversity.

5. Conclusions

The study confirmed that the integration of a mobile measurement platform with the YOLOv4 deep learning algorithm provides a viable proof of concept for real-time pest detection in winter rapeseed. The system achieved a mean Average Precision of AP@0.50 = 0.837 and maintained a processing speed of 2 FPS on the NVIDIA Jetson Orin AGX, meeting the operational requirements for real-time agricultural applications. Quantitatively, the implementation of this technology allowed for a 66% reduction in the area requiring insecticide treatment (only 6.24 ha out of 18.30 ha required intervention). This selective approach results in direct financial savings of approximately 140 USD per field and significantly minimizes the environmental impact by reducing chemical runoff and protecting pollinator biodiversity.
For practical field leverage, the trained model can be integrated with an IFM mobile controller and the tractor’s electronic system via the ISOBUS protocol. This enables site-specific spraying where the intensity is regulated in real time based on the established threshold of 23 detected pests per m2. Such an architecture allows for a “plug-and-play” setup that requires no additional hardware inside the tractor cabin, facilitating adoption by commercial farms. While this study focused on Brassicogethes aeneus, the methodology is adaptable to other pests and crops. Future research will focus on expanding the dataset to earlier phenological stages and employing model compression techniques, such as quantization and pruning, to maintain high-speed performance as the sensor array expands.

Author Contributions

Conceptualization, Z.M. and M.C.; methodology, B.S. and A.W.-G.; software, K.O.; validation, K.O., R.S. and K.T.; formal analysis, A.W.-G.; investigation, K.O., R.S., B.S. and K.T.; data curation, A.W.-G. and B.S.; writing—original draft preparation, Z.M., K.O., R.S. and A.W.-G.; writing—review and editing, Z.M. and R.S., supervision, Z.M. and M.C.; funding acquisition, Z.M. All authors have read and agreed to the published version of the manuscript.

Funding

The research was funded by Agency for Restructuring and Modernisation of Agriculture from Poland under the “Cooperation” Action within the Rural Development Program for 2014–2020, co-financed by the European Agricultural Fund for Rural Development (EAFRD). Agreement No. 00073.DDD.6509.00274.2022.01.

Data Availability Statement

Data are available upon reasonable request. Please contact the corresponding author.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. World Population Review. 2024. Available online: https://worldpopulationreview.com/ (accessed on 3 July 2024).
  2. Cook, S.; Jędryczka, M. Integrated pest control in oilseed crops—New advances from the rapeseed research community. Pest Manag. Sci. 2024, 80, 2217–2219. [Google Scholar] [CrossRef] [PubMed]
  3. Hausmann, J. Challenges for integrated pest management of Dasineurabrassicae in oilseed rape. Arthropod-Plant Interact. 2021, 15, 645–656. [Google Scholar] [CrossRef]
  4. Mauchline, A.L.; Cook, S.M.; Powell, W.; Chapman, J.W.; Osborne, J.L. Migratory flight behaviour of the pollen beetle Meligethes aeneus. Pest Manag. Sci. 2017, 73, 1076–1082. [Google Scholar] [CrossRef] [PubMed]
  5. Bick, E.; Sigsgaard, L.; Torrance, M.T.; Helmreich, S.; Still, L.; Beck, B.; El Rashid, R.; Lemmich, J.; Nikolajsen, T.; Cook, S.M. Dynamics of pollen beetle (Brassicogethes aeneus) immigration and colonization of oilseed rape (Brassica napus) in Europe. Pest Manag. Sci. 2024, 80, 2306–2313. [Google Scholar] [CrossRef]
  6. Fricke, U.; Redlich, S.; Zhang, J.; Benjamin, C.; Englmeier, J.; Ganuza, C.; Haensel, M.; Riebl, R.; Rojas-Botero, S.; Tobisch, C.; et al. Earlier flowering of winter oilseed rape compensates for higher pest pressure in warmer climates. J. Appl. Ecol. 2023, 60, 365–375. [Google Scholar] [CrossRef]
  7. Zaller, J.G.; Moser, D.; Drapela, T.; Schmöger, C.; Frank, T. Effect of within-field and landscape factors on insect damage in winter oilseed rape. Agric. Ecosyst. Environ. 2008, 123, 233–238. [Google Scholar] [CrossRef]
  8. Juhel, A.S.; Barbu, C.M.; Franck, P.; Roger-Estrade, J.; Butier, A.; Bazot, M.; Valantin-Morison, M. Characterization of the pollen beetle, Brassicogethes aeneus, dispersal from woodlands to winter oilseed rape fields. PLoS ONE 2017, 12, e0183878. [Google Scholar] [CrossRef]
  9. Williams, I.H. The major insect pests of oilseed Rape in Europe and their management: An overview. In Biocontrol-Based Integrated Management of Oilseed Rape Pests, 1st ed.; Williams, I.H., Ed.; Springer: Dordrecht, The Netherlands, 2010; pp. 1–43. [Google Scholar] [CrossRef]
  10. Ortega-Ramos, P.A.; Cook, S.M.; Mauchline, A.L. How contradictory EU policies led to the development of a pest: The story of oilseed rape and the cabbage stem flea beetle. GCB Bioenergy 2022, 14, 258–266. [Google Scholar] [CrossRef]
  11. Zhang, H.; Breeze, T.; Bailey, A.; Garthwaite, D.; Harrington, R.; Potts, S.G. Arthropod pest control for UK oilseed rape—Comparing insecticide efficacies, side effects and alternatives. PLoS ONE 2017, 12, e0169475. [Google Scholar] [CrossRef]
  12. Nils, C.; Brandes, M.; Ulber, B.; Heimbach, U. Effect of immigration time and beetle density on development of the cabbage stem flea beetle, (Psylliodes chrysocephala L.) and damage potential in winter oilseed rape. J. Plant Dis. Prot. 2021, 128, 1081–1090. [Google Scholar] [CrossRef]
  13. Agarwal, M.; Bohat, V.K.; Ansari, M.D.; Sinha, A.; Gupta, S.K.; Garg, D. A convulation neural network based approach to detect disease in corn crop. In Proceedings of the 2019 IEEE 9th International Conference on Advanced Computing, IACC, Tiruchirappalli, India, 13–14 December 2019. [Google Scholar]
  14. Hansen, L.M. Economic damage threshold model for pollen beetles (Meligethes aeneus F.) in spring oilseed rape (Brassica napus L.) crops. Crop Prot. 2024, 23, 43–46. [Google Scholar] [CrossRef]
  15. Tratwal, A.; Witaszak, W.; Trzciński, P.; Baran, M.; Bocianowski, J. Analysis of winter oilseed rape damages caused by Brassicogethes aeneus (Fabricius, 1775) in various regions of Poland, in 2009–2019. Prog. Plant Prot. 2022, 62, 128–133. [Google Scholar] [CrossRef]
  16. Mrówczyński, M.; Pruszyński, G.; Walczak, F. Monitoring i progi szkodliwości najważniejszych szkodników rzepaku. Prog. Plant Prot. 2013, 53, 1–10. Available online: https://www.researchgate.net/publication/292315247_Integrowana_Ochrona_Upraw_Rolniczych_Tom_II_Zastosowanie_Integrowanej_Ochrony (accessed on 1 September 2025).
  17. Witzgall, P.; Kirsch, P.; Cork, A. Sex pheromones and their impact on pest management. J. Chem. Ecol. 2010, 36, 80–100. [Google Scholar] [CrossRef] [PubMed]
  18. Southwood, T.R.E.; Henderson, P.A. Ecological Methods, 3rd ed.; Blackwell Science: Oxford, UK, 2000; Available online: https://www.researchgate.net/publication/260051655_Ecological_Methods_3rd_edition (accessed on 1 September 2025).
  19. Walczak, F. Monitoring agrofagów dla potrzeb integrowanej ochrony roślin uprawnych. Fragm. Agron. 2010, 27, 147–154. Available online: https://pta.up.poznan.pl/pdf/2010/FA%2027(4)%202010%20Walczak.pdf (accessed on 1 July 2024).
  20. Albanese, A.; Nardello, M.; Brunelli, D. Automated pest detection with DNN on the edge for precision agriculture. IEEE J. Emerg. Sel. Top. Circuits Syst. 2021, 11, 458–467. [Google Scholar] [CrossRef]
  21. Chen, L.; Zhu, H.; Horst, L.; Wallhead, M.; Reding, M.; Fulcher, A. Management of pest insects and plant diseases in fruit and nursery production with laser-guided variable-rate sprayers. HortScience 2020, 56, 94–100. [Google Scholar] [CrossRef]
  22. Karise, R.; Eneli Viik, E.; Mänd, M. Impact of alpha-cypermethrin on honey bees foraging on spring oilseed rape (Brassica napus) flowers in field conditions. Pest Manag. Sci. 2007, 63, 1085–1089. [Google Scholar] [CrossRef]
  23. Chin, R.; Catal, C.; Kassahun, A. Plant disease detection using drones in precision agriculture. Precis. Agric. 2023, 24, 1663–1682. [Google Scholar] [CrossRef]
  24. Guo, Q.; Wang, C.; Xiao, D.; Huang, Q. Automatic monitoring of flying vegetable insect pests using an RGB camera and YOLO-SIP detector. Precis. Agric. 2023, 24, 436–457. [Google Scholar] [CrossRef]
  25. Kirkeby, C.; Rydhmer, K.; Cook, S.M.; Strand, A.; Torrance, M.T.; Swain, J.L.; Prangsma, J.; Johnen, A.; Jensen, M.; Brydegaard, M.; et al. Advances in automatic identification of flying insects using optical sensors and machine learning. Sci. Rep. 2021, 11, 1555. [Google Scholar] [CrossRef] [PubMed]
  26. Garcia, A.A.; Caceres Campana, J.W. Identification of pathogens in corn using near-infrared UAV imagery and deep learning. Precis. Agric. 2023, 24, 783–806. [Google Scholar] [CrossRef]
  27. Cho, O.-H. Machine learning algorithms for early detection of legume crop disease. Legume Res. 2024, 47, 463–469. [Google Scholar] [CrossRef]
  28. Mamun, A.L. Iot-based agriculture and smart farming: Machine learning applications: A commentary. Open Access J. Data Sci. Artif. Intell. 2024, 2, 000110. [Google Scholar] [CrossRef]
  29. Wu, H.; Wang, Y.; Zhao, P.; Qian, M. Small-target weed-detection model based on YOLO-V4 with improved backbone and neck structures. Precis. Agric. 2023, 24, 2149–2170. [Google Scholar] [CrossRef]
  30. Das, M.; Bais, A. Deepveg: Deep learning model for segmentation of weed, canola, and canola flea beetle damage. IEEE Access 2021, 9, 119367–119380. [Google Scholar] [CrossRef]
  31. Gupta, N.; Gupta, B.; Passi, K.; Jain, C.K. Applications of artificial intelligence based technologies in weed and pest detection. J. Comput. Sci. 2022, 18, 520–529. [Google Scholar] [CrossRef]
  32. Xie, C.; Wang, R.; Zhang, J.; Chen, P.; Dong, W.; Li, R.; Chen, T.; Chen, H. Multi-level learning features for automatic classification of field crop pests. Comput. Electron. Agric. 2018, 152, 233–241. [Google Scholar] [CrossRef]
  33. Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. In Proceedings of the Advances in Neural Information Processing Systems 28, Neural Information Processing Systems Conference (NIPS 2015), New Orleans, LA, USA, 10–16 December 2015; pp. 91–99. [Google Scholar]
  34. Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single shot multibox detector. In Computer Vision–ECCV 2016, Proceedings of the 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; Lecture Notes in Computer Science; Leibe, B., Matas, J., Sebe, N., Welling, M., Eds.; Springer: Cham, Switzerland, 2016; Volume 9905, pp. 21–37. [Google Scholar] [CrossRef]
  35. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar] [CrossRef]
  36. Wang, C.T.; Bochkovskiy, A.; Liao, H.Y.M. ScaledYOLOv4: Scaling cross stage partial network. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 20–25 June 2021; pp. 13029–13038. [Google Scholar] [CrossRef]
  37. Patil, S.; Kharade, A.; Kesarkar, A.; Bankarmali, U. Object Detection Using YOLO. Int. J. Multidiscip. Res. 2025, 7. [Google Scholar] [CrossRef]
  38. Hao, S.; Gao, E.; Ji, Z.; Ganchev, I. BCS_YOLO: Research on Corn Leaf Disease and Pest Detection Based on YOLOv11n. Appl. Sci. 2025, 15, 8231. [Google Scholar] [CrossRef]
  39. Wang, C.; Liang, G. Overview of Research on Object Detection Based on YOLO. In Proceedings of the 4th International Conference on Artificial Intelligence and Computer Engineering, Dalian, China, 17–19 November 2023; ACM: New York, NY, USA, 2024; pp. 257–265. [Google Scholar] [CrossRef]
  40. Kaur, R.; Singh, S. A comprehensive review of object detection with deep learning. Digit. Signal Process. 2023, 132, 103812. [Google Scholar] [CrossRef]
  41. Xiao, Y.; Tian, Z.; Yu, J.; Zhang, Y.; Liu, S.; Du, S.; Lan, X. A review of object detection based on deep learning. Multimed. Tools Appl. 2020, 79, 23729–23791. [Google Scholar] [CrossRef]
  42. Zhao, Z.Q.; Zheng, P.; Xu, S.T.; Wu, X. Object detection with deep learning: A review. IEEE Trans. Neural Netw. Learn. Syst. 2019, 30, 3212–3232. [Google Scholar] [CrossRef] [PubMed]
  43. Aldubaikhi, A.; Patel, S. Advancements in Small-Object Detection 2023–2025: Approaches, Datasets, Benchmarks, Applications, and Practical Guidance. Appl. Sci. 2025, 15, 11882. [Google Scholar] [CrossRef]
  44. Gomez-Canales, A.; Gomez-Avila, J.; Hernandez-Barragan, J.; Lopez-Franco, C.; Villaseñor, C.; Arana-Daniel, N. Improving Moving Insect Detection with Difference of Features Maps in YOLO Architecture. Appl. Sci. 2025, 15, 7697. [Google Scholar] [CrossRef]
  45. Bochkovskiy, A.; Wang, C.-Y.; Liao, H.-Y.M. YOLOv4: Optimal speed and accuracy of object detection. arXiv 2020. [Google Scholar] [CrossRef]
  46. Rezatofighi, H.; Tsoi, N.; Gwak, J.Y.; Sadeghian, A.; Reid, I.; Savarese, S. Generalized intersection over union: A metric and a loss for bounding box regression. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 658–666. [Google Scholar] [CrossRef]
  47. Luo, Y.; Ni, L.; Cai, F.; Wang, D.; Luo, Y.; Li, X.; Fu, N.; Tang, J.; Xue, L. Detection of agricultural pests based on YOLO. J. Phys. Conf. Ser. 2023, 2560, 012013. [Google Scholar] [CrossRef]
  48. Babicki, S.; Arndt, D.; Marcu, A.; Liang, Y.; Grant, J.R.; Maciejewski, A.; Wishart, D.S. Heatmapper: Web-enabled heat mapping for all. Nucleic Acids Res. 2016, 44, W147–W153. [Google Scholar] [CrossRef]
  49. Ferguson, A.W.; Nevard, L.M.; Clark, S.J.; Cook, S.M. Temperature-–activity relationships in Meligethes aeneus: Implications for pest management. Pest Manag. Sci. 2015, 71, 459–466. [Google Scholar] [CrossRef]
  50. Seimandi-Corda, G.; Jenkins, T.; Cook, S.M. Sampling pollen beetle (Brassicogethes aeneus) pressure in oilseed rape: Which method is best? Pest Manag. Sci. 2021, 77, 2785–2794. [Google Scholar] [CrossRef]
Figure 1. Architecture of the mobile measurement system.
Figure 1. Architecture of the mobile measurement system.
Applsci 16 01075 g001
Figure 2. Cameras and LED lamps visible on a dedicated mounting system attached to the tractor’s front linkage.
Figure 2. Cameras and LED lamps visible on a dedicated mounting system attached to the tractor’s front linkage.
Applsci 16 01075 g002
Figure 3. Electric enclosure with onboard computer and DC power supply.
Figure 3. Electric enclosure with onboard computer and DC power supply.
Applsci 16 01075 g003
Figure 4. Result of the operation of the neural network on the sample photo from the dataset—correctly recognized pollen beetle.
Figure 4. Result of the operation of the neural network on the sample photo from the dataset—correctly recognized pollen beetle.
Applsci 16 01075 g004
Figure 5. Result of the operation of the neural network on the sample photo from the dataset—correctly recognized pollen beetle.
Figure 5. Result of the operation of the neural network on the sample photo from the dataset—correctly recognized pollen beetle.
Applsci 16 01075 g005
Figure 6. Rape pollen beetle recognized by Jetson in real time.
Figure 6. Rape pollen beetle recognized by Jetson in real time.
Applsci 16 01075 g006
Figure 7. Rape pollen beetle partially obscured by the flower petals.
Figure 7. Rape pollen beetle partially obscured by the flower petals.
Applsci 16 01075 g007
Figure 8. Heatmap of pest density for a detection threshold of 0.05. A warmer color indicates greater density.
Figure 8. Heatmap of pest density for a detection threshold of 0.05. A warmer color indicates greater density.
Applsci 16 01075 g008
Figure 9. Heatmap of pest density for a detection threshold of 0.1. A warmer color indicates greater density.
Figure 9. Heatmap of pest density for a detection threshold of 0.1. A warmer color indicates greater density.
Applsci 16 01075 g009
Figure 10. Heatmap of pest density for a detection threshold of 0.5. A warmer color indicates greater density.
Figure 10. Heatmap of pest density for a detection threshold of 0.5. A warmer color indicates greater density.
Applsci 16 01075 g010
Figure 11. The thresholds of harmfulness for pollen beetle occurrence in a winter oilseed rape field, mapped according to longitude and latitude. Confidence levels: 0–2—threshold of harmfulness not exceeded, 2–3—insecticidal treatment recommended, 3–5—insecticidal treatment required.
Figure 11. The thresholds of harmfulness for pollen beetle occurrence in a winter oilseed rape field, mapped according to longitude and latitude. Confidence levels: 0–2—threshold of harmfulness not exceeded, 2–3—insecticidal treatment recommended, 3–5—insecticidal treatment required.
Applsci 16 01075 g011
Figure 12. Pollen beetle aggregation in oilseed rape depending on latitude. The blue circles denote the detected pests, the solid line is the linear regression with r = 0.44 and the dashed lines show confidence interval for p < 0.05. The vertical axis corresponds to the threshold of harmfulness.
Figure 12. Pollen beetle aggregation in oilseed rape depending on latitude. The blue circles denote the detected pests, the solid line is the linear regression with r = 0.44 and the dashed lines show confidence interval for p < 0.05. The vertical axis corresponds to the threshold of harmfulness.
Applsci 16 01075 g012
Figure 13. Pollen beetle aggregation in oilseed rape depending on longitude. Linear regression: r = 0.29 and confidence interval p < 0.05.
Figure 13. Pollen beetle aggregation in oilseed rape depending on longitude. Linear regression: r = 0.29 and confidence interval p < 0.05.
Applsci 16 01075 g013
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Malecha, Z.; Ożarowski, K.; Siemasz, R.; Chorowski, M.; Tomczuk, K.; Strochalska, B.; Wondołowska-Grabowska, A. Deep Learning for Real-Time Detection of Brassicogethes aeneus in Oilseed Rape Using the YOLOv4 Architecture. Appl. Sci. 2026, 16, 1075. https://doi.org/10.3390/app16021075

AMA Style

Malecha Z, Ożarowski K, Siemasz R, Chorowski M, Tomczuk K, Strochalska B, Wondołowska-Grabowska A. Deep Learning for Real-Time Detection of Brassicogethes aeneus in Oilseed Rape Using the YOLOv4 Architecture. Applied Sciences. 2026; 16(2):1075. https://doi.org/10.3390/app16021075

Chicago/Turabian Style

Malecha, Ziemowit, Kajetan Ożarowski, Rafał Siemasz, Maciej Chorowski, Krzysztof Tomczuk, Bernadeta Strochalska, and Anna Wondołowska-Grabowska. 2026. "Deep Learning for Real-Time Detection of Brassicogethes aeneus in Oilseed Rape Using the YOLOv4 Architecture" Applied Sciences 16, no. 2: 1075. https://doi.org/10.3390/app16021075

APA Style

Malecha, Z., Ożarowski, K., Siemasz, R., Chorowski, M., Tomczuk, K., Strochalska, B., & Wondołowska-Grabowska, A. (2026). Deep Learning for Real-Time Detection of Brassicogethes aeneus in Oilseed Rape Using the YOLOv4 Architecture. Applied Sciences, 16(2), 1075. https://doi.org/10.3390/app16021075

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop