You are currently on the new version of our website. Access the old version .
AgriEngineeringAgriEngineering
  • Review
  • Open Access

15 January 2026

Ambrosia artemisiifolia in Hungary: A Review of Challenges, Impacts, and Precision Agriculture Approaches for Sustainable Site-Specific Weed Management Using UAV Technologies

,
and
1
Doctoral School of Agricultural and Food Sciences (Plant Science Program), Szent István Campus, Hungarian University of Agriculture and Life Sciences, Páter Károly u.1, 2100 Gödöllő, Hungary
2
Department of Forestry, College of Agricultural Engineering Sciences, Salahaddin University-Erbil, Erbil 44003, Kurdistan Region, Iraq
*
Author to whom correspondence should be addressed.

Abstract

Weed management has become a critical agricultural practice, as weeds compete with crops for nutrients, host pests and diseases, and cause major economic losses. The invasive weed Ambrosia artemisiifolia (common ragweed) is particularly problematic in Hungary, endangering crop productivity and public health through its fast proliferation and allergenic pollen. This review examines the current challenges and impacts of A. artemisiifolia while exploring sustainable approaches to its management through precision agriculture. Recent advancements in unmanned aerial vehicles (UAVs) equipped with advanced imaging systems, remote sensing, and artificial intelligence, particularly deep learning models such as convolutional neural networks (CNNs) and Support Vector Machines (SVMs), enable accurate detection, mapping, and classification of weed infestations. These technologies facilitate site-specific weed management (SSWM) by optimizing herbicide application, reducing chemical inputs, and minimizing environmental impacts. The results of recent studies demonstrate the high potential of UAV-based monitoring for real-time, data-driven weed management. The review concludes that integrating UAV and AI technologies into weed management offers a sustainable, cost-effective, mitigate the socioeconomic impacts and environmentally responsible solution, emphasizing the need for collaboration between agricultural researchers and technology developers to enhance precision agriculture practices in Hungary.

1. Introduction

The Food and Agriculture Organization (FAO) reported that the global population is expected to increase to nine billion people by 2050, resulting in a double increase in food consumption; meanwhile, agricultural resources will become more limited, degraded, and increasingly vulnerable to climate change [1]. It is crucial to observe, measure, and study the physical aspects and phenomena within complex multifunctional agricultural ecosystems to have a better understanding of these challenges [2]. To achieve this, the use of unmanned aerial vehicles (UAVs), image satellites, and autonomous field robots in agriculture has significantly developed [3,4]
In recent years, remote sensing has undergone rapid development, particularly in data gathering and computer vision technologies, which have enabled a wide range of precision agricultural applications. These activities are included: abiotic stress evaluation, growth monitoring, crop yield prediction, weed detection and mapping, and disease and pest identification [2]. The objective of precision agriculture (PA) is to improve resource efficiency, productivity, quality, and profitability of agricultural production by using the data that are collected, processed, and integrated with additional information to make decisions based on temporal and spatial variability [5,6].
Because of this, farmers use advanced technologies, including unmanned aerial vehicles (UAVs) for weed detection and management to enhance crop productivity [7]. Weed management has become crucial to maintain the quality of crops, particularly vegetable crops and cereals, reducing yield loss and preventing economic loss [8]. Because weeds compete with the main crop for essential resources, including nutrients, water, light, and CO2, space, which causes a significant reduction in crop outcome.
Additionally, weeds serve as reservoirs for a wide range of pests, including insects, bacteria, viruses, and fungi, which can subsequently infest neighboring crops [9,10]. Several economically important insect pests, such as whiteflies and aphids, use weeds as alternative hosts before spreading to vegetable and field crops [11,12]. Similarly, weeds can harbor plant pathogens, including viruses such as Tobacco rattle virus, thereby increasing disease pressure and complicating integrated pest management strategies [13]
However, in addition to civilization and the economy, the introduction of alien species has a major effect on biological variety and the preservation of nature. Invasive alien species are now acknowledged as one of the primary threats to biodiversity, along with habitat loss and fragmentation. The risks and deficiencies have been highlighted by European nations. Projects have been started, and the categorization and list of invasive species have been accurately identified in a number of European nations, such as Hungary and the United Kingdom. Measures to stop the invasion of Common Ragweed (Ambrosia artemisiifolia), which is becoming a greater hazard to human health, are given top attention in Hungary. Approximately 30% of the Hungarian population experiences allergies, with 65% of these persons having sensitivity to pollen, and at least 60% of this pollen sensitivity is caused by common ragweed [14]. Additionally, the world’s landscapes and ecosystems are changing due to ongoing climate change and biological invasions [15,16]. Invasive species are directly impacted by changing rainfall and temperature patterns as they are exposed to changing physiological constraints. Due to their adaptability to changing climatic conditions, many of these species are better able to expand to new areas within their non-native range [17], so by increased extreme rainfall and drier summer conditions have also impacted the spread of A. artemisiifolia.
Direct weed identification in croplands is difficult due to the presence of over 200 harmful plant species. The effects of weeds on various crops have been demonstrated in several studies [18]. For instance, in Californian lettuce fields [19], it was observed that season-long weed competition resulted in a 50% reduction in lettuce production. This study examined different weed management methods on wheat yield, finding that uncontrolled weed growth led to yield reductions ranging from 57.6% to 73.2%. Effective strategies such as post-emergence herbicides and hand weeding were highlighted as crucial for mitigating these losses [20].
The aims of this review paper are to assess the effects of invasive weed species, specifically Ambrosia artemisiifolia, on agricultural output and public health in Hungary and investigate contemporary precision agriculture technology, particularly unmanned aerial vehicles (UAVs) which is equipped with machine learning-driven image processing, for site-specific weed management. These technologies provide precise weed detection, classification, enabling precision herbicide application, thereby minimizing environmental damage and enhancing resource efficiency.
Hungary is presented in this review as a representative case study for regions in Central and Eastern Europe facing severe Ambrosia artemisiifolia infestation, where agricultural impacts, public health concerns, and the need for advanced monitoring technologies converge.

2. Literature Search Approach

To support the topics addressed in this study, appropriate articles were identified by searches in Scopus, Web of Science, and Google Scholar. The search concentrated on research related to Ambrosia artemisiifolia, precision agriculture, machine learning and deep learning, UAV-based remote sensing, and site-specific weed management (SSWM).

2.1. Search Strategy and Keywords

The search used combinations of the following keywords: “Ambrosia artemisiifolia”, “common ragweed”, “UAV”, “unmanned aerial vehicle”, “drone”, “remote sensing”, “weed detection”, “hyperspectral”, “multispectral”, “RGB imaging”, “sensor fusion”, “multi-modal data”, “precision agriculture”, “machine learning”, “deep learning”, “site-specific weed management”.

2.2. Timeframe and Scope

The literature review is primarily concentrated on papers published from 2000 to 2026, highlighting advancements in contemporary remote sensing, machine learning, and unmanned aerial vehicle (UAV) technology for weed detection and management. Earlier foundational publications concerning the history, biology, and ecological effects of Ambrosia artemisiifolia were incorporated to furnish critical background and context. These works encompass previous decades and substantiate the initial sections of the evaluation.

2.3. Selection Approach

While this is not a systematic review, preference was given to: peer-reviewed studies, research involving UAV-based sensors (RGB, multispectral, hyperspectral). Studies addressing weed detection, mapping, classification, or management work focusing on A. artemisiifolia or similar broadleaf weeds. This literature search strategy ensured broad coverage of key technological developments while maintaining a clear focus on sustainable UAV-based site-specific weed management for Ambrosia artemisiifolia.

3. Common Ragweed or Annual Ragweed (Ambrosia artemisiifolia)

Common ragweed, or annual ragweed (Ambrosia artemisiifolia), is an invasive species belonging to the Asteraceae family. Native to North America, but now widespread across Europe, where it causes significant agricultural and public-health impacts [21,22,23,24,25]. Two independent introductions from eastern and western North America have contributed to its establishment in Europe [26]. (Figure 1) shows field photographs of Ambrosia artemisiifolia taken at the experimental field of the Hungarian University of Agriculture and Life Sciences, illustrating its characteristic growth form under local field conditions.
Figure 1. Morphological appearance of Ambrosia artemisiifolia observed in experimental fields at the Agronomy Institute field, Hungarian University of Agriculture and Life Sciences (MATE), Gödöllő 2100, Hungary: (a) individual plants grown in pots; (b) dense stand of A. artemisiifolia in the field—photographs collected by Sherwan Yassin Hammad, July 2024.

3.1. History of Common Ragweed in Europe, Specifically in Hungary

Common ragweed began spreading across Europe after World War I, partly through contaminated clover seed shipments [27,28]. It became increasingly common in Central and Eastern Europe, with major entry points at ports such as Rijeka, Marseille, Genoa, and Trieste [28,29]. Based on data obtained from the European Aeroallergen Network (EAN) and the European Pollen Information (EPI), the pollen distribution map was created, indicating elevated Ambrosia artemisiifolia pollen concentrations across Eastern and Central Europe during the period 1–10 September 2025 (Figure 2) [30,31].
Figure 2. Distribution of Ambrosia artemisiifolia pollen in Europe during 1–10 September 2025, based on data from the European Pollen Information (EPI). The map was created by the authors using QGIS.
In Hungary, ragweed was first recorded in Budapest in 1888. It expanded rapidly through the 20th century and became one of the country’s most dominant arable weeds, rising from 21st place in the 1950s to 4th place by the 1980s [27]. By 2003, it had infested 5.4 million hectares, with 700,000 hectares strongly infested [32]. According to the data obtained from the Central Office of Soil and Plant Protection of Hungary, counties such as Somogy, Baranya, and Szabolcs-Szatmár-Bereg remained the most affected in 2005 (Figure 3). Along the Austrian–Hungarian border, ragweed abundance was found to depend more on land-use factors (crop cover, previous crop, farming type) than on climatic or soil variables [33].
Figure 3. Spatial distribution of Ambrosia artemisiifolia infestation across Hungary in 2005, based on data from the Central Office of Soil and Plant Protection of Hungary and mapped by the authors using QGIS.
Hungary has implemented a comprehensive framework for controlling Ambrosia artemisiifolia that combines legal regulation, systematic monitoring, and integrated weed management approaches [34,35,36,37,38]. Since 2005, national legislation has required landowners to prevent ragweed spread, supported by centralized monitoring systems and spatial infestation maps developed by national geodesy and remote sensing institutions [38]. These national efforts align with broader European initiatives, such as the EU COST Action “SMARTER,” which promotes sustainable and biological control strategies for ragweed across Europe [39]. Biological control programs, including the use of Ophraella communa, and citizen science-based monitoring platforms, further highlight that ragweed management is a shared European challenge rather than a country-specific issue [34,40,41]. Together, these national and EU-level actions position Hungary as a representative regional case within wider European strategies, where UAV-based remote sensing and site-specific weed management technologies offer scalable and transferable solutions for invasive weed control [27,36,42].

3.2. Botanical and Biological Characteristics Relevant to Weed Detection

Ragweed is an erect annual plant, typically up to 250 cm tall, with a hairy, branched or unbranched stem and deeply lobed leaves [41]. Male flowers occur in spike-like clusters at the top of the plant, producing large amounts of pollen, while female flowers are located in the leaf axils. Ref. [43] demonstrates that Ambrosia artemisiifolia exhibits strong biological adaptability and herbicide resistance, particularly across early growth stages, highlighting its vigorous growth and survival capacity. Such biological traits contribute to persistent infestations and emphasize the importance of early-stage identification in weed detection systems. Seed predation influences the seed bank and emergence patterns of Ambrosia artemisiifolia, with implications for weed detection [43]. Ambrosia artemisiifolia includes prolonged emergence, rapid vegetative growth, high seed longevity, and strong sensitivity to environmental conditions. These traits drive spatial and temporal variability in infestations, which is critical for effective image-based and sensor-driven weed detection [44,45].

3.3. Agricultural and Health Impacts

Ragweed is the most important arable weed in Hungary [46] and also invades natural ecosystems, reducing species diversity and forage value [47]. Its allelopathic effects can suppress crop germination and growth [48,49]. From a public-health perspective, ragweed is among the most allergenic plants in Europe and North America, ranking 9th among major broadleaf weeds and 7th in soybean fields according to the Weed Science Society of America (WSSA) [50]. Its pollen is a major cause of asthma and allergic rhinitis, contributing substantially to healthcare costs and lost productivity across Europe [46].
The biological, ecological, and impact-related features of common ragweed determine its spatial distribution, spectral appearance, and detectability in UAV imagery, thereby influencing detection accuracy and site-specific weed management strategies discussed in the following sections.

4. Precision Agriculture (PA) and Site-Specific Weed Management (SSWM)

The weed control techniques are classified into three categories: mechanical, physical, and biological. Physical weed control techniques included mulching, solarization, flaming, and steaming [51]. Because chemical herbicides are easy to use, spraying weeds using tractors or Unmanned Aerial Vehicles (UAVs) has become a common way to manage weeds [52]. Nevertheless, despite their direct toxicity to the plants they are intended to kill, the use of chemical herbicides has resulted in significant environmental damage [53]. Pesticide residues from overuse of herbicides have also been found in agricultural products, affecting their quality and yield as well as the efficiency of agricultural production [54].
The first stage in using an automated weed management system is accurately identifying and detecting weeds, which is good for the environment and the economy [55]. Diverse weed management techniques must be used in SSWM, and they must be modified based on the location, density, and population of the weeds [56]. The vision-based image processing system, which functions as the machine’s brain, finds the real-time application of herbicide through the use of variable rate technology (VRT) [57].
Currently, there are two different kinds of VRT-based applications: (1) sensor-based and (2) map-based. One popular method is map-based, where a region’s map is created using georeferenced soil or plant samples. This procedure is costly and time-consuming as it requires manually collecting soil samples for further investigation [58]. In contrast, sensor-based approaches enable real-time data acquisition and processing, allowing immediate weed detection and treatment. When combined with machine learning (ML) or deep learning (DL) methodologies deployed on ground-based or aerial platforms, sensor-based VRT systems support real-time decision-making and precise herbicide application [57].
Growth monitoring is a critical measure for estimating agricultural production and is vital for decision-making [59]. Traditional methods, such as expert visual evaluations or chemical laboratory analyses, are used to evaluate the crop’s growth and nutritional condition. These approaches are time-consuming and impractical for monitoring large areas of a site. In contrast, machine learning based image processing technology has emerged as an effective solution, providing real-time data on crop health and nutrient status, and can be used for continual monitoring throughout the crop life cycle [60].
The rapid development of automated precision agriculture has been driven by advances in artificial intelligence (AI), machine learning (ML), and deep learning (DL). These technologies play a central role in transforming remotely sensed data into actionable information for site-specific weed management (SSWM). Among AI approaches, deep learning has emerged as a particularly powerful tool for weed identification and discrimination due to its ability to automatically extract complex features from image data [57].

4.1. Machine Learning Approaches for Weed Detection and Mapping

Conventional machine learning (ML) techniques, such as Support Vector Machines (SVM), Random Forests, and k-nearest neighbors (KNN), have been extensively utilized for weed-crop categorization employing characteristics derived from RGB, multispectral, and hyperspectral imaging. Commonly used features include spectral reflectance values, vegetation indices, texture descriptors, and shape-based metrics. Machine learning classifiers are especially proficient when training datasets are limited and computational efficiency is required. The results of these models are generally utilized to create weed distribution maps, which serve as the basis for site-specific weed management strategies.
Recent studies have demonstrated the effectiveness of ML-driven approaches for large-scale and real-time weed detection. For example, hybrid architectures integrating convolutional neural networks with vision transformers, combined with adaptive multispectral feature fusion techniques, have achieved detection accuracies of up to 98% in real-time weed monitoring conditions [61]. Similarly, Random Forest models applied to high-resolution multispectral imagery have shown promising performance in distinguishing crops from weeds, achieving an overall accuracy of approximately 76%, although further refinement is required to reduce false-negative detections [62].
In addition to pixel-based classification, image-processing techniques such as contour detection and Histogram of Oriented Gradients (HOGs) feature extraction have been successfully combined with classifiers such as SVM and KNN for accurate weed identification and management [63]. More recently, deep-learning-based object detection models, including YOLOv8 and YOLOv9, have demonstrated strong performance, with mean average precision (mAP50) values ranging from 80.8% to 98% across different weed and crop species [64].
UAV-based imaging systems further enhance ML-driven weed mapping by providing ultra-high spatial resolution data suitable for early-stage weed detection. Approaches such as object-based image analysis (OBIA) and supervised machine learning applied to UAV imagery have shown promising results in delineating weed patches and supporting site-specific interventions [65]. These techniques contribute to sustainable farming practices by reducing manual labor requirements and minimizing unnecessary herbicide application [63,66,67].
Despite these advances, several challenges remain, including spectral similarity between crops and weeds, canopy occlusion, and variable illumination conditions, which can negatively affect classification accuracy. Addressing these limitations requires continued research and methodological refinement, as well as integration with advanced sensing technologies and deep learning approaches [68,69].

4.2. Deep Learning for Weed Crop Discrimination in Complex Field Environments

Deep learning (DL) approaches represent a significant advancement beyond classical machine learning methods for weed crop discrimination, particularly in complex field environments. Unlike traditional ML techniques that rely on handcrafted features, DL models automatically learn hierarchical and task-specific representations directly from image data, enabling improved robustness under variable illumination, canopy overlap, and strong visual similarity between weeds and crops [70,71,72,73].
Convolutional neural networks (CNNs), including architectures such as VGG, ResNet, YOLO-based object detectors, and semantic segmentation models such as U-Net, have demonstrated high accuracy in detecting and localizing weeds in real agricultural settings. YOLO-based models, in particular, have shown strong performance in cluttered field conditions, with reported detection accuracies of approximately 83%, while CNN-based classification and segmentation approaches have achieved accuracies of up to 98% in weed crop discrimination tasks [71,72].
High-resolution imagery acquired from UAV platforms is commonly used to train deep learning models, with preprocessing strategies such as data augmentation and normalization applied to improve generalization across varying field conditions. These techniques are especially important for early growth stages, where weeds are small and difficult to distinguish from crops [74,75].
Deep learning models are increasingly integrated into operational precision agriculture systems, including UAV-based monitoring workflows and smart spraying platforms. This integration enables near real-time weed detection and supports targeted herbicide application, contributing to more efficient and environmentally sustainable site-specific weed management [73,76,77].
Despite their advantages, DL approaches face challenges related to large labeled data requirements and computational complexity. To mitigate these limitations, transfer learning, data augmentation, and lightweight network architectures (e.g., ShuffleNetv2) have been proposed to reduce training effort and improve deployment feasibility [74,75,78]. Further improvements in model generalization and scalability remain key research priorities [79,80].

4.3. From Weed Detection to Precision Herbicide Application

The integration of AI-based weed detection outputs into precision herbicide application systems represents a critical step toward sustainable site-specific weed management. Weed distribution maps generated using machine learning or deep learning models can be translated into prescription maps that guide variable-rate or spot-specific spraying, enabling herbicides to be applied only where weeds are [81,82,83]. This approach significantly reduces chemical inputs, minimizes environmental impacts, and lowers production costs compared with conventional broadcast spraying methods [83,84].
Recent advances in deep learning and computer vision have substantially improved the feasibility of real-time precision herbicide application. Object detection models such as YOLO, including versions YOLOv3 and YOLOv4, have been widely adopted for real-time weed detection using UAV- and camera-based imaging systems. These models are trained on annotated datasets of crops and weeds and can be integrated with drone-mounted cameras or smart spraying platforms to enable targeted herbicide application directly in the field [81,82]. Hybrid frameworks that combine YOLO-based object detection with U-Net semantic segmentation have further improved boundary delineation, enhancing the spatial precision of herbicide delivery [85].
Other deep learning architectures have also demonstrated strong potential for operational weed control. Approaches integrating DenseNet for high-accuracy classification with YOLO for spatial localization have achieved high validation accuracy and mean Average Precision (mAP), supporting efficient and scalable weed management strategies [82]. In addition, ensemble deep learning methods that combine multiple architectures, such as ResNet18, InceptionV3, and DenseNet201, have shown improved robustness and reduced false-positive and false-negative rates in weed detection, further enhancing the reliability of precision spraying systems [86].
In practical applications, high-resolution UAV imagery combined with advanced detection models, such as YOLOv8, has enabled selective herbicide application with minimal human intervention and improved real-time performance [87]. The integration of multispectral remote sensing data with UAV-based platforms has also facilitated site-specific herbicide application, particularly in large agricultural fields. For instance, the use of remotely piloted aerial application systems (RPAAS) in rice fields resulted in a 45% reduction in herbicide usage compared to conventional broadcast spraying methods, demonstrating the environmental and economic benefits of precision application strategies [83].
For invasive species such as Ambrosia artemisiifolia, early and targeted intervention is particularly important to prevent seed production and pollen release. UAV-derived weed maps and AI-driven detection systems support timely decision-making and enable precise treatment at early growth stages, thereby improving control efficiency and reducing long-term infestation risks. Overall, AI-enabled precision herbicide application contributes directly to sustainable farming practices by optimizing resource use, reducing environmental hazards, and improving weed management outcomes [82,84,88].

4.4. Role of AI in Supporting Sustainable Site-Specific Weed Management

Artificial intelligence plays a central role in enabling sustainable site-specific weed management (SSWM) by transforming remotely sensed data into precise, actionable interventions. By supporting accurate weed detection, spatial mapping, and targeted treatment, AI-driven approaches contribute directly to reduced herbicide inputs, lower off-target effects, and improved treatment efficiency, thereby promoting environmentally responsible weed control practices [89,90,91].
AI-based systems integrate machine learning and deep learning algorithms with image processing, computer vision, and multispectral data to accurately identify and classify weed species. This capability enables site-specific and species-specific interventions, which are increasingly important under conditions of climate change that promote faster and prolonged weed growth cycles [92,93,94]. In particular, convolutional neural networks (CNNs) have enhanced weed detection performance through automated feature learning, improving robustness to spatio-temporal variability and heterogeneous field conditions [95].
Beyond detection, AI facilitates real-time decision-making through integration with UAV platforms, smart sprayers, and robotic systems. AI-enabled spot-spraying technologies can apply herbicides directly to detected weeds with minimal overspray, substantially reducing chemical usage while preserving crop health [96]. Most notably, UAVs provide very high spatial resolution, enabling the identification and mapping of small and early-stage weed patches that are often undetectable in satellite imagery due to coarse resolution [97].
Despite these advances, the effectiveness of AI-driven SSWM depends strongly on data quality, sensor selection, and the reliability of real-time sensing and control systems. Challenges remain related to sensor interoperability, system scalability, and the deployment of robust real-time solutions under field conditions [90,98]. In this context, high-resolution UAV imagery supports more accurate early-stage weed detection and timely site-specific weed management [99]. Continued research, technological integration, and collaboration among agronomists, engineers, and policymakers are therefore essential to fully realize the potential of AI in sustainable weed management.

5. Unmanned Aerial Vehicles (UAVs) in Weed Management

Unmanned Aerial Vehicles (UAVs) play an increasingly important role in modern agriculture, particularly as platforms for high-resolution data acquisition that support weed detection and site-specific management. While UAV-based spraying technologies exist and offer advantages such as improved targeting and reduced labor demand [100]. This review focuses primarily on UAV-based imaging, sensor technologies, and analytical methods for weed detection and mapping. The section highlights the advantages of UAV-based remote sensing over conventional platforms, discusses key technical constraints in image acquisition and processing, and outlines emerging solutions relevant to site-specific weed management.

5.1. Advantages of UAV-Based Remote Sensing Compared to Other Platforms

UAV-based remote sensing offers several advantages over satellite and manned airborne platforms for site-specific weed management. Most notably, UAVs provide very high spatial resolution, enabling the identification and mapping of small and early-stage weed patches that are often undetectable in satellite imagery due to coarse resolution and cloud interference [101,102,103,104]. This capability is particularly important for early-season detection, which is critical for timely and effective weed management interventions [103,105]. UAVs offer higher spatial resolution, on-demand data acquisition, and reduced sensitivity to cloud cover compared with satellite platforms, making them particularly suitable for site-specific weed detection and mapping [103].
UAVs also offer greater temporal flexibility, as they can be deployed on demand and timed to coincide with optimal phenological stages or favorable illumination conditions, unlike satellites that rely on fixed revisit schedules [101,102,104]. From an operational perspective, UAVs are generally more cost-effective than manned aircraft and reduce data acquisition costs by enabling frequent, field-scale monitoring using relatively low-cost sensors [101,106,107,108].
In addition, UAV platforms support advanced imaging modalities, including RGB, multispectral, and hyperspectral sensors, which enhance species-level discrimination and improve classification accuracy when combined with machine learning and deep learning approaches [107,109,110]. These capabilities facilitate near–real-time data processing and decision-making, which are difficult to achieve with traditional remote sensing platforms. Collectively, these advantages make UAV-based remote sensing particularly well suited for site-specific weed management, offering both environmental benefits through reduced chemical inputs and economic benefits through improved management efficiency [101,102,108,110].

5.2. UAV-Based Imaging Systems and Data Acquisition for Weed Detection

Unmanned aerial vehicles (UAVs) can rapidly acquire high-resolution imagery and detect weed patches across large agricultural areas within short time intervals [111]. Current research highlights three main camera types used for UAV-based weed detection: RGB, multispectral, and hyperspectral sensors. Image processing approaches such as conventional neural networks, deep neural networks, and object-based image analysis are commonly applied to analyze UAV imagery [112,113]. Detection accuracy is influenced by factors, including flight altitude, sensor resolution, and UAV platform characteristics.
Furthermore, UAVs integrated with Geographic Information System (GIS) technologies are increasingly applied in weed management strategies. The combination of UAVs with robotics, artificial intelligence, and additional sensors has been shown to improve detection accuracy, reduce labor requirements, and support more efficient agricultural management [7]. Weed detection systems can be broadly categorized into ground-based and UAV-based image acquisition platforms; however, UAVs are particularly well-suited for field crops and grasslands due to their rapid coverage and operational flexibility [114].

5.3. Type of Cameras for Weed Detection

UAV-based weed detection relies on various imaging sensors that differ in cost, spectral resolution, data volume, and detection accuracy. RGB imaging is widely available and cost-effective, offering high spatial resolution; however, it provides limited spectral information and is highly sensitive to illumination conditions, which reduces its accuracy for precise weed identification [103,115,116,117,118]. Multispectral cameras capture reflectance in multiple discrete wavebands beyond the visible spectrum, providing enhanced spectral information that improves weed crop discrimination compared to RGB imagery, albeit at higher cost and system complexity [103,115,118]. Hyperspectral imaging systems offer the highest spectral resolution, typically capturing hundreds of narrow spectral bands. These sensors enable detailed biochemical and structural analysis and generally achieve the highest weed detection accuracy. However, their practical use is constrained by high cost, large data volumes, and complex data processing requirements [116,119,120,121,122,123,124,125].
Other sensing technologies, including thermal imaging and LiDAR, are primarily used for specific applications such as irrigation management, plant water stress detection, and structural analysis. While valuable for crop monitoring, these sensors are not suitable for direct weed identification due to limited spectral discrimination capabilities [120,126]. Similarly, fluorescence imaging and ultrasonic sensing have niche applications in plant stress detection and structural analysis, respectively, but are not appropriate for comprehensive weed detection [120]. A detailed overview of imaging technologies used in agricultural weed detection, including their advantages, challenges, and applications, is presented in Table 1.
Table 1. Overview of imaging technologies used in agriculture and weed detection, detailing their benefits, challenges, and specific applications.

5.4. Mitigation of Sensor Limitations and Emerging Solutions

Recent research has increasingly focused on overcoming the limitations of single-sensor UAV systems through sensor fusion, advanced data processing, and optimized operational strategies. The integration of RGB, multispectral, hyperspectral, and LiDAR data enables a more comprehensive characterization of weed-crop systems by combining spectral, structural, and contextual information, thereby improving discrimination accuracy under complex field conditions [120,131]. Such multimodal approaches are particularly effective for detecting small or early-stage weed patches that are difficult to identify using individual sensors alone.
Advances in deep learning architectures have further enhanced the value of fused UAV data. Object detection and segmentation models such as YOLO, Mask R-CNN, and transformer-based frameworks have demonstrated high accuracy in weed identification, even under variable illumination and canopy overlap [132,133]. Recent developments targeting small-object detection have shown improved performance and processing efficiency, supporting near–real-time applications in site-specific weed management [134]. To address challenges related to data scarcity and environmental variability, researchers have also explored synthetic data generation, data augmentation, and hybrid learning strategies, which improve model robustness across diverse field conditions [84,135].
From an operational perspective, UAV-based weed detection is increasingly embedded within site-specific weed management (SSWM) frameworks, enabling targeted interventions that reduce herbicide use and environmental impact [136,137]. The integration of UAV platforms with IoT-based decision support systems allows real-time monitoring, rapid data transmission, and timely management actions [138]. Machine learning plays a central role in these integrated systems by continuously adapting detection models to new data, weed species, and environmental conditions, thereby enhancing long-term reliability and scalability [107,139,140,141]. Collectively, these developments indicate a clear shift toward intelligent, integrated UAV systems that combine sensing, analytics, and decision support to enable sustainable weed management.

5.5. Image Processing and Machine Learning Workflows for Weed Detection

To differentiate weeds from crops, UAV-acquired images can be analyzed using a combination of spectral properties, morphological characteristics, texture features, and spatial context. A common method for weed detection is image processing, which includes steps such as preprocessing, segmentation, feature extraction, and classification [18], as illustrated in Figure 4.
Figure 4. Typical image processing procedures for weed identification [18].
Due to their potential to expand agricultural mapping, machine learning methods have generated significant interest in RS research during the past decade. The ability of machine learning algorithms for classifying weeds was shown to be effective [142]. The main machine learning (ML) algorithms utilized for weed detection in agricultural applications include supervised learning methods such as convolutional neural networks (CNNs) and the use of deep features for unsupervised data analysis. Deep learning (DL) enhances weed detection in precision agriculture by automating the identification of weeds, reducing herbicide use, and improving sustainability [55,66,68,143,144,145,146,147,148]. A detailed summary of the current state using the application of machine learning and deep learning in site-specific weed management is shown in Table 2.
Table 2. Comparative overview of Machine Learning (ML) and Deep Learning (DL) in site-specific weed management.

5.6. Synthesis of ML and DL Approaches for UAV-Based Weed Detection

As summarized in Table 2, machine learning (ML) and deep learning (DL) approaches offer complementary strengths for site-specific weed management. ML methods are computationally efficient and suitable for resource-limited environments but rely on handcrafted features and often show reduced robustness under complex field conditions [131,156]. In contrast, DL models generally achieve higher detection accuracy and better generalization by automatically learning discriminative features, though they require large labeled datasets and substantial computational resources [157,158].
Recent research has focused on mitigating these limitations through transfer learning, data augmentation, lightweight network architectures, and sensor-aware model design [131,156,157,158,159]. While these strategies improve model scalability and real-time deployment on UAV platforms, challenges remain related to domain adaptation, synthetic data realism, accuracy–efficiency trade-offs, and sensitivity to environmental variability [131,159,160,161,162]. Addressing these issues is critical for developing robust and operational UAV-based weed detection systems.
Numerous research studies have been conducted lately to automate the process of classifying and identifying weeds. The machine-learning algorithm called support vector machine (SVM) was used for the effective classification of crops and weeds in digital images, and the results analysis shows that SVM covers a collection of 224 test photos with an accuracy of more than 97% [163].
To identify weeds on soybean crop photos taken with a drone, ref. [164] applied a convolutional neural network (CNN—AlexNet), categorized the weeds as either grass or broadleaf, and then sprayed the appropriate herbicide on the weeds that were identified. According to this experiment, broadleaf and grass weeds without soil and soybeans in the background may be identified with 97% accuracy using CNN (Figure 5).
Figure 5. Original UAV image (left) and corresponding classifications by the ConvNet (center) and SVM (right). Broadleaf weeds are shown in red, grasses in blue, soil in purple, and soybean remains in natural color [164].
Ref. [142] used UAV-based multispectral imagery and machine learning for precise weed detection in Calabria, Italy. Four classification algorithms were assessed: K-Nearest Neighbour (KNN), Support Vector Machines (SVMs), Random Forests (RFs), and Normal Bayes (NB). Support Vector Machines and Random Forests showed high stability, with consistency obtained over 81% and 91.2% accuracy, respectively.
In [165], a deep learning strategy was applied for feature extraction from multispectral drone data to detect weeds in a lettuce field, achieving an F1 score of 94%. Furthermore, some other research is listed in Table 3, where various models, or the algorithm of deep learning or machine learning, were used for weed identification.
Table 3. Additional studies that used various deep learning or machine learning models or algorithms for weed detection.

5.7. Applications of UAV and AI Methods for Ambrosia artemisiifolia Detection

A number of researchers use these technologies for weed detection on Ambrosia artemisiifolia. Ref. [171] used a Phantom 3 UAV flown at 10.1 m and an S1000 UAV flown at 8 m to collect high-resolution imagery for weed detection in soybean fields. The research integrated multispectral and thermal data to identify both common weed species and glyphosate-resistant biotypes at early growth stages. The study focused on four weed species: Ambrosia artimisifolia, Ambrosia rudis, Kochia scoparia, and Chenopodium album. Six supervised classifiers, Parallelepiped, Mahalanobis Distance, Maximum Likelihood, Spectral Angle Mapper, Support Vector Machine, and Decision Tree, were applied using pixel-based (PBIA) and object-based (OBIA) image analysis. OBIA achieved the highest performance, with overall accuracy exceeding 86%. Thermal photography was employed to identify glyphosate resistance by assessing variations in canopy temperature, resulting in classification accuracies of 88% for kochia, 93% for Amaranthus, and 92% for Ambrosia, showing the effectiveness of this method for (SSWM).
This research study presents a multi-format open-source weed image dataset designed to support real-time weed detection in precision agriculture. Five major weed species, including common ragweed, were documented using both aerial and ground-based imaging. Aerial data were collected with a DJI Phantom 4 Pro (V2.0) UAV equipped with its built-in 1-inch CMOS RGB camera, capturing high-resolution 5472 × 3648 px images at ~12 ft altitude to obtain clear weed features. Additional individual-plant images were taken using a Canon 90D handheld camera to enrich the dataset. Images were manually annotated using LabelImg and exported in TXT, XML, and JSON formats for training deep learning models. The final dataset contains 3975 images and 10,090 annotated weed instances, including 650 ragweed instances in aerial images. The dataset successfully provides diverse lighting, occlusion, and environmental conditions, making it suitable for training robust weed-detection models for both aerial and ground robotics applications [172]. Also, this conference paper showed that drones equipped with compressed deep neural networks can identify Ambrosia artemisiifolia efficiently from the air. By training detection and segmentation models and then optimizing them through shunt connections, knowledge distillation, and TensorRT, the system was able to run on an NVIDIA Jetson TX2 with inference times of 200–400 ms. This method significantly lowers the cost of ragweed monitoring and enables rapid, large-scale aerial detection [173].
Another study used RGB images captured with a Google Pixel 5 device equipped with a 12.2-MP main camera (F1.7 aperture, 28 mm focal length, 1.4 μm pixels) and a 16-MP ultra-wide camera (F2.2 aperture, 1 μm pixels). The researchers extracted texture features to categorize four weed species (horseweed, kochia, ragweed, waterhemp) and six crops (black bean, canola, corn, flax, soybean, sugar beets) utilizing Support Vector Machine (SVM) and VGG16 deep learning models, based on 3792 images captured in a greenhouse. The VGG16 model outperformed SVM, obtaining f1-scores between 93% and 97.5%, with an excellent 100% for maize, highlighting its applicability in site-specific weed management within precision agriculture [174]. The study used UAV-based sensing to detect glyphosate-resistant weeds, including kochia, waterhemp, redroot pigweed, and ragweed. A DJI M600 equipped with a Zenmuse XT2 thermal camera and Micasense RedEdge-MX Dual multispectral sensor captured canopy data, which were classified using Maximum Likelihood, Random Trees, and SVM learning methods. Multispectral features (NDVI and 705/740/842 nm bands) outperformed thermal imaging, with the best result, 87.2% accuracy for ragweed, achieved using the Random Trees classifier [115]. Using 5-band multispectral UAV imagery captured at 15 m, the study evaluated species differentiation among Palmer amaranth, common ragweed, and sicklepod at plant heights of 5, 10, 15, and 30 cm. Supervised image classification was applied and achieved 24–100% accuracy, with Palmer amaranth consistently identified at 100%, and ragweed and sicklepod showing lower but usable accuracy. Although spectral responses varied with species and height, clear separation, especially in bands 3 and 4n, demonstrated the potential of multispectral sensing for weed species discrimination [175].
Ref. [176] employed hyperspectral imaging using the Cubert UHD185 camera mounted on a tripod and used for ground-based imaging at a fixed distance of approximately 90 cm to detect invasive and weed species, such as (Ambrosia artemisiifolia, Euphorbia seguieriana, Atriplex tatarica, Glycyrrhiza glabra, and Setaria pumila) in grain agroecosystems following the harvest of winter wheat. Utilizing statistical approaches and machine learning techniques (Principal Component Analysis, decision tree, random forest), they computed 80 Vegetation Indices (VIs) successfully differentiating among weed types. The research emphasized VIs Derivative index (D1), Chlorophyll content index (Datt3), and Pigment specific normalized difference (PSND) as critical metrics for accurate weed identification.
Ref. [177] used a ground-based hyperspectral imaging system (400–1000 nm, ImSpector V10E spectrograph with a Pixelfly QE CCD camera mounted on a fixed indoor imaging setup) to differentiate Ambrosia artemisiifolia (ragweed) from Artemisia vulgaris (mugwort) across three growth stages by analyzing stem and leaf reflectance, particularly at 450, 550, 650, and 680–712 nm. The study found that wavelengths of 550 nm and 650 nm were especially effective for detecting A. artemisiifolia stems during the fruit development stage, regardless of the surrounding crop environment. The following section discusses these technical challenges and outlines future research directions for UAV-based SSWM.

6. Challenges, Limitations, and Future Directions of UAV-Based SSWM for Ambrosia artemisiifolia

While unmanned aerial vehicles have distinct benefits for high-resolution monitoring and mapping of weeds, numerous challenges and limitations constrain their operational application in site-specific weed management (SSWM) of Ambrosia artemisiifolia. These include challenges associated with data accuracy, scalability, and environmental effect, as well as technical, financial, regulatory, and operational issues. To overcome these obstacles, concerted efforts must be made to establish standardized protocols, enhance UAV technology, and offer users financial assistance and training [178,179,180,181].

6.1. Regulatory, Operational, and Economic Constraints

6.1.1. Fragmented Regulations

The deployment of UAVs may be restricted by the strict laws that frequently govern their operations. It is challenging to standardize UAV-based techniques for controlling Ambrosia artemisiifolia because these rules can differ greatly by location [178,179]. For instance, in Europe, the U-space concept aims to integrate UAVs into the airspace, but the regulatory framework is still evolving and varies between countries [182].

6.1.2. Safety and Risk Management

Regulatory authorities underscore safety, necessitating thorough risk evaluations and safety solutions for UAV operations, particularly in populated or crucial regions [183,184]. The Specific Operations Risk Assessment (SORA) framework assists in identifying and mitigating safety risks; however, it also introduces additional complexity to the regulatory approval process [184].

6.1.3. Technical and Operational Constraints

The deployment of UAVs in some circumstances may be limited by their need to adhere to operating limitations and technical standards [183]. Their application in the management of invasive species is further complicated by factors like airworthiness, flying duration, and range limitations [185].

6.1.4. Insurance, Liability, and Economic

Due to a lack of actuarial data, a lack of experience, and a great deal of regulatory uncertainty, aviation insurers have a difficult time offering adequate coverage for UAV operations. This complexity results from the fact that numerous countries do not have complete data on UAV incidents and accidents, and many UAV models have not been in use long enough to create reliable risk profiles, which adds another layer of complexity. To address this challenge of inefficient coverage, a multi-sector cooperation provision framework involving aviation insurers, the government, and UAV manufacturers has been proposed [186]. The implementation of UAVs is frequently obstructed by substantial initial expenses, encompassing the acquisition of the UAVs and requisite sensors and equipment [187,188].

6.2. Sensor and Image Acquisition Constraints

6.2.1. Solar Angle

The accuracy and robustness of classification models can be significantly affected by the high sensitivity of AV-based remote sensing to different lighting conditions, especially in RGB and multispectral images. Sun-camera geometry, including solar angle, strongly influences reflectance measurements. Even a 2° change in view angle can alter reflectance by more than ±50% of the nadir value 2, requiring correction for directional effects to ensure reliable data [189].

6.2.2. Flight Altitude and Spatial Resolution

Reflectance accuracy is also impacted by the angle at which incident light sensors (ILS) face the sun and the altitude of the aircraft [190]. Complex terrain and other topographic factors can skew sun illumination and interfere with optical remote sensing studies [191]. To identify individual weed plants in varied environments, good spatial resolution requires low-altitude aircraft [192,193,194]. Nonetheless, reduced altitudes reduce the covering area per image and increase the number of images needed, thus complicating the mission and increasing data volume [193,195]

6.2.3. Motion Blur and Platform Instability

Wind-related UAV instability can cause motion blur, reducing image quality and hindering accurate weed detection. This issue is amplified during high-speed flights, where rapid movements further intensify the blur [196,197,198]. A UAV-based photogrammetric inspection study quantified how motion blur degrades reconstruction accuracy. The researchers found that motion-induced blurring significantly distorts image features, increasing reconstruction standard deviation and peak-to-peak error by up to a factor of two under typical flight conditions. In the worst tested scenario, overall reconstruction error degraded by a factor of 13 compared to the optimal setup, demonstrating the severe sensitivity of image-based inspections to motion blur [197]. Similarly, another study noted that motion blur and other environmental factors could reduce the accuracy of weed detection models [199].

6.3. Weed–Crop Spectral and Structural Complexity

Ambrosia artemisiifolia poses significant challenges in agricultural fields due to having similar colors, textures, and shapes with crops and other broadleaf species, especially during early phenological stages, making them difficult to distinguish [141,200,201,202]. Crops and weeds, such as ragweed, have the same spectral signatures, especially in the visible and near-infrared (NIR) spectrum, complicating their differentiation using standard RGB or multispectral sensors [203,204,205]. The interaction of light with complex canopy structures might result in distorted or corrupted spectral data, hence hindering detection operations [206]. The application of particular Vegetation Indices (VIs), such as the Derivative index (D1), Chlorophyll content index (Datt3), and Pigment specific normalized difference (PSND), has demonstrated enhanced efficacy in identifying weed species, such as ragweed, even in obstructed situations [176].

6.4. Potential Directions to Overcome Limitations

Building on the technical and operational challenges discussed in previous sections, several priority research and development directions can help address current limitations in UAV-based detection of Ambrosia artemisiifolia. First, the use of optimized UAV platforms, particularly rotary-wing systems for low-altitude, high-resolution imaging and fixed-wing UAVs for larger-area surveys, can improve detection performance across different spatial scales [207,208,209,210]. Integrating multispectral and RGB data can also strengthen detection reliability across varying growth stages and environmental conditions [211].
Advanced computational methods are another priority. Deep neural networks (DNNs), including compressed models optimized for embedded platforms like Nvidia Jetson TX2, allow for faster and more accurate detection while enabling real-time or near–real-time processing onboard UAVs [173]. Complementary machine learning approaches, such as vegetation indices (e.g., TDVI), Support Vector Machines (SVM), Maximum Likelihood (ML) classifiers, and multilayer perceptrons (MLP-ARD), can assist in overcoming spectral instability and improving classification accuracy under variable lighting and field conditions [177,211,212]. Object-based image analysis and fuzzy-logic frameworks offer further benefits by incorporating spatial context and landscape characteristics, enabling stable prediction of ragweed likelihood in highly heterogeneous areas, including urban landscapes [210].
Optimizing flight parameters such as altitude, image overlap, and timing under stable illumination helps reduce blur, shadow effects, and spectral inconsistencies that currently hinder detection reliability [112,207]. Finally, improved detection algorithms, including YOLOv5-based models and novelty-detection classifiers, can better handle challenges such as overlapping leaves, morphological variability, and dense ragweed patches that commonly occur in the field [209,213]. These directives underline the necessity of integrating new sensing technologies, enhanced data processing methodologies, and optimized UAV operations to attain more resilient and scalable detection of Ambrosia artemisiifolia in varied situations.

7. Conclusions

Ambrosia artemisiifolia is among the most damaging invasive weeds affecting agricultural productivity and public health in Hungary and across Europe. In arable systems, it readily colonizes disturbed habitats and low-cover crop environments, where its competitiveness increases infestation pressure and contributes to yield losses and higher management costs [214,215,216]. Beyond agriculture, ragweed produces highly allergenic pollen that drives allergic rhinitis and asthma burdens in Central and Eastern Europe, with pollen and seed production varying across regions and seasons [36,214,217]. Accordingly, effective management requires prevention of further spread and scalable interventions, including disturbance reduction and sustainable options [218,219].
Uniform herbicide-based control is increasingly unsustainable due to environmental impacts, resistance risks, and regulatory pressure to reduce chemical inputs [220,221,222], consistent with goals such as the EU Green Deal [223]. Precision and site-specific weed management (SSWM) offer a practical alternative by enabling detection-driven interventions that reduce unnecessary herbicide use, which fundamentally depend on the spatially explicit characterization of vegetation condition derived from spectral remote sensing information [224,225,226]. In particular, UAV-based remote sensing supports flexible, on-demand, high-resolution monitoring for early-season detection and patch-level mapping [113], and recent advances in UAV weed detection using deep learning further improve the feasibility of producing actionable weed maps for targeted treatments [134,227].
UAV-based remote sensing using RGB, multispectral, and hyperspectral sensors provides high-resolution and timely weed detection that is not achievable with satellite platforms, particularly during early growth stages critical for effective control. Multispectral and hyperspectral data enhance weed-crop discrimination, while flexible UAV deployment enables targeted monitoring of invasive species such as Ambrosia artemisiifolia [107,228]. When combined with machine learning and deep learning models, including SVMs and CNNs, UAV imagery supports accurate weed mapping and site-specific herbicide application, reducing chemical inputs and environmental impacts [107,212,229].
Despite these advantages, several technical, sensor-related, operational, and regulatory challenges continue to limit the widespread adoption of UAV-based weed management. From a technical perspective, UAV operations are constrained by limited flight duration and payload capacity, which restrict sensor selection and coverage area [230,231]. Sensor-related challenges include the limited spectral information of RGB cameras, the high cost and large data volumes associated with multispectral and hyperspectral sensors, and image quality issues caused by variable illumination, motion blur, and platform instability, all of which affect detection accuracy and data processing efficiency [231,232,233]. In addition, onboard data processing and real-time analysis remain challenging, particularly for computationally intensive machine learning and deep learning models [232,233]. These technical constraints are compounded by high initial investment costs and regulatory restrictions on UAV operations, posing additional barriers for small and medium-sized farms, and highlighting the need for cooperative UAV-sharing schemes, policy support, and targeted training programs to enable broader adoption [185,188,234].
Future research should prioritize the development of lightweight and sensor-aware deep learning models, along with multi-sensor data fusion strategies, to improve the robustness and transferability of UAV-based detection of Ambrosia artemisiifolia under variable field conditions [173,228,235]. Particular emphasis should be placed on integrating RGB, multispectral, and hyperspectral imagery with optimized machine learning and deep learning architectures that enable real-time or near–real-time onboard processing [141,228,236,237]. In parallel, advances in mission planning, standardized acquisition protocols, and cloud edge computing frameworks are required to enhance the scalability and operational efficiency of UAV-based site-specific weed management [141,235]. Beyond technical developments, interdisciplinary research combining agronomy, ecology, and policy analysis is essential to evaluate long-term ecological impacts, address regulatory constraints, and support wider adoption through cooperative UAV-sharing models and targeted policy incentives [34,228,236]. Finally, effective translation of UAV-based technologies into practical and sustainable solutions for controlling Ambrosia artemisiifolia in Hungary and across Europe will depend on strong interdisciplinary collaboration among agronomists, engineers, data scientists, and policymakers, supported by targeted farmer training programs and enabling policy frameworks [187,238,239,240,241]. Such coordinated efforts are essential to overcome technical, economic, and regulatory barriers, promote adoption among small and medium-sized farms, and ensure that UAV-driven site-specific weed management contributes meaningfully to long-term agricultural sustainability [242,243,244].

Author Contributions

Conceptualization, S.Y.H.; methodology, S.Y.H.; software, S.Y.H. and G.M.; validation, S.Y.H.; resources, S.Y.H.; data curation, S.Y.H. and G.M.; writing—original draft preparation, S.Y.H.; writing—review and editing, S.Y.H.; supervision and funding acquisition, G.M. and G.P.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study.

Acknowledgments

The authors gratefully acknowledge the support of the Hungarian University of Agriculture and Life Sciences (MATE) and the Stipendium Hungaricum Scholarship Programme. The publication of this article was financially supported by MATE and the Stipendium Hungaricum Scholarship.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AIArtificial Intelligence
CNNConvolutional Neural Networks
D1Derivative Index
DLDeep Learning
EANEuropean Aeroallergen Network
EPIEuropean Pollen Information
FAOFood and Agriculture Organization
GPUGraphic Processing Unit
ILSIncident Light Sensors
KNNK-Nearest Neighbour
MLMachine Learning
NBNaïve Bayes
OBIAObject-Based Image Analysis
PAPrecision Agriculture
PBIAPixel-Based Image Analysis
PSNDPigment-Specific Normalized Difference
RFRandom Forests
RGBRed, Green, Blue
RPAASRemotely Piloted Aerial Application Systems
SDMSpecies Distribution Models
SORASpecific Operations Risk Assessment
SSWMSite-Specific Weed Management
SVMSupport Vector Machines
UAVsUnmanned Aerial Vehicles
VisVegetation Indices
WSSAWeed Science Society of America

References

  1. FAO. How to feed the world in 2050. Insights from an expert meet. In Insights from an Expert Meeting at FAO; FAO: Rome, Italy, 2009; Volume 2050, pp. 1–35. [Google Scholar]
  2. Castellano, G.; De Marinis, P.; Vessio, G. Weed mapping in multispectral drone imagery using lightweight vision transformers. Neurocomputing 2023, 562, 126914. [Google Scholar] [CrossRef]
  3. Burke, M.; Driscoll, A.; Lobell, D.B.; Ermon, S. Using satellite imagery to understand and promote sustainable development. Science 2021, 371, eabe8628. [Google Scholar] [CrossRef]
  4. Vougioukas, S.G. Agricultural Robotics. Annu. Rev. Control Robot. Auton. Syst. 2019, 2, 365–392. [Google Scholar] [CrossRef]
  5. Stafford, J.V. Implementing precision agriculture in the 21st century. J. Agric. Eng. Res. 2000, 76, 267–275. [Google Scholar] [CrossRef]
  6. Zhang, N.; Wang, M.; Wang, N. Precision agriculture—A worldwide overview. Comput. Electron. Agric. 2002, 36, 113–132. [Google Scholar] [CrossRef]
  7. Meesaragandla, S.; Jagtap, M.P.; Khatri, N.; Madan, H.; Vadduri, A.A. Herbicide spraying and weed identification using drone technology in modern farms: A comprehensive review. Results Eng. 2024, 21, 101870. [Google Scholar] [CrossRef]
  8. Vijayakumar, V.; Ampatzidis, Y.; Schueller, J.K.; Burks, T. Smart spraying technologies for precision weed management: A review. Smart Agric. Technol. 2023, 6, 100337. [Google Scholar] [CrossRef]
  9. Abo-Habaga, M.; Imara, Z.; Okasha, M. Development of a Combine Hoeing Machine for Flat and Ridged Soil. J. Soil Sci. Agric. Eng. 2018, 9, 817–820. [Google Scholar] [CrossRef]
  10. Roshan, P.; Kulshreshtha, A.; Hallan, V. Global Weed-Infecting Geminiviruses. In Geminiviruses: Impact, Challenges and Approaches; Springer International Publishing: Cham, Switzerland, 2019. [Google Scholar]
  11. Zandstra, B.H.; Motooka, P.S. Beneficial Effects of Weeds in Pest Management—A Review. PANS 1978, 24, 333–338. [Google Scholar] [CrossRef]
  12. Duffus, J.E. Role of Weeds in the Incidence of Virus Diseases. Annu. Rev. Phytopathol. 1971, 9, 319–340. [Google Scholar] [CrossRef]
  13. Byron, M.; Treadwell, D.D.; Dittmar, P.J. Weeds as Reservoirs of Plant Pathogens Affecting Economically Important Crops. Edis 2019, 2019, 7. [Google Scholar] [CrossRef]
  14. Makra, L.; Juhász, M.; Borsos, E.; Béczi, R. Meteorological variables connected with airborne ragweed pollen in Southern Hungary. Int. J. Biometeorol. 2004, 49, 37–47. [Google Scholar] [CrossRef]
  15. Sala, O.E.; Stuart Chapin, F.; Armesto, J.J.; Berlow, E.; Bloomfield, J.; Dirzo, R.; Huber-Sanwald, E.; Huenneke, L.F.; Jackson, R.B.; Kinzig, A.; et al. Global Biodiversity Scenarios for the Year 2100. Science 2000, 287, 1770–1774. [Google Scholar] [CrossRef]
  16. Vitousek, P.M.; Mooney, H.A.; Lubchenco, J.; Melillo, J.M. Human Domination of Earth’s Ecosystems. Science 1997, 277, 494–499. [Google Scholar] [CrossRef]
  17. Bradley, B.A.; Blumenthal, D.M.; Wilcove, D.S.; Ziska, L.H. Predicting plant invasions in an era of global change. Trends Ecol. Evol. 2010, 25, 310–318. [Google Scholar] [CrossRef]
  18. Liu, B.; Bruch, R. Weed Detection for Selective Spraying: A Review. Curr. Robot. Rep. 2020, 1, 19–26. [Google Scholar] [CrossRef]
  19. Lanini, W.T.; Le Strange, M. Low-input management of weeds in vegetable fields. Calif. Agric. 1991, 45, 11–13. [Google Scholar] [CrossRef]
  20. Amare, T.; Sharma, J.J.; Zewdie, K. Effect of Weed Control Methods on Weeds and Wheat (Triticum aestivum L.) Yield. World J. Agric. Res. 2014, 2, 124–128. [Google Scholar] [CrossRef]
  21. Bullock, J.M.; Chapman, D.; Schafer, S.; Roy, D.; Girardello, M.; Haynes, T.; Beal, S.; Wheeler, B.; Dickie, I.; Phang, Z.; et al. Assessing and Controlling the Spread and the Effects of Common Ragweed in Europe. Final report: ENV. B2/ETU/2010/0037. 2010. Available online: https://www.google.com/url?sa=t&source=web&rct=j&opi=89978449&url=https://circabc.europa.eu/sd/d/d1ad57e8-327c-4fdd-b908-dadd5b859eff/FinalFinalReport.pdf&ved=2ahUKEwia-uam__WRAxXz3AIHHbvoAFQQFnoECBgQAQ&usg=AOvVaw2C0iNelILWHrjMUvTr13lK (accessed on 10 December 2025).
  22. Jager, O.R.S. Ambrosia (ragweed) in Europe. Allergy Clin. Immunol. Int. 2001, 13, 60–66. [Google Scholar]
  23. Jäger, S. Global aspects of ragweed in Europe. In Proceedings of the 6th International Congress on Aerobiology. Satellite Symposium Proceedings: Ragweed in Europe, Perugia, Italy, 22–26 August 1998; ALK Abelló: Hørsholm, Denmark, 1998; pp. 6–10. [Google Scholar]
  24. Juhász, M. History of ragweed in Europe. In Proceedings of the 6th International Congress on Aerobiology. Satellite Symposium Proceedings: Ragweed in Europe, Perugia, Italy, 22–26 August 1998; ALK Abelló: Hørsholm, Denmark, 1998; pp. 11–14. [Google Scholar]
  25. Makra, L.; Matyasovszky, I.; Deák, Á.J. Ragweed in Eastern Europe. In Invasive Species and Global Climate Change; CABI: Wallingford, UK, 2014; pp. 117–128. [Google Scholar] [CrossRef]
  26. Gaudeul, M.; Giraud, T.; Kiss, L.; Shykoff, J.A. Nuclear and chloroplast microsatellites show multiple introductions in the worldwide invasion history of common ragweed, Ambrosia artemisiifolia. PLoS ONE 2011, 6, e17658. [Google Scholar] [CrossRef]
  27. Makra, L.; Matyasovszky, I.; Hufnagel, L.; Tusnády, G. The history of ragweed in the world. Appl. Ecol. Environ. Res. 2015, 13, 489–512. [Google Scholar] [CrossRef]
  28. Járai-Komlódi, M.; Juhász, M. Ambrosia elatior (L.) in Hungary (1989–1990). Aerobiologia 1993, 9, 75–78. [Google Scholar] [CrossRef]
  29. Makra, L.; Juhász, M.; Béczi, R.; Borsos, E.K. The history and impacts of airborne Ambrosia (Asteraceae) pollen in Hungary. Grana 2005, 44, 57–64. [Google Scholar] [CrossRef]
  30. European Aeroallergen Network (EAN). Distribution of Ambrosia artemisiifolia Pollen in Europe. 2025. Available online: https://ean.polleninfo.eu/Ean (accessed on 10 December 2025).
  31. European Pollen Information (EPI). Pollen Information and Ragweed Pollen Distribution Data. 2025. Available online: https://www.polleninformation.at/en/allergy/pollen-load-map-of-europe (accessed on 10 December 2025).
  32. Tóth, Á.; Hoffmanné, P.Z.; Szentey, L. A parlagfű (Ambrosia elatior) helyzet 2003-ban Magyarországon. A Levegő Pollenszám Csökkentésének Nehézségei 2004, 14. [Google Scholar]
  33. Pinke, G.; Kolejanisz, T.; Vér, A.; Nagy, K.; Milics, G.; Schlögl, G.; Bede-Fazekas, Á.; Botta-Dukát, Z.; Czúcz, B. Drivers of Ambrosia artemisiifolia abundance in arable fields along the Austrian-Hungarian border. Preslia 2019, 91, 369–389. [Google Scholar] [CrossRef]
  34. Knolmajer, B.; Jócsák, I.; Taller, J.; Keszthelyi, S.; Kazinczi, G. Common Ragweed—Ambrosia artemisiifolia L.: A Review with Special Regards to the Latest Results in Protection Methods, Herbicide Resistance, New Tools and Methods. Agronomy 2025, 15, 1765. [Google Scholar] [CrossRef]
  35. Gerber, E.; Schaffner, U.; Gassmann, A.; Hinz, H.L.; Seier, M.; Müller-Schärer, H. Prospects for biological control of Ambrosia artemisiifolia in Europe: Learning from the past. Weed Res. 2011, 51, 559–573. [Google Scholar] [CrossRef]
  36. Leru, P.M.; Eftimie, A.M.; Anton, V.F.; Thibaudon, M. Assessment of the risks associated with the invasive weed Ambrosia artemisiifolia in urban environments in Romania. Ecocycles 2019, 5, 56–61. [Google Scholar] [CrossRef]
  37. Bohren, C.; Mermillod, G.; Delabays, N. Ambrosia artemisiiifolia L.—Control measures and their effects on its capacity of reproduction. J. Plant Dis. Proctection 2008, 21, 311–316. [Google Scholar]
  38. Attila, M. Ambrosia vs. authority: Tasks, methods and results of the land management in ragweed prevention and monitoring of common interest. Geod. Kartogr. 2009, 61, 32–34. [Google Scholar]
  39. Vidović, B.; Cvrković, T.; Rančić, D.; Marinković, S.; Cristofaro, M.; Schaffner, U.; Petanović, R. Eriophyid mite Aceria artemisiifoliae sp.nov. (Acari: Eriophyoidea) potential biological control agent of invasive common ragweed, Ambrosia artemisiifolia L. (Asteraceae) in Serbia. Syst. Appl. Acarol. 2016, 21, 919–935. [Google Scholar] [CrossRef]
  40. Lommen, S.T.; Jolidon, E.F.; Sun, Y.; Bustamante Eduardo, J.I.; Müller-Schärer, H. An early suitability assessment of two exotic Ophraella species (Coleoptera: Chrysomelidae) for biological control of invasive ragweed in Europe. Eur. J. Entomol. 2017, 114, 160–169. [Google Scholar] [CrossRef]
  41. Dirr, L.; Bastl, K.; Bastl, M.; Berger, U.E.; Bouchal, J.M.; Kofol Seliger, A.; Magyar, D.; Ščevková, J.; Szigeti, T.; Grímsson, F. The Ragweed Finder: A Citizen-Science Project to Inform Pollen Allergy Sufferers About Ambrosia artemisiifolia Populations in Austria. Appl. Sci. 2025, 15, 12333. [Google Scholar] [CrossRef]
  42. Valkó, O.; Deák, B.; Török, P.; Kelemen, A.; Miglécz, T.; Tóth, K.; Tóthmérész, B. Abandonment of croplands: Problem or chance for grassland restoration? case studies from hungary. Ecosyst. Health Sustain. 2016, 2, e01208. [Google Scholar] [CrossRef]
  43. Kazinczi, G.; Beres, I.; Novák, R.; Biro, K.; Pathy, Z. Common ragweed (Ambrosia artemisiifolia): A review with special regards to the results in Hungary. I. Taxonomy, origin and distribution, morphology, life cycle and reproduction strategy. Herbologia 2008, 9, 55–91. [Google Scholar]
  44. Qin, Z.; Mao, D.J.; Quan, G.M.; Zhang, J.-E.; Xie, J.F.; DiTommaso, A. Physiological and morphological responses of invasive Ambrosia artemisiifolia (common ragweed) to different irradiances. Botany 2012, 90, 1284–1294. [Google Scholar] [CrossRef]
  45. Zhao, W.; Xue, Z.; Liu, T.; Wang, H.; Han, Z. Factors affecting establishment and population growth of the invasive weed Ambrosia artemisiifolia. Front. Plant Sci. 2023, 14, 1251441. [Google Scholar] [CrossRef]
  46. Smith, M.; Cecchi, L.; Skjøth, C.A.; Karrer, G.; Šikoparija, B. Common ragweed: A threat to environmental health in Europe. Environ. Int. 2013, 61, 115–126. [Google Scholar] [CrossRef] [PubMed]
  47. Sărățeanu, V.; Moisuc, A.; Cotuna, O. Ambrosia artemisiifolia L. an invasive weed from ruderal areas to disturbed grasslands. Lucr. Ştiinţifice—Ser. Agron. 2010, 53, 235–238. [Google Scholar]
  48. Lehoczky, E.; Szabó, R.; Nelima, M.O.; Nagy, P.; Béres, I. Examination of common ragweed’s (Ambrosia artemisiifolia L.) allelopathic effect on some weed species. Commun. Agric. Appl. Biol. Sci. 2010, 75, 107–111. [Google Scholar]
  49. Lehoczky, E.; Gólya, G.; Szabó, R.; Szalai, A. Allelopathic effects of ragweed (Ambrosia artemisiifolia L.) on cultivated plants. Commun. Agric. Appl. Biol. Sci. 2011, 76, 545–549. [Google Scholar]
  50. Beam, S.C.; Cahoon, C.W.; Haak, D.C.; Holshouser, D.L.; Mirsky, S.B.; Flessner, M.L. Integrated Weed Management Systems to Control Common Ragweed (Ambrosia artemisiifolia L.) in Soybean. Front. Agron. 2021, 2, 598426. [Google Scholar] [CrossRef]
  51. Pannacci, E.; Lattanzi, B.; Tei, F. Non-chemical weed management strategies in minor crops: A review. Crop Prot. 2017, 96, 44–58. [Google Scholar] [CrossRef]
  52. Xu, K.; Shu, L.; Xie, Q.; Song, M.; Zhu, Y.; Cao, W.; Ni, J. Precision weed detection in wheat fields for agriculture 4.0: A survey of enabling technologies, methods, and research challenges. Comput. Electron. Agric. 2023, 212, 108. [Google Scholar] [CrossRef]
  53. Carver, B.F. Wheat Science and Trade; Wiley: Hoboken, NJ, USA, 2009; pp. 1–569. [Google Scholar] [CrossRef]
  54. Authority, E.F.S. The 2015 European Union report on pesticide residues in food. EFSA J. 2017, 15, e04791. [Google Scholar] [CrossRef] [PubMed]
  55. Hasan, A.S.M.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G.K. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
  56. Wiles, L.J. Beyond patch spraying: Site-specific weed management with several herbicides. Precis. Agric. 2009, 10, 277–290. [Google Scholar] [CrossRef]
  57. Rai, N.; Zhang, Y.; Ram, B.G.; Schumacher, L.; Yellavajjala, R.K.; Bajwa, S.; Sun, X. Applications of deep learning in precision weed management: A review. Comput. Electron. Agric. 2023, 206, 107698. [Google Scholar] [CrossRef]
  58. da Costa Lima, A.; Mendes, K.F. Variable Rate Application of Herbicides for Weed Management in Pre- and Postemergence. In Pests, Weeds and Diseases in Agricultural Crop and Animal Husbandry Production; Kontogiannatos, D., Kourti, A., Mendes, K.F., Eds.; IntechOpen: Rijeka, Croatia, 2020. [Google Scholar] [CrossRef]
  59. Velumani, K.; Madec, S.; de Solan, B.; Lopez-Lozano, R.; Gillet, J.; Labrosse, J.; Jezequel, S.; Comar, A.; Baret, F. An automatic method based on daily in situ images and deep learning to date wheat heading stage. Field Crops Res. 2020, 252, 107793. [Google Scholar] [CrossRef]
  60. Barbedo, J.G.A. Detection of nutrition deficiencies in plants using proximal images and machine learning: A review. Comput. Electron. Agric. 2019, 162, 482–492. [Google Scholar] [CrossRef]
  61. Madeshwar, M.; Priyan, M.V.; Manvizhi, N. Hybrid Vision Transformer and CNN-Based System for Real-Time Weed Detection in Precision Agriculture. In Proceedings of the 2025 International Conference on Emerging Technologies in Engineering Applications, ICETEA, Puducherry, India, 5–6 June 2025. [Google Scholar] [CrossRef]
  62. Bazrafkan, A.; Kosugi, Y.; Flores, P. A machine learning extension built on ArcGIS for the detection of weeds in cornfields. In Proceedings of the 2024 ASABE Annual International Meeting, Anaheim, CA, USA, 28–31 July 2024. [Google Scholar] [CrossRef]
  63. Sharma, A.; Sharma, S.; Malik, A.; Sobti, R.; Hussain, M. Enhancing Sustainable Farming with Automated Weed Detection: A Hybrid Approach Using Image Processing and Machine Learning; Lecture Notes in Networks and Systems, LNNS; Springer: Cham, Switzerland, 2025; Volume 1399, pp. 54–68. [Google Scholar] [CrossRef]
  64. Sunil, G.C.; Upadhyay, A.; Zhang, Y.; Howatt, K.; Peters, T.; Ostlie, M.; Aderholdt, W.; Sun, X. Field-based multispecies weed and crop detection using ground robots and advanced YOLO models: A data and model-centric approach. Smart Agric. Technol. 2024, 9, 100538. [Google Scholar] [CrossRef]
  65. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. Selecting patterns and features for between- and within- crop-row weed mapping using UAV-imagery. Expert. Syst. Appl. 2016, 47, 85–94. [Google Scholar] [CrossRef]
  66. Al-Badri, A.H.; Ismail, N.A.; Al-Dulaimi, K.; Salman, G.A.; Khan, A.R.; Al-Sabaawi, A.; Salam, M.S.H. Classification of weed using machine learning techniques: A review—Challenges, current and future potential techniques. J. Plant Dis. Prot. 2022, 129, 745–768. [Google Scholar] [CrossRef]
  67. Shilaskar, S.; Gholap, R.; Bhatlawande, S.; Ranade, N.; Ghadge, S. Computer Vision Based Detection of Weed in Ginger and Sugarcane Crop for Automated Farming System. In Proceedings of the 2023 International Conference on IoT, Communication and Automation Technology, ICICAT 2023, Gorakhpur, India, 23–24 June 2023. [Google Scholar] [CrossRef]
  68. Adhinata, F.D.; Wahyono Sumiharto, R. A comprehensive survey on weed and crop classification using machine learning and deep learning. Artif. Intell. Agric. 2024, 13, 45–63. [Google Scholar] [CrossRef]
  69. de Villiers, C.; Munghemezulu, C.; Mashaba-Munghemezulu, Z.; Chirima, G.J.; Tesfamichael, S.G. Weed Detection in Rainfed Maize Crops Using UAV and PlanetScope Imagery. Sustainability 2023, 15, 13416. [Google Scholar] [CrossRef]
  70. Kavitha, K.; Gopalakrishnan, K.; Balaji, S.; Jeevanantham, J.; Aakhila Hayathunisa, M. Crop Classification using Convolutional Neural Network. In Proceedings of the 15th International Conference on Advances in Computing, Control, and Telecommunication Technologies, ACT 2024, Hyderabad, India, 21–22 June 2024; Volume 2, pp. 5438–5444. [Google Scholar]
  71. Das, A.; Yang, Y.; Subburaj, V.H. YOLOv7 for Weed Detection in Cotton Fields Using UAV Imagery. AgriEngineering 2025, 7, 313. [Google Scholar] [CrossRef]
  72. Nidhya, R.; Pavithra, D.; Smilarubavathy, G.; Mythrayee, D. Automated Weed Detection in Crop Fields Using Convolutional Neural Networks: A Deep Learning Approach for Smart Farming. Data Metadata 2025, 4, 848. [Google Scholar] [CrossRef]
  73. Kumar, G.; Sharma, C. Weed Identification and Classification in Mixed Crop Using Deep Learning Technique: A Review. In Proceedings of the IEEE International Conference on Signal Processing, Computing and Control, Solan, India, 6–8 March 2025; pp. 681–686. [Google Scholar] [CrossRef]
  74. Reedha, R.; Dericquebourg, E.; Canals, R.; Hafiane, A. Transformer Neural Network for Weed and Crop Classification of High Resolution UAV Images. Remote Sens. 2022, 14, 592. [Google Scholar] [CrossRef]
  75. Hossain, N.; Rahman, S.T.; Sun, S. An End-to-End Deep Learning Framework for Multi-Scale Cross-Domain Weed Segmentation. In Proceedings of the 2025 ASABE Annual International Meeting, Toronto, ON, Canada, 13–16 July 2025. [Google Scholar] [CrossRef]
  76. Anantha Sivaprakasam, S.; Senthil Pandi, S.; Yuva Prathima, S.; Varshini, V. Weed Plant Detection in Agricultural Field Using Deep Learning Integrated with IOT. In Proceedings of the 2024 5th International Conference for Emerging Technology, INCET 2024, Karnataka, India, 24–26 May 2024. [Google Scholar] [CrossRef]
  77. Modi, R.U.; Kancheti, M.; Subeesh, A.; Raj, C.; Singh, A.K.; Chandel, N.S.; Dhimate, A.S.; Singh, M.K.; Singh, S. An automated weed identification framework for sugarcane crop: A deep learning approach. Crop Prot. 2023, 173, 106360. [Google Scholar] [CrossRef]
  78. Li, Y.; Ke, A.; Guo, Z.; Tan, Q.; Yang, J. WeedNet-X: A lightweight field weed detection algorithm. Eng. Appl. Artif. Intell. 2025, 162, 112441. [Google Scholar] [CrossRef]
  79. Zhang, Y.; Wang, M.; Zhao, D.; Liu, C.; Liu, Z. Early weed identification based on deep learning: A review. Smart Agric. Technol. 2023, 3, 100123. [Google Scholar] [CrossRef]
  80. Hussain, A.A.; Nair, P.S. An Efficient Deep Learning Based AgriResUpNet Architecture for Semantic Segmentation of Crop and Weed Images. Ing. Des. Syst. D’information 2024, 29, 1829–1845. [Google Scholar] [CrossRef]
  81. Thenmozhi, T.; Subashini, S.; Pradeepa, K.; Pooja, E. Automated Weed Detection and Removal using Deep Learning. In Proceedings of the 2024 4th Asian Conference on Innovation in Technology, ASIANCON 2024, Pune, India, 23–25 August 2024. [Google Scholar] [CrossRef]
  82. Abhiram, K.; Abhishek, C.; Sharma, D.A. An Automated Weed-Detection Approach Using Deep Learning in Agriculture System. In Proceedings of the International Conference on Advanced Computing Technologies, ICoACT 2025, Sivakasi, India, 14–15 March 2025. [Google Scholar] [CrossRef]
  83. Gurjar, B.; Sapkota, B.; Torres, U.; Ceperkovic, I.; Kutugata, M.; Kumar, V.; Zhou, X.-G.; Martin, D.; Bagavathiannan, M. Site-Specific Treatment of Late-Season Weed Escapes in Rice Utilizing a Remotely Piloted Aerial Application System. Weed Technol. 2025, 39, e74. [Google Scholar] [CrossRef]
  84. Xu, B.; Werle, R.; Chudzik, G.; Zhang, Z. Enhancing weed detection using UAV imagery and deep learning with weather-driven domain adaptation. Comput. Electron. Agric. 2025, 237, 110673. [Google Scholar] [CrossRef]
  85. Awedat, K. A Dual-Stage Deep Learning Framework for Weed Detection. In Proceedings of the IEEE International Conference on Electro Information Technology, Valparaiso, IN, USA, 29–31 May 2025; pp. 17–22. [Google Scholar] [CrossRef]
  86. Babu, V.S.; Venkatram, N. Ensemble Learning for Weed Detection in Soybean Crop. In Proceedings of the 2024 11th International Conference on Computing for Sustainable Global Development, INDIACom 2024, New Delhi, India, 28 February–1 March 2024; pp. 867–871. [Google Scholar] [CrossRef]
  87. Ram Prakash, L.; Pravinesh, H.; Ariyamala, V. A Novel Approach for Classification of Crops and Weeds using Deep Learning. In Proceedings of the IEEE International Conference on Electronic Systems and Intelligent Computing, ICESIC 2024, Chennai, India, 22–23 November 2024; pp. 284–288. [Google Scholar] [CrossRef]
  88. Singh, P.; Zhao, B.; Shi, Y. Computer Vision for Site-Specific Weed Management in Precision Agriculture: A Review. Agriculture 2025, 15, 2296. [Google Scholar] [CrossRef]
  89. Rao, R.; Pushparaj Shetty, K.S.; Morabad, S.S.; Revathi, G.P. AI-Powered Weed Control: Image Processing and Robotics in Agriculture. In Proceedings of the 2024 IEEE Conference on Engineering Informatics, ICEI 2024, Melbourne, Australia, 20–28 November 2024. [Google Scholar] [CrossRef]
  90. Das, S.; Upadhyay, A.; Sun, X. Weed Management Strategies Employing Artificial Intelligence and Robotics. In Smart Farming, Smarter Solutions: Revolutionizing Agriculture with Artificial Intelligence; CRC Press: Boca Raton, FL, USA, 2025; pp. 154–179. [Google Scholar] [CrossRef]
  91. Vasileiou, M.; Kyrgiakos, L.S.; Kleisiari, C.; Kleftodimos, G.; Vlontzos, G.; Belhouchette, H.; Pardalos, P.M. Transforming weed management in sustainable agriculture with artificial intelligence: A systematic literature review towards weed identification and deep learning. Crop. Prot. 2024, 176, 106522. [Google Scholar] [CrossRef]
  92. Riksen, V.; Shpak, V. Convolutional Neural Network for Identification and Classification of Weeds in Buckwheat Crops. In Machine Intelligence for Smart Applications; Studies in Computational Intelligence; Springer: Cham, Switzerland, 2023; Volume 1105, pp. 61–72. [Google Scholar] [CrossRef]
  93. Singh, S.; Yadav, P. A Critical Analysis Toward Sustainable Farming: The Role of AI in Weed Identification and Management. In Communication and Intelligent Systems; Lecture Notes in Networks and Systems, LNNS; Springer: Singapore, 2024; Volume 967, pp. 369–382. [Google Scholar] [CrossRef]
  94. Mathur, G.; Pandey, H. Precision Farming with Automated Weed Detection Using Machine Learning. In Applying Remote Sensing and GIS for Spatial Analysis and Decision-Making; IGI Global Scientific Publishing: Hershey, PA, USA, 2024; pp. 267–310. [Google Scholar] [CrossRef]
  95. Gómez, A.; Moreno, H.; Valero, C.; Andújar, D. Spatio-temporal stability of intelligent modeling for weed detection in tomato fields. Agric. Syst. 2025, 228, 104394. [Google Scholar] [CrossRef]
  96. Anne, P.; Gasser, S.; Göttl, M.; Tanner, S. The reduction of chemical inputs by ultra-precise smart spot sprayer technology maximizes crop potential by lowering phytotoxicity. Front. Environ. Econ. 2024, 3, 1394315. [Google Scholar] [CrossRef]
  97. Dastres, E.; Esmaeili, H.; Edalat, M. Species distribution modeling of Malva neglecta Wallr. weed using ten different machine learning algorithms: An approach to site-specific weed management (SSWM). Eur. J. Agron. 2025, 167, 127579. [Google Scholar] [CrossRef]
  98. Gerhards, R.; Andújar Sanchez, D.; Hamouz, P.; Peteinatos, G.G.; Christensen, S.; Fernandez-Quintanilla, C. Advances in site-specific weed management in agriculture—A review. Weed Res. 2022, 62, 123–133. [Google Scholar] [CrossRef]
  99. Borra-Serrano, I.; Peña, J.M.; Torres-Sánchez, J.; Mesas-Carrascosa, F.J.; López-Granados, F. Spatial quality evaluation of resampled unmanned aerial vehicle-imagery for weed mapping. Sensors 2015, 15, 19688–19708. [Google Scholar] [CrossRef]
  100. Xiang, H.; Tian, L. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV). Biosyst. Eng. 2011, 108, 174–190. [Google Scholar] [CrossRef]
  101. Hassanein, M.; El-Sheimy, N. An efficient weed detection procedure using low-cost UAV imagery system for precision agriculture applications. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 181–187. [Google Scholar] [CrossRef]
  102. Huang, Y.; Reddy, K.N.; Fletcher, R.S.; Pennington, D. UAV Low-Altitude Remote Sensing for Precision Weed Management. Weed Technol. 2018, 32, 2–6. [Google Scholar] [CrossRef]
  103. Fernández-Quintanilla, C.; Peña, J.M.; Andújar, D.; Dorado, J.; Ribeiro, A.; López-Granados, F. Is the current state of the art of weed monitoring suitable for site-specific weed management in arable crops? Weed Res. 2018, 58, 259–272. [Google Scholar] [CrossRef]
  104. Zhu, W.; Li, S.; Zhang, X.; Li, Y.; Sun, Z. Estimation of winter wheat yield using optimal vegetation indices from unmanned aerial vehicle remote sensing. Nongye Gongcheng Xuebao/Trans. Chin. Soc. Agric. Eng. 2018, 34, 78–86. [Google Scholar] [CrossRef]
  105. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. A semi-supervised system for weed mapping in sunflower crops using unmanned aerial vehicles and a crop row detection method. Appl. Soft Comput. J. 2015, 37, 533–544. [Google Scholar] [CrossRef]
  106. Zhang, J.; Yu, F.; Zhang, Q.; Wang, M.; Yu, J.; Tan, Y. Advancements of UAV and Deep Learning Technologies for Weed Management in Farmland. Agronomy 2024, 14, 494. [Google Scholar] [CrossRef]
  107. Amarasingam, N.; Kelly, J.E.; Sandino, J.; Hamilton, M.; Gonzalez, F.; Dehaan, R.L.; Zheng, L.; Cherry, H. Bitou bush detection and mapping using UAV-based multispectral and hyperspectral imagery and artificial intelligence. Remote Sens. Appl. 2024, 34, 101151. [Google Scholar] [CrossRef]
  108. Hunt, E.R.; Daughtry, C.S.T. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? Int. J. Remote Sens. 2018, 39, 5345–5376. [Google Scholar] [CrossRef]
  109. Castellano, G.; De Marinis, P.; Vessio, G. Applying Knowledge Distillation to Improve Weed Mapping with Drones. In Proceedings of the 18th Conference on Computer Science and Intelligence Systems, FedCSIS 2023, Warsaw, Poland, 17–20 September 2023; pp. 393–400. [Google Scholar] [CrossRef]
  110. Luo, W.; Chen, Q.; Wang, Y.; Fu, D.; Mi, Z.; Wang, Q.; Li, H.; Shi, Y.; Su, B. Real-time identification and spatial distribution mapping of weeds through unmanned aerial vehicle (UAV) remote sensing. Eur. J. Agron. 2025, 169, 127699. [Google Scholar] [CrossRef]
  111. Li, H.; He, Y.; Qin, C.; Liu, D.; Zhang, K.; Zhang, L. Ecological Analysis on Spray Performance of Multi-rotor Unmanned Aerial Sprayer in Soybean Field. Ekoloji 2019, 28, 4573–4579. [Google Scholar] [CrossRef]
  112. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef]
  113. Maes, W.H.; Steppe, K. Perspectives for Remote Sensing with Unmanned Aerial Vehicles in Precision Agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef] [PubMed]
  114. Wang, T.; Gao, M.; Cao, C.; You, J.; Zhang, X.; Shen, L. Winter wheat chlorophyll content retrieval based on machine learning using in situ hyperspectral data. Comput. Electron. Agric. 2022, 193, 106728. [Google Scholar] [CrossRef]
  115. Eide, A.; Koparan, C.; Zhang, Y.; Ostlie, M.; Howatt, K.; Sun, X. UAV-Assisted Thermal Infrared and Multispectral Imaging of Weed Canopies for Glyphosate Resistance Detection. Remote Sens. 2021, 13, 4606. [Google Scholar] [CrossRef]
  116. Farooq, A.; Hu, J.; Jia, X. Weed classification in hyperspectral remote sensing images via deep convolutional neural network. In Proceedings of the 2018 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Valencia, Spain, 22–27 July 2018; Volume 2018, pp. 3816–3819. [Google Scholar] [CrossRef]
  117. Farooq, U.; Rehman, A.; Khanam, T.; Amtullah, A.; Bou-Rabee, M.A.; Tariq, M. Lightweight Deep Learning Model for Weed Detection for IoT Devices. In Proceedings of the 2022 2nd International Conference on Emerging Frontiers in Electrical and Electronic Technologies, ICEFEET 2022, Patna, India, 24–25 June 2022. [Google Scholar] [CrossRef]
  118. Fischer, B.; Gauweiler, P.; Gruna, R.; Beyerer, J. Utilizing multispectral imaging for improved weed and crop detection. In Optical Instrument Science, Technology, and Applications III; SPIE: Bellingham, WA, USA, 2024; Volume 13024. [Google Scholar] [CrossRef]
  119. Okamoto, H.; Murata, T.; Kataoka, T.; Hata, S.I. Plant classification for weed detection using hyperspectral imaging with wavelet analysis: Research paper. Weed Biol. Manag. 2007, 7, 31–37. [Google Scholar] [CrossRef]
  120. Ahsen, R.; Di Bitonto, P.; De Trizio, L.; Magarelli, M.; Romano, D.; Novielli, P. Precision Agriculture: Integrating Sensors for Weed Detection using Machine Learning in Agriculture Fields. In Proceedings of the 2024 7th IEEE International Humanitarian Technologies Conference, IHTC 2024, Bari, Italy, 27–30 November 2024. [Google Scholar] [CrossRef]
  121. Barrero, O.; Perdomo, S.A. RGB and multispectral UAV image fusion for Gramineae weed detection in rice fields. Precis. Agric. 2018, 19, 809–822. [Google Scholar] [CrossRef]
  122. Ronay, I.; Lati, R.N.; Kizel, F. Weed Species Identification: Acquisition, Feature Analysis, and Evaluation of a Hyperspectral and RGB Dataset with Labeled Data. Remote Sens. 2024, 16, 2808. [Google Scholar] [CrossRef]
  123. Wendel, A.; Underwood, J. Self-supervised weed detection in vegetable crops using ground based hyperspectral imaging. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 5128–5135. [Google Scholar] [CrossRef]
  124. Noshiri, N.; Beck, M.A.; Bidinosti, C.P.; Henry, C.J. A comprehensive review of 3D convolutional neural network-based classification techniques of diseased and defective crops using non-UAV-based hyperspectral images. Smart Agric. Technol. 2023, 5, 100316. [Google Scholar] [CrossRef]
  125. Lu, B.; Dao, P.D.; Liu, J.; He, Y.; Shang, J. Recent advances of hyperspectral imaging technology and applications in agriculture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
  126. Srinivasa Rao, P.; Anantha Raman, G.R.; Rao, M.S.S.; Radha, K.; Ahmed, R. Enhancing Orchard Cultivation Through Drone Technology and Deep Stream Algorithms in Precision Agriculture. Int. J. Adv. Comput. Sci. Appl. 2024, 15, 781–795. [Google Scholar] [CrossRef]
  127. Petrova, T.; Marinov, M.; Petrov, Z. Study of an Index for Assessing Grass Quality in Pastures Using RGB Images Obtained by Unmanned Aerial Vehicle. In Proceedings of the 2024 23rd International Symposium INFOTEH-JAHORINA, INFOTEH 2024, East Sarajevo, Bosnia and Herzegovina, 20–22 March 2024. [Google Scholar] [CrossRef]
  128. Amziane, A.; Losson, O.; Mathon, B.; MacAire, L.; Duménil, A. Weed detection by analysis of multispectral images acquired under uncontrolled illumination conditions. In Fifteenth International Conference on Quality Control by Artificial Vision; SPIE: Bellingham, WA, USA, 2021; Volume 11794. [Google Scholar] [CrossRef]
  129. Bhandari, S.; Raheja, A.; Green, R.L.; Do, D. Towards collaboration between unmanned aerial and ground vehicles for precision agriculture. In Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II; SPIE: Bellingham, WA, USA, 2017; Volume 10218. [Google Scholar] [CrossRef]
  130. Hassler, S.C.; Baysal-Gurel, F. Unmanned aircraft system (UAS) technology and applications in agriculture. Agronomy 2019, 9, 618. [Google Scholar] [CrossRef]
  131. Zhang, Y.; Su, Z.; Meng, L. A Review of UAV-Based Artificial Intelligence Applications in Agriculture. In Proceedings of the International Conference on Advanced Mechatronic Systems, ICAMechS, Xi’an, China, 19–22 September 2025; pp. 250–255. [Google Scholar] [CrossRef]
  132. Triska, D.; Uryeu, D.; Becerra, E.C.; Haddad, M.; Bhandari, S.; Raheja, A. Investigating the Performance of Monocular and Stereo Vision for the Detection of Weeds in a Strawberry Field using UAVs and Machine Learning Techniques. In Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX; SPIE: Bellingham, WA, USA, 2024; Volume 13053. [Google Scholar] [CrossRef]
  133. Nguyen, V.-H.; Le, C.-D.; Truong, M.-T.; Huynh, P.-H. Weedy Rice Detection and Segmentation in UAV Imagery Using Deep Learning Models. In ICT for Intelligent Systems; Lecture Notes in Networks and Systems, LNNS; Springer: Singapore, 2025; Volume 1507, pp. 357–369. [Google Scholar] [CrossRef]
  134. Yang, S.; Lin, J.; Cernava, T.; Chen, X.; Zhang, X. WeedDETR: An efficient and accurate detection method for detecting small-target weeds in UAV images. Weed Sci. 2025, 73, 10035. [Google Scholar] [CrossRef]
  135. Sunitha, M.J.; Reddy, V.S.; Charan, K.G.V.; Sujith, L.; Chandrika, E.J.; Patel, M. Feature selection-based hybrid DL model for detecting the unwanted land using satellite data. In Proceedings of the 2024 International Conference on Knowledge Engineering and Communication Systems, ICKECS 2024, Chikkaballapur, India, 18–19 April 2024. [Google Scholar] [CrossRef]
  136. Dutta, A.K.; Albagory, Y.; Sait, A.R.W.; Keshta, I.M. Autonomous Unmanned Aerial Vehicles Based Decision Support System for Weed Management. Comput. Mater. Contin. 2022, 73, 899–915. [Google Scholar] [CrossRef]
  137. Singh, V.; Rana, A.; Bishop, M.; Filippi, A.M.; Cope, D.; Rajan, N.; Bagavathiannan, M. Unmanned aircraft systems for precision weed detection and management: Prospects and challenges. Adv. Agron. 2020, 159, 93–134. [Google Scholar] [CrossRef]
  138. Sreeja, K.A.; Pradeep, A.; Arshey, M.; Warrier, G.S.; Ragesh, K.T. Implementing Innovative Weed Detection Techniques for Environmental Sustainability. J. Environ. Nanotechnol. 2024, 13, 287–294. [Google Scholar] [CrossRef]
  139. Xia, F.; Lou, Z.; Sun, D.; Li, H.; Quan, L. Weed resistance assessment through airborne multimodal data fusion and deep learning: A novel approach towards sustainable agriculture. Int. J. Appl. Earth Obs. Geoinf. 2023, 120, 103352. [Google Scholar] [CrossRef]
  140. Lou, Z.; Quan, L.; Sun, D.; Xia, F.; Li, H.; Guo, Z. Multimodal deep fusion model based on Transformer and multi-layer residuals for assessing the competitiveness of weeds in farmland ecosystems. Int. J. Appl. Earth Obs. Geoinf. 2024, 127, 103681. [Google Scholar] [CrossRef]
  141. Sandoval-Pillajo, L.; García-Santillán, I.; Pusdá-Chulde, M.; Giret, A. Weed detection based on deep learning from UAV imagery: A review. Smart Agric. Technol. 2025, 12, 101147. [Google Scholar] [CrossRef]
  142. Modica, G.; De Luca, G.; Messina, G.; Praticò, S. Comparison and assessment of different object-based classifications using machine learning algorithms and UAVs multispectral imagery: A case study in a citrus orchard and an onion crop. Eur. J. Remote Sens. 2021, 54, 431–460. [Google Scholar] [CrossRef]
  143. Verma, J.; Snehi, M.; Kansal, I.; Tiwari, R.G.; Prasad, D. Real-time weed detection and classification using deep learning models and IoT-based edge computing for social learning applications. In Augmented and Virtual Reality in Social Learning: Technological Impacts and Challenges; De Gruyter: Berlin, Germany, 2023; pp. 241–268. [Google Scholar] [CrossRef]
  144. Jiveesha, J.; Vinothini, A. Comparative Analysis of Various Deep Learning Techniques for Weed Detection in Corn Crops. In Proceedings of the 8th International Conference on Electronics, Communication and Aerospace Technology, ICECA 2024, Coimbatore, India, 6–8 November 2024; pp. 812–819. [Google Scholar] [CrossRef]
  145. Gunapriya, B.; Thirumalraj, A.; Anusuya, V.S.; Kavin, B.P.; Seng, G.H. A smart innovative pre-trained model-based QDM for weed detection in soybean fields. In Advanced Intelligence Systems and Innovation in Entrepreneurship; IGI Global Scientific Publishing: Hershey, PA, USA, 2024; pp. 262–285. [Google Scholar] [CrossRef]
  146. Jabir, B.; Rabhi, L.; Falih, N. RNN- and CNN-based weed detection for crop improvement: An overview. Foods Raw Mater. 2021, 9, 387–396. [Google Scholar] [CrossRef]
  147. Mohidem, N.A.; Che’Ya, N.N.; Juraimi, A.S.; Fazlil Ilahi, W.F.; Mohd Roslim, M.H.; Sulaiman, N.; Saberioon, M.; Mohd Noor, N. How can unmanned aerial vehicles be used for detecting weeds in agricultural fields? Agriculture 2021, 11, 1004. [Google Scholar] [CrossRef]
  148. Kumar Nagothu, S.; Anitha, G.; Siranthini, B.; Anandi, V.; Siva Prasad, P. Weed detection in agriculture crop using unmanned aerial vehicle and machine learning. Mater. Today Proc. 2023; in press. [Google Scholar] [CrossRef]
  149. Pai, D.G.; Kamath, R.; Balachandra, M. Deep Learning Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review. IEEE Access 2024, 12, 113193–113214. [Google Scholar] [CrossRef]
  150. Saini, P. Recent Advancement of Weed Detection in Crops Using Artificial Intelligence and Deep Learning: A Review. In Advances in Energy Technology; Lecture Notes in Electrical Engineering; Springer: Singapore, 2022; Volume 766, pp. 631–640. [Google Scholar] [CrossRef]
  151. Mwitta, C.; Rains, G.C.; Prostko, E. Evaluation of Inference Performance of Deep Learning Models for Real-Time Weed Detection in an Embedded Computer. Sensors 2024, 24, 514. [Google Scholar] [CrossRef]
  152. Jin, X.; Liu, T.; McCullough, P.E.; Chen, Y.; Yu, J. Evaluation of convolutional neural networks for herbicide susceptibility-based weed detection in turf. Front. Plant Sci. 2023, 14, 1096802. [Google Scholar] [CrossRef]
  153. Sanjay, M.; Vaideeswar, D.P.; Reddy, C.V.R.; Tavares, M.B. Weed Detection: A Vision Transformer Approach for Soybean Crops. In Proceedings of the 2023 14th International Conference on Computing Communication and Networking Technologies, ICCCNT 2023, Delhi, India, 6–8 July 2023. [Google Scholar] [CrossRef]
  154. Bah, M.D.; Hafiane, A.; Canals, R.; Emile, B. Deep features and One-class classification with unsupervised data for weed detection in UAV images. In Proceedings of the 2019 9th International Conference on Image Processing Theory, Tools and Applications, IPTA 2019, Istanbul, Turkey, 6–9 November 2019. [Google Scholar] [CrossRef]
  155. Haq, M.A. CNN Based Automated Weed Detection System Using UAV Imagery. Comput. Syst. Sci. Eng. 2021, 42, 837–849. [Google Scholar] [CrossRef]
  156. Saini, P.; Nagesh, D.S. A review of deep learning applications in weed detection: UAV and robotic approaches for precision agriculture. Eur. J. Agron. 2025, 168, 127652. [Google Scholar] [CrossRef]
  157. Gallo, I.; Rehman, A.U.; Dehkordi, R.H.; Landro, N.; La Grassa, R.; Boschetti, M. Deep Object Detection of Crop Weeds: Performance of YOLOv7 on a Real Case Dataset from UAV Images. Remote Sens. 2023, 15, 539. [Google Scholar] [CrossRef]
  158. Shaheen, A.Y.; El-Gayar, O. Weed Detection using Lightweight DL models with Transfer Learning & Hyperparameter Optimization. In Proceedings of the 30th Americas Conference on Information Systems, AMCIS 2024, Salt Lake City, UT, USA, 15–17 August 2024. [Google Scholar]
  159. Divyanth, L.G.; Guru, D.S.; Soni, P.; Machavaram, R.; Nadimi, M.; Paliwal, J. Image-to-Image Translation-Based Data Augmentation for Improving Crop/Weed Classification Models for Precision Agriculture Applications. Algorithms 2022, 15, 401. [Google Scholar] [CrossRef]
  160. López-Granados, F. Weed detection for site-specific weed management: Mapping and real-time approaches. Weed Res. 2011, 51, 1–11. [Google Scholar] [CrossRef]
  161. Harders, L.O.; Ufer, T.; Wrede, A.; Hussmann, S. UAV-based real-time weed detection in horticulture using edge processing. J. Electron. Imaging 2023, 32, 052405. [Google Scholar] [CrossRef]
  162. Islam, M.D.; Liu, W.; Izere, P.; Singh, P.; Yu, C.; Riggan, B.; Zhang, K.; Jhala, A.J.; Knezevic, S.; Ge, Y.; et al. Towards real-time weed detection and segmentation with lightweight CNN models on edge devices. Comput. Electron. Agric. 2025, 237, 110600. [Google Scholar] [CrossRef]
  163. Ahmed, F.; Al-Mamun, H.A.; Bari, A.S.M.H.; Hossain, E.; Kwan, P. Classification of crops and weeds from digital images: A support vector machine approach. Crop Prot. 2012, 40, 98–104. [Google Scholar] [CrossRef]
  164. dos Santos Ferreira, A.; Matte Freitas, D.; Gonçalves da Silva, G.; Pistori, H.; Folhes, M.T. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
  165. Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodríguez, L. A Deep Learning Approach for Weed Detection in Lettuce Crops Using Multispectral Images. AgriEngineering 2020, 2, 471–488. [Google Scholar] [CrossRef]
  166. Aaron, A.; Hassan, M.; Hamada, M.; Kakudi, H. A Lightweight Deep Learning Model for Identifying Weeds in Corn and Soybean Using Quantization. Eng. Proc. 2023, 56, 318. [Google Scholar] [CrossRef]
  167. Saqib, M.A.; Aqib, M.; Tahir, M.N.; Hafeez, Y. Towards deep learning based smart farming for intelligent weeds management in crops. Front. Plant Sci. 2023, 14, 1211235. [Google Scholar] [CrossRef]
  168. Alirezazadeh, P.; Schirrmann, M.; Stolzenburg, F. A comparative analysis of deep learning methods for weed classification of high-resolution UAV images. J. Plant Dis. Prot. 2024, 131, 227–236. [Google Scholar] [CrossRef]
  169. Hruška, A.; Hamouz, P. Verification of a machine learning model for weed detection in maize (Zea mays) using infrared imaging. Plant Prot. Sci. 2023, 59, 292–297. [Google Scholar] [CrossRef]
  170. Sohail, R.; Nawaz, Q.; Hamid, I.; Amin, H.; Chauhdary, J.N.; Gilani, S.M.M.; Mumtaz, I. A novel machine learning based algorithm to detect weeds in Soybean crop. Pak. J. Agric. Sci. 2021, 58, 1007–1015. [Google Scholar]
  171. Shirzadifar, A.; Bajwa, S.; Nowatzki, J.; Bazrafkan, A. Field identification of weed species and glyphosate-resistant weeds using high resolution imagery in early growing season. Biosyst. Eng. 2020, 200, 200–214. [Google Scholar] [CrossRef]
  172. Rai, N.; Villamil Mahecha, M.; Christensen, A.; Quanbeck, J.; Zhang, Y.; Howatt, K.; Ostlie, M.; Sun, X. Multi-format open-source weed image dataset for real-time weed identification in precision agriculture. Data Brief. 2023, 51, 109691. [Google Scholar] [CrossRef]
  173. Lechner, M.; Steindl, L.; Jantsch, A. Study of DNN-Based Ragweed Detection from Drones. In Embedded Computer Systems: Architectures, Modeling, and Simulation; Orailoglu, A., Reichenbach, M., Jung, M., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 187–199. [Google Scholar]
  174. Sunil, G.C.; Zhang, Y.; Koparan, C.; Ahmed, M.R.; Howatt, K.; Sun, X. Weed and crop species classification using computer vision and deep learning technologies in greenhouse conditions. J. Agric. Food Res. 2022, 9, 100325. [Google Scholar] [CrossRef]
  175. Sanders, J.T.; Everman, W.J.; Austin, R.; Roberson, G.T.; Richardson, R.J. Weed species differentiation using spectral reflectance land image classification. In Advanced Environmental, Chemical, and Biological Sensing Technologies XV; SPIE: Bellingham, WA, USA, 2019; Volume 11007, p. 110070P. [Google Scholar] [CrossRef]
  176. Dmitriev, P.A.; Kozlovsky, B.L.; Kupriushkin, D.P.; Dmitrieva, A.A.; Rajput, V.D.; Chokheli, V.A.; Tarik, E.P.; Kapralova, O.A.; Tokhtar, V.K.; Minkina, T.M.; et al. Assessment of Invasive and Weed Species by Hyperspectral Imagery in Agrocenoses Ecosystem. Remote Sens. 2022, 14, 2442. [Google Scholar] [CrossRef]
  177. Dammer, K.H.; Intress, J.; Beuche, H.; Selbeck, J.; Dworak, V. Discrimination of Ambrosia artemisiifolia and Artemisia vulgaris by hyperspectral image analysis during the growing season. Weed Res. 2013, 53, 146–156. [Google Scholar] [CrossRef]
  178. Maken, Q.-; Hussain, S.; Tariq, S.; Qaisrani, S.A.; Mubeen, M. UAVs’ Scope in Agricultural Water Management. In Innovations in Agricultural Water Management: Risks and Solutions; Springer: Cham, Switzerland, 2025; pp. 401–414. [Google Scholar] [CrossRef]
  179. Tamminga, A.; Hugenholtz, C.; Eaton, B.; Lapointe, M. Hyperspatial Remote Sensing of Channel Reach Morphology and Hydraulic Fish Habitat Using an Unmanned Aerial Vehicle (UAV): A First Assessment in the Context of river Research and Management. River Res. Appl. 2015, 391, 379–391. [Google Scholar] [CrossRef]
  180. Krebsz, M.; Dwivedi, D. Re-wilding Natural Habitats with Flying Robots, AI, and Metaverse Ecosystems. In The Business of the Metaverse: How to Maintain the Human Element Within This New Business Reality; CRC Press: Boca Raton, FL, USA, 2023; pp. 34–60. [Google Scholar] [CrossRef]
  181. Nieto, H. Mapping Evapotranspiration and Crop Stress with Unmanned Aerial Vehicles: A Cost-effective Approach. In Proceedings of the 2022 IEEE International Conference on Automation/25th Congress of the Chilean Association of Automatic Control: For the Development of Sustainable Agricultural Systems, ICA-ACCA 2022, Curicó, Chile, 24–28 October 2022. [Google Scholar] [CrossRef]
  182. Kotlinski, M.; Calkowska, J.K. U-Space and UTM Deployment as an Opportunity for More Complex UAV Operations Including UAV Medical Transport. J. Intell. Robot. Syst. Theory Appl. 2022, 106, 12. [Google Scholar] [CrossRef]
  183. Clothier, R.; Williams, B.; Washington, A. Development of a Template Safety Case for Unmanned Aircraft Operations over Populous Areas; SAE Technical Papers; SAE International: Warrendale, PA, USA, 2015. [Google Scholar] [CrossRef]
  184. Denney, E.; Pai, G.; Johnson, M. Towards a rigorous basis for specific operations risk assessment of UAS. In Proceedings of the AIAA/IEEE Digital Avionics Systems Conference, London, UK, 23–27 September 2018. [Google Scholar] [CrossRef]
  185. Gheorghe, A.V.; Ancel, E. Unmanned aerial systems integration to National Airspace System. In Proceedings of the 2008 1st International Conference on Infrastructure Systems and Services: Building Networks for a Brighter Future, INFRA 2008, Rotterdam, The Netherlands, 10–12 November 2008. [Google Scholar] [CrossRef]
  186. Ma, Q. Optimization and evolution of UAV insurance provision framework: Insights into multi-sector cooperation. J. Air Transp. Manag. 2025, 124, 102740. [Google Scholar] [CrossRef]
  187. Chiriacò, F.; Catellani, E.; Ciccullo, F. Exploring the adoption process of drones and robots in agriculture. In Proceedings of the Summer School Francesco Turco, Lecce, Italy, 10–12 September 2025. [Google Scholar]
  188. Gheorghe, G.-V.; Dumitru, D.-N.; Ciupercă, R.; Mateescu, M.; Mantovani, S.A.; Prisacariu, E.; Harabagiu, A. Advancing Precision Agriculture with UAV’S: Innovations IN Fertilization. Inmateh—Agric. Eng. 2024, 74, 1057–1072. [Google Scholar] [CrossRef]
  189. Jafarbiglu, H.; Pourreza, A. Impact of sun-view geometry on canopy spectral reflectance variability. ISPRS J. Photogramm. Remote Sens. 2023, 196, 270–286. [Google Scholar] [CrossRef]
  190. Raymond Hunt, E.; Stern, A.J. Evaluation of incident light sensors on unmanned aircraft for calculation of spectral reflectance. Remote Sens. 2019, 11, 2622. [Google Scholar] [CrossRef]
  191. Zhao, W.; Li, X.; Wang, W.; Wen, F.; Yin, G. DSRC: An Improved Topographic Correction Method for Optical Remote-Sensing Observations Based on Surface Downwelling Shortwave Radiation. IEEE Trans. Geosci. Remote Sens. 2022, 60, 3083754. [Google Scholar] [CrossRef]
  192. Pflanz, M.; Nordmeyer, H.; Schirrmann, M. Weed mapping with UAS imagery and a bag of visualwords based image classifier. Remote Sens. 2018, 10, 1530. [Google Scholar] [CrossRef]
  193. Gómez-Candón, D.; De Castro, A.I.; López-Granados, F. Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precis. Agric. 2014, 15, 44–56. [Google Scholar] [CrossRef]
  194. Ubben, N.; Pukrop, M.; Jarmer, T. Spatial Resolution as a Factor for Efficient UAV-Based Weed Mapping—A Soybean Field Case Study. Remote Sens. 2024, 16, 1778. [Google Scholar] [CrossRef]
  195. Mesas-Carrascosa, F.J.; Clavero Rumbao, I.; Torres-Sánchez, J.; García-Ferrer, A.; Peña, J.M.; López Granados, F. Accurate ortho-mosaicked six-band multispectral UAV images as affected by mission planning for precision agriculture proposes. Int. J. Remote Sens. 2017, 38, 2161–2176. [Google Scholar] [CrossRef]
  196. Zaw, H.H.; Hein, Z.; Mikhailovich, P.E. Development of Mathematical Methods and Algorithms for Filtering Images Obtained from Unmanned Aerial Vehicle Camera. In Proceedings of the 2023 International Conference on Industrial Engineering, Applications and Manufacturing, ICIEAM 2023, Sochi, Russia, 15–19 May 2023; pp. 837–844. [Google Scholar] [CrossRef]
  197. Zhang, D.; Watson, R.; Dobie, G.; MacLeod, C.; Khan, A.; Pierce, G. Quantifying impacts on remote photogrammetric inspection using unmanned aerial vehicles. Eng. Struct. 2020, 209, 109940. [Google Scholar] [CrossRef]
  198. Wan, Y.; Wei, W.; Zheng, Q. Study on the aerial image degradation caused by satellite swing and its correction. In Proceedings of the 2016 5th International Conference on Computer Science and Network Technology, ICCSNT 2016, Changchun, China, 10–11 December 2017; pp. 795–799. [Google Scholar] [CrossRef]
  199. Genze, N.; Ajekwe, R.; Güreli, Z.; Haselbeck, F.; Grieb, M.; Grimm, D.G. Deep learning-based early weed segmentation using motion blurred UAV images of sorghum fields. Comput. Electron. Agric. 2022, 202, 107388. [Google Scholar] [CrossRef]
  200. Pei, H.; Sun, Y.; Huang, H.; Zhang, W.; Sheng, J.; Zhang, Z. Weed Detection in Maize Fields by UAV Images Based on Crop Row Preprocessing and Improved YOLOv4. Agriculture 2022, 12, 975. [Google Scholar] [CrossRef]
  201. Che’Ya, N.N.; Dunwoody, E.; Gupta, M. Assessment of Weed Classification Using Hyperspectral Reflectance and Optimal Multispectral UAV Imagery. Agronomy 2021, 11, 1435. [Google Scholar] [CrossRef]
  202. Basinger, N.T.; Jennings, K.M.; Hestir, E.L.; Monks, D.W.; Jordan, D.L.; Everman, W.J. Phenology affects differentiation of crop and weed species using hyperspectral remote sensing. Weed Technol. 2020, 34, 897–908. [Google Scholar] [CrossRef]
  203. Mohammadi, V.; Minaei, S.; Gouton, P.; Mahdavian, A.R.; Khoshtaghaza, M.H. Spectral discrimination of crops and weeds using deep learning assisted by wavelet transform and statistical preprocessing. Weed Sci. 2024, 72, 536–545. [Google Scholar] [CrossRef]
  204. Castro A-Ide Jurado-Expósito, M.; Gómez-Casero, M.-T.; López-Granados, F. Applying Neural Networks to Hyperspectral and Multispectral Field Data for Discrimination of Cruciferous Weeds in Winter Crops. Sci. World J. 2012, 2012, 630390. [Google Scholar] [CrossRef]
  205. Smith, A.M.; Blackshaw, R.E. Weed-crop discrimination using remote sensing: A detached leaf experiment. Weed Technol. 2003, 17, 811–820. [Google Scholar] [CrossRef]
  206. Zhu, F.; Zhou, Z.; Shen, Y.; He, M.; Jiang, J.; Qiao, X.; Peng, J.; He, Y. A 3D spectral compensation method on close-range hyperspectral imagery of plant canopies. Comput. Electron. Agric. 2025, 231, 109955. [Google Scholar] [CrossRef]
  207. Ali, Z.A.; Yang, C.; Israr, A.; Zhu, Q. A Comprehensive Review of Scab Disease Detection on Rosaceae Family Fruits via UAV Imagery. Drones 2023, 7, 97. [Google Scholar] [CrossRef]
  208. Tamouridou, A.A.; Alexandridis, T.K.; Pantazi, X.E.; Lagopodi, A.L.; Kashefi, J.; Kasampalis, D.; Kontouris, G.; Moshou, D. Application of multilayer perceptron with automatic relevance determination on weed mapping using UAV multispectral imagery. Sensors 2017, 17, 2307. [Google Scholar] [CrossRef]
  209. Alexandridis, T.K.; Tamouridou, A.A.; Pantazi, X.E.; Lagopodi, A.L.; Kashefi, J.; Ovakoglou, G.; Polychronos, V.; Moshou, D. Novelty detection classifiers in weed mapping: Silybum marianum detection on UAV multispectral images. Sensors 2017, 17, 2007. [Google Scholar] [CrossRef]
  210. Ngom, R.; Gosselin, P. Development of a remote sensing-based method to map likelihood of common ragweed (Ambrosia artemisiifolia) presence in urban areas. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 126–139. [Google Scholar] [CrossRef]
  211. Marzialetti, F.; Frate, L.; De Simone, W.; Frattaroli, A.R.; Acosta, A.T.R.; Carranza, M.L. Unmanned aerial vehicle (Uav)-based mapping of acacia saligna invasion in the mediterranean coast. Remote Sens. 2021, 13, 3361. [Google Scholar] [CrossRef]
  212. Al-Ali, Z.M.; Abdullah, M.M.; Asadalla, N.B.; Gholoum, M. A comparative study of remote sensing classification methods for monitoring and assessing desert vegetation using a UAV-based multispectral sensor. Environ. Monit. Assess 2020, 192, 389. [Google Scholar] [CrossRef]
  213. Chen, X.; Chen, T.; Meng, H.; Zhang, Z.; Wang, D.; Sun, J.; Wang, J. An improved algorithm based on YOLOv5 for detecting Ambrosia trifida in UAV images. Front. Plant Sci. 2024, 15, 1360419. [Google Scholar] [CrossRef]
  214. Hall, R.M.; Urban, B.; Wagentristl, H.; Karrer, G.; Winter, A.; Czerny, R.; Kaul, H.-P. Common ragweed (Ambrosia artemisiifolia L.) causes severe yield losses in soybean and impairs bradyrhizobium japonicum infection. Agronomy 2021, 11, 1616. [Google Scholar] [CrossRef]
  215. Pinke, G.; Karácsony, P.; Czúcz, B.; Botta-Dukát, Z. Environmental and land-use variables determining the abundance of Ambrosia artemisiifolia in arable fields in Hungary. Preslia 2011, 83, 219–235. [Google Scholar]
  216. Krähmer, H. Overview of selected problems. In Atlas of Weed Mapping; Wiley: Hoboken, NJ, USA, 2016; pp. 103–112. [Google Scholar] [CrossRef]
  217. Lommen, S.T.E.; Hallmann, C.A.; Jongejans, E.; Chauvel, B.; Leitsch-Vitalos, M.; Aleksanyan, A.; Tóth, P.; Preda, C.; Šćepanović, M.; Onen, H.; et al. Explaining variability in the production of seed and allergenic pollen by invasive Ambrosia artemisiifolia across Europe. Biol. Invasions 2018, 20, 1475–1491. [Google Scholar] [CrossRef]
  218. Kröel-Dulay, G.; Csecserits, A.; Szitár, K.; Molnár, E.; Szabó, R.; Ónodi, G.; Botta-Dukát, Z. The potential of common ragweed for further spread: Invasibility of different habitats and the role of disturbances and propagule pressure. Biol. Invasions 2019, 21, 137–149. [Google Scholar] [CrossRef]
  219. Schaffner, U.; Steinbach, S.; Sun, Y.; Skjøth, C.A.; de Weger, L.A.; Lommen, S.T.; Augustinus, B.A.; Bonini, M.; Karrer, G.; Šikoparija, B.; et al. Biological weed control to relieve millions from Ambrosia allergies in Europe. Nat. Commun. 2020, 11, 1745. [Google Scholar] [CrossRef] [PubMed]
  220. Bajwa, A.A.; Khan, M.J.; Bhowmik, P.C.; Walsh, M.; Chauhan, B.S. Sustainable weed management. In Innovations in Sustainable Agriculture; Springer: Cham, Switzerland, 2019; pp. 249–286. [Google Scholar] [CrossRef]
  221. Mohan, A.V.; Baskaran, R.; Harisudan, C.; Kalpana, R.; Boominanthan, P.; Jaghadeeswaran, R. Advances and trends in weed management: A comprehensive review. Plant Sci. Today 2024, 11, 5141. [Google Scholar] [CrossRef]
  222. Roberts, J.; Florentine, S. Advancements and developments in the detection and control of invasive weeds: A global review of the current challenges and future opportunities. Weed Sci. 2024, 72, 205–215. [Google Scholar] [CrossRef]
  223. Tataridas, A.; Kanatas, P.; Chatzigeorgiou, A.; Zannopoulos, S.; Travlos, I. Sustainable Crop and Weed Management in the Era of the EU Green Deal: A Survival Guide. Agronomy 2022, 12, 589. [Google Scholar] [CrossRef]
  224. Infanta, S.C.; Selvakumar, T.; Ragavan, T.; Sathya Sheela, K.R.V.; Bharathi, C. Cultivating change: A review of progressive technologies in weed detection and management. Emir. J. Food Agric. 2024, 2024, 1–11. [Google Scholar] [CrossRef]
  225. Hakzi, K.; Pareeth, S.; Gaznayee, H.A.A.; Chukalla, A.; Safi, A.R.; de Fraiture, C. Assessing the Spatio-Temporal Variations of Wheat Yield and Water Productivity under Centre Pivot Irrigation Systems Using Open-Access Remote Sensing Data. Agric. Water Manag. 2025, 319, 109733. [Google Scholar] [CrossRef]
  226. Mahdi, K.; Gaznayee, H.A.A.; Aliehsan, P.H.; Zaki, S.H.; Keya, D.R.; Hakzi, K.; Alqrinawi, F.; Riksen, M. Intensifying Drought Patterns and Agricultural Water Stress in Erbil Governorate, Iraq: A Spatiotemporal Climate Analysis. Glob. Chall. 2026, 10, e00491. [Google Scholar] [CrossRef] [PubMed]
  227. Mesías-Ruiz, G.A.; Borra-Serrano, I.; Peña, J.M.; de Castro, A.I.; Fernández-Quintanilla, C.; Dorado, J. Weed species classification with UAV imagery and standard CNN models: Assessing the frontiers of training and inference phases. Crop. Prot. 2024, 182, 106721. [Google Scholar] [CrossRef]
  228. Vijayakumar, S.; Kumar, V.; Gurjar, B.; Bagavathiannan, M. The Role of Unmanned Aerial Vehicles and Sensor Technology in Site-Specific Weed Management. In Recent Advances in Weed Science; Springer: Cham, Switzerland, 2025; pp. 89–123. [Google Scholar] [CrossRef]
  229. Sankararao, A.U.G.; Rajalakshmi, P. UAV Based Hyperspectral Remote Sensing and CNN for Vegetation Classification. In Proceedings of the International Geoscience and Remote Sensing Symposium (IGARSS), Kuala Lumpur, Malaysia, 17–22 July 2022; pp. 7737–7740. [Google Scholar] [CrossRef]
  230. Agrawal, J.; Arafat, M.Y. Transforming Farming: A Review of AI-Powered UAV Technologies in Precision Agriculture. Drones 2024, 8, 664. [Google Scholar] [CrossRef]
  231. Unde, S.S.; Kurkute, V.K.; Chavan, S.S.; Mohite, D.D.; Harale, A.A.; Chougle, A. The expanding role of multirotor UAVs in precision agriculture with applications AI integration and future prospects. Discov. Mech. Eng. 2025, 4, 38. [Google Scholar] [CrossRef]
  232. Raouhi, E.M.; Lachgar, M.; Hrimech, H.; Kartit, A. Unmanned Aerial Vehicle-based Applications in Smart Farming: A Systematic Review. Int. J. Adv. Comput. Sci. Appl. 2023, 14, 1150–1165. [Google Scholar] [CrossRef]
  233. Katsigiannis, P.; Misopolinos, L.; Liakopoulos, V.; Alexandridis, T.K.; Zalidis, G. An autonomous multi-sensor UAV system for reduced-input precision agriculture applications. In Proceedings of the 24th Mediterranean Conference on Control and Automation, MED 2016, Athens, Greece, 21–24 June 2016; pp. 60–64. [Google Scholar] [CrossRef]
  234. Perera, H.; Perera, S.; Chamika, D.; Ashraff, S.; Sandagomi, S.; Balasingham, D.; Rajahrajasingh, H.; Primal, D.; Jayakody, A. Advancing Agriculture: A Review of UAV Technologies and Their Impact on Sustainable Farming. In Proceedings of the International Research Conference on Smart Computing and Systems Engineering, SCSE 2025, Colombo, Sri Lanka, 3 April 2025. [Google Scholar] [CrossRef]
  235. Xuan, T.D.; Khanh, T.D.; Minh, T.T.N. Implementation of Conventional and Smart Weed Management Strategies in Sustainable Agricultural Production. Weed Biol. Manag. 2025, 25, e70000. [Google Scholar] [CrossRef]
  236. Schindler, S.; Bayliss, H.R.; Essl, F.; Rabitsch, W.; Follak, S.; Pullin, A.S. Effectiveness of management interventions for control of invasive Common ragweed Ambrosia artemisiifolia: A systematic review protocol. Environ. Evid. 2016, 5, 11. [Google Scholar] [CrossRef]
  237. Li, H.; Fu, T.; Hao, H.; Yu, Z. MAVSD: A Multi-Angle View Segmentation Dataset for Detection of Solidago canadensis L. Sci. Data 2025, 12, 861. [Google Scholar] [CrossRef]
  238. Asmus, A.; Schroeder, J. Rethinking Outreach: Collaboration Is Key for Herbicide-Resistance Management. Weed Sci. 2016, 64, 655–660. [Google Scholar] [CrossRef]
  239. Pontes Junior, V.B.; Alberto da Silva, A.; D’Antonino, L.; Mendes, K.F.; de Paula Medeiros, B.A. Methods of Control and Integrated Management of Weeds in Agriculture. In Applied Weed and Herbicide Science; Springer: Cham, Switzerland, 2022; pp. 127–156. [Google Scholar] [CrossRef]
  240. Barathkumar, R.; Selvanayaki, S.; Deepa, N.; Kannan, P.; Prahadeeswaran, M. Impact of drone technology on agriculture—farmers’ perception analysis. Plant Sci. Today 2024, 11, 13. [Google Scholar] [CrossRef]
  241. Nafar, N.; Fatemi, M.; Rezaei-Moghaddam, K. Decoding drone adoption in agriculture: A comparative analysis of behavioral models. Inf. Process. Agric. 2025; in press. [Google Scholar] [CrossRef]
  242. Karupakula, S.R.; Maram, B.R.; Ram, V. Soil and field analysis using unmanned aerial vehicles for smart and sustainable farming. In Hyperautomation in Precision Agriculture: Advancements and Opportunities for Sustainable Farming; Academic Press: London, UK, 2024; pp. 147–158. [Google Scholar] [CrossRef]
  243. Samko, M.; Zatserkovnyi, V.; Vorokh, V.; Tsyguliov, I.; Ilchenko, A. Monitoring using uavs in precision farming technologies. In Proceedings of the 18th International Scientific Conference “Monitoring of Geological Processes and Ecological Condition of the Environment”, Monitoring 2025, Kyiv, Ukraine, 14–17 April 2025. [Google Scholar] [CrossRef]
  244. Nahiyoon, S.A.; Ren, Z.; Wei, P.; Li, X.; Li, X.; Xu, J.; Yan, X.; Yuan, H. Recent Development Trends in Plant Protection UAVs: A Journey from Conventional Practices to Cutting-Edge Technologies—A Comprehensive Review. Drones 2024, 8, 457. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Article metric data becomes available approximately 24 hours after publication online.