Next Article in Journal
Deciphering the Mechanisms Underlying Enhanced Drought Tolerance in Autotetraploid Apple ‘Redchief’: Physiological, Biochemical, Molecular, and Anatomical Insights
Next Article in Special Issue
CAFE-DETR: A Sesame Plant and Weed Classification and Detection Algorithm Based on Context-Aware Feature Enhancement
Previous Article in Journal
Monitoring Rice Blast Disease Progression Through the Fusion of Time-Series Hyperspectral Imaging and Deep Learning
Previous Article in Special Issue
Correction: Zhu et al. Maize Seed Variety Classification Based on Hyperspectral Imaging and a CNN-LSTM Learning Framework. Agronomy 2025, 15, 1585
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Breeding Smarter: Artificial Intelligence and Machine Learning Tools in Modern Breeding—A Review

by
Ana Luísa Garcia-Oliveira
1,*,
Sangam L. Dwivedi
2,
Subhash Chander
3,
Charles Nelimor
4,
Diaa Abd El Moneim
5 and
Rodomiro Octavio Ortiz
6,*
1
Instituto Nacional de Investigação Agrária e Veterinária, I.P.—Estrada de Gil Vaz Ap. 6, 7350-901 Elvas, Portugal
2
Independent Researcher, Hyderabad 500016, India
3
Oilseeds Section, Department of Genetics & Plant Breeding, CCS Haryana Agricultural University, Hisar 125004, India
4
Council of Scientific and Industrial Research (CSIR), Savanna Agricultural Research Institute, Tamale P.O. Box 52, Ghana
5
Department of Plant Production, Faculty of Environmental and Agricultural Sciences, Arish University, El-Arish 45511, Egypt
6
Department of Plant Breeding, Swedish University of Agricultural Sciences, 23456 Alnarp, Sweden
*
Authors to whom correspondence should be addressed.
Agronomy 2026, 16(1), 137; https://doi.org/10.3390/agronomy16010137
Submission received: 13 November 2025 / Revised: 22 December 2025 / Accepted: 22 December 2025 / Published: 5 January 2026
(This article belongs to the Collection AI, Sensors and Robotics for Smart Agriculture)

Abstract

Climate challenges, along with a projected global population increase of 2 billion by 2080, are intensifying pressures on agricultural systems, leading to biodiversity loss, land use constrains, soil fertility declining, and changes in water cycles, while crop yields struggle to meet the rising food demand. These challenges, coupled with evolving legislation and rapid technology advancements, require innovative sustainable agricultural solutions. By reshaping farmers’ daily operations, real-time data acquisition and predictive models can support informed decision-making. In this context, smart farming (SM) applied to plant breeding can improve efficiency by reducing inputs and increasing outputs through the adoption of digital and data-driven technologies. Examples include the investment on common ontologies and metadata standards for phenotypes and environments, standardization of HTP protocols, integration of prediction outputs into breeding databases, and selection workflows, as well in building multi-partner field networks that collect diverse envirotypes. This review outlines how AI and machine learning (ML) can be integrated in modern plant breeding methodologies, including genomic selection (GS) and genetic algorithms (GAs), to accelerate the development of climate-resilient and sustainably performing crop varieties. While many reviews address smart farming or smart breeding independently, herein, these domains are bridged to provide an understandable strategic landscape by enhancing breeding efficiency.

Graphical Abstract

1. Introduction

In agriculture, it is essential to improve traits to address the various challenges that impact crop production [1]. In plant breeding, this improvement is accomplished by combining genomic (DNA markers), phenomic (trait expression or plant phenotype), and enviromic data, all contributing to desired trait expression. These three factors are interdependent and necessary for gene discovery and manipulation via modern breeding tools, including gene editing and epigenetic engineering. Methods that aid in this discovery include quantitative analysis through linkage mapping and associated traits—either using a genome-wide association study (GWAS) approach or mapping quantitative trait loci (QTLs). Other methods include biometrics and genomic selection (GS), particularly targeting multi-environmental scales [2,3,4]. With the current ongoing advancements in next-generation sequencing and, consequently, the discovery of a greater number of molecular markers, their effects can be further explored in terms of the number of variants as well as their contribution to trait variability in specific environments to be used in forward and reverse crop genetics through mutation breeding and CRISPR [5]. Chawade et al. [6] pointed out that “the degree of success in changing the population’s genotypic structure by altering its gene frequency depends on precise phenotyping and selection”. Consequently, given favorable allele frequency, selection efficiency and phenotyping accuracy are directly correlated.
Smart breeding refers to the integration of advanced tools, including genomic, phenomic, artificial intelligence (AI), and machine learning (ML) tools, to enhance efficiency of processes and their accuracy, speed, and effectiveness of breeding processes and enviromics [7] (Table 1). Hence, a variety of tools are currently available to assess the changes encoded by the genome at the phenomics level. Yet, one must be aware that phenotyping procedures can be highly time-consuming and complex, particularly when thousands of plots need to be measured, which likely pose challenges in terms of time–cost–quality triangulation outputs.
Table 1. Conceptual synergies between ‘Farming Smarter’ vs.‘Breeding Smarter’ and their intersections.
Table 1. Conceptual synergies between ‘Farming Smarter’ vs.‘Breeding Smarter’ and their intersections.
AspectsFarming SmarterBreeding SmarterIntersection: Smart Agriculture IntegrationReferences
FocusManaging and optimizing production systems using data and technologyImproving the genetic potential of crops/animals using genomic and artificial intelligence (AI) toolsIntegrating genetic, environmental, and management data to co-optimize variety/breed performance and management practices.[8,9,10]
ScaleField, farm, or regional levelPopulation or breeding program levelMulti-scale: linking genotype × environment × management (G × E × M) interactions across farms and breeding programs.
Core toolsSensors, drones, IoT, robotics, ML-driven decision support, remote sensing for managementGenotyping, phenotyping, genomic prediction, gene editing, bioinformatics for selectionShared AI and big data analytics platforms for both genetic and management optimization.
Time HorizonShort- to medium-term (seasonal improvements)Long-term (genetic gains over generations)Continuous: real-time feedback from farm data informs breeding targets; new varieties feed back into optimized farming.
Data usedEnvironmental, soil, weather, and management dataGenetic, genomic, and phenotypic dataIntegrated datasets combining genotypic, phenotypic, and environmental information for holistic modeling
OutcomeHigher efficiency, sustainability, and profitability of production systemsHigher yield potential, resilience, and quality in new cultivarsAccelerated genetic gain and improved field performance through adaptive management and precision breeding
Type of innovationProcess innovation: improving how farming is performed
Better decisions → higher efficiency
Product innovation: improving what is farmed (cultivars/breeds)
Better varieties/breeds → higher yield/resilience
System innovation: co-designing crops, environments, and practices for maximum synergy
Role of technology, particularly AISupport on decision-making for input use, disease and pest control, irrigation, and logisticsPredicts genotype performance, identifies key genes, and enhances selection accuracyEnabling predictive agriculture, linking genomic prediction with environmental sensing and management optimization
In this context, high-throughput remote sensing provides a methodology for detecting real-time crop responses that can be immediately connected with the genotype, providing a tool for making immediate decisions (Figure 1). Because the overall goal in agriculture is to develop an integrated system that optimizes both plant and animal outputs in a sustainable and cost-effective manner, optimizing crop and animal production can minimize costs and maximize outputs. This optimization in crop production includes the integrated use of crossbreeding and agronomic practices with advanced technologies [11]. This includes the integrated use of sensors in agronomic practices such as sowing and spraying [6], robotics as an extension of precision farming [12], weed detection and management [13], germplasm selection, and physiological and photosynthesis efficiency evaluation [14,15], as well as for sustainable decision fertilization modeling [16]. In the forestry industry, which also involves tree breeding and conservation, the efficient assessment and tracking of worldwide afforestation and deforestation, land degradation, and ecological management depend heavily on product analysis based on satellite imagery [17].
In this area, robust remote sensing techniques and equipment are necessary due to the large scale of operations to maintain forest ecosystems and structural diversity. Remote sensing is also key to support regulatory bodies and policy-making in areas such as wood and cork productivity, greenhouse gas (GHG) emissions, and palm oil certification [18]. Additional areas of support include estimations of biodiversity loss and degradation due to fires [19,20], along with monitoring of the water status in various orchard crops, including almond, lime, and olive trees [21]. Climate change is expected to make water availability increasingly unpredictable and stressed, thereby affecting agricultural systems more frequent and aggressively [22]. In remote sensing applied to breeding, unmanned aerial vehicles (UAVs) are of great value, as they can rapidly assess phenotypic traits across large breeding populations. With their ability to gather high-resolution imagery, UAVs detect variations in traits such as growth patterns, disease resistance, and stress responses, which are essential for improving breeding outcomes. Because machine learning (ML) and artificial intelligence (AI) possess the ability to process large volumes of heterogeneous data generated by remote sensing platforms, including UAVs, they have become crucial tools in modern breeding pipelines. By incorporating ML using recognition methods including convolutional neural networks (CNNs) or supervised regression models, UAV data can reveal hidden patterns that are difficult to observe through traditional breeding methods [23,24].
Figure 1. Case studies on unmanned aerial vehicle (UAV) utility: (a) Tattaris et al. [25] demonstrated the capability of using image processing for monitoring crop growth and yield. This methodology involved the capture of high-resolution images of a crop field from a UAV, which were processed to calculate vegetation indices including the Normalized Difference Vegetation Index (NDVI) and Green Area Index (GAI). The technique was efficient in tracking both growth and yield over time with a high level of accuracy. (b) The Satellite Applications Catapult (Accessed on 21 December 2025; https://sa.catapult.org.uk/industry-news/ahdb-satellites-for-agriculture/) is an online farm management platform called FarmSAR that integrates satellite data with farm management practices. The platform allows farmers to observe real-time information on weather, soil moisture, and crop growth, thereby allowing them to make informed decisions about irrigation, fertilization, and harvesting. Machine learning algorithms were additionally used to analyze the data and provide recommendations for optimal crop management. (c) Using Sentinel-2 satellite imagery data, the European Space Agency (ESA) monitored crop growth and yield across different regions in Poland during two consecutive years. The data were then processed to develop crop maps and predict both yield and crop health in order to demonstrate the proof-of-concept on the feasibility and reliability of using satellite data in crop monitoring and yield forecasting [26].
Figure 1. Case studies on unmanned aerial vehicle (UAV) utility: (a) Tattaris et al. [25] demonstrated the capability of using image processing for monitoring crop growth and yield. This methodology involved the capture of high-resolution images of a crop field from a UAV, which were processed to calculate vegetation indices including the Normalized Difference Vegetation Index (NDVI) and Green Area Index (GAI). The technique was efficient in tracking both growth and yield over time with a high level of accuracy. (b) The Satellite Applications Catapult (Accessed on 21 December 2025; https://sa.catapult.org.uk/industry-news/ahdb-satellites-for-agriculture/) is an online farm management platform called FarmSAR that integrates satellite data with farm management practices. The platform allows farmers to observe real-time information on weather, soil moisture, and crop growth, thereby allowing them to make informed decisions about irrigation, fertilization, and harvesting. Machine learning algorithms were additionally used to analyze the data and provide recommendations for optimal crop management. (c) Using Sentinel-2 satellite imagery data, the European Space Agency (ESA) monitored crop growth and yield across different regions in Poland during two consecutive years. The data were then processed to develop crop maps and predict both yield and crop health in order to demonstrate the proof-of-concept on the feasibility and reliability of using satellite data in crop monitoring and yield forecasting [26].
Agronomy 16 00137 g001
Despite the fact that there is an increasing number of reviews and research articles in this area [Supplementary Figure S1d], most studies tend to either overlook the genetic aspect or address them in a simpler and non-integrative way. In this context, besides providing an update on proof-of-concept and small-scale technical trials, we discuss the integration of genetic algorithms and phenotyping techniques in the context of plant breeding for easy assimilation.

2. Review Methodology

Data Sources and Exclusion Criteria

Following the PRISMA 2020 guidelines [27], the methodology for this review employed a literature search followed by screening and qualitative synthesis. Relevant research studies were performed via several knowledge platforms including the Scopus, Web of Science, PubMed, and Google Scholar databases. The search was conducted without date restrictions and limited to scientific articles and reviews using the keywords ‘plant breeding’, ‘artificial intelligence’, ‘machine learning’, ‘remote sensing’, and ‘genetic algorithm’. An additional search incorporated the terms ‘ethics and regulations’ to capture governance-related aspects. The literature search conducted on Scopus resulted in 294 articles chosen (Supplementary Figure S1a–d). As expected, there is a trend in the number of annual publications, which increased markedly over the last 33 years, with a slightly lower number of publications when the keywords ‘ethics’ and ‘regulations’ are applied. Inclusion and exclusion criteria are detailed in Supplementary Table S1. Among the non-review articles selected after screening, the dataset also includes published frameworks, thesis, and technical reports. These were consulted selectively to provide context, method, and regulatory insight and are not fully represented in the peer-reviewed literature. Article selection was carried out manually based on the assessment of titles and abstracts, and full texts were consulted when necessary. Additional relevant publications were identified through reference screening of key articles. No formal risk-of-bias assessment or structured frameworks such as PICO were applied, as the aim of this review was narrative synthesis rather than quantitative comparison. Overall, the trends indicate a rapid growth in the research and viability of ML, AI, and genetic algorithms in agriculture. This is particularly evident over the past five years (Supplementary Figure S1a–d), highlighting increasing interdisciplinarity relevant to plant breeding. It is interesting to pinpoint that some publication picks (e.g., 2008) reflect studies from human epidemiology rather than agricultural research, underscoring the need for caution when interpreting keyword bibliometric trends.

3. Smart Farming

The emergence of smart farming has brought together farming management with modern information and communication technologies (ICTs) to increase efficiency. It includes the use of sensors, software, positioning technologies, robotics, and data analytics, which allow for precision agriculture and livestock farming facilitated by the use of AI and ML tools [28]. Regarded as the Third Green Agricultural Revolution, it includes processing, automation, and the use of drones to collect multispectral and thermal imagery. In modern agriculture, ML offers significant advantages in increasing accuracy and problem-solving, as it is used to identify, classify, quantify, and predict (ICQP) challenges within a development cycle [23]. Although often used interchangeably, AI and ML are not synonymous. ML is a subset of AI that focuses on algorithms capable of learning patterns from data without being explicitly programmed. In agricultural breeding and management, ML models enable data-driven decisions at increasingly fine spatial and temporal resolutions. For example, sensor data analyzed with ML can support decision-making at the level of individual plants or animals or per square meter in the field. This allows breeders and farmers to tailor interventions in operations such as irrigation, fertilization, or selection based on site-specific conditions [29].

4. Satellite Imaging, UAVs, and Proximal Phenotyping in Plants

4.1. Satellite Imaging and GIS

Satellite imaging is a technology that has been in use since the 1960s and is widely applied in agriculture for surveillance and to monitor land and water resources, as well as the land’s main agricultural activities under governmental regulations [30]. This is performed through the utilization of Geographic Information Systems (GIS) and online resources to deliver multispectral imagery for superior agricultural management, utilizing data from satellites, aircraft, and UAVs to analyze land surface phenology (LSP) metrics and support effective decision-making. The use of GIS technology in remote sensing has facilitated the updating and correction of plot boundaries, as well as the reclassification of land occupation within a country, alongside photointerpretation [31,32]. These activities aid governments in offering fair subsidies to farmers based on the information gleaned from these resources [33]. Satellite imagery has proven to be incredibly effective, particularly for larger areas where usage restrictions are not a concern, with high- to medium-resolution capabilities. The utilization of satellite imagery has allowed for increased efficiency in various agricultural practices (Table 2).
The European Space Agency (ESA) currently hosts an impressive fleet of 92 satellites in space. Currently, between 70% and 97% of Global Position System/Global Navigation Satellite System (GPS/GNSS) tractors used throughout Europe rely on signals transmitted by the European navigation satellites through Galileo and EGNOS (https://www.euspa.europa.eu/, accessed on 21 December 2025) (Table 3). The system’s accuracy enables farmers to precisely steer their tractors. Ongoing projects in this area include ESAWorld Cover and ESA WorldCereal from the European Copernicus Program (https://esa-worldcover.org/en, accessed on 21 December 2025). These platforms are utilized by key users such as the Food and Agriculture Organization of the United Nations (FAO), the Organization for Economic Cooperation and Development (OECD), the Center for International Forestry Research (CIFOR), and the United Nations Convention to Combat Desertification (UNCCD). Such platforms are employed in precision farming to dispense precise doses of fertilizers only where necessary. This is made possible by the knowledge of precise soil mapping and accurate positioning.

4.2. UAV-Based and Proximal Phenotyping

When evaluating small plots with thousands of genotypes (with an average size of approximately 1 m2), this cannot be adequately performed using satellite sensors such as WorldView-3. Compared with satellite-based techniques, unmanned aerial systems (UASs) increase throughput and frequency for phenotyping to provide the highest resolution [34]. In the event that past or present satellite imagery is required, such data can be accessed through web-based platforms such as the Google Earth Engine, Planet.com, Earth Data Search by the National Aeronautics and Space Administration (NASA), and LandViewer by the Earth Observing System [35]. In this case, unmanned aerial vehicles (UAVs) and drones are of better use, they can be frequently used at any moment of the breeding process, particularly to minimize cost whilst maximizing use. Both technologies are superior to satellite imaging when it comes to taking into account environmental conditions such as cloud cover and spatial resolution. Unmanned aerial vehicles come in different forms, including parachutes, blimps, rotocopters, and fixed-wing systems [36], and they have been tested for their utility in plant breeding for multiple purposes in diverse crops (Table 2). In some cases, single-propeller drones, combined with ML techniques, have been shown to accurately (98.5%) monitor and manage pests [37].
Table 2. Purposes and resolution systems that can be used in agriculture phenotyping, which serves plant breeding.
Table 2. Purposes and resolution systems that can be used in agriculture phenotyping, which serves plant breeding.
PurposeOrganismReference
High resolution satellite systems (HRSS)
Mapping leaf area indexGrapevine; giant bamboo[38,39,40]
Surface soil property, soil mapping, soil salinity, moisture, and pHSoil[31,32,33,34,35,36,37,38,39,40,41,42,43,44,45]
Yield monitoring and predictionSorghum, cotton, sugar beet, spring wheat, corn/maize, and sunflower[46,47,48,49,50,51]
Disease detectionWheat, rice, citrus[52,53,54]
Agronomic parameters, N quantification and fertilization, protein contentMaize, barley, wheat, turfgrasses[55,56,57,58,59,60,61]
Crop identification [62]
Forest burn index evaluationTrees and forest ecosystem[19]
Photosynthetic capacityVarious[15]
Unmanned aircraft Vehicle (UAV)
Growth stages determinationBambara groundnut; cotton[63,64]
Structural/morphological trait evaluation (biomass, heigh, count)Barley, sugarcane, maize[65,66,67]
Leaf area index (LAI)Soybean, maize, sorghum, bambara groundnut, vineyard[39,68,69,70,71,72]
Yield forecastMaize, wheat, barley, canola, field peas, rice, sugarcane, rye, cotton, bambara groundnut, soybean[72,73,74,75,76,77,78,79,80,81]
Vegetation and soil segmentation [82]
Crop row detection, tree detection and classification, fire monitoringConiferous trees, forest ecosystem[82,83,84]
Nitrogen (N) estimationSoybean; bread wheat; sugarcane[85,86,87,88]
Crop stress and crop phenotyping monitorization and evaluationSugarcane, citrus, wheat, oilseed rape, maize; black poplar[85,89,90,91,92,93]
Must quality parameters, vigor zones, yield, diversityGrapevine[94,95,96,97,98]
Disease detectionCitrus, avocado, banana, wheat, groundnut[99,100,101,102,103,104]
Irrigation schedulingFruit trees[105]
Carbon stock and sequestering above ground, carbon dynamicsForest trees, mangrove[84,106,107]
Reproductive traits (floral opening)Lettuce[108]
Unnamed ground Vehicle (UGV)
Row detectionLettuce[109]
Operations on peat fields [110]
Ground properties of greenhouses [111]
Table 3. Machine learning models related to crops.
Table 3. Machine learning models related to crops.
OrganismTraitModelReference
Yield
Coffee treeNumber of branches, % of fruit weight and maturationSVM[112]
Cherry treeHarvesting mechanizationBM/GNB[113]
Citrus treeEarly yield mappingSVM[114]
GrassEstimation of biomassANNs and multitemporal remote sensing data[115]
Wheat, appleYield predictionSatellite imagery + soil data; MLP/CNN, SVR[116,117,118]
TomatoFruit detection/countingSensed RGB images, CNN[119,120]
RiceDevelopment stage predictionSVM and basic geographic information[121]
SugarcanePlant heigh and stalk density [80]
LemonQuality assessment/controlCNN[122]
RiceGrain protein contentDCGAN[123]
Land vegetationSoil heavy metal monitorizationVarious[124]
Biotic stresses
Mediterranean milk thistleInfection rate to smut fungus, weed detectionANN/XY-Fusion, ANN/CP[125]
StrawberryThrips detection; Botrytis sp., Penicillium sp., and Rhyzopus sp. discriminationSVM, NN[126,127]
RiceDisease and geographical origin detectionSVM, EL/RF[128,129]
WheatDisease infection rate to yellow rust and Septoria, N and H2O stress, weed managementANN/XY-Fusion, ANN/MLP, SVM/LS-SVM, ANN/SOM, DNN[13,125,130,131,132,133]
Maize, soybeanWeed detection and controlANN/one-class SOM; CNN; UFAB/DNN, DL[134,135,136,137]
PearsFragrancy detectionSVM/SPA-SVM[138]
Beans, soybeanIdentification and classification, root system architecture (RSA)DL/CNN, CNN[139,140]
Common grape vineHealth status, powdery mildew, black rot, downy mildewSVM, Gaussian Mixture Model (GMM)/LBPs[141]
BananaDisease and pest detection (e.g., Black Sigatoka)CNN/DCNN, CNN-VGG[101,142]
Quality Control/Quality assurance
TobaccoRecognition of non-tobacco-related materialsCNN: LRNTRM-YOLO[143]
Overall, the precision of error estimation and capturing of images can be enhanced by using high-resolution cameras, flying at lower altitude, and employing ground control points (GCPs) [144]. Proximal phenotyping utilizes ground-based vehicles and sensors, which bring together automated AI with genomics, agronomy, and ecophysiology. The stress sensors are attached to vehicles, pegged in the ground, or attached on strings to obtain data from the visible and thermal ranges on the one hand. On the other hand, they are used to collect canopy data of chlorophyll, plant water, nitrogen, leaf area, plant height, seedling vigor, maturity, biomass, diseases such as rust, and pests [145]. In the last few decades, several investigations have demonstrated the practical utility of proximal phenotyping in different crops [146,147,148,149,150,151,152]. Mobile platforms provide multiple benefits over handheld sensors by conducting numerous features (multiple traits) simultaneously, leading to a significant decrease in time consumed in each task, cost, and labor savings. However, achieving the best results still may require technical expertise. In the context of wheat cultivation, an accurate yield forecast of productivity can be achieved, with an average accuracy of up to 70% attainable if the secondary traits are phenotyped at a high level of precision [153]. These types of devices in agriculture are amongst the most notable ones. Examples include the robot Oz used for mechanical weeding (https://www.naio-technologies.com/en/oz-robot/, accessed on 21 December 2025) and the autonomous sprayer GUSS (https://gussag.com, accessed on 21 December 2025) used for spraying by means of a laptop or robot systems capable of autonomously identifying and removing weeds such as the one offered by Blue River (https://www.theblifemovement.com/blue-river-technologys-precision-weed-control-machine/, accessed on 21 December 2025), among others. These types of innovative solutions help farmers to manage weeds without relying on herbicides or to enhance the efficiency of herbicide spraying via the use of a laptop computer, respectively.

4.3. Precision and Generalist Agriculture

Although phenotyping for precision and generalist agriculture may seem to have opposing goals, they actually converge towards the same target through different approaches. Phenotyping in generalist agriculture aims to maximize yield, with a focus on broad adaptability, ideal conditions, and average performance, while phenotyping for precision seeks to prevent factors that limit yield maximization, with a focus on site-specific management, stress response, and environment–genotype interaction [6]. Even if the ultimate objective of both approaches is the increase in crop productivity at a reduced production cost, they differ in focus. Generalist agricultural phenotyping aims to select traits that perform well under optimal conditions, whereas precision agricultural phenotyping aims to ensure consistent performance across variable or stressful conditions. In the latter, the key requirement for success will be the choice of appropriate sensors and methods that best fit the specific trait, environment, and costs. To maximize the return on investment, decision support platforms or systems (DSP or DSS) should be designed to integrate the data coming from phenotyping, weather, genotyping, economics, and satellite imaging [6]. This would enable more timely and informed decision-making in agriculture. Low-cost UAV data can be correlated with global satellite remote sensing databases to improve data integration and enhance decision-making. Furthermore, combining information from UAVs and Unmanned Ground Vehicles (UGVs) can lead to better decision-making outcomes.

5. Integration of Remote Sensing AI and Genetic Algorithms in Phenotyping to Identify Loci Associated with Agronomically Beneficial Traits

5.1. Genetic Algorithms: Principles and Optimization

The integration of machines, vehicles, and systems provides an accurate and reliable tool for the adoption of low-input, high-efficiency, and sustainable (LHS) agricultural decision support [154]. In the field of AI, several metaheuristic single and multiple population-based evolutionary algorithms have been proposed, including the genetic algorithm (GA) methods [155]. The difference between traditional algorithms and Gas is well established, with former following a fixed set of rules and logic to obtain a solution. Therefore, these are broader and may refer to standard algorithms that solve common tasks such as sorting and searching and are not tied to evolutionary principles. Instead, GAs seek to optimize problem-solving strategies based on a natural genetic selection process that mimics biological evolution and incorporates search fine-tuning mechanisms such as selection, crossover, and mutation [156]. GAs works by trial and error, and during the selection stage, individuals are chosen for reproduction, and poor solutions are discarded, while both crossover (also known as recombination) and mutation stages (same as genetic operators) explore the search space and retain less-fit individuals [157]. In certain cases, these less-fit individuals are kept in later stages to maintain the genetic diversity, as well to avoid premature convergence, which can occur if the population becomes too similar [158], preventing the algorithm failing to explore promising areas of the search space. In addition, there may be a case where there is a need to prevent ‘local optima’, as these ‘to-discard’ individuals may contain useful traits to be combined with later generations [159]. This may ensure a more robust search where the algorithm has the chance to explore multiple niches, in a so-called ‘steady-state and niching technique’ [160]. Therefore, these GAs (also referred to as individuals of the problem) have been constructed and applied to a variety of optimization problems in which a population of chromosomes encodes potential solutions (Figure 2). These solutions then undergo mutation and recombination, giving rise to new offspring throughout a number of generations. To create an effective encoding and apply genetic operators, it is crucial to know input parameters such as the population size (Pop), the maximum number of iterations (iter), and stopping criteria. Additionally, the size of the cloning proportion (Elite) maintained through interactions, as well as the proportions of individuals generated by mutation (Mut) and crossover (Xover)—which simulates reproduction between two parent solutions—must fulfill the following expression:
1 = Elite + Mut + Xover.
Genetic encoding allows for the assignment of hypotheses to diverse options, and individuals can be generated either randomly or obtained by constructive methods. It is expected that the initial population will explore the maximum number of solutions consisting of random permutations to improve the GA. In order to perform a natural selection, each individual evaluation is based on its fitness value, which is used for individual selection in line with the Darwinian theory of “survival of the fittest”. The cloning and mutation stages induce small random changes in the solution by adding new characteristics gradually, while the elitism stage maintains certain individuals from one generation to the next by cloning them [161].

5.2. Integration with Remote Sensing and UAV-Based Phenotyping

Remote sensing AI technology readings are based on the correlation between captured radiation and the properties of the targeted objects. This radiation or reflectance contains information at physical and chemical levels, which is strongly associated with vibration molecular bonds such as carbon–hydrogen (C-H), oxygen–hydrogen (O-H), or nitrogen–hydrogen (N-H) bonds [162]. The final output is a specific signature of reflected light used to calculate spectral indices based on algorithms [25]. To find the weights to each trait index evaluated, either by satellite or UAVs, the normalized spectrum values for each image pixel can be translated into a GA to identify an optimal function that can relate the generated data [163]. The GA is based on the evolution of individuals, which carry alleles that represent them as a population. In each generation, new allele combinations are produced, and the best individuals are chosen as parental lines for a particular environment, passing on their genes to the next generation. The GA model created by [163] ranks a set of individuals with their individual weights for every generation using a fitness function, giving a score to each and every individual. This will ultimately select the best fit on the basis of three main steps: initialization (1), selection (2), and genetic operators (3) (Figure 3). This model could serve as a low-cost alternative to current plant phenotyping methods. Recently, the third-generation non-dominated sorting GA (NSGA-III) was described and is intended to address optimization problems with multiple conflicting objectives, thereby focusing on improving the diversity of solutions and enhancing efficiency [164]. Ref. [108] utilized ML as a versatile tool to analyze large amounts of data and correlate it with genomic data. Specifically, they used drone-mediated imagery to track the spatiotemporal behavior of lettuce individuals and associated it with genetic profiles through ML and Bayesian inference methods. By performing this, they were able to identify two casual loci (daily floral opening, qDFO2.1 and qDFO8.1) related to differential floral opening and closing times, explaining 30% of the phenotypic variation in floral opening time. This study demonstrates the potential of UAV imaging technology to accelerate breeding efforts and adapt research frameworks of crops like lettuce, where flowering is a critical trait for genetic improvement [108].
A new multi-species binary coded algorithm METO approach was proposed, differing from the traditional GA, since it produces two consecutive generations of offspring in each evolution epoch instead of just one [F1 = crossbreeding of F0 generation parents; F2 = self-breeding of the F1 generation parents] [165,166]. The METO algorithm uses two parallel routes for transferring genes from one generation to the next and may be helpful in resolving more complex problems. In soybean, it was shown that near-infrared spectroscopy (NIR) data can be used for phenotypic prediction at different stages of a breeding program in soybean, with comparable, and in some cases, greater, predictive ability than genomic predictions [167]. The authors found that phenomic predictions—a term introduced by [168]—were less sensitive to relatedness between the training and prediction sets and could outperform genomic prediction in certain scenarios, including those involving seed yield and plant height. When applied to a GA (Figure 3), a small number of wavelengths utilizing NIR devices may be sufficient without a loss in phenotypic prediction abilities when using biallelic genotypic markers such as single-nucleotide polymorphisms (SNPs) [167].
Figure 3. Workflow for developing a genetic algorithm (GA) using unmanned aerial systems (UASs): (a) spectrum image-based indicating the several spectral ranges used by tools (retrieved from https://en.wikipedia.org/wiki/Infrared_vision, accessed on 21 December 2025) and used in the (b) collection of phenotypic data with a specific example on how to use hyperspectral reflectance in a rice paddy and reflecting the seasonal change in the crop for better decisions [14]. After the dataset is chosen, data analysis is performed, including statistical analysis, visualization of input and output features [169], and a pre-processing step, as highlighted by Figure 1 of [170], which can occur for label encoding, data imputation, and data mapping and splitting, followed by the development of a GA per se. (c) The developed GA (based on [163]). In the elitism stage, the top individuals of the population are selected for the next generation. In the crossover stage, the parents are mated to create the progeny with a combination of their genes. After crossover, the remaining population is then completed by removing the two parents and progeny and adding a mutation factor. At the end of the workflow, the best recommendations are made, which may lead to performance testing or evaluation.
Figure 3. Workflow for developing a genetic algorithm (GA) using unmanned aerial systems (UASs): (a) spectrum image-based indicating the several spectral ranges used by tools (retrieved from https://en.wikipedia.org/wiki/Infrared_vision, accessed on 21 December 2025) and used in the (b) collection of phenotypic data with a specific example on how to use hyperspectral reflectance in a rice paddy and reflecting the seasonal change in the crop for better decisions [14]. After the dataset is chosen, data analysis is performed, including statistical analysis, visualization of input and output features [169], and a pre-processing step, as highlighted by Figure 1 of [170], which can occur for label encoding, data imputation, and data mapping and splitting, followed by the development of a GA per se. (c) The developed GA (based on [163]). In the elitism stage, the top individuals of the population are selected for the next generation. In the crossover stage, the parents are mated to create the progeny with a combination of their genes. After crossover, the remaining population is then completed by removing the two parents and progeny and adding a mutation factor. At the end of the workflow, the best recommendations are made, which may lead to performance testing or evaluation.
Agronomy 16 00137 g003
In diploid organisms with three allelic classes (AA, Aa, and aa), the variations obtained by authors were as simple as 3k, with k indicating the number of markers utilized. NIR data outcomes were found to be similar to multi-allelic markers observed in transcriptomic data, as these do not rely on a predefined number of elements and instead present a continuous variation with multiple informative states [167]. Each wavelength must be therefore regarded as a phenotype influenced by many loci and must be treated in an interdependent manner. In such cases, the substantial reduction in the wavelength number utilized for the prediction can maintain the phenomic predictive ability. However, this predictability is most effective for traits that produce clear trait-specific signals.
In contrast, for traits that vary primarily due to subtle differences in allelic states at a specific QTL, the signals may not be distinct enough to enable reliable predictions. In case diagnostic markers are available for a specific trait, these could be used in the pre-selection of lines and before phenotypic prediction in models (Table 4). Zhu et al. [167] suggested that phenomic prediction would be preferred in the case of traits with non-additive genetic effects, where interactions between genes are important. In contrast, genomic prediction is more efficient and accurate for traits with additive effects, where each gene contributes independently and breeding values are needed [171]. However, phenomic prediction may be more useful in cases of higher complexity. In any case, a training population is needed for accuracy in all the scenarios. In the case of genomic predictions, higher accuracy is obtained in full-sibs, half-sibs, and unrelated families in that order. In contrast, phenomic accuracy achieved similar results in full- and half-sibs, whereas other families obtained similar results as half-sibs [167]. In triticale, it was seen that the prediction accuracy when applying models from one family to another was lower than the average accuracy seen within the same family, based on full siblings [172]. The highest prediction results obtained in half-sib families in triticale were also obtained in other crops such as barley [173].

5.3. Broader Applications in Agriculture and Food Systems

Besides their use in crop field, the importance of GAs in the agrifood industry lies in their potential to enhance operational efficiency and optimization. Examples of their use in this area include the determination of the most efficient production schedules in food manufacturing facilities. Optimized schedules reduce costs and ensure the timely delivery of fresh produce products [174]. Here, both productivity and customer satisfaction increase, as they contribute to waste reduction within the supply chain. In soil sciences, which is a key integrative area in crop genetics and breeding, a recently introduced model combines multiple factors, including soil composition, weather conditions, and historical crop yields. This combined approach, referred to as a ‘hybrid model’, allows for a better optimization of random forest (RF) classifiers while improving their ability to predict crop outcomes more accurately [175,176]. Using this model, ref. [170] achieved an impressive 99.3% accuracy in predicting the outcomes for 22 different crop groups [pulses & beans, cereals, fruits, oilseeds and fiber crops, vegetables and spices & beverages]. In Spain, the use of a GA as a decision support system to manage the irrigation frequency plan in the fields not only reduced water and energy consumption but also ensured optimal irrigation coverage, thereby contributing to resource efficiency [177]. This is particularly important in addressing sustainability challenges in water-scarce agricultural regions such as the south of the Iberian Peninsula, northern Africa, or the arid northern regions of India. Moreover, GAs can be used in various UVA optimization applications, including Particle Swarm Optimization (PSO), differential evolution (DE), and other bioinspired search methods. Such an example includes the gains added by GA adjustment performed on a UVA fuzzy controller, leading to improvement in the path tracking of the UAV device [178]. Yet, and as outlined by the authors, this tuning method may not be applicable to all devices. Similarly, in winter wheat, the optimization of RF parameters through a GA led to improvement in the accuracy of predicting both chlorophyll and anthocyanin contents [179,180]. However, and as pointed previously [181,182], this approach is not enough, and caution must be made when making recommendations. While hybrid models are useful in making accurate predictions made through modeling [61], these can provide highly inaccurate input application recommendations. Here, the choice of machine learning algorithm and covariate selection are crucial, as insufficient consideration of model uncertainty may lead to making highly undesirable input use decisions [181]. The study conducted by the authors showed a difference in gains of about 340% due to these differentiated selections. Using the HI-WUE integrated index, combining the harvest index and water use efficiency, ref. [183] suggested a framework that is able to identify rice ideotypes optimized for resource efficiency. Through using a wide range of virtual cultivars under diverse environmental conditions, this multidimensional analysis optimizing GA CERES-Rice Crop was able to quantify the genetic distance between computationally optimized ideotypes and field-characterized cultivars. This approach shows that co-development of crop models and phenotyping platforms has the potential to translate predictions into practical breeding outcomes for climate-resilient crop varieties. Despite work emphasizing the predictive potential of mechanistic models and AI-based optimization strategies in defining crop ideotypes [183], it is also clear that the way a crop’s genetics interact with local environmental factors is very important for its performance. It is evident that these interactions create barriers in providing breeding methods that work universally across all regions, as crops need to be tailored to specific climates.

6. Data Integration—Multi-Omics Data to Enhance Genetic Predictions

6.1. Metabolomics, Multi-Sensor Integration, and ML Approaches

In plant phenotyping, metabolome analysis is helpful to understand complex biological traits, particularly if the integration of extensive physical and spectral data is required, including chemistry data. However, accurately identifying metabolites to determine their biological relevance still is a significant challenge. Here, the identification of specific biomarkers could support the understanding of the biological significance of the data [184]. Despite the promise of non-invasive, automated metabolomics, cost remains a major obstacle. The datasets generated from metabolomics are usually difficult to visualize and interpret and require the use of advanced modeling approaches, as these assays usually do not capture the systematic environment of metabolites [185]. Integrating multiple sensors, such as optical molecular spectroscopy, imaging, and mass spectrometry, into automated plant phenotyping facilities can improve the understanding of complex plant features. Data fusion techniques, such as statistical multimodal data analysis and deep learning, can enhance the knowledge obtained from metabolomics, transcriptomics, and imaging data supported by software tools that are available to integrate omics data (Table 1; [184]). In this context, the application of ML and network analysis not only allowed for the prediction of biochemical pathways in tomato with metabolite data [186] but also allowed for phenotype predictability through the use of genes, transcripts, and metabolites altogether [187]. At this data analysis stage, ML can improve accuracy, and, depending on the datasets, supervised, unsupervised, and semi-supervised approaches may be used [188]. In order to categorize or generate predictions based on input attributes, the supervised ML method uses a labeled dataset. The labeled examples from the training datasets are then used to teach the algorithm a mapping from the input data to a target output or label. It has been shown that supervised models like ‘DeepGS’ [189] can perform better than conventional GS models when used in genome-wide (GW) analysis predictions [190]. Suggestion exist that other models may perform better than DeepGS [191]. However, this needs more evidence, as it seems that there is not a unique model that works well in all traits studied in wheat [192]. Nevertheless, the goal of unsupervised ML models is to find patterns, structures, or correlations in unlabeled data without the need for explicit supervision or a predetermined target variable [193] (Figure 4). By using a hybrid approach, the model can enhance its performance by utilizing both the additional unlabeled data and the limited amount of labeled data.
Overall, integrating metabolomics with multi-sensor data and ML enables more accurate phenotype prediction since it captures complementary biological signals that are not accessible through single-omics approaches alone, thereby reducing breeding cycle time and resource use.

6.2. Genomic Resources, Causal Gene Discovery, and Smart Laboratory Platforms

In the field of genomics, it is important to pinpoint that it is crucial to have foundational reference genomes to act as a ‘map’ where genetic variations, gene locations, and structural features of DNA are known [194]. Lacking this information is analogous to training a navigation system without access to a map of roads and cities, resulting in unreliable predictions despite ML demonstrating utility in identifying genomic regions in crops [195,196,197,198], multi-omics and/or multi-regulation networks analysis, as performed in maize [199] and rice [200], respectively, are preferred. ML algorithms such as QTG-Finder2 can support the discovery of causal genes in agricultural plant species and facilitate agricultural trait improvement [201]. Still within the genomics space, and to support this, AI is employed for real-time PCR analysis and endpoint PCR data analysis to create a connected platform for smart and digital laboratories (https://www.illumina.com/informatics/ai-in-genomics.html, accessed on 21 December 2025). Marker scoring after sequencing is one of the most time-consuming processes during genotyping, and it is a subjective process as there is no identifiable root cause for underperforming assays. Therefore, non-standardized pieces of software are common in multiple PCR platforms. For instance, FastFinder from UgenTec (www.ugentec.com; https://www.ugentec.com/fastfinder, accessed on 21 December 2025) can help diagnostic laboratories by automating assay scoring through AI, saving operational time, providing data intelligence for quality assurance and improvement, and being customizable to customers. Such software can be used for genotyping, pathogen testing at scale, and quality control (QC) in industry, animal, and seed health. The software from UgenTec can save up to 80% of the scoring time and can provide a solution for multiple platform usage such as KASP SNP (Kraken), Nexar, Araya, IntelliQube, and Fluidigm, among others (https://www.ugentec.com/fastfinder/genotyper, accessed on 21 December 2025). The usage of such software offers several benefits, including access to historical data repositories and the ability to apply intelligent algorithms tailored to specific assays, organisms, and workflows. By incorporating HTP into ML-driven genotype/phenotype models, breeders can not only enhance the pace at which they develop new cultivars but also work more efficiently in trait discovery. Together, reference genomes, causal gene discovery tools, and AI-enabled laboratory platforms form an integrated framework that accelerates trait discovery to improve breeding efficiency.

7. Simulation Models in Support of Plant Breeders

7.1. Crop, Environmental, and Genomic Prediction Models

To predict how different genotypes perform under various environmental growth developmental conditions, plant breeders can take advantage of simulation models. These models use computational tools to simulate biological, genetic, and environmental processes, helping to predict genotype performance under various conditions. Simulation models support breeders not only to understand trait interactions and evaluate breeding strategies but also to optimize selection decisions without relying solely on field experiments. Using crop models, one can test and refine breeding strategies more efficiently to take better-informed decisions in terms of trait selection and cultivar development. Taking the example of coupling pest and disease damage modules using CSM-NWheat, this module in the CSM-NWheat model simulates the effects of biotic stresses on wheat crops, allowing farmers/researchers accurate predictions of yield losses [202]. In the northern Indo-Gangetic basin of Bangladesh, crop simulation models such as APSIN and DSSAT were further calibrated and validated to understand the impacts of climate change on rice and wheat production [203]. The results provide insights for adjustments in practices such as sowing and irrigation. Using genomic prediction models through deep learning on maize, it was possible to demonstrate that this tool could provide better prediction accuracy, although without considering the GxE interactions. This is because deep learning (DL) models may not be as transparent and easy to interpret as other models, such as BMTME, which was found more superior when considering the GxE interactions [204]. The opposite was shown for barley and to predict Fusarium Head Blight (FHB), where the DL model, using multiple networks through the transformer-based genomic prediction model, was shown to be as good as or better than GS methods such as BLUP and MLP [205]. Independently of model preference, currently, three user-friendly software packages are known to be able to integrate multi-data sources. The R software package ‘learnMet’ 1.0.0 [206] allows users to employ traditional ML methods in their data, whereas the statistical machine-learning toolkit ‘SKM library’ was designed to be used in any prediction task with input–output data [207]. More recently, the Hyperfidelis 1.0 geospatial software package was presented to support researchers as a user-friendly graphical interface employing machine learning techniques, merging information from plant science, agronomy, remote sensing, and data science [208]. These are user-friendly tools that streamline genomic selection workflows, including model fitting and performance evaluation. Other practical examples of use include aiding rice yield estimates during the peak season in Bangladesh [209] and salinity forecasting [210], both of which contribute to improving food security management and decision-making. Remote sensing and ML are also valuable tools to monitor and evaluate large regional interventions, minimizing the need for extensive data collection from the field. This is evident in the case of assessing rice production resilience to climate in Senegal [211]. Such examples are just few among many described elsewhere [212,213,214,215,216,217,218,219]. Altogether, these examples illustrate how simulation models, when supported by genomic prediction and remote sensing data, can serve as a powerful decision-support tool. Allowing for the anticipation of genotype performance across environments and optimization of selection strategies, such models bridge the gap between field experimentation and predictive breeding, ultimately accelerating cultivar development under climate change.

7.2. Generative Adversarial Networks (GANs): The Next Frontier

Generative adversarial networks (GANs), primarily introduced by [220], are generative models that generate new data instances resembling the training data. GANs discover and learn patterns in input data in such a way that the model can generate new examples based on a set of attributes or variables from the original dataset. These networks consist of two sub-models: the generator and discriminator; the former generates new hypotheses, while the latter classifies the hypothesis as real (or false). This process generates numerous plausible and realistic hypotheses for researchers to address diverse problems. As is known, one application of GANs includes the translation of satellite photographs into map-style renderings, as shown by early image-to-image translation models [221]. As of today, GANs are commonly used for tasks including image synthesis [222], super resolution [223,224], anomaly detection [225], and handling missing data [226], as well text-to-image processing [227] across various domains. In agriculture, GANs are useful to model high-dimensional data produced from multispectral imaging [228] and provide multimodal outputs through deep-learning-based methods [61,229,230]. Specifically in plant breeding, with their ability to simulate genetic variation, predict environmental adaptability, and accelerate breeding cycles, GANs can be a powerful tool in the quest to develop crops that are more resilient to climate change and therefore capable of feeding a growing global population. This is supported, for example, by software packages such as Scion Image 4.0.3.2 [231], Leaf Doctor [232], and Quantitative Plant (https://quantitative-plant.org, accessed on 21 December 2025), which allow for imaging analysis. As practical example, the LeafGAN system provides an image-to-image translation model and has been used as a data augmentation tool to improve plant disease diagnostics performance [233,234]. ML is widely utilized to increase crop productivity and quality, namely, at seed retail systems, but it is also employed to create better crops and identify natural enemies of crop pests and diseases. The uses of robotics and sensors for predicting crop health and yield, monitoring deforestation, providing updates on cartography for agricultural registration, land occupation confirmation, and boundary monitoring are all part of the suite of apps and services designed to improve agricultural productivity [235]. In this realm, the EarthOne Platform (previously known as Descarts Labs Platform; https://ag.earthdaily.com, accessed on 21 December 2025) promotes sustainable farming practices by allowing data providers and diverse stakeholders to access and manage large datasets of information. The data collected can be then used to generate ML-based predictions without requiring cross-checking with other datasets. Previously it has been demonstrated that using ultrasonic sensors could lead to input savings ranging from 22% to 70% in crop production [236], where deep learning models have enhanced the performance of traditional image processing techniques, achieving an average accuracy of 92.51% across various agricultural applications [237]. In plant breeding, the availability of a diverse set of datasets, such as genomic, transcriptomic and metabolomic, and predictive models, can help in predicting complex agronomic traits [238]. This approach can not only improve the model accuracy but also supports a more complete understanding of the different biological mechanisms that drive trait variability.

8. Ethics on the Use of Aerial Systems, Geospatial Information, ‘Big Data’, and Governance Policies

8.1. Ethical and Regulatory Considerations for UAVs and Aerial Systems in Plant Breeding

Ethical and regulatory frameworks governing UAVs, geospatial data, and AI directly influence the feasibility, reliability, and fairness of data-driven plant breeding, particularly in high-throughput phenotyping and AI-assisted selection pipelines. The rapid expansion of UAV services has prompted increased regulatory oversight, which directly influences the deployment of UAV-based phenotyping in agriculture [239]. Despite the decreasing costs associated with UAV systems, aviation regulations in many countries continue to restrict drone use, which can limit the frequency, timing, and spatial resolution of high-throughput phenotyping campaigns essential for modern breeding programs [Supplementary Materials—Supplementary File S1]. With respect to drone usage, ethical considerations pertain not only to privacy but also to data sharing, storage, and usage of data. Guidelines also address flight approval times [240], administrative processes and documentation, user demands, and safety during operations concerning individuals and property.
In order to understand the full benefit of UAVs in agriculture, the use of UAVs not only requires transparent rules and strategies to ensure ethical and responsible behavior but also that these rules and strategies are value-laden and have no discriminatory potential. In the context of plant breeding, UAVs are widely deployed for high-throughput phenotyping and monitoring trials; therefore, restrictive or unclear regulations can directly affect breeders’ ability to collect timely and reliable data. Consequently, computer scientists and developers need to be responsible for ensuring that their technology operates ethically and responsibly [TUNAT—transparent, unbiased, non-maleficent, accountable, trustworthy while protecting privacy] and that the delegation of tasks to technology should absolve them of the responsibility, as technology’s actions and decisions still implicate and involve ethical considerations [241]. For plant breeders, this is particularly relevant when AI-driven genomic prediction or trait evaluation models are used, since algorithmic biases could skew selection outcomes or reduce trust in breeding decisions. It is important for those involved in developing and implementing algorithms to be mindful of the values and biases embedded in their technology and to work actively towards transparency, fairness, and accountability. Potential risks posed by AI technology include the possibility of opaque decision-making, gender-based and other forms of discrimination, privacy concerns, and the potential for criminal activities to be carried out using AI [242]. To address some of these risks, guidelines for AI have been developed, including the Asilomar AI principles [243] and prohibited AI practices [244]; Regulation (EU) 2024/1689 (AI Act) [245].

8.2. Big Data Governance, Ownership, and Ethical Use in Agriculture

Bulk data generation, which includes data that is generated, captured, copied, and consumed, is projected to reach more than 394 zettabytes by 2028 [246]. Yet, the governance of big data remains a major challenge, particularly related to privacy, security, sharing, costs and ownership data analytics level [247,248]. In breeding programs, these issues translate into questions of who owns and controls genomic, phenotypic, and environmental datasets, which are increasingly generated by public–private partnerships. In the EU, organizations processing research data are required to protect the data properly to prevent loss or misuse, as there could be legal, reputational, and financial consequences for the data processor [249]. To increase transparency and trust, the EU voluntarily released its Code of Conduct on Agricultural Data Sharing by contractual agreement in 2018 [250]. Following this initiative, the Farm Data Code Practice was also published by Australia and New Zealand with the purpose of minimizing any potential misunderstandings between farmers and partners [250,251] (http://www.farmdatacode.org.nz, accessed on 21 December 2025). Numerous co-signatory organizations are working together to produce a non-binding code that sheds greater light on contractual relations and provides guidance on the use of agricultural data [250,252] (https://www.fao.org/family-farming/detail/en/c/1370911, accessed on 21 December 2025). In 2020, the GODAN/CTA/GFAR online tool was also released, which provides a platform for users, especially farmers, to create their own codes of conduct based on their specific needs (https://www.godan.info, accessed on 21 December 2025). Besides these codes of ethics, effective risk management policies need to be in place and address the challenges of robotics governance in least-developed countries, where farmers may face difficulties in adopting new technologies [253].
With climate change increasingly affecting agriculture, the adaptability of robotics to changing conditions such as drought, heat, fires, floods, and new pests becomes critical for their continued effectiveness and timely responsiveness. In this context, robotics may raise issues of data ownership similar to those of AI and ML. Farmers must be aware that some agricultural technology providers (ATPs) may include licensing fees for data usage, with penalties for contract breaches [254], but also updating pricing depending on the rising of market costs, independently of the contract provided. Agricultural and environmental organizations, in conjunction with governmental bodies, do play a significant role in storing and managing plant pathological data. These data are usually organized based on taxonomic, symptomatic, and geographical distribution features. Yet, still there is a lack of correlation between these acquired data and genomic sequence data, all together with remote imaging, metadata, and ML [255]. Hence, innovative initiatives need to address this gap to provide accessible resources and information for research, policymaking, and users regarding geospatial data sources and tools, including property rights (https://ethicalgeo.org, accessed on 21 December 2025). The development of online reference databases will aid in the understanding of existing ethics and geospatial technology.
In summary, there are many ethical and legal questions that need to be addressed in agriculture with regards to data, including the establishment of a common set of ethical principles for data handling and transfer, directives on who owns and controls the data, determination of the entitlement value of the data, ensuring accessibility of data to all actors involved in agriculture, ascertaining of compliance with data protection regulations and who will make this work, establishment of fare regulations for farmer rights and benefits in sharing information with agribusiness and vice-versa, and decisions on the appropriate legislation, policies, and ethical considerations for the use of data in agriculture [24,256]. Finaly, the right of farmers to access capacity-building initiatives should also be addressed. This will enable them to cope with new technologies (UAVs, AI, ML) and will mitigate the risks of displacement. For plant breeders, the establishment of ethical and transparent frameworks for UAV-based phenotyping, genomic data sharing, and AI-driven prediction is critical to ensure fair access, reliable trait evaluation, and farmer participation in breeding pipelines.

9. Overcoming Challenges

Integration of data acquisition and analysis, along with decision-making support systems and variable rate application—which involves applying inputs like water, fertilizers, or pesticides at different rates across a field based on specific needs—are the hallmarks of precision agriculture. Here, remote sensing plays a critical role in the process, and despite significant advancements in sensor quality and capacity, challenges such as the selection and processing of satellite images still persist. Current bottlenecks in the efficient use of HTP systems are the extraction of data, even from simpler devices, as well as the need for larger and labeled datasets. These processes are time-consuming and costly, besides the need for standardized procedures for retrieving specific types of data, such as temperature data, which are currently lacking [257]. End-use protocols, standard operating procedures (SOPs), and guidelines will facilitate the process of retrieving this type of data. For breeders, the lack of standardized and timely data extraction directly slows trait evaluation and reduces the reliability of downstream genomic prediction models.
Another challenge facing precision agriculture relates to choosing and updating devices in response to the demanding market. This challenge could be mitigated by providers offering more frequent and simplified updates, which could then be followed-up by deploying end-user phenotyping devices. In addition, the turnaround time for image delivery is another constraint that could be reduced through a greater number of service providers. However, an increase in service providers may also lead to more stringent rules regarding the use of UASs due to security and privacy concerns. This is because as more service providers use UASs, the risk of misuse or accidental breaches of privacy and security also increases, prompting tighter regulations to protect people and property [258]. Hence, there are key questions to be considered when determining the best type of sensors to use, including the size of the area to be mapped, the complexity of traits relative to crop type, the time available for mapping, the environmental conditions present, and its associated costs. As a result, when considering larger areas, satellite systems may be preferable, while drones may be advantageous for immediate, real-time assessment and greater flexibility in terms of filter range over time. For breeding programs, these decisions determine whether phenotyping campaigns can be scaled to entire nurseries or targeted at specific experimental plots. Although sensors within the BGR + NIR spectrum [Sentinel-2 (European Space Agency, WorldView-3 (Maxar), ASD FieldSpec 4, Parrot Sequoia (Drone-mounted) and Landsat 8] are sufficient for many plant measurements, it is important to consider advancements in fluorescence and LiDAR technologies. Of special importance is the increased battery life of drones, which can greatly benefit farmers with extensive land areas, such as those involved in livestock production. In the case of LiDAR technology, both 2D and 3D sensors use laser pulses and time-of-flight to measure distances to objects along a specific direction. However, one must be aware that in environments with irregular terrain, the laser light can generate positioning errors [259]. Therefore, regardless of the technology used, the accuracy of ML-based assessments is critical, and attention must be given to metrics such as error classification, sensitivity, specificity, and false positive rates, as highlighted by [260], to avoid inaccurate recommendations. In breeding contexts, such errors can propagate into biased trait assessments, undermining the accuracy of selection decisions.
Despite the efforts that are underway to establish such public databases, currently, deep learning (AI) networks still lack sufficient public benchmark datasets that specifically address the full scope of the various agricultural needs, for example, LeafSnap [261] for visual recognition of tree species; CropDeep [262] for species classification and detection; PlantVillage [263,264] (https://plantvillage.psu.edu/plants, accessed on 21 December 2025) as well the former PlantDoc [265] (https://hort.extension.wisc.edu/, accessed on 21 December 2025) for monitoring plant diseases and pests; the Tumaini (https://tumainiaiapp.org/, accessed on 21 December 2025) and Agriculture-Vision databases for large-scale aerial farm land image dataset for analyzing agricultural patterns [266]; the DeepWeeds dataset (multiclass weed species image dataset [267]; https://github.com/AlexOlsen/DeepWeeds, accessed on 21 December 2025) for understanding and managing weed ecology; and FAOSTAT (https://www.fao.org/faostat/en/, accessed on 21 December 2025), the largest agricultural statistical data repository in the world. While many of these databases target general crop management, adapting them for breeding-specific applications such as trait scoring and genomic prediction remains a major unmet need.
At the local level, systems such as Fruitlook (www.fruitlook.co.za) can serve as pre-operational services by providing weekly estimates of crop parameters to inform farmers about crop growth, water usage, nutrient status, and soil moisture content. At the regional and global levels, the JRC’s MARS (Joint Research Center Monitoring Agricultural Resources) crop monitoring service (https://ec.europa.eu/jrc/en/mars, accessed on 21 December 2025) relies on static and real-time data for weather forecasting. Yet, there is a need to integrate the various processes of data analysis, ML implementation, and decision-making in an automated manner so that the overall process is functionally connected to the decision-making level. Therefore, there is ample opportunity to explore how farmers can make the best use of the acquired data. Examples in this area include the development of annual crop maps (https://agriculture.canada.ca/en/agricultural-production/geospatial-products, accessed on 21 December 2025). Additional improvement opportunities also involve optimizing efficient noise reduction and data redundancy, particularly in regression tasks [268], which will be crucial as big data geodatabases continue to expand. With the expectation of a large number of agricultural machines connected to service centers, there will also be a need to anticipate and manage network traffic and storage systems, particularly in real-time data scenarios [165]. For example, a smart self-driving tractor can currently collect more than 240 GB of crop data daily [269]. John Deere launched the first autonomous tractor that enables 360-degree obstacle detection using pixel classification within approximately 100 ms to facilitate decision-making based on the type of obstacle detected. In this case, the farmer only needs to transport the machine to the field and make the necessary configurations. Subsequently, the farmer can leave the field and monitor the machine’s status from a mobile device. The private company provides farmers with access to live videos, images, data, and metrics. This efficiency comes with the promise of reducing carbon footprint, improving safety, streamlining operations, and increasing profits. For plant breeding programs, integrating such real-time data into trial management systems could accelerate selection cycles and reduce costs.
The need for effective data management software and applications is equally important. A study conducted in 14 countries provided insights into the information used in this regard [270]. Web-based geospatial applications such as Crop Condition and Soil Moisture Analytics (Crop-CASMA) (https://cloud.csiss.gmu.edu/Crop-CASMA//, accessed on 21 December 2025) have been developed to enable users to remotely access geospatial soil moisture and vegetation index data. This platform has been applied in the USA’s territory. The Global Agricultural and Disaster Assessment System (GADAS, https://geo.fas.usda.gov/GADAS/, accessed on 21 December 2025) by the United States Department of Agriculture (USDA) provides real-time satellite information on weather, crops, and disasters. This type of information is critical for policy development, crop productivity forecasting, tracking extreme climate incidences such as floods and droughts, and mitigating natural disasters, including tsunamis and the risk of pandemics. In breeding contexts, such data feeds into environment-specific genomic prediction models, strengthening genotype-by-environment analyses.
In the space of smart crop farming and AI, there is an evolving role for farm advisory services and agricultural advisors, whose expertise is essential for bridging the existing gap between cutting-edge technologies and the respective practical on-farm applications. These findings not only support farmers in understanding and trusting AI tools such as precision irrigation, crop health diagnostics, or yield predictions but also provide direct support in the tailoring of tech solutions to fit local conditions and smallholder realities. This is particularly true in Africa’s and Asia’s conditions, where in the former, the majority of farms belong to smallholder farmers with no more than 2 hectares to maximize farm production [271]. These technical advisory professionals can also work on the space to provide contextual interpretation of AI-generated insights, further supporting farmers to interpret satellite imagery, drone data, or Internet-of-Things (IoT) sensor outputs for decision-making. The advisory services will be equally important to train farmers on how to use apps, sensors, and AI dashboards as well regarding services offered by third parties. Advisers of organizations such as the Kenya Dairy Farmers Federation (KDFF; https://www.kenaff.org/, accessed on 21 December 2025), the Oromia Coffee Farmers Cooperative Union (OCFCU, https://oromiacoffeeunion.com/, accessed on 21 December 2025) in Ethiopia, the Uganda National Farmers Federation (UNFFE; https://www.fo-mapp.com/farm/uganda-national-farmers-federation-unffe/, accessed on 21 December 2025) in Uganda, the Ghanaian Kuapa Kokoo Cooperative Cocoa Farmers and Marketing Union Limited (KKFU; https://kuapakokoo.com, accessed on 21 December 2025), and the European Research Infrastructure for Plant Phenotyping (EMPHASIS, https://emphasis.plant-phenotyping.eu/, accessed on 21 December 2025) build digital capacity, organize field demonstrations, and conduct digital literacy programs. In this area of extension, both the AI chabot AgriTalk-IoT platform for precision farming of soil cultivation and the assistant created by DigitalGreen also offer support to Asian and East African farmers in a similar manner, respectively [272,273]. Other IoT software used in agriculture is mentioned elsewhere [11,274]. Similarly important, these services also act as the bridge between users and developers by offering feedback to agri-tech companies and NGOs about user experience, the cultural context, and ground realities. As AI provides more granular insight, the advisors can offer hyper-local recommendations as well as personalized plans and risk assessments based on AI models and current support apps. Examples of apps that are currently able to support small-scale farmers include the 2024-launched Darli AI-based chatbot for crop-specific guidance on regenerative farming practices, disease diagnosis, soil health, and water conservation, as well as market and logistics advice [275], the RiceAdvice app for the integration of rice farmers into the value chain (https://www.cari-project.org/, accessed on 21 December 2025), and the FertiCal-P App KP for fertilizer calculations, recommendations, and price breakups [276]. On the side of livestock management, the application of the Datamars Swiss company supports animal farmers to better manage animal productivity and welfare while advising farmers to meet sustainability targets [273]. Another important area to explore and expand includes helping farmers to access digital credit, crop insurance, and cost–benefit analysis, as well as how to use the more frequent and diverse agri-fintech platforms. For breeders, strengthening the advisory role ensures that phenotypic data generated on-farm is both reliable and trusted, directly feeding into breeding databases and genomic prediction workflows.

10. Final Considerations

This review highlights that the convergence of UAV-based phenotyping, multi-omics data, and AI/ML approaches is reshaping plant breeding by enabling higher-resolution trait characterization, scalable data acquisition, and more informed selection decisions. However, the effectiveness of these technologies depends critically on data quality, model robustness, and context-aware implementation. Importantly, while technological advances can enhance breeding efficiency, genetic gains are expected to remain largely incremental and must be supported by breeders’ expertise, environmental validation, and integrated management strategies.
Small-scale farmers aim to maximize their profits, and although most current plant breeding programs still rely on manual phenotyping methods, this is changing with the emergence of new technologies, such as smartphone-based AI applications to detect and diagnose diseases, as well as the use of UAV systems, cloud computing, and the IoT. Their synoptic view drives the adoption of these technologies because of the availability of open data standards, high degree of homogeneity, ease in data integration with GIS datasets, inexpensive data acquisition, and availability of data products and services [18]. However, it is important to exercise caution when interpreting traditionally collected data against more recent AI-related data collection methodologies, as the reliability of traditional methods may be lower than that of more recent ones. In addition, it is also important to keep in mind that despite the fact that scaling could lead to a decrease in pricing through usage, indicators point out that many farmer issues are instead context-specific with nuances and governmental legal frames that should be considered. Therefore, it is important for farmers to be aware of the different methodologies developed, as many tools are focused on phenotyping rather than overall management. AI-ML technology has enormous potential for improving crop and livestock management and enhancing pre-breeding efforts. However, priority must be given to the quality and suitability of image datasets used for AI-ML analysis. Advances in forward and/or reverse genetic engineering and mutation breeding can greatly improve the efficiency of genomic selection for increased profits at the farmer level. As the complexity of questions increases, the need for innovations in in situ data collection and analysis will also grow, leading to further opportunities for AI-ML and robotics in agriculture. This is particularly important in the context of labor shortages, efficiency, and the increasing availability of low-cost robotics [277]. A SWOT analysis can help identify the strengths, weaknesses, opportunities, and threats associated with the use of AI-ML in agriculture to guide decision-making for future research and development efforts [278]. An example of SWOT analysis, plus specific study cases, is provided in Table 5. An increase in data throughput and quality, as well as a reduction in data point costs, besides technology acceptance and usage by farmers, will dictate the pace of successful adoption of AI-ML in agriculture.
Because technology runs on comet’s tail, one would think that data-driven companies, or those focused on editing crop genomes, could do much of the work for agricultural scientists—as these fields are pushing technological boundaries to an extent never seen. However, a substantial caveat needs to be taken into consideration: plant yield and tolerance have plateaued for many crops [279,280,281,282]. Even as crops evolved to become more efficient, diminishing returns set in as crops approached the upper limits of their biological potential. This is particularly evident in terms of yield, which is directly measurable, and tolerance to environmental stresses, which is typically assessed through scoring systems or performance under controlled conditions. In many crops, the current genetic pools are limited, which hinders the development of varieties better suited to the current challenges. While genetically modified crops (GMCs) can support agricultural systems, the constraints imposed by Earth’s planetary boundaries—some of which are already overstretched (https://www.planetaryhealthcheck.org, accessed on 21 December 2025)—further restrict the sustainability of production.
One may think that finding new genes, or alleles, to allow for incremental gains—either by adopting new genetic innovations to increase photosynthesis, reduce photorespiration, harness genetic variants associated with plant–microbiome interactions, including endophytes, or seed and root microbiomes—is a straightforward task, but this is only a partially true statement. While it is correct that great progress is being made in understanding how to increase photosynthetic efficiency and reduce photorespiration [283], it is not trivial to find new genes or alleles that produce incremental gains. This occurs because genetic innovations in photosynthesis are complex, as they often involve trade-offs, and here, any improvement can have cascading effects across physiological processes. Identification of the specific genetic variants that contribute to beneficial plant–microbiome interactions [284] is still complex and requires detailed study; therefore, it is far from a trivial task. In plant breeding, MAS and GS are two techniques currently targeted to transfer and characterize favorable traits in crops. Yet, limitations include the low level of detection of genes with small effects on the majority of important traits for agriculture [194]. Evidence suggests that comparing predictive ability estimates between phenomic and genomic prediction models is invalid for assessing their relative effectiveness, as it may falsely imply that phenomic models are more accurate [285]. It is suggested, instead, that plant breeding should see phenomic selection to report predictive ability, whereas genomic prediction would be used to report prediction accuracy. Therefore, genetic editing using CRISPR/Cas, mutation breeding, along with ML and artificial systems (ASs) powered by artificial intelligence (AI)—which last through the diversification of training models and cheaper devices [286]—are powerful tools for inclusion and routine use in agricultural research [5]. Nevertheless, the gains these tools bring are also incremental.
Table 5. SWOT analysis for new technologies in precision agriculture.
Table 5. SWOT analysis for new technologies in precision agriculture.
Internal
StrengthsWeaknesses
-
Time saving in performing tasks;
-
Increased homogeneity;
-
Reduced redundancy and increased accuracy;
-
Predictions for problem-solving;
-
Automation in data collection, processing, and analyzing large volumes of data in a shorter time;
-
Upscaling;
-
Better solutions to farmers;
-
Cost-effectiveness;
-
Environmentally friendly and sustainable systems;
-
Data transparency and public availability benefit farmers through awareness for better choices, particularly through phenotypic data apps, soil nutrition, and preservation apps levels.
-
Technology may not be applicable in areas with heterogeneous landscapes or terrain, where data collection may be inconsistent or unreliable;
-
The use of AI-ML in agriculture may be hindered in areas with limited access to continuous energy supply, which is required for data collection and processing. The significant need of computational power leads to higher consumption of energy that contributes to more global warming, since such housing facilities require extreme cooling systems;
-
Reduced coordination at the human–AI interface, particularly when multiple robots are involved;
-
Lack of use of management tools to incorporate new technologies in farms (SWOT and PESTEL analysis);
-
Non-consideration of small farming businesses, particularly applied to Africa and Asia;
-
Tasks that lack precision movements are better performed.
-
Small holder farmers are not all small start-ups;
-
Limited to use only in developed countries;
-
Lack of real data in majority of training models used in AI and ML.
OpportunitiesThreats
-
Best applied in areas where there is a shortage of labor or where labor costs are too high;
-
Increase in robotics/AI, sensors, big data, ML, knowledge, and availability, enabling real-time monitoring and precise irrigation, pesticide, and fertilizer application;
-
Better understanding of where robotics/AI/ML can be used without excluding human workforce knowledge;
-
Best used in practices where human health is in danger;
-
Real-time predictions for usage;
-
Higher safety conditions for farmers and the environment and to reduce the agricultural environmental footprint;
-
Predictions for problem avoidance.
-
Lack of understanding of where the new technologies are best used depending on the use case;
-
Lack of substantial experience in modeling, particularly when regards to deep learning models;
-
Safe use of new technologies with no harm;
-
Social–human aspects are not considered, including farmers’ concerns;
-
Lack or misuse of information and technology;
-
Lack of leadership vision and resistance to embrace change;
-
Availability of funds and software solutions;
-
Lack of trust among partnerships, due to non-existing agreeable frameworks;
-
Not all technologies are able to scale-up.
External
Examples of successes: Study cases
Increase in data throughput, better-quality data, reduced cost by datapoint, fewer safety incidents, large usage of technology
https://pestdisplace.org/; http://www.terra-i.org/terra-i.html; https://croppie.org/, all accessed on 21 December 2025
To address complexities in agricultural systems, companies and institutes in the forefront will need to be able to support advances into practical solutions, even if scalable solutions in the real world promise to be a complex task. Future progress in data-driven plant breeding will rely on the integration of UAV-derived phenomics with genomic, transcriptomic, and metabolomic data across environments and seasons, supported by large-scale, cross-location training datasets. Advances in model interpretability and transparency will be essential to foster breeder trust in AI-assisted decision-making, while hybrid approaches combining phenomic and genomic prediction are likely to improve robustness for complex traits. Continued investment in interoperable data infrastructures, explainable ML models, and real-world validation will be key to translating technological innovation into sustainable breeding outcomes. In conclusion, more investments in agricultural research, stronger partnerships, and the switching inter-use of the diverse tools in a holistic manner—including more advanced genetic algorithms—can accelerate agricultural applicability while supporting sustainability for the coming generations. While informatics and artificial intelligence will empower the next generation of plant breeders, real-world validation, environmental adaptability, and breeders’ knowledge and intuition will remain irreplaceable in guiding effective crop improvement.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/agronomy16010137/s1, Figure S1: (a): Published work shown in the search platform Google Scholar using the chosen keywords * in the last 33 years; (b): Published work shown in the search platform Google Scholar using the chosen key words * in the last 33 years; (c): Publications with and without ethics and regulations on artificial intelligence related (directly and indirectly) to agriculture; (d): Distribution of publications used in this research review by year and only from Google Scholar platform. Table S1: Inclusion and exclusion criteria used in this review. Supplementary File S1: Market forecasts, regulatory insights, country-by-country UAV legislation and licensing, and operational constraints with references.

Author Contributions

Writing—original draft preparation, A.L.G.-O.; writing—review and editing, S.L.D., S.C., C.N., D.A.E.M. and R.O.O.; funding acquisition, R.O.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

A.L.G.-O. is thankful to Soren K. Rasmussen from the University of Copenhagen, Denmark, for insightful comments on the manuscript, as well to Valentina Carrillo and Sylvia M. Pineda from the Alliance Bioversity & CIAT, Palmira, Colombia, for supporting in the quality of the manuscript figures.

Conflicts of Interest

The authors declare no conflicts of interest. This paper mentions various private companies and products for illustrative purposes only. The authors do not have any financial interests or affiliations with these companies, nor is there any endorsement or recommendation intended. The inclusion of these companies and products is solely to provide options and ideas relevant to the discussion.

Glossary

Accuracy checkRelated to the genetic algorithm, this is the evaluation of how well a given individual (solution) in the population performs with respect to a predefined fitness function (objective).
ANNsAcronym for artificial neural networks, which are widely used for various tasks, including classification, regression, pattern recognition, and prediction in a AI models.
Binary variablesVariables that can take on one of two values (e.g., 0 and 1) and that are often used in classification tasks where the outcome is either one of two possible classes or values.
Biological pressurePercentage of individuals that reproduce, where the values can vary between 0% and 100% [0% indicates that no individuals reproduce and 100% indicates that all individuals reproduce].
BM/GNBRefers to the Bernoulli Naive Bayes variant of the Naive Bayes classifier used for binary features.
BPAcronym for backward propagation neural network, which is a key algorithm used for training neural networks, including multilayer feed-forward networks. It involves propagating the error backward through the network to update the weights and minimize the loss.
BreedRefers to a specific group of animals, plants, or organisms that share specific characteristics that distinguish them from other groups within the same species.
BREEDING 4.0.Refers to the advanced integration of artificial intelligence (AI), genomics, and multiplex gene editing technologies to optimize crop breeding. This approach enables the precise identification, modification, and enhancement of multiple genetic traits simultaneously, leading to the development of crops that are more resilient, resource-efficient, and high-yielding.
CNNAcronym for convolutional neural network, which is a type of artificial neural network (ANN) designed to process and analyze grid-like data, such as images, video frames, and time series.
DCGANAcronym for deep convolution generative adversarial network, which is a type of generative adversarial network (GAN) that uses deep convolutional neural networks for both the generator and discriminator and is used to generate new data, such as images, that are similar to a given training dataset.
DCNNAcronym for deep convolutional neural network, which is another type of convolutional neural network (CNN) with multiple layers of convolutions designed for automatically and hierarchically learning features from data such as images, video frames, and even time-series data.
ElitismNumber of individuals in the search.
Elitism stageIn the general sense, elitism refers to the process in GA where the best individuals (solutions) from the current generation are carried over to the next generation without modification; elitism stage means that there is the preservation of the best individuals from one generation to the next, thereby avoiding losing good solutions.
Fitness functionFunction to evaluate the performance of any proposed gains.
GAAcronym for genetic algorithm, which is used to find approximate solutions to optimization and search problems by mimicking the process of evolution. GA uses techniques such as selection, crossover (recombination), mutation, and inheritance to evolve a population of candidate solutions over generations, improving the solutions with each iteration.
GA optimizationRefers to the optimization of the parameters of the random forest (RF), backpropagation (BP), and kernel extreme learning machine (KELM) models.
GMMsAcronym for Gaussian mixture models, which used for clustering and density estimation.
GSAcronym for genomic selection, which refers a breeding method that uses DNA data to predict genetic potential and select candidates based on estimates from genomic prediction models.
KELMAcronym for kernel extreme learning machine, which and refers to a learning machine based on the kernel function.
IdeotypesIn the context of AI and ML, ideotypes are crop models that combine beneficial traits to improve performance in specific environmental conditions. This concept differs from genetic ideotypes, which are specific genetic profiles designed to optimize performance based on inherited traits.
Index equation (IE)This is an equation used to access specific data in an indexed collection, such as arrays or databases, and is particularly relevant when dealing with large datasets where indexing helps in retrieving data efficiently.
IntegerAn integer is a discrete variable that represents discrete categories or quantities, such as the number of items in a group, the number of visits to a website, or the count of certain features or events.
Labeled datasetHere, dataset refers to each input example (or data point) paired with a corresponding target output or label. For example, in genomic studies, the input attributes might include genetic markers or sequence data, while the label could be the phenotypic trait (e.g., yield, disease resistance, etc.) associated with those markers.
MLPAcronym for multilayer perceptron, which refers to a type of neural network used for classification and regression.
Mutation probabilityThe probability that individuals have a mutation. The value can vary from 0 and 1, where 0 means no mutation occurs and 1 means every individual will undergo mutation.
NIRAcronym for near infrared, typically ranging from approximately 750 to 1400 nanometers on the electromagnetic spectrum.
NNAcronym for neural network, which is a type of machine learning model (MLM) inspired by the structure and functioning of the human brain and is composed of interconnected layers of nodes—neurons or artificial neurons—which work together to solve various tasks such as classification, regression, and pattern recognition.
ParityIn the context of this review, parity refers to the number of times the cattle have given birth.
PermutationArrangement or re-arrangement of objects or elements in a specific order, and in some machine learning (ML) algorithms (e.g., decision trees or ensemble methods), permutations can be used in feature selection or bootstrapping.
PopulationNumber of individuals used in the search.
Proximal phenotypingThis is the process of measuring traits or characteristics using sensors and technologies that are physically close to the plants, but not in direct contact. This can be performed using drones, robotic systems, or ground-based devices, which collect high-resolution data on various attributes (e.g., leaf area, chlorophyll content, water stress, among others). Unlike traditional methodologies, which are quick and accurate, these are used for large-scale and non-invasive agricultural practices.
R2Refers to the coefficient of determination, which measures how well the regression model fits the data. An R2 value closer to 1 is better, as it indicates that a greater proportion of the variance in the dependent variable is explained by the model.
Range of the searchValues between which best gains are used in the search. It ranges from the minimum value and the maximum value of the search space.
RGBAcronym for red, green, and blue, which refers to the color a model uses for representing images.
RFAcronym for random forest regression (also known as RFR), which refers to the data-driven integrated learning approach.
RMSEAcronym for root mean square error, and ideally, the smaller the value is, the better, since it measures the average magnitude of the error between the values that are predicted and the values that are observed. A smaller value means that the model’s predictions are closer to the actual values.
RPDAcronym for relative percentage difference, which is used to check the quality of a predictive model, particularly in the context of spectroscopy or chemometrics. It compares the prediction error to the variation in the data. The values used include the following: <1.4: impossible estimation, indicating that the model’s predictions are highly inaccurate; ≥1.4 and <2: rough estimation, indicating that the predictions are moderately accurate but not precise; and ≥2: good estimation, indicating that the predictions are reliable and accurate (as described by [177]).
Small Scale Farmer (SSF)Both FAO and CGIAR operate with this definition, although the exact criteria may differ depending on the region, crop/livestock type, and context. The system refers to SSF, which often operates <2 hectares for crops, relies primarily on family labor, and focuses on subsistence or local markets. Regarding livestock, SSF refers to systems that own small herds or flocks.
SOMAcronym for self-organizing map, which is a type of unsupervised neural network that is used for dimensionality reduction, clustering, and visualization of high-dimensional data.
Stop conditionThe number of iterations in which a search is performed, which can vary from 1 to any specified maximum number of iterations. The exact number depends on the problem and the algorithm being used.
SVRAcronym for support vector regression, which is usually used for regression tasks to predict continuous values instead of predicting discrete categories as in the classification tasks. It is a powerful algorithm when the relationship between the input features and the target variable is complex and non-linear.
SVMAcronym for support vector machine, which is also used for regression tasks. Its main strength lies in its ability to work efficiently with both linear and non-linear data using kernel functions.
UFABAcronym for universal function approximation block, which is a component used for approximating any given function.

References

  1. Alotaibi, M. Climate change, its impact on crop production, challenges, and possible solutions. Not. Bot. Horti Agrobot. Cluj-Napoca 2023, 51, 13020. [Google Scholar] [CrossRef]
  2. Meuwissen, T.H.E.; Hayes, B.J.; Goddard, M.E. Prediction of Total Genetic Value Using Genome-Wide Dense Marker Maps. Genetics 2001, 157, 1819–1829. [Google Scholar] [CrossRef] [PubMed]
  3. Washburn, J.D.; Cimen, E.; Ramstein, G.; Reeves, T.; O’Briant, P.; McLean, G.; Cooper, M.; Hammer, G.; Buckler, E.S. Predicting phenotypes from genetic; environment; management; and historical data using CNNs. Theor. Appl. Genet. 2021, 134, 3997–4011. [Google Scholar] [CrossRef] [PubMed]
  4. Jubair, S.; Domaratzki, M. Crop genomic selection with deep learning and environmental data: A survey. Front. Artif. Intell. 2023, 5, 1040295. [Google Scholar] [CrossRef]
  5. Garcia-Oliveira, A.L.; Ortiz, R.; Sarsu, F.; Rasmussen, S.K.; Agre, P.; Asfaw, A.; Kante, M.; Chander, S. The importance of genotyping within the climate-smart plant breeding value chain–integrative tools for genetic enhancement programs. Front. Plant Sci. 2025, 15, 1518123. [Google Scholar] [CrossRef]
  6. Chawade, A.; van Ham, J.; Blomquist, H.; Bagge, O.; Alexanderson, E.; Ortiz, R. High-Throughput Field-Phenotyping Tools for Plant Breeding and Precision Agriculture. Agronomy 2019, 9, 258. [Google Scholar] [CrossRef]
  7. Resende, R.T.; Chenu, K.; Rasmussen, S.K.; Heinemann, A.B.; Fritsche-Neto, R. Editorial. Enviromics in Plant Breeding. Front. Plant Sci. 2022, 13, 935380. [Google Scholar] [CrossRef]
  8. Kick, D.R.; Wallace, J.G.; Schnable, J.C.; Kolkman, J.M.; Alaca, B.; Beissinger, T.M.; Edwards, J.; Ertl, D.; Flint-Garcia, S.; Gage, J.L.; et al. Yield prediction through integration of genetic, environment, and management data through deep learning. G3 Genes Genomes Genet. 2023, 13, jkad006. [Google Scholar] [CrossRef]
  9. Tomura, S.; Wilkinson, M.J.; Cooper, M.; Powell, O. Improved genomic prediction performance with ensembles of diverse models. G3 Genes Genomes Genet. 2025, 15, jkaf048. [Google Scholar] [CrossRef]
  10. Xu, Y.; Zhang, X.; Li, H.; Zheng, H.; Zhang, J.; Olsen, M.S.; Varshney, R.K.; Prasanna, B.M.; Qian, Q. Smart breeding driven by big data, artificial intelligence, and integrated genomic-enviromic prediction. Mol. Plant 2022, 15, 1664–1695. [Google Scholar] [CrossRef]
  11. Thilakarathne, N.N.; Bakar, M.S.A.; Abas, P.E.; Yassin, H. Internet of things enabled smart agriculture, Current status, latest advancements, challenges and countermeasures. Heliyon 2025, 11, E42136. [Google Scholar] [CrossRef] [PubMed]
  12. Sparrow, R.; Howard, M. Robots in agriculture, prospects, impacts, ethics; and policy. Precis. Agric. 2021, 22, 818–833. [Google Scholar] [CrossRef]
  13. Zou, K.; Liao, Q.; Zhang, F.; Che, X.; Zhang, C. A segmentation network for smart weed management in wheat fields. Comput. Electron. Agric. 2022, 202, 107303. [Google Scholar] [CrossRef]
  14. Inoue, Y.; Peñuelas, J.; Miyata, A.; Mano, M. Normalized Difference Spectral Indices for Estimating Photosynthetic Efficiency and Capacity at a Canopy Scale Derived from Hyperspectral and CO2 Flux Measurements in Rice. Remote Sens. Environ. 2008, 112, 156–172. [Google Scholar] [CrossRef]
  15. Fu, P.; Montes, C.M.; Siebers, M.H.; Gomez-Casanova, N.; McGrath, J.M.; Ainsworth, E.A.; Bernacchi, C.J. Advances in field-based high-throughput photosynthetic phenotyping. J. Exp. Bot. 2022, 73, 3157–3172. [Google Scholar] [CrossRef]
  16. Carroll, O.H.; Seabloom, E.W.; Borer, E.T.; Harpole, W.S.; Wilfahrt, P.; Arnillas, C.A.; Bakker, J.D.; Blumenthal, D.M.; Boughton, E.; Bugalho, M.N.; et al. Frequent failure of nutrients to increase plant biomass supports the need for precision fertilization in agriculture. Sci. Rep. 2025, 15, 14564. [Google Scholar] [CrossRef]
  17. Wang, T.; Zuo, Y.; Manda, T.; Hwarari, D.; Yang, L. Harnessing Artificial Intelligence; Machine Learning and Deep Learning for Sustainable Forestry Management and Conservation, Transformative Potential and Future Perspectives. Plants 2025, 14, 998. [Google Scholar] [CrossRef]
  18. Lechner, A.M.; Foody, G.M.; Boyd, D.S. Applications in Remote Sensing to Forest Ecology and Management. One Earth 2020, 2, 405–412. [Google Scholar] [CrossRef]
  19. Sannigrahi, S.; Pilla, F.; Basu, B.; Basu, A.S.; Sarkar, K.; Chakraborti, S.; Joshi, P.K.; Zhang, Q.; Wang, Y.; Bhatt, S.; et al. Examining the effects of forest fire on terrestrial carbon emission and ecosystem production in India using remote sensing approaches. Sci. Total Environ. 2020, 725, 138331. [Google Scholar] [CrossRef]
  20. Gao, Y.; Skutsch, M.; Paneque-Galvez, J.; Ghilardi, A. Remote sensing of forest degradation: A review. Environ. Res. Lett. 2020, 15, 103001. [Google Scholar] [CrossRef]
  21. Velazquez-Chavez, L.J.; Daccache, A.; Mohamed, A.Z.; Centritto, M. Plant-based and remote sensing for water status monitoring of orchard crops. Systematic review and meta-analysis. Agric. Water Manag. 2024, 303, 109051. [Google Scholar] [CrossRef]
  22. Muzammal, H.; Zaman, M.; Safdar, M.; Shahid, M.A.; Sabir, M.K.; Khil, A.; Raza, A.; Faheem, M.; Ahmed, J.; Sattar, S.M.; et al. Climate Change Impacts on Water Resources and Implications for Agricultural Management. In Transforming Agricultural Management for a Sustainable Future; World Sustainability Series; Kanga, S., Singh, S.K., Shevkani, K., Pathak, V., Sajan, B., Eds.; Springer: Cham, Switzerland, 2024; pp. 21–45. [Google Scholar] [CrossRef]
  23. Singh, A.; Ganapathysubramanian, B.; Singh, A.K.; Sarkar, S. Machine Learning for High-Throughput Stress Phenotyping in Plants. Trends Plant Sci. 2016, 21, 110–124. [Google Scholar] [CrossRef] [PubMed]
  24. Mahato, S.; Bi, H.; Neethirajan, S. Dairy DigiD: A keypoint-based deep learning system for classifying dairy cattle by physiological and reproductive status. Front. Artif. Intell. 2025, 8, 1545247. [Google Scholar] [CrossRef] [PubMed]
  25. Tattaris, M.; Reynolds, M.P.; Chapman, S.C. A Direct Comparison of Remote Sensing Approaches for High-Throughput Phenotyping in Plant Breeding. Front. Plant Sci. 2016, 7, 1131. [Google Scholar] [CrossRef]
  26. ESA (European Space Agency). Sentinel Data Enables New System for Agricultural Monitoring in Poland. 2020. Available online: https://www.esa.int/Applications/Observing_the_Earth/Copernicus/Sentinel_data_enables_new_system_for_agricultural_monitoring_in_Poland (accessed on 27 October 2025).
  27. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, 71. [Google Scholar] [CrossRef]
  28. Assimakopoulos, F.; Vassilakis, C.; Margaris, D.; Kotis, K.; Spiliotopoulos, D. AI and related technologies in the fields of smart agriculture: A review. Information 2025, 16, 100. [Google Scholar] [CrossRef]
  29. Ghosh, A.; Sumit, R.; Ashoka, P.; Kotyal, K.; Sabarinathan, B.; Anjali, S.S.; Sivakumar, K.P.; Panotra, N.; Pandey, S.K. Data-driven decision making in agriculture with sensors, satellite imagery and AI analytics by digital farming. Arch. Curr. Res. Int. 2025, 25, 37–52. [Google Scholar] [CrossRef]
  30. Tatem, A.J.; Goetz, S.J.; Hay, S.I. Fifty years of Earth observation satellites, Views from above have lead to countless advances on the ground in both scientific knowledge and daily life. Am. Sci. 2008, 96, 390–398. [Google Scholar] [CrossRef]
  31. Badola, S. Role of remote sensing and GIS in land use planning. Int. J. Eng. Res. Manag. Technol. 2019, 6, 59–65. Available online: https://www.ijermt.org/publication/41/306.%20ijernt%20JULY%202019.pdf (accessed on 21 December 2025).
  32. Raihan, A. A Comprehensive review of the recent advancement in integrating deep learning with geographic information systems. Res. Briefs Inf. Commun. Technol. Evol. 2023, 9, 98–115. [Google Scholar] [CrossRef]
  33. Trivedi, A.; Rao, K.V.R.; Yadav, D.; Verma, N.S. Remote sensing and geographic information system applications for precision farming and natural resource management. Indian J. Ecol. 2022, 49, 1624–1633. [Google Scholar] [CrossRef]
  34. Xie, C.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
  35. Chandra, A.L.; Desai, S.V.; Guo, W.; Balasubramanian, V.N. Computer vision with deep learning for plant phenotyping in agriculture: A survey. arXiv 2020, arXiv:2006.11391. [Google Scholar] [CrossRef]
  36. Sankaran, S.; Khot, L.R.; Espinoza, C.Z.; Jarolmasjed, S.; Sathuvalli, V.R.; Vandemark, G.J.; Miklas, P.N.; Carter, A.H.; Pumphrey, M.O.; Knowles, N.R.; et al. Low-altitude; high-resolution aerial imaging systems for row and field crop phenotyping, A review. Eur. J. Agron. 2015, 70, 112–123. [Google Scholar] [CrossRef]
  37. Duarte, A.; Acevedo-Munoz, L.; Goncalves, C.I.; Mota, L.; Sarmento, A.; Silva, M.; Fabres, S.; Borralho, N.; Valente, C. Detection of longhorned borer attack and assessment in eucalyptus plantations using UAV imagery. Remote Sens. 2020, 12, 3153. [Google Scholar] [CrossRef]
  38. Johnson, L.F.; Roczen, D.E.; Youkhana, S.K.; Nemani, R.R.; Bosch, D.F. Mapping vineyard leaf area with multispectral satellite imagery. Comput. Electron Agric. 2003, 38, 33–44. [Google Scholar] [CrossRef]
  39. Johnson, L.F. Temporal stability of an NDVI-LAI relationship in a Napa Valley vineyard. Aust. J. Grape Wine Res. 2008, 9, 96–101. [Google Scholar] [CrossRef]
  40. Qin, Z.; Yang, H.; Shu, Q.; Yu, J.; Yang, Z.; Ma, X.; Duan, D. Estimation of Dendrocalamus giganteus leaf area index by combining multi-source remote sensing data and machine learning optimization model. Front. Plant Sci. 2025, 15, 1505414. [Google Scholar] [CrossRef]
  41. Sullivan, D.G.; Shaw, J.N.; Rickman, D. IKONOS imagery to estimate surface soil property variability in two Alabama physiographies. Soil Sci. Soc. Am. J. 2005, 69, 1789–1798. [Google Scholar] [CrossRef]
  42. Ping, J.L.; Ferguson, R.B.; Dobermann, A. Site-specific nitrogen and plant density management in irrigated maize. Agron. J. 2008, 100, 1193–1204. [Google Scholar] [CrossRef]
  43. Kumar, N.; Anouncia, S.; Madhavan, P. Application of satellite remote sensing to find soil fertilization by using soil colour. Int. J. Online Eng. 2013, 9, 2530. [Google Scholar] [CrossRef]
  44. Ghazali, M.; Wikantika, K.; Harto, A.; Kondoh, A. Generating soil salinity, soil moisture, soil ph from satellite imagery and its analysis. Inf. Process Agric. 2019, 7, 294–306. [Google Scholar] [CrossRef]
  45. Montaldo, N.; Gaspa, A.; Corona, R. Multiscale assimilation of sentinel and Landsat data for soil moisture and leaf area index predictions using an ensemble-Kalman-filter-based assimilation approach in a heterogeneous ecosystem. Remote Sens. 2022, 14, 3458. [Google Scholar] [CrossRef]
  46. Dobermann, A.; Ping, J.L. Geostatistical integration of yield monitor data and remote sensing improves yield maps. Agronomy J. 2004, 96, 285–297. [Google Scholar] [CrossRef]
  47. Yang, C.; Everitt, J.H.; Bradford, J.M. Comparison of QuickBird satellite imagery and airborne imagery for mapping grain sorghum yield patterns. Precis. Agric. 2006, 7, 33–44. [Google Scholar] [CrossRef]
  48. Yang, C.; Everitt, J.H.; Bradford, J.M. Evaluating high resolution QuickBird satellite imagery for estimating cotton yield. Trans. ASABE 2006, 49, 1599–1606. [Google Scholar] [CrossRef]
  49. Yang, C.; Everitt, J.H.; Fletcher, R.S.; Murden, D. Using high resolution QuickBird imagery for crop identification and area estimation. Geocarto Int. 2007, 22, 219–233. [Google Scholar] [CrossRef]
  50. Yang, C.; Everitt, J.H.; Murden, D. Using high resolution SPOT 5 multispectral imagery for crop identification. Comput. Electron. Agric. 2011, 75, 347–354. [Google Scholar] [CrossRef]
  51. Bu, H.; Sharma, L.K.; Denton, A.; Franzen, D.W. Comparison of satellite imagery and ground-based active optical sensors as yield predictors in sugar beet, spring wheat, corn, and sunflower. Agron. J. 2017, 109, 299–308. [Google Scholar] [CrossRef]
  52. Franke, J.; Menz, G. Multi-temporal wheat disease detection by multispectral remote sensing. Precis. Agric. 2007, 8, 161–172. [Google Scholar] [CrossRef]
  53. Li, X.; Lee, W.S.; Li, M.; Ehsani, R.; Mishra, A.R.; Yang, C.; Mangan, R.L. Feasibility study on Huanglongbing (citrus greening) detection based on WorldView-2 satellite imagery. Biosyst. Eng. 2015, 132, 28–38. [Google Scholar] [CrossRef]
  54. Ghobadifar, F.; Aimrun, W.; Jebur, M.N. Development of an early warning system for brown planthopper (BPH) (Nilaparvata lugens) in rice farming using multispectral remote sensing. Precis. Agric. 2016, 17, 377–391. [Google Scholar] [CrossRef]
  55. Bausch, W.C.; Halvorson, A.D.; Cipra, J. QuickBird satellite and ground-based multispectral data correlations with agronomic parameters of irrigated maize grown in small plots. Biosyst. Eng. 2008, 101, 306–315. [Google Scholar] [CrossRef]
  56. Bausch, W.C.; Khosla, R. QuickBird satellite versus ground-based multi-spectral data for estimating nitrogen status of irrigated maize. Precis. Agric. 2010, 11, 274–290. [Google Scholar] [CrossRef]
  57. Söderström, M.; Borjesson, T.; Pettersson, C.G.; Nissen, K.; Hagner, O. Prediction of protein content in malting barley using proximal and remote sensing. Precis. Agric. 2010, 11, 587–599. [Google Scholar] [CrossRef]
  58. Wagner, P.; Hank, K. Suitability of aerial and satellite data for calculation of site-specific nitrogen fertilisation compared to ground based sensor data. Precis. Agric. 2013, 14, 135–150. [Google Scholar] [CrossRef]
  59. Caturegli, L.; Casucci, M.; Lulli, F.; Grossi, N.; Gaetani, M.; Magni, S.; Bonari, E.; Volterrani, M. GeoEye-1 satellite versus ground-based multispectral data for estimating nitrogen status of turfgrasses. Int. J. Remote Sens. 2015, 36, 2238–2251. [Google Scholar] [CrossRef]
  60. Magney, T.S.; Eitel, J.U.H.; Vierling, L.A. Mapping wheat nitrogen uptake from RapidEye vegetation indices. Precis. Agric. 2017, 18, 429–451. [Google Scholar] [CrossRef]
  61. Yu, Y.; Luo, Y.; Wang, X.; Wang, X.; Hu, C. Precise assimilation prediction of short-term and long-term maize irrigation water based on EnKF-DSSAT and fuzzy optimization-DSSAT models. IEEE Access 2025, 13, 27150–27166. [Google Scholar] [CrossRef]
  62. Rußwurm, M.; Lefevre, S.; Korner, M. Breizhcrops: A satellite time series dataset for crop type identification. In Proceedings of the Time Series Workshop of the 36th International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; Volume 97. [Google Scholar]
  63. Suhairi, T.; Jahanshiri, E.; Nizar, N.M.M. Multicriteria land suitability assessment for growing under utilised crop, bambara groundnut in Peninsular Malaysia. IOP Conf. Ser. Earth Environ. Sci. 2018, 169, 012044. [Google Scholar] [CrossRef]
  64. Xu, R.; Li, C.; Paterson, A.H.; Jiang, Y.; Sun, S.; Robertson, J.S. Aerial images and convolutional neural network for cotton bloom detection. Front. Plant Sci. 2018, 8, 2235. [Google Scholar] [CrossRef] [PubMed]
  65. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  66. De Souza, C.H.W.; Lamparelli, R.A.C.; Rocha, J.V.; Magalhães, P.S.G. Height estimation of sugarcane using an unmanned aerial system (UAS) based on structure from motion (SfM) point clouds. Int. J. Remote Sens. 2017, 38, 2218–2230. [Google Scholar] [CrossRef]
  67. Gnädinger, F.; Schmidhalter, U. Digital Counts of Maize Plants by Unmanned Aerial Vehicles (UAVs). Remote Sens. 2017, 9, 544. [Google Scholar] [CrossRef]
  68. Colombo, R.; Bellingeri, D.; Fasolini, D.; Marino, C.M. Retrieval of leaf area index in different vegetation types using high resolution satellite data. Remote Sens. Environ. 2003, 86, 120–131. [Google Scholar] [CrossRef]
  69. Gano, B.; Dembele, J.S.B.; Tovignan, T.K.; Sine, B.; Vadez, V.; Diouf, D.; Audebert, A. Adaptation Responses to Early Drought Stress of West Africa Sorghum Varieties. Agronomy 2021, 11, 443. [Google Scholar] [CrossRef]
  70. Peng, X.; Han, W.; Ao, J.; Wang, Y. Assimilation of LAI derived from UAV multispectral data into the SAFY model to estimate maize yield. Remote Sens. 2021, 13, 1094. [Google Scholar] [CrossRef]
  71. Buthelezi, S.; Mutanga, O.; Sibanda, M.; Odindi, J.; Clulow, A.D.; Chimonyo, V.G.; Mabhaudhi, T. Assessing the prospects of remote sensing maize leaf area index using UAV-derived multi-spectral data in smallholder farms across the growing season. Remote Sen. 2023, 15, 1597. [Google Scholar] [CrossRef]
  72. Jewan, S.Y.Y.; Singh, A.; Billa, L.; Sparkes, D.; Murchie, E.; Gautam, D.; Cogato, A.; Pagay, V. Can Multi-Temporal Vegetation Indices and Machine Learning Algorithms Be Used for Estimation of Groundnut Canopy State Variables? Horticulture 2024, 10, 748. [Google Scholar] [CrossRef]
  73. Shanahan, J.F.; Schepers, J.S.; Francis, D.D.; Varvel, G.E.; Wilhelm, W.W.; Tringe, J.M.; Schlemmer, M.R.; David, D.J. Use of remote-sensing imagery to estimate corn grain yield. Agron. J. 2001, 93, 583–589. [Google Scholar] [CrossRef]
  74. Swain, K.; Thomson, S.; Jayasuriya, H. Adoption of an unmanned helicopter for low-altitude remote sensing to estimate yield and total biomass of a rice crop. Trans. ASABE 2010, 53, 21–27. [Google Scholar] [CrossRef]
  75. Mkhabela, M.S.; Bullock, P.; Raj, S.; Wang, S.; Yang, Y. Crop yield forecasting on the Canadian Prairies using MODIS NDVI data. Agric. For. Meteorol. 2011, 151, 385–393. [Google Scholar] [CrossRef]
  76. Geipel, J.; Link, J.; Claupein, W. Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system. Remote Sens. 2014, 6, 10335–10355. [Google Scholar] [CrossRef]
  77. Gracia-Romero, A.; Vergara-Díaz, O.; Thierfelder, C.; Cairns, J.E.; Kefauver, S.C.; Araus, J.L. Phenotyping conservation agriculture management effects on ground and aerial remote sensing assessments of maize hybrids performance in Zimbabwe. Remote Sens. 2018, 10, 349. [Google Scholar] [CrossRef]
  78. Galán, R.J.; Bernal-Vasquez, A.M.; Jebsen, C.; Piepho, H.P.; Thorwarth, P.; Steffan, P.; Gordillo, A.; Miedaner, T. Integration of genotypic, hyperspectral, and phenotypic data to improve biomass yield prediction in hybrid rye. Theor. Appl. Genet. 2020, 133, 3001–3015. [Google Scholar] [CrossRef]
  79. Ashapure, A.; Jung, J.; Chang, A.; Oh, S.; Yeom, J.; Maeda, M.; Maeda, A.; Dube, N.; Landivar, J.; Hague, S.; et al. Developing a machine learning based cotton yield estimation framework using multi-temporal UAS data. ISPRS J. Photogramm Rem. Sens. 2020, 169, 180–194. [Google Scholar] [CrossRef]
  80. Sumesh, K.C.; Ninsawat, S.; Som-ard, J. Integration of RGB-based vegetation index; crop surface model and object-based image analysis approach for sugarcane yield estimation using unmanned aerial vehicle. Comput. Electron. Agric. 2021, 180, 105903. [Google Scholar] [CrossRef]
  81. Alabi, T.R.; Abebe, A.T.; Chigeza, G.; Fowobaje, K.R. Estimation of soybean grain yield from multispectral high-resolution UAV data with machine learning models in West Africa. Remote Sens. Appl. Soc. Environ. 2022, 27, 100782. [Google Scholar] [CrossRef]
  82. Hassanein, M.; Lari, Z.; El-Sheimy, N. A new vegetation segmentation approach for cropped fields based on threshold detection from hue histograms. Sensors 2018, 18, 1253. [Google Scholar] [CrossRef]
  83. Merino, L.; Caballero, F.; Martínez-de-Dios, J.R.; Maza, I.; Ollero, A. An unmanned aircraft system for automatic forest fire monitoring and measurement. J. Intell. Robot. Syst. 2012, 65, 533–548. [Google Scholar] [CrossRef]
  84. Fujimoto, A.; Haga, C.; Matsui, T.; Machimura, T.; Hayashi, K.; Sugita, S.; Takagi, H. An end-to-end process development for UAV-SfM based forest monitoring: Individual tree detection, species classification and carbon dynamics simulation. Forests 2019, 10, 680. [Google Scholar] [CrossRef]
  85. Amaral, L.R.; Molin, J.P.; Portz, G.; Finazzi, F.B.; Cortinove, L. Comparison of crop canopy reflectance sensors used to identify sugarcane biomass and nitrogen status. Precis. Agric. 2015, 16, 15–28. [Google Scholar] [CrossRef]
  86. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
  87. Zheng, H.; Cheng, T.; Li, D.; Zhou, X.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Evaluation of RGB; color-infrared and multispectral images acquired from unmanned aerial systems for the estimation of nitrogen accumulation in rice. Remote Sens. 2018, 10, 824. [Google Scholar] [CrossRef]
  88. Hagn, L.; Mittermayer, M.; Kern, A.; Kimmelmann, S.; Maidl, F.-X.; Hülsbergen, K.-J. Effects of sensor-based, site-specific nitrogen fertilizer application on crop yield, nitrogen balance, and nitrogen efficiency. Sensors 2025, 25, 795. [Google Scholar] [CrossRef]
  89. Ludovisi, R.; Tauro, F.; Salvati, R.; Khoury, S.; Mugnozza, G.S.; Harfouche, A. UAV-based thermal imaging for high-throughput field phenotyping of black poplar response to drought. Front. Plant Sci. 2017, 8, 1681. [Google Scholar] [CrossRef]
  90. Ampatzidis, Y.; Partel, V.; Meyering, B.; Albrecht, U. Citrus rootstock evaluation utilizing UAV-based remote sensing and artificial intelligence. Comput. Electron. Agric. 2019, 164, 104900. [Google Scholar] [CrossRef]
  91. Bhandari, M.; Baker, S.; Rudd, J.C.; Ibrahim, A.M.H.; Chang, A.; Xue, Q.; Jung, J.; Landivar, J.; Auvermann, B. Assessing the effect of drought on winter wheat growth using unmanned aerial system (UAS)-based phenotyping. Remote Sens. 2021, 13, 1144. [Google Scholar] [CrossRef]
  92. Brewer, K.; Clulow, A.; Sibanda, M.; Gokool, S.; Odindi, J.; Mutanga, O.; Naiken, V.; Chimonyo, V.G.P.; Mabhaudhi, T. Estimation of maize foliar temperature and stomatal conductance as indicators of water stress based on optical and thermal imagery acquired using an unmanned aerial vehicle (UAV) platform. Drones 2022, 6, 169. [Google Scholar] [CrossRef]
  93. Yang, Y.; Wei, X.; Wang, J.; Zhou, G.; Wang, J.; Jiang, Z.; Zhao, J.; Ren, Y. Prediction of seedling oilseed rape crop phenotype by drone-derived multimodal data. Remote Sens. 2023, 15, 3951. [Google Scholar] [CrossRef]
  94. Fiorillo, E.; Crisci, A.; de Filippis, T.; di Gennaro, S.F.; di Blasi, S.; Matese, A.; Primicerio, J.; Vaccari, F.P.; Genesio, L. Airborne high-resolution images for grape classification: Changes in correlation between technological and late maturity in a Sangiovese vineyard in Central Italy. Aust. J. Grape Wine Res. 2012, 18, 80–90. [Google Scholar] [CrossRef]
  95. Bonilla, I.; de Toda, F.M.; Martínez-Casasnovas, J.A. Vine vigor, yield and grape quality assessment by airborne remote sensing over three years: Analysis of unexpected relationships in cv Tempranillo. Span. J. Agric. Res. 2015, 13, e0903. [Google Scholar] [CrossRef]
  96. Ledderhof, D.; Brown, R.; Reynolds, A.; Jollineau, M. Using remote sensing to understand Pinot noir vineyard variability in Ontario. Can. J. Plant Sci. 2016, 96, 89–108. [Google Scholar] [CrossRef]
  97. Ferrer, M.; Echeverría, G.; Pereyra, G.; Gonzalez-Neves, G.; Pan, D.; Mirás-Avalos, J.M. Mapping vineyard vigor using airborne remote sensing: Relations with yield, berry composition and sanitary status under humid climate conditions. Precis. Agric. 2019, 21, 178–197. [Google Scholar] [CrossRef]
  98. Garcia-Fernandez, M.; Sanz-Ablanedo, E.; Rodríguez-Pérez, J.R. High-resolution drone-acquired RGB imagery to estimate spatial grape quality variability. Agronomy 2021, 11, 655. [Google Scholar] [CrossRef]
  99. Abdulridha, J.; Batuman, O.; Ampatzidis, Y. UAV-based remote sensing technique to detect citrus canker disease utilizing hyperspectral imaging and machine learning. Remote Sens. 2019, 11, 1373. [Google Scholar] [CrossRef]
  100. Harihara, J.; Fuller, J.; Ampatzidis, Y.; Abdulridha, J.; Lerwill, A. Finite difference analysis and bivariate correlation of hyperspectral data for detecting laurel wilt disease and nutritional deficiency in avocado. Remote Sens. 2019, 11, 1748. [Google Scholar] [CrossRef]
  101. Selvaraj, M.G.; Vergara, A.; Ruiz, H.; Elayabalan, S.; Ocimati, W.; Blomme, G. AI-powered banana diseases and pest detection. Plant Methods 2019, 15, 92. [Google Scholar] [CrossRef]
  102. Bhandari, M.; Ibrahim, A.M.H.; Xue, Q.; Jung, J.; Chang, A.; Rudd, J.C.; Maeda, M.; Rajan, N.; Neely, H.; Landivar, J. Assessing winter wheat foliage disease severity using aerial imagery acquired from small unmanned aerial vehicle (UAV). Comput. Electron. Agric. 2020, 176, 105665. [Google Scholar] [CrossRef]
  103. Chang, A.; Yeom, J.; Jung, J.; Landivar, J. Comparison of canopy shape and vegetation indices of citrus trees derived from UAV multispectral images for characterization of citrus greening disease. Remote Sens. 2020, 12, 4122. [Google Scholar] [CrossRef]
  104. Kassim, Y.B.; Oteng-Frimpong, R.; Puozaa, D.K.; Sie, E.K.; Rasheed, A.; Rashid, A.; Danquah, A.; Akogo, D.A.; Rhoads, J.; Hoisington, D.; et al. High-throughput plant phenotyping (HTPP) in resource-constrained research programs: A working example in Ghana. Agronomy 2022, 12, 2733. [Google Scholar] [CrossRef]
  105. Gonzalez-Dugo, V.; Zarco-Tejada, P.; Nicolas, E.; Nortes, P.A.; Alarcon, J.J.; Intrigliolo, D.S.; Fereres, E. Using high resolution uav thermal imagery to assess the variability in the water status of five fruit tree species within a commercial orchard. Precis. Agric. 2013, 14, 660–678. [Google Scholar] [CrossRef]
  106. Hashem, A. Estimation of Aboveground Biomass/Carbon Sequestration Using UAV Imagery at Kebun Raya Unmul Samarinda Education Forest, East Kalimantan, Indonesia. Master’s Thesis, University of Twente, Enschede, The Netherlands, 2019; p. 77. [Google Scholar]
  107. Anand, A.; Pandey, P.C.; Petropoulos, G.P.; Pavlides, A.; Srivastava, P.K.; Sharma, J.K.; Malhi, R.K.M. Use of hyperion for mangrove forest carbon stock assessment in Bhitarkanika Forest Reserve: A contribution towards Blue Carbon Initiative. Remote Sens. 2020, 12, 597. [Google Scholar] [CrossRef]
  108. Han, R.; Wong, A.J.Y.; Tang, Z.; Truco, M.J.; Lavelle, D.O.; Kozik, A.; Jin, Y.; Michelmore, R.W. Drone phenotyping and machine learning enable discovery of loci regulating daily floral opening in lettuce. J. Exp. Bot. 2021, 72, 2979–2994. [Google Scholar] [CrossRef]
  109. Bonadies, S.; Gadsden, A.S. An overview of autonomous crop row navigation strategies for unmanned ground vehicles. Eng. Agric. Environ. Food 2019, 12, 24–31. [Google Scholar] [CrossRef]
  110. Kägo, R.; Vellak, P.; Karofeld, E.; Noorma, M.; Ol, J. Assessment of using state of the art unmanned ground vehicles for operations on peat fields. Mires Peat 2021, 27, 11. [Google Scholar] [CrossRef]
  111. Ruiz-Larrea, A.; Roldán, J.J.; Garzón, M.; del Cerro, J.; Barrientos, A. A UGV Approach to Measure the Ground Properties of Greenhouses. In Robot 2015: Second Iberian Robotics Conference; Reis, L., Moreira, A., Lima, P., Montano, L., Muñoz-Martinez, V., Eds.; Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2015; Volume 418. [Google Scholar] [CrossRef]
  112. Ramos, P.J.; Prieto, F.A.; Montoya, E.C.; Oliveros, C.E. Automatic fruit count on coffee branches using computer vision. Comput. Electron. Agric. 2017, 137, 9–22. [Google Scholar] [CrossRef]
  113. Amatya, S.; Karkee, M.; Gongal, A.; Zhang, Q.; Whiting, M.D. Detection of cherry tree branches with full foliage in planar architecture for automated sweet-cherry harvesting. Biosyst. Eng. 2016, 146, 3–15. [Google Scholar] [CrossRef]
  114. Sengupta, S.; Lee, W.S. Identification and determination of the number of immature green citrus fruit in a canopy under different ambient light conditions. Biosyst. Eng. 2014, 117, 51–61. [Google Scholar] [CrossRef]
  115. Ali, I.; Cawkwell, F.; Dwyer, E.; Green, S. Modeling managed grassland biomass estimation by using multitemporal remote sensing data—A machine learning approach. IEEE J. Sel. Top Appl. Earth Obs. Remote Sens. 2016, 10, 3254–3264. [Google Scholar] [CrossRef]
  116. Pantazi, X.-E.; Moshou, D.; Alexandridis, T.K.; Whetton, R.L.; Mouazen, A.M. Wheat yield prediction using machine learning and advanced sensing techniques. Comput. Electron. Agric. 2016, 121, 57–65. [Google Scholar] [CrossRef]
  117. Bargoti, S.; Underwood, J.P. Image Segmentation for Fruit Detection and Yield Estimation in Apple Orchards. J. Fields Robot. 2017, 34, 1039–1060. [Google Scholar] [CrossRef]
  118. Zhu, J.; Li, Y.; Wang, C.; Liu, P.; Lan, Y. Method for monitoring wheat growth status and estimating yield based on UAV multispectral remote sensing. Agronomy 2024, 14, 991. [Google Scholar] [CrossRef]
  119. Senthilnath, J.; Dokania, A.; Kandukuri, M.; Ramesh, K.N.; Anand, G.; Omkar, S.N. Detection of tomatoes using spectral-spatial methods in remotely sensed RGB images captured by UAV. Biosyst. Eng. 2016, 146, 16–32. [Google Scholar] [CrossRef]
  120. Zheng, W.; Dai, G.; Hu, M.; Wang, P. A Robust Tomato Counting Framework for Greenhouse Inspection Robots Using YOLOv8 and Inter-Frame Prediction. Agronomy 2025, 15, 1135. [Google Scholar] [CrossRef]
  121. Su, Y.; Xu, H.; Yan, L. Support vector machine-based open crop model (SBOCM): Case of rice production in China. Saudi J. Biol. Sci. 2017, 24, 537–547. [Google Scholar] [CrossRef]
  122. Jahanbakhshi, A.; Momeny, M.; Mahmoudi, M.; Zhang, Y.-D. Classification of sour lemons based on apparent defects using stochastic pooling mechanism in deep convolutional neural networks. Sci. Hortic. 2020, 263, 109133. [Google Scholar] [CrossRef]
  123. Zheng, H.; Tang, W.; Yang, T.; Zhou, M.; Guo, C.; Cheng, T.; Cao, W.; Zhu, Y.; Zhang, Y.X. Grain Protein Content Phenotyping in Rice via Hyperspectral Imaging Technology and a Genome-Wide Association Study. Plant Phenomics 2024, 6, 0200. [Google Scholar] [CrossRef]
  124. Lovynska, V.; Bayat, B.; Bol, R.; Moradi, S.; Rahmati, M.; Raj, R.; Sytnyk, S.; Wiche, O.; Wu, B.; Montzka, C. Monitoring Heavy Metals and Metalloids in Soils and Vegetation by Remote Sensing: A Review. Remote Sens. 2024, 16, 3221. [Google Scholar] [CrossRef]
  125. Pantazi, X.E.; Moshou, D.; Oberti, R.; West, J.; Mouazen, A.M.; Bochtis, D. Detection of biotic and abiotic stresses in crops by using hierarchical self organizing classifiers. Precis. Agric. 2017, 18, 383–393. [Google Scholar] [CrossRef]
  126. Pan, L.; Zhang, W.; Zhu, N.; Mao, S.; Tu, K. Early detection and classification of pathogenic fungal disease in post-harvest strawberry fruit by electronic nose and gas chromatography—Mass spectrometry. Food Res. Int. 2014, 62, 162. [Google Scholar] [CrossRef]
  127. Ebrahimi, M.A.; Khoshtaghaza, M.H.; Minaei, S.; Jamshidi, B. Vision-based pest detection based on SVM classification method. Comput. Electron. Agric. 2017, 137, 52–58. [Google Scholar] [CrossRef]
  128. Chung, C.L.; Huang, K.J.; Chen, S.Y.; Lai, M.H.; Chen, Y.C.; Kuo, Y.F. Detecting Bakanae disease in rice seedlings by machine vision. Comput. Electron. Agric. 2016, 121, 404–411. [Google Scholar] [CrossRef]
  129. Maione, C.; Batista, B.L.; Campiglia, A.D.; Barbosa, F.; Barbosa, R.M. Classification of geographic origin of rice by data mining and inductively coupled plasma mass spectrometry. Comput. Electron Agric. 2016, 121, 101–107. [Google Scholar] [CrossRef]
  130. Moshou, D.; Bravo, C.; West, J.; Wahlen, S.; McCartney, A.; Ramon, H. Automatic detection of ‘yellow rust’ in wheat using reflectance measurements and neural networks. Comput. Electron Agric. 2004, 44, 173–188. [Google Scholar] [CrossRef]
  131. Moshou, D.; Bravo, C.; Oberti, R.; West, J.; Bodria, L.; McCartney, A.; Ramon, H. Plant disease detection based on data fusion of hyper-spectral and multi-spectral fluorescence imaging using Kohonen maps. Real-Time Imaging 2005, 11, 75–83. [Google Scholar] [CrossRef]
  132. Moshou, D.; Bravo, C.; Wahlen, S.; West, J.; McCartney, A.; De Baerdemaeker, J.; Ramon, H. Simultaneous identification of plant stresses and diseases in arable crops using proximal optical sensing and self-organising maps. Precis. Agric. 2006, 7, 149–164. [Google Scholar] [CrossRef]
  133. Moshou, D.; Pantazi, X.-E.; Kateris, D.; Gravalos, I. Water stress detection based on optical multisensor fusion with a least squares support vector machine classifier. Biosyst. Eng. 2014, 117, 15–22. [Google Scholar] [CrossRef]
  134. Pantazi, X.E.; Tamouridou, A.A.; Alexandridis, T.K.; Lagopodi, A.L.; Kontouris, G.; Moshou, D. Detection of Silybum marianum infection with Microbotryum silybum using VNIR field spectroscopy. Comput. Electron. Agric. 2017, 137, 130–137. [Google Scholar] [CrossRef]
  135. Dos Santos, A.F.; Freitas, D.M.; da Silva, G.G.; Pistori, H.; Folhes, M.T. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314. [Google Scholar] [CrossRef]
  136. You, J.; Liu, W.; Lee, J. A DNN-based semantic segmentation for detecting weed and crop. Comput. Electron. Agric. 2020, 178, 105750. [Google Scholar] [CrossRef]
  137. Sonawame, S.; Patil, N.N. Crop-weed segmentation and classification using YOLOv8 approach for smart farming. J. Stud. Sci. Eng. 2024, 4, 136–158. [Google Scholar] [CrossRef]
  138. Hu, H.; Pan, L.; Sun, K.; Tu, S.; Sun, Y.; Wei, Y.; Tu, K. Differentiation of deciduous-calyx and persistent-calyx pears using hyperspectral reflectance imaging and multivariate analysis. Comput. Electron. Agric. 2017, 137, 150–156. [Google Scholar] [CrossRef]
  139. Grinblat, G.L.; Uzal, L.C.; Larese, M.G.; Granitto, P.M. Deep learning for plant identification using vein morphological patterns. Comput. Electron. Agric. 2016, 127, 418–424. [Google Scholar] [CrossRef]
  140. Falk, K.G.; Jubery, T.; Mirnezami, S.; Parmley, K.A.; Sarkar, S.; Singh, A.; Ganapathysubramanian, B.; Singh, A. Computer vision and machine learning enabled soybean root phenotyping pipeline. Plant Methods 2020, 16, 5. [Google Scholar] [CrossRef]
  141. Pantazi, X.E.; Moshou, D.; Tamouridou, A.A. Automated leaf disease detection in different crop species through image features analysis and One Class Classifiers. Comput. Electron Agric. 2019, 156, 96–104. [Google Scholar] [CrossRef]
  142. Ferentinos, K.P. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric. 2018, 145, 311–318. [Google Scholar] [CrossRef]
  143. Zhang, C.; Yun, L.; Yang, C.; Chen, Z.; Cheng, F. LRNTRM-YOLO: Research on real-time recognition of non-tobacco-related materials. Agronomy 2025, 15, 489. [Google Scholar] [CrossRef]
  144. Khan, Z.; Rahimi-Eichi, V.; Haefele, S.; Garnett, T.; Miklavcic, S.J. Estimation of vegetation indices for high-throughput phenotyping of wheat using aerial imaging. Plant Methods 2018, 14, 20. [Google Scholar] [CrossRef]
  145. Jin, X.; Zarco-Tejada, P.J.; Schmidhalter, U.; Reynolds, M.P.; Hawkesford, M.J.; Varshney, R.K.; Yang, T.; Nie, C.; Li, Z.; Ming, B.; et al. High-throughput estimation of crop traits: A review of ground and aerial phenotyping platforms. IEEE Geosci. Remote Sens. Mag. 2021, 9, 200–231. [Google Scholar] [CrossRef]
  146. Debaeke, P.; Rouet, P.; Justes, E. Relationship between the normalized SPAD Index and the nitrogen nutrition index: Application to durum wheat. J. Plant Nutr. 2006, 29, 75–92. [Google Scholar] [CrossRef]
  147. Huang, W.; Lamb, D.W.; Niu, Z.; Zhang, Y.; Liu, L.; Wang, J. Identification of yellow rust in wheat using in-situ spectral reflectance measurements and airborne hyperspectral imaging. Precis. Agric. 2007, 8, 187–197. [Google Scholar] [CrossRef]
  148. Chawade, A.; Linden, P.; Brautigam, M.; Jonsson, R.; Jonsson, A.; Moritz, T.; Olsson, O. Development of a model system to identify differences in spring and winter oat. PLoS ONE 2012, 7, e29792. [Google Scholar] [CrossRef] [PubMed]
  149. Yang, H.; Yang, J.; Lv, Y.; He, J. SPAD values and nitrogen nutrition index for the evaluation of rice nitrogen status. Plant Prod. Sci. 2015, 17, 81–92. [Google Scholar] [CrossRef]
  150. Garriga, M.; Romero-Bravo, S.; Estrada, F.; Escobar, A.; Matus, I.A.; del Pozo, A.; Astudillo, C.A.; Lobos, G.A. Assessing wheat traits by spectral reflectance: Do we really need to focus on predicted trait-values or directly identify the elite genotypes group? Front. Plant Sci. 2017, 8, 280. [Google Scholar] [CrossRef]
  151. Andrianto, H.; Suhardi, S.; Faizal, A. Measurement of chlorophyll content to determine nutrition deficiency in plants: A systematic literature review. In Proceedings of the 2017 International Conference on Information Technology Systems and Innovation (ICITSI), Bandung, Indonesia, 23–24 October 2017; pp. 392–397. [Google Scholar] [CrossRef]
  152. Odilbekov, F.; Armoniene, R.; Henriksson, T.; Chawade, A. Proximal phenotyping and machine learning methods to identify Septoria tritici Blotch disease symptoms in wheat. Front. Plant Sci. 2018, 9, 685. [Google Scholar] [CrossRef]
  153. Rutkoski, J.; Poland, J.; Mondal, S.; Autrique, E.; Pérez, L.G.; Crossa, J.; Reynolds, M.; Singh, R. Canopy temperature and vegetation indices from high-throughput phenotyping improve accuracy of pedigree and genomic selection for grain yield in wheat. G3 Genes Genomes Genet. 2016, 6, 2799–2808. [Google Scholar] [CrossRef]
  154. Debangshi, U.; Sadhukhan, A.; Dutta, D.; Roy, S. Application of smart farming technologies in sustainable agriculture development: A comprehensive review on present status and future advancements. Int. J. Environ. Clim. Chang. 2023, 13, 3689–3704. [Google Scholar] [CrossRef]
  155. Fleming, S.W.; Goodbody, A.G. A machine learning metasystem for robust probabilistic nonlinear regression-based forecasting of seasonal water availability in the US West. IEEE Access 2019, 7, 119943–119964. [Google Scholar] [CrossRef]
  156. Alhijawi, B.; Awajan, A. Genetic algorithms: Theory, genetic operators, solutions, and applications. Evol. Intell. 2024, 17, 1245–1256. [Google Scholar] [CrossRef]
  157. Singh, G.; Gupta, N.; Khosravy, M. New crossover operators for real coded genetic algorithm (RCGA). In Proceedings of the 2015 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), Okinawa, Japan, 28–30 November 2015; IEEE: Piscataway, NJ, USA, 2016. [Google Scholar] [CrossRef]
  158. Pandey, H.M.; Choudhary, A.; Mehrotra, D. A Comparative review of approaches to prevent premature convergence in GA. Appl. Soft. Comput. 2014, 24, 1047–1077. [Google Scholar] [CrossRef]
  159. Rocha, M.; Neves, J. Preventing premature convergence to local optima in genetic algorithms via random offspring generation. In Multiple Approaches to Intelligent Systems; Springer: Berlin/Heidelberg, Germany, 1999; pp. 127–136. [Google Scholar]
  160. Li, M.; Kou, J. A Novel type of niching methods based on steady-state genetic algorithm. In Advances in Natural Computation; Wang, L., Chen, K., Ong, Y.S., Eds.; ICNC 2005; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2005; Volume 3612. [Google Scholar] [CrossRef]
  161. Gracia, C.; Diezma-Iglesias, B.; Barreiro, P. A hybrid genetic algorithm for route optimization in the bale collecting problem. Span. J. Agric. Res. 2013, 11, 603–614. [Google Scholar] [CrossRef]
  162. Qu, J.-G.; Liu, D.; Cheng, J.-H.; Sun, D.-W.; Ma, J.; Pu, H.; Zeng, X.-A. Applications of near-infrared spectroscopy in food safety evaluation and control: A review of recent research advances. Crit. Rev. Food Sci. Nutr. 2015, 55, 1939–1954. [Google Scholar] [CrossRef] [PubMed]
  163. Costa, L.; Nunes, L.; Ampatzidis, Y. A new visible band index (vNDVI) for estimating NDVI values on RGB images utilizing genetic algorithms. Comput. Electron. Agric. 2020, 172, 105334. [Google Scholar] [CrossRef]
  164. Tang, C.; Ding, J.; Zhang, L. LEO satellite downlink distributed jamming optimization method using a non-dominated sorting genetic algorithm. Remote Sens. 2024, 16, 1006. [Google Scholar] [CrossRef]
  165. Gupta, N.; Khosravy, M.; Patel, N.; Dey, N.; Mahela, O.P. Mendelian evolutionary theory optimization algorithm. Soft. Comput. 2020, 24, 14345–14390. [Google Scholar] [CrossRef]
  166. Khosravy, M.; Gupta, N.; Patel, N.; Mahela, O.P.; Varshney, G. Tracing the points in search space in plant biology genetics algorithm optimization. In Frontier Applications of Nature Inspired Computation Springer Tracts in Nature-Inspired Computing; Khosravy, M., Gupta, N., Patel, N., Senjyu, T., Eds.; Springer: Singapore, 2020. [Google Scholar] [CrossRef]
  167. Zhu, X.; Leiser, W.L.; Hahn, V.; Würschum, T. Phenomic selection is competitive with genomic selection for breeding of complex traits. Plant Phenome J. 2021, 4, e20027. [Google Scholar] [CrossRef]
  168. Rincent, R.; Charpentier, J.-P.; Faivre-Rampant, P.; Paux, E.; Le Gouis, J.; Bastien, C.; Segura, V. Phenomic selection is a low-cost and high-throughput method based on indirect predictions, proof of concept on wheat and poplar. G3 Genes Genomes Genet. 2018, 8, 3961–3972. [Google Scholar] [CrossRef]
  169. Waskom, M. Seaborn: Statistical data visualization. J. Open Source Softw. 2021, 6, 3021. [Google Scholar] [CrossRef]
  170. Mahmud, T.; Datta, N.; Chakma, R.; Das, U.K.; Aziz, M.T.; Islam, M.; Salimullah, A.H.M.; Hossain, M.S.; Andersson, K. An approach for crop prediction in agriculture: Integrating genetic algorithms and machine learning. IEEE Access 2024, 12, 173583–173598. [Google Scholar] [CrossRef]
  171. Nishio, M.; Satoh, M. Including dominance effects in the genomic BLUP method for genomic evaluation. PLoS ONE 2014, 9, e85792. [Google Scholar] [CrossRef] [PubMed]
  172. Würschum, T.; Maurer, H.P.; Weissmann, S.; Hahn, V.; Leiser, W.L. Accuracy of within and among-family genomic prediction in triticale. Plant Breed. 2017, 136, 230–236. [Google Scholar] [CrossRef]
  173. Lorenz, A.J.; Smith, K.P. Adding genetically distant individuals to training populations reduces genomic prediction accuracy in barley. Crop Sci. 2015, 55, 2657–2667. [Google Scholar] [CrossRef]
  174. Liu, Y.C.; Sun, S.H.; Yang, S.M.; Chuang, C.Y. Application of genetic algorithm in production scheduling: A Case study on the food processing. Information 2012, 15, 6063–6075. [Google Scholar]
  175. Pérez, O.; Diers, B.; Martin, N. Maturity Prediction in soybean breeding using aerial images and the random forest machine learning algorithm. Remote Sens. 2024, 16, 4343. [Google Scholar] [CrossRef]
  176. El Sakka, M.; Ivanovici, M.; Chaari, L.; Mothe, J. A Review of CNN applications in smart agriculture using multimodal data. Sensors 2025, 25, 472. [Google Scholar] [CrossRef]
  177. Perea, R.G.; Moreno, M.Á.; da Silva Baptista, V.B.; Córcoles, J.I. Decision support system based on genetic algorithms to optimize the daily management of water abstraction from multiple groundwater supply sources. Water Resour. Manag. 2020, 34, 4739–4755. [Google Scholar] [CrossRef]
  178. Rodríguez-Abreo, O.; Rodríguez-Reséndiz, J.; García-Cerezoa, A.; García-Martínez, J.R. Fuzzy logic controller for UAV with gains optimized via genetic algorithm. Heliyon 2024, 10, e26363. [Google Scholar] [CrossRef]
  179. Liu, X.; Li, Z.; Xiang, Y.; Tang, Z.; Huang, X.; Shi, H.; Sun, T.; Yang, W.; Cui, S.; Chen, G.; et al. Estimation of winter wheat chlorophyll content based on wavelet transform and the optimal spectral index. Agronomy 2024, 14, 1309. [Google Scholar] [CrossRef]
  180. Miao, H.; Chen, X.; Guo, Y.; Wang, Q.; Zhang, R.; Chang, Q. Estimation of anthocyanins in winter wheat based on band screening method and genetic algorithm optimization models. Remote Sens. 2024, 16, 2324. [Google Scholar] [CrossRef]
  181. Tanaka, T.S.T.; Hewvelink, G.B.M.; Mieno, T.; Bullock, D.S. Can machine learning models provide accurate fertilizer recommendations? Precis. Agric. 2024, 25, 1839–1856. [Google Scholar] [CrossRef]
  182. Rahman, Z.U.; Asaari, M.S.M.; Ibrahim, H.; Asidin, I.S.Z. Generative adversarial networks (GANs) for image augmentation in farming: A review. IEEE Access 2024, 12, 179912–179943. [Google Scholar] [CrossRef]
  183. Correa, E.S. Mechanistic crop modelling and AI for ideotype optimization: Crop-scale advances to enhance yield and water use efficiency. bioRxiv 2025. [Google Scholar] [CrossRef]
  184. Hall, R.D.; Auria, J.C.D.; Silva-Ferreira, A.C.; Gibon, Y.; Kruszka, D.; Mishra, P.; de Zedde, R. High-throughput plant phenotyping: A role for metabolomics? Trends Plant Sci. 2022, 27, 549–563. [Google Scholar] [CrossRef] [PubMed]
  185. Barupal, D.K.; Fan, S.; Fiehn, O. Integrating bioinformatics approaches for a comprehensive interpretation of metabolomics datasets. Curr. Opin. Biotechnol. 2018, 54, 1–9. [Google Scholar] [CrossRef] [PubMed]
  186. Toubiana, D.; Puzis, R.; Wen, L.; Sikron, N.; Kurmanbayeva, A.; Soltabayeva, A.; Wilhelmi, M.M.R.; Sade, N.; Fait, A.; Sagi, M.; et al. Combined network analysis and machine learning allows the prediction of metabolic pathways from tomato metabolomics data. Commun. Biol. 2019, 2, 214. [Google Scholar] [CrossRef] [PubMed]
  187. Knoch, D.; Werner, C.R.; Meyer, R.C.; Riewe, D.; Abbadi, A.; Lücke, S.; Snowdon, R.J.; Altmann, T. Multi-omics-based prediction of hybrid performance in canola. Theor. Appl. Genet. 2021, 134, 1147–1165. [Google Scholar] [CrossRef]
  188. Thomas, W.J.W.; Zhang, Y.; Amas, J.C.; Cantila, A.Y.; Zandberg, J.D.; Harvie, S.L.; Batley, J. Innovative Advances in Plant Genotyping. In Plant Genotyping: Methods and Protocols; Shavrukov, Y., Ed.; Methods in Molecular Biology; Springer: Berlin/Heidelberg, Germany, 2023; Volume 2638. [Google Scholar] [CrossRef]
  189. Ma, W.; Qiu, Z.; Song, J.; Li, J.; Cheng, Q.; Zhai, J.; Ma, C. A deep convolutional neural network approach for predicting phenotypes from genotypes. Planta 2018, 248, 1307–1318. [Google Scholar] [CrossRef]
  190. Varshney, R.K.; Bohra, A.; Roorkiwal, M.; Barmukh, R.; Cowling, W.A.; Chitikineni, A.; Lam, H.-M.; Hickey, L.T.; Croser, J.S.; Bayer, P.E.; et al. Fast-forward breeding for a food-secure world. Trends Genet. 2021, 37, 1124–1136. [Google Scholar] [CrossRef]
  191. Liu, Y.; Wang, D.; He, F.; Wang, J.; Joshi, T.; Xu, D. Phenotype prediction and genome-wide association study using deep convolutional neural network of soybean. Front. Genet. 2019, 10, 1091. [Google Scholar] [CrossRef]
  192. Sandhu, K.S.; Lozada, D.N.; Zhang, Z.; Pumphrey, M.O.; Carter, A.H. Deep learning for predicting complex traits in spring wheat breeding program. Front. Plant Sci. 2021, 11, 613325. [Google Scholar] [CrossRef] [PubMed]
  193. Lu, Y.; Chen, D.; Olaniyi, E.; Huang, Y. Generative adversarial networks (GANs) for image augmentation in agriculture: A systematic review. Comput. Electron. Agric. 2022, 200, 107208. [Google Scholar] [CrossRef]
  194. Farooq, M.A.; Gao, S.; Hassan, M.A.; Huang, Z.; Rasheed, A.; Hearne, S.; Prasanna, B.; Li, X.; Li, H. Artificial intelligence in plant breeding. Trends Genet. 2024, 40, 891–908. [Google Scholar] [CrossRef] [PubMed]
  195. Demirci, S.; Peters, S.A.; de Ridder, D.; van Dijk, A.D.J. DNA sequence and shape are predictive for meiotic crossovers throughout the plant kingdom. Plant J. 2018, 95, 686–699. [Google Scholar] [CrossRef]
  196. Bourgeois, Y.; Stritt, C.; Walser, J.-C.; Gordon, S.P.; Vogel, J.P.; Roulin, A.C. Genome-wide scans of selection highlight the impact of biotic and abiotic constraints in natural populations of the model grass Brachypodium distachyon. Plant J. 2018, 96, 438–451. [Google Scholar] [CrossRef]
  197. Sartor, R.C.; Noshay, J.; Springer, N.M.; Briggs, S.P. Identification of the expressome by machine learning on omics data. Proc. Natl. Acad. Sci. USA 2019, 116, 18119–18125. [Google Scholar] [CrossRef]
  198. Tong, H.; Nikoloski, Z. Machine learning approaches for crop improvement, Leveraging phenotypic and genotypic big data. J. Plant Physiol. 2021, 257, 153354. [Google Scholar] [CrossRef]
  199. McLoughlin, F.; Augustine, R.C.; Marshall, R.S.; Li, F.; Kisrckpatrik, L.D.; Otegui, M.S.; Viersta, R.D. Maize multi-omics reveal roles for autophagic recycling in proteome remodelling and lipid turn-over. Nat. Plants 2018, 4, 1056–1070. [Google Scholar] [CrossRef]
  200. Gupta, C.; Ramegowda, V.; Basu, S.; Pereira, A. Using network-based machine learning to predict transcription factors involved in drought resistance. Front. Genet. 2021, 12, 652189. [Google Scholar] [CrossRef]
  201. Lin, F.; Lazarus, E.Z.; Rhee, S.Y. QTG-Finder2: A generalized machine-learning algorithm for prioritizing QTL causal genes in plants. G3 Genes Genomes Genet. 2020, 10, 2411–2421. [Google Scholar] [CrossRef]
  202. Ferreira, T.B.; Pavan, W.; Fernandes, J.M.C.; Asseng, S. Coupling a Pest and disease damage module with CSM-Nwheat: A wheat crop simulation model. Trans. ASABE 2021, 64, 2061–2071. [Google Scholar] [CrossRef]
  203. Chawdhery, M.R.A.; Al-Mueed, M.; Wazed, M.A.; Emran, S.-A.; Chowdhury, M.A.H.; Hussain, S.G. Climate change impacts assessment using crop simulation model intercomparison approach in northern Indo-Gangetic Basin of Bangladesh. Int. J. Environ. Res. Public Health 2022, 19, 15829. [Google Scholar] [CrossRef] [PubMed]
  204. Montesinos-López, O.A.; Montesinos-López, A.; Crossa, J.; Gianola, D.; Hernández-Suárez, C.M.; Martín-Vallejo, J. Multi-trait, multi-environment deep learning modeling for genomic-enabled prediction of plant traits. G3 Genes Genomes Genet. 2018, 8, 3829–3840. [Google Scholar] [CrossRef] [PubMed]
  205. Jubair, S.; Tucker, J.R.; Henderson, N.; Hiebert, C.W.; Badea, A.; Domaratzki, M.; Fernando, W.G.D. Gptransformer: A transformer-based deep learning method for predicting fusarium related traits in barley. Front. Plant Sci. 2021, 12, 761402. [Google Scholar] [CrossRef]
  206. Westhues, C.C.; Simianer, H.; Beissinger, T.M. learnMET: An R package to apply machine learning methods for genomic prediction using multi-environment trial data. G3 Genes Genomes Genet. 2022, 12, jkac226. [Google Scholar] [CrossRef]
  207. Montesinos-López, O.A.; Mosqueda-González, B.A.; Montesinos-López, A.; Crossa, J. Statistical machine-learning methods for genomic prediction using the SKM library. Genes 2023, 14, 1003. [Google Scholar] [CrossRef]
  208. Sagan, V.; Coral, R.; Bhadra, S.; Alifu, H.; Al Akkad, O.; Giri, A.; Esposito, F. Hyperfidelis: A Software toolkit to empower precision agriculture with GeoAI. Remote Sens. 2024, 16, 1584. [Google Scholar] [CrossRef]
  209. Tiwari, V.; Thorp, K.; Tulbure, M.G.; Gray, J.; Kamruzzama, M.; Krupnik, T.J.; Sankarasubramanian, A.; Ardon, M. Advancing food security: Rice yield estimation framework using time-series satellite data & machine learning. PLoS ONE 2024, 19, e0309982. [Google Scholar] [CrossRef]
  210. Behera, A.; Sena, D.R.; Matheswaran, K.; Jampani, M.; Hasib, M.R.; Mondal, M.K. Using Machine Learning Tools for Salinity Forecasting to Support Irrigation Management and Decision-Making in a Polder of Coastal Bangladesh; CGIAR Initiative on Asian Mega-Deltas; International Water Management Institute (IWMI): Colombo, Sri Lanka, 2024; p. 7. [Google Scholar]
  211. Fionnagein, D.O.; Geever, M.; Farrell, J.O.; Codyre, P.; Trearty, R.; Tessema, Y.M.; Reymondin, L.; Lobogerrera, A.M.; Spillane, C.; Golden, A. Assessing climate resilience in rice production: Measuring the impact of the Millennium Challenge Corporation’s IWRM scheme in the Senegal River Valley using remote sensing and machine learning. Environ. Res. Lett. 2024, 19, 074075. [Google Scholar] [CrossRef]
  212. Danilevicz, M.F.; Bayer, P.E.; Boussaid, F.; Bennamoun, M.; Edwards, D. Maize yield prediction at an early developmental stage using multispectral images and genotype data for preliminary hybrid selection. Remote Sens. 2021, 13, 3976. [Google Scholar] [CrossRef]
  213. Abdulridha, J.; Ampatzidis, Y.; Qureshi, J.; Roberts, P. Identification and classification of downy mildew severity stages in watermelon utilizing aerial and ground remote sensing and machine learning. Front. Plant Sci. 2022, 13, 791018. [Google Scholar] [CrossRef] [PubMed]
  214. Junior, A.C.; Sant’Anna, I.C.; da Silva, M.J.; Bhering, L.L.; Nascimento, M.; Carvalho, I.R.; da Silva, J.A.G.; Cruz, C.D. Trait prediction through computational intelligence and machine learning applied to the improvement of white oat (Avena sativa L). Rev. Ceres 2024, 71, e71045. [Google Scholar] [CrossRef]
  215. Mora-Poblete, F.; Miere-Castro, D.; Junior, A.T.A.; Balach, M.; Maldonado, C. Integrating deep learning for phenomic and genomic predictive modeling of Eucalyptus trees. Ind. Crops Prod. 2024, 220, 119151. [Google Scholar] [CrossRef]
  216. Okada, M.; Barras, C.; Toda, Y.; Hamazaki, K.; Ohmori, Y.; Yamasaki, Y.; Takahashi, H.; Takanashi, H.; Tsuda, M.; Hirai, M.Y.; et al. High-throughput phenotyping of soybean biomass: Conventional trait estimation and novel latent feature extraction using UAV remote sensing and deep learning models. Plant Phenomics 2024, 6, 0244. [Google Scholar] [CrossRef]
  217. Sadeh, R.; Ben-David, R.; Hermann, I.; Peleg, Z. Spectral-genomic chain-model approach enhances the wheat yield component prediction under the Mediterranean climate. Physiol. Plant. 2024, 176, e14480. [Google Scholar] [CrossRef]
  218. Cheng, J.H.; Luo, M.T. AI-assisted genomic prediction models in cotton breeding. Cotton Genom. Genet. 2025, 16, 137–147. [Google Scholar] [CrossRef]
  219. Li, H.; Zhang, L.; Gao, S.; Wang, J. Prediction by simulation in plant breeding. Crop J. 2025, 13, 501–509. [Google Scholar] [CrossRef]
  220. Goodfellow, I.J.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative Adversarial Nets. In Proceedings of the NIPS’14: Proceedings of the 28th International Conference on Neural Information Processing Systems, Montreal, QC, Canada, 8–13 December 2014; Volume 2, pp. 2672–2680. [Google Scholar]
  221. Zhang, Y.; Yin, Y.; Zimmermann, R.; Wang, G.; Varadarajan, J.; Ng, S.-K. An enhanced GAN model for automatic satellite-to-map image conversion. IEEE Access 2020, 8, 176704–176716. [Google Scholar] [CrossRef]
  222. Zhang, T.; Fu, H.; Zhao, Y.; Cheng, J.; Guo, M.; Gu, Z.; Yang, B.; Xiao, Y.; Gao, S.; Liu, J. SkrGAN: Sketching-rendering unconditional generative adversarial networks for medical image synthesis. In Medical Image Computing and Computer Assisted Intervention—MICCAI 2019; Shen, D., Liu, S., Peters, T.M., Staib, L.H., Essert, C., Zhou, C., Yap, P.T., Khan, A., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2019; Volume 11767. [Google Scholar] [CrossRef]
  223. Romero, L.S.; Marcello, J.; Vilaplana, V. Super-resolution of sentinel-2 imagery using generative adversarial networks. Remote Sens. 2020, 12, 2424. [Google Scholar] [CrossRef]
  224. Daihong, J.; Sai, Z.; Lei, D.; Yueming, D. Multi-scale generative adversarial network for image super-resolution. Soft Comput. 2022, 26, 3631–3641. [Google Scholar] [CrossRef]
  225. Wolleb, J.; Bieder, F.; Sandkuhler, R.; Cattin, P.C. Diffusion models for medical anomaly detection. In Medical Image Computing and Computer Assisted Intervention—MICCAI 2022, 25th International Conference, Singapore, 18–22 September 2022, Proceedings, Part VIII; Springer: Cham, Switzerland, 2022; pp. 35–45. [Google Scholar]
  226. Shahbazian, R.; Trubitsyna, I. DEGAIN: Generative-Adversarial-Network-based missing data imputation. Information 2022, 13, 575. [Google Scholar] [CrossRef]
  227. Ramesh, A.; Pavlov, M.; Goh, M.G.; Gray, S.; Voss, C.; Radford, A.; Chen, M.; Sutskever, I. Zero-shot text-to-image generation. In Proceedings of the International Conference on Machine Learning (PMLR 2021), Virtual, 18–24 July 2021; pp. 8821–8831. [Google Scholar]
  228. Farooque, A.A.; Afzaal, H.; Benlamri, R.; Al-Naemi, S.; MacDonald, E.; Abbas, F.; MacLeod, K.; Ali, H. Red-green-blue to normalized difference vegetation index translation: A robust and inexpensive approach for vegetation monitoring using machine vision and generative adversarial networks. Precis. Agric. 2023, 24, 1097–1115. [Google Scholar] [CrossRef]
  229. Li, L.; Hassan, M.A.; Yang, S.; Jing, F.; Yang, M.; Rasheed, A.; Wang, J.; Xia, X.C.; He, Z.H.; Xiao, Y.G. Development of image-based wheat spike counter through a faster R-CNN algorithm and application for genetic studies. Crop J. 2022, 10, 1303–1311. [Google Scholar] [CrossRef]
  230. Jozdani, S.; Chen, D.; Pouliot, D.; Johnson, B.A. A review and meta-analysis of Generative Adversarial Networks and their applications in remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2022, 108, 102734. [Google Scholar] [CrossRef]
  231. Wijekoon, C.P.; Goodwin, P.H.; Hsiang, T. Quantifying fungal infection of plant leaves by digital image analysis using Scion Image software. J. Microbiol. Methods 2008, 74, 94–101. [Google Scholar] [CrossRef]
  232. Pethybridge, S.J.; Nelson, S.C. Leaf doctor: A new portable application for quantifying plant disease severity. Plant Dis. 2015, 99, 1310–1316. [Google Scholar] [CrossRef]
  233. Cap, Q.H.; Uga, H.; Kagiwada, S.; Iyatomi, H. LeafGAN: An effective data augmentation method for practical plant disease diagnosis. IEEE Trans. Autom. Sci. Eng. 2022, 19, 1258–1267. [Google Scholar] [CrossRef]
  234. Singh, A.K.; Rao, A.; Chattopadhyay, P.; Maurya, R.; Singh, L. Effective plant disease diagnosis using Vision Transformer trained with leafy-generative adversarial network-generated images. Expert Syst. Appl. 2024, 254, 124387. [Google Scholar] [CrossRef]
  235. Yu, L.; Du, Z.; Li, X.; Zheng, J.; Zhao, Q.; Wu, H.; Weise, D.; Yang, Y.; Zhang, Q.; Li, X.; et al. Enhancing global agricultural monitoring system for climate-smart agriculture. Clim. Smart Agric. 2025, 2, 100037. [Google Scholar] [CrossRef]
  236. Colaço, A.F.; Molin, J.P.; Rosell-Polo, J.R.R.; Escola, A. Application of light detection and ranging and ultrasonic sensors to high-throughput phenotyping and precision horticulture: Current status and challenge. Hortic. Res. 2018, 5, 35. [Google Scholar] [CrossRef]
  237. Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of bloom/yield in crop images using deep learning models for smart agriculture: A Review. Agronomy 2021, 11, 646. [Google Scholar] [CrossRef]
  238. Montesinos-López, O.A.; Montesinos-López, A.; Mosqueda-González, B.A.; Delgado-Enciso, I.; Chavira-Flores, M.; Crossa, J.; Dreisigacker, S.; Sun, J.; Ortiz, R. Genomic prediction powered by multi-omics data. Front. Genet. 2025, 16, 1636438. [Google Scholar] [CrossRef] [PubMed]
  239. European Commission (EC). Press Release Aviation: Commission Is Taking the European Drone Sector to New Heights. 2017. Available online: http://europa.eu/rapid/press-release_IP-17-1605_en.htm#_ftn2 (accessed on 27 October 2025).
  240. Rango, A.; Laliberte, A.S. Impact of flight regulations on effective use of unmanned aircraft systems for natural resources applications. J. Appl. Remote Sens. 2010, 4, 043539. [Google Scholar] [CrossRef]
  241. Martin, K. Ethical implications and accountability of algorithms. J. Bus. Ethics 2019, 160, 835–850. [Google Scholar] [CrossRef]
  242. European Commission (EC). WHITE PAPER—On Artificial Intelligence—A European Approach to Excellence and Trust, Brussels, 19.2.2020, COM(2020) 65 Final. 2020; p. 26. Available online: https://commission.europa.eu/system/files/2020-02/commission-white-paper-artificial-intelligence-feb2020_en.pdf (accessed on 27 October 2025).
  243. Asilomar Conference. Asilomar: AI Principles, Future of Life Institute. 2017. Available online: https://futureoflife.org/ai-principles/ (accessed on 7 November 2025).
  244. European Comission (EC). Ethics Guidelines for Trustworthy AI. High-Level Expert Group on AI, Directorate-General for Communications Networks, Content and Technology. 2019. Available online: https://op.europa.eu/en/publication-detail/-/publication/d3988569-0434-11ea-8c1f-01aa75ed71a1 (accessed on 27 October 2025). [CrossRef]
  245. European Commission (EC). Commission Publishes the Guidelines on Prohibited Artificial Intelligence (AI) Practices, as Defined by the AI Act. 2025. Available online: https://digital-strategy.ec.europa.eu/en/library/commission-publishes-guidelines-prohibited-artificial-intelligence-ai-practices-defined-ai-act (accessed on 27 October 2025).
  246. Statista. Volume of Data/Information Created, Captured, Copied, and Consumed Worldwide from 2010 to 2025. 2025. Available online: https://www.statista.com/statistics/871513/worldwide-data-created/ (accessed on 28 October 2025).
  247. Shankarnarayan, V.K.; Ramakrishna, H. Paradigm change in Indian agricultural practices using Big Data. Challenges and opportunities from field to plate. Inf. Process Agric. 2020, 7, 355–368. [Google Scholar] [CrossRef]
  248. Ahmed, N.; Shakoor, N. Advancing Agriculture through IoT, Big Data, and AI: A Review of Smart Technologies Enabling Sustainability. Smart Agric. Technol. 2025, 10, 100848. [Google Scholar] [CrossRef]
  249. European Commission (EC). Ethics and Data Protection, 2008. Available online: https://ec.europa.eu/info/funding-tenders/opportunities/docs/2021-2027/horizon/guidance/ethics-and-data-protection_he_en.pdf (accessed on 5 November 2025).
  250. van der Burg, S.; Wiseman, L.; Krkeljas, J. Trust in farm data sharing, reflections on the EU code of conduct for agricultural data sharing. Ethics Inf. Technol. 2021, 23, 185–198. [Google Scholar] [CrossRef]
  251. Lucock, X.; Westbrooke, V. Trusting in the “Eye in the Sky”? Farmers’ and auditors’ perceptions of drone use in environmental auditing. Sustainability 2021, 13, 13208. [Google Scholar] [CrossRef]
  252. National Farmers Federation. Australian Farm Data Code. 2000. Available online: https://nff.org.au/programs/australian-farm-data-code/ (accessed on 28 October 2025).
  253. Fleming, A.; Jakku, E.; Lim-Camacho, L.; Taylor, B.; Thorburn, P. Is big data for big farming or for everyone? Perceptions in the Australian grains industry. Agron. Sustain. Dev. 2018, 38, 24. [Google Scholar] [CrossRef]
  254. Ryan, M. Ethics of using AI and big data in agriculture. The case of a large agriculture multinational. ORBIT J. 2019, 2, 1–27. [Google Scholar] [CrossRef]
  255. Hu, Y.; Wilson, S.; Schwessinger, B.; Rathjen, R. Blurred lines: Integrating emerging technologies to advance plant biosecurity. Curr. Opin. Plant. Biol. 2020, 56, 127–134. [Google Scholar] [CrossRef] [PubMed]
  256. Ryan, M. The social and ethical impacts of artificial intelligence in agriculture, mapping the agricultural AI literature. AI Soc. 2023, 38, 2473–2485. [Google Scholar] [CrossRef]
  257. Yang, C. High resolution satellite imaging sensors for precision agriculture. Front. Agric. Sci. Eng. 2018, 5, 393–405. [Google Scholar] [CrossRef]
  258. Wang, Z.; Li, Y.; Wu, S.; Zhou, Y.; Yang, L.; Xu, Y.; Zhang, T.; Pan, Q. A survey on cybersecurity attacks and defenses for unmanned aerial systems. J. Syst. Archit. 2023, 138, 102870. [Google Scholar] [CrossRef]
  259. Pallejà, T.; Tresanchez, M.; Teixido, M.; Sanz, R.; Rosell, J.R.; Palacin, J. Sensitivity of tree volume measurement to trajectory errors from a terrestrial LIDAR scanner. Agric. Meteorol. 2010, 150, 1420–1427. [Google Scholar] [CrossRef]
  260. McCab, M.F.; Tester, M. Digital insights: Bridging the phenotype-to-genotype divide. J. Exp. Bot. 2021, 72, 2807–2810. [Google Scholar] [CrossRef]
  261. Kumar, N.; Belhumeur, P.N.; Biswas, A.; Jacobs, D.W.; Kress, W.J.; Lopez, I.C.; Soares, J.V. Leafsnap—A computer vision system for automatic plant species identification. In Computer Vision—ECCV 2012; Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2012; Volume 7573. [Google Scholar] [CrossRef]
  262. Zheng, Y.-Y.; Kong, J.-L.; Jin, X.-B.; Wang, X.-Y.; Su, T.-L.; Zuo, M. CropDeep: The crop vision dataset for deep-learning-based classification and detection in precision agriculture. Sensors 2019, 19, 1058. [Google Scholar] [CrossRef]
  263. Hughes, D.; Salathe, M. An open access repository of images on plant health to enable the development of mobile disease diagnostics through machine learning and crowdsourcing. arXiv 2015, arXiv:1511.08060. [Google Scholar] [CrossRef]
  264. Andrew, J.; Eunice, J.; Popescu, D.E.; Chowdary, M.K.; Hemanth, J. Deep learning-based leaf disease detection in crops using images for agricultural applications. Agronomy 2022, 12, 2395. [Google Scholar] [CrossRef]
  265. Singh, D.; Jain, N.; Jain, P.; Kayal, P.; Kumawat, S.; Batra, N. Plantdoc: A dataset for visual plant disease detection. arXiv 2020, arXiv:1911.10317. [Google Scholar] [CrossRef]
  266. Chiu, M.T.; Xu, X.; Wei, Y.; Huang, Z.; Schwing, A.; Brunner, R.; Khachatrian, H.; Karapetyan, H.; Dozier, I.; Rose, G.; et al. Agriculture-Vision: A large aerial image database for agricultural pattern analysis. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020. [Google Scholar] [CrossRef]
  267. Olsen, A.; Konovalov, D.A.; Philippa, B.; Ridd, P.; Wood, J.C.; Johns, J.; Banks, W.; Girgenti, B.; Kenny, O.; Whinney, J.; et al. DeepWeeds: A multiclass weed species image dataset for deep learning. Sci. Rep. 2019, 9, 2058. [Google Scholar] [CrossRef]
  268. Izquierdo-Verdiguiera, E.; Zurita-Millab, R. An evaluation of guided regularized random forest for classification and regression tasks in remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2020, 88, 102051. [Google Scholar] [CrossRef]
  269. Ohnsman, A. Here Come the Farm Robots: Startup Raises $20 Million for Autonomous Electric Tractors. Forbes. 2012. Available online: https://www.forbes.com/sites/alanohnsman/2021/03/16/here-come-the-farm-robots-startup-raises-20-million-for-autonomous-electric-tractors/?sh=edbf08f7e241 (accessed on 21 December 2025).
  270. Saiz-Rubio, V.; Roviro-Mas, F. From smart farming towards Agriculture 50: A review on crop data management. Agronomy 2020, 10, 207. [Google Scholar] [CrossRef]
  271. Mugambiwa, S.S. Sustainable agriculture and sustainable developmental goals: A case study of smallholder farmers in sub-Saharan Africa. In Sustainable Agriculture and the Environment; Farooq, M., Gogoi, N., Pisante, M., Eds.; Academic Press: Cambridge, MA, USA, 2023; Chapter 3; pp. 91–103. [Google Scholar] [CrossRef]
  272. Chen, W.L.; Lin, Y.-B.; Lin, Y.-W.; Chen, R. AgriTalk: IoT for precision soil farming of turmeric cultivation. IEEE Internet Things J. 2019, 6, 5209–5223. [Google Scholar] [CrossRef]
  273. Somitsch, E. How Farmers Harvest New Insights with Generative AI. 2024. Available online: https://www.sap.com/japan/blogs/how-farmers-harvest-new-insights-with-generative-ai (accessed on 28 October 2025).
  274. Mmbando, G.S. Harnessing artificial intelligence and remote sensing in climate-smart agriculture: The current strategies needed for enhancing global food security. Cogent Food Agric. 2025, 11, 2454354. [Google Scholar] [CrossRef]
  275. BIS. Darli the Chatbot: Transforming Smart Farming with AI to Support Small-Scale Farmersm. 2025. Available online: https://bisresearchreports.medium.com/darli-the-chatbot-transforming-smart-farming-with-ai-to-support-small-scale-farmers-1a59a01a98cd (accessed on 27 October 2025).
  276. Sharma, M.K.; Khediya, M.; Bhatt, C. FertiCal-P: An android-based decision support system (DSS) determines the NPK fertilizer recommendation by assessing pH and macronutrient of the soil. Curr. Agric. Res. 2025, 13, 288–292. [Google Scholar] [CrossRef]
  277. Hamner, B.; Bergerman, M.; Singh, S. Autonomous orchard vehicles for specialty crops production. In Proceedings of the 2011 American Society of Agricultural and Biological Engineers, Louisville, KT, USA, 7–10 August 2011; p. 1. [Google Scholar] [CrossRef]
  278. Lytridis, C.; Kaburlasos, V.G.; Pachidis, T.; Manios, M.; Vrochidou, E.; Kalampokas, T.; Chatzistamatis, S. An Overview of Cooperative Robotics in Agriculture. Agronomy 2021, 11, 1818. [Google Scholar] [CrossRef]
  279. Ray, D.K.; Ramankutty, N.; Mueller, N.D.; West, P.C.; Foley, J.A. Recent patterns of crop yield growth and stagnation. Nat. Commun. 2012, 3, 1293. [Google Scholar] [CrossRef]
  280. Ning, Y.; Liu, W.; Wang, G.-L. Balancing Immunity and Yield in Crop Plants. Trends Plant Sci. 2017, 22, 1069–1079. [Google Scholar] [CrossRef]
  281. Rahman, A.; Zhang, J. Trends in rice research, 2030 and beyond. Food Energy Secur. 2022, 12, e390. [Google Scholar] [CrossRef]
  282. Cudjoe, D.K.; Virlet, N.; Castle, M.; Riche, A.B.; Mhada, M.; Waine, T.W.; Mohareb, F.; Hawkesford, M.J. Field phenotyping for African crops: Overview and perspectives. Front. Plant Sci. 2023, 14, 1219673. [Google Scholar] [CrossRef]
  283. Smith, E.N.; van Aalst, M.; Tosens, T.; Niinemets, U.; Stich, B.; Morosinotto, T.; Alboresi, A.; Erb, T.J.; Gómez-Coronado, P.A.; Tolleter, D.; et al. Improving photosynthetic efficiency toward food security. Strategies: Advances, and perspectives. Mol. Plant 2023, 16, 1547–1563. [Google Scholar] [CrossRef]
  284. Dwivedi, S.L.; Vetukuri, R.R.; Kelbessa, B.G.; Gepts, P.; Heslop-Harrison, P.; Araujo, A.S.F.; Sharma, S.; Ortiz, R. Exploitation of rhizosphere microbiome biodiversity in plant breeding. Trends Plant Sci. 2025, 30, 1033–1045. [Google Scholar] [CrossRef]
  285. Wang, F.; Feldman, M.J.; Runie, D.F. Do not benchmark phenomic prediction against genomic prediction accuracy. Plant Phenome J. 2025, 8, e70029. [Google Scholar] [CrossRef]
  286. Ravidran, S. Cutting AI down to size. Science 2025, 387, 818–821. [Google Scholar] [CrossRef]
Figure 2. (1). Oversimplified look of a genetic algorithm (GA). (2). The basis of a GA: Herein, the phenotype is the actual solution/result we obtain from testing, whereas the genotype is the encoded version of the solution and which the algorithm works with. For simple problems, the encoded version (genotype) and the actual solution (phenotype) are the same, whereas for complex problems these are different. The population refers to the possible solutions (chromosomes) that the algorithm works with—the encoded solution of the problem—while the chromosome is a single solution in the population—a “candidate” for the best solution. As a part of the chromosome, a gene is a specific element or position in the solution, while an allele is the actual value that a gene takes within the chromosome. If a gene represents “color,” an allele might be “blue” or “green.” In this space, a chromosome can be represented in different manners: (1) as a binary value (using 0 s and 1 s or ‘yes’ and ‘no’), (2) as an integer (using numbers such as 0, 1, 2, 3), and (3) as permutation representations (using a sequence of numbers such as 0 to 9). As a real-life analogy, we want to organize a bookshelf, and the genotype will be the distinct ways of how to organize it (e.g., in list of numbers), whereas the phenotype is the way this bookshelf looks after we organize it—the actual result. The algorithm will try different genotypes (i.e., ways of organizing) to find the best solution (phenotype). The GA will try to find the best real-world solution (phenotype) by manipulating coded versions of possible solutions (genotype). (3). Example of a GA and analogy to a bookshelf organization.
Figure 2. (1). Oversimplified look of a genetic algorithm (GA). (2). The basis of a GA: Herein, the phenotype is the actual solution/result we obtain from testing, whereas the genotype is the encoded version of the solution and which the algorithm works with. For simple problems, the encoded version (genotype) and the actual solution (phenotype) are the same, whereas for complex problems these are different. The population refers to the possible solutions (chromosomes) that the algorithm works with—the encoded solution of the problem—while the chromosome is a single solution in the population—a “candidate” for the best solution. As a part of the chromosome, a gene is a specific element or position in the solution, while an allele is the actual value that a gene takes within the chromosome. If a gene represents “color,” an allele might be “blue” or “green.” In this space, a chromosome can be represented in different manners: (1) as a binary value (using 0 s and 1 s or ‘yes’ and ‘no’), (2) as an integer (using numbers such as 0, 1, 2, 3), and (3) as permutation representations (using a sequence of numbers such as 0 to 9). As a real-life analogy, we want to organize a bookshelf, and the genotype will be the distinct ways of how to organize it (e.g., in list of numbers), whereas the phenotype is the way this bookshelf looks after we organize it—the actual result. The algorithm will try different genotypes (i.e., ways of organizing) to find the best solution (phenotype). The GA will try to find the best real-world solution (phenotype) by manipulating coded versions of possible solutions (genotype). (3). Example of a GA and analogy to a bookshelf organization.
Agronomy 16 00137 g002
Figure 4. Different steps involved in the three types of algorithms that can be applied to genomic data. t-Distributed Stochastic Neighbor Embedding (t-SNE) is used for visualizing high-dimensional data by mapping it to a lower-dimensional space while preserving similarities between data points, whereas principal component analysis (PCA) reduces the dimensionality of data while preserving as much variance as possible by finding linear combinations of features. In the unsupervised machine learning methodology, historical data is a mandatory requirement. Basically, the approach involves training a dataset that includes both labeled and unlabeled data points in semi-supervised machine learning (ML) models, which mix aspects of supervised and unsupervised learning.
Figure 4. Different steps involved in the three types of algorithms that can be applied to genomic data. t-Distributed Stochastic Neighbor Embedding (t-SNE) is used for visualizing high-dimensional data by mapping it to a lower-dimensional space while preserving similarities between data points, whereas principal component analysis (PCA) reduces the dimensionality of data while preserving as much variance as possible by finding linear combinations of features. In the unsupervised machine learning methodology, historical data is a mandatory requirement. Basically, the approach involves training a dataset that includes both labeled and unlabeled data points in semi-supervised machine learning (ML) models, which mix aspects of supervised and unsupervised learning.
Agronomy 16 00137 g004
Table 4. Indices that can be used to measure in trait-related breeding agriculture using artificial intelligence.
Table 4. Indices that can be used to measure in trait-related breeding agriculture using artificial intelligence.
GroupIndicesSensor WavelengthExamples of Sensor
Broadband Greenness
[Chlorophyll content, crop biomass, N deficiency at crop senescence, Leaf Area Index (LAI)]
Normalised Difference Vegetation Index (NDVI) + visible atmospheric resistance index (VARI), RGB-based vegetation index 2 and 3 (RGBVI2 and RGBVI3)Near-infrared (NIR) and visible (VIS) regions of the electromagnetic spectrumTrimble Greenseeker Handheld NDVI Sensor; UAV imagery
Optimized soil-adjusted vegetation index (OSAVI)Red, NIR
Soil-adjusted vegetation indices (SAVI)Red, NIR
Renormalized Difference Vegetation Index (RDVI)Red, NIR
Enhance vegetation indices (EVIs)Blue, red, NIR
Color vegetation indices (CVIs)RGB sensors
Light Use EfficiencyPhotochemical Reflectance Index (PRI) GreenSRS sensor
Leaf PigmentsModified Chlorophyll Absorption Ratio Index (MCARI)Green, red, NIRFieldSpec 4; TriFlex; FRT GmbH’s Specim IQ
Chlorophyll Content Index (CCI)Green, NIR
Transformed Chlorophyll Absorption Ratio Index (TCARI)Green, red, NIR
Anthocyanin Reflectance Index 2 (ARI2)Blue, red, NIR
Carotenoid Reflectance Index 2 (CRI2)Blue, red
Water StressCrop water stress index (CWSI)RGB, thermal infraredMicaSense RedEdge, FLIR Vue TZ20; Flir A6750sc thermal camera
Water ContentWater Band Index (WBI)Red, NIRSFC/AIEE-based fluorescence sensor TPE-(An-CHO)4, Kapta™ 3000 series; i::SCAN probe
Source: Improved from [77].
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Garcia-Oliveira, A.L.; Dwivedi, S.L.; Chander, S.; Nelimor, C.; Abd El Moneim, D.; Ortiz, R.O. Breeding Smarter: Artificial Intelligence and Machine Learning Tools in Modern Breeding—A Review. Agronomy 2026, 16, 137. https://doi.org/10.3390/agronomy16010137

AMA Style

Garcia-Oliveira AL, Dwivedi SL, Chander S, Nelimor C, Abd El Moneim D, Ortiz RO. Breeding Smarter: Artificial Intelligence and Machine Learning Tools in Modern Breeding—A Review. Agronomy. 2026; 16(1):137. https://doi.org/10.3390/agronomy16010137

Chicago/Turabian Style

Garcia-Oliveira, Ana Luísa, Sangam L. Dwivedi, Subhash Chander, Charles Nelimor, Diaa Abd El Moneim, and Rodomiro Octavio Ortiz. 2026. "Breeding Smarter: Artificial Intelligence and Machine Learning Tools in Modern Breeding—A Review" Agronomy 16, no. 1: 137. https://doi.org/10.3390/agronomy16010137

APA Style

Garcia-Oliveira, A. L., Dwivedi, S. L., Chander, S., Nelimor, C., Abd El Moneim, D., & Ortiz, R. O. (2026). Breeding Smarter: Artificial Intelligence and Machine Learning Tools in Modern Breeding—A Review. Agronomy, 16(1), 137. https://doi.org/10.3390/agronomy16010137

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop