Next Article in Journal
The Use of Selected Essential Oils as an Alternative Method of Controlling Pathogenic Fungi, Weeds and Insects on Oilseed Rape (Brassica napus L.)
Previous Article in Journal
Research on Strawberry Visual Recognition and 3D Localization Based on Lightweight RAFS-YOLO and RGB-D Camera
Previous Article in Special Issue
Phenotypic Descriptors and Image-Based Assessment of Viola cornuta L. Quality Under Photoselective Shade Nets Using a Naive Bayes Classifier
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Automated Crop Measurements with UAVs: Evaluation of an AI-Driven Platform for Counting and Biometric Analysis

by
João Victor da Silva Martins
1,2,*,
Marcelo Rodrigues Barbosa Júnior
1,
Lucas de Azevedo Sales
1,
Regimar Garcia dos Santos
1,
Wellington Souto Ribeiro
2 and
Luan Pereira de Oliveira
1,*
1
Department of Horticulture, University of Georgia, Tifton, GA 31793, USA
2
Department of Agronomy, Federal University of Viçosa, Viçosa 36570-900, MG, Brazil
*
Authors to whom correspondence should be addressed.
Agriculture 2025, 15(21), 2213; https://doi.org/10.3390/agriculture15212213 (registering DOI)
Submission received: 16 September 2025 / Revised: 21 October 2025 / Accepted: 22 October 2025 / Published: 24 October 2025

Abstract

Unmanned aerial vehicles (UAVs) are transforming agriculture through enhanced data acquisition, improved monitoring efficiency, and support for data-driven decision-making. Complementing this, AI-driven platforms provide intuitive and reliable tools for advanced UAV analytics. However, their integration remains underexplored, particularly in specialty crops. Therefore, in this study, we evaluated the performance of an AI-driven web platform (Solvi) for automated plant counting and biometric trait estimation in two contrasting systems: pecan, a perennial nut crop, and onion, an annual vegetable. Ground-truth measurements included pecan tree number, tree height, and canopy area, as well as onion bulb number and diameter, the latter used for market class classification. Counting performance was assessed using precision, recall, and F1 score, while trait estimation was evaluated with linear regression analysis. UAV-based counts showed strong agreement with ground-truth data, achieving precision, recall, and F1 scores above 97% for both crops. For pecans, UAV-derived estimates of tree height (R2 = 0.98, error = 11.48%) and canopy area (R2 = 0.99, error = 23.16%) demonstrated high accuracy, while errors were larger in young trees compared with mature trees. For onions, UAV-derived bulb diameters achieved an R2 of 0.78 with a 6.29% error, and market class classification (medium, jumbo, colossal) was predicted with <10% error. These findings demonstrate that UAV imagery integrated with a user-friendly AI platform can deliver accurate, scalable solutions for biometric monitoring in both perennial and annual specialty crops, supporting applications in harvest planning, orchard management, and market supply forecasting.

1. Introduction

The integration of unmanned aerial vehicles (UAVs) with artificial intelligence (AI) is rapidly transforming agricultural systems through enhanced data acquisition, improved monitoring efficiency, and support for informed, data-driven decision-making [1]. These technologies form the foundation of modern agriculture, enabling non-destructive approaches, scalable solutions, and progress toward real-time analytics [2,3].
In modern agricultural research, automated tools such as UAVs and high-resolution cameras are widely used to rapidly and non-destructively capture large-scale trait data [4]. These image-based technologies enable researchers to simultaneously assess structural, physiological, and biochemical traits across time and space, thereby providing a comprehensive view of plant performance and variability. Moreover, this approach reduces subjectivity by minimizing human bias and supports real-time or near-real-time monitoring of crop development throughout the growing season [5,6]. At the same time, advances in web-based platforms powered by AI offer promising alternatives for crop monitoring by improving data analysis efficiency and equipping end users with accessible, user-friendly tools. For example, these platforms facilitate automated trait extraction from UAV-acquired images through intuitive interfaces, eliminating the need for advanced image-processing expertise and enabling broader implementation in both research and commercial contexts [2,7,8]. By combining AI algorithms, pre-trained computer vision models, and cloud-based processing, such platforms can reliably generate object counts, estimates of canopy size, plant height, object dimensions, and other phenotypic parameters from standard RGB or multispectral imagery [1,8]. These innovations have been widely applied in large-scale cropping systems (e.g., corn, soybeans, and cotton). However, their application in specialty crops remains limited [9]. This makes such technologies particularly valuable in specialty crop systems, where high phenotypic diversity and labor-intensive measurements constrain the feasibility of traditional monitoring methods.
As digital tools become more integrated into agricultural management, ensuring their accessibility and usability, particularly for smallholder farmers and non-expert users, has become essential [10,11]. One example is Solvi [12], a drone-based analytics platform designed for field trial assessments. Founded in 2015 with the goal of making advanced drone technology accessible and easy to use, Solvi originated as a pilot project with the Swedish University of Agricultural Sciences. Today, it is widely used by farmers, agronomists, and researchers to support more efficient and sustainable crop management. Due to its capabilities, Solvi has been applied in a few studies for image-based analysis, including the determination of in-season nitrogen rates [13] and prediction of grain protein concentration in winter wheat [14], as well as the estimation of leaf nitrogen content, leaf area, berry yield [15], and monitoring and prediction of wild blueberry [16]. However, despite these advantages in data analysis and time-saving potential, its application remains limited in the literature, particularly with respect to biometric and phenotypic data in specialty crops. This gap is especially evident in perennial and vegetable crops such as pecan and onion, which, despite their high economic value and spatially stable planting structures, have received considerably less attention than annual field crops [17].
Biometric and phenotypic data play a central role in modern crop management. In perennial crops such as pecan, canopy attributes such as area and tree height serve as key indicators of plant vigor, productivity potential, and overall orchard health. These measurements also enable producers to manage inputs (e.g., water, chemicals, fertilizers) more efficiently [18,19]. In vegetable crops such as onions, bulb size is directly linked to market classification standards and yield estimation, and strongly influences consumer preferences [20]. For both pecan and onion, conventional trait analysis still relies heavily on manual measurements. The integration of AI-based tools that require minimal user input can therefore broaden accessibility for growers, researchers, and Extension agents, while offering real-time decision support across diverse production environments [7,11].
Therefore, based on the premise that UAV- and AI-driven digital agriculture platforms can provide reliable phenotypic measurements across diverse crop morphologies while offering a user-friendly and scalable alternative to traditional field methods, we aimed to evaluate the performance of a commercial AI-powered digital agriculture web platform (Solvi) for automated biometric analysis in specialty crops (pecan and onion) using UAV-acquired imagery.

2. Materials and Methods

2.1. Study Environment

The study was conducted in two experimental fields at the University of Georgia, Georgia, USA (Figure 1A). The fields were designated as Field 1 (Pecan; 31°30′31.0″ N 83°39′06.2″ W) and Field 2 (Onions; 32°01′08.6″ N 82°13′19.6″ W), selected to represent a range of crop types, growth architectures, and phenotyping challenges. Field 1 was located at Ponder Farms in Tifton, GA, USA (Figure 1B), and the data collection was performed on 28 October 2024. It consisted of a pecan orchard with trees at three distinct developmental stages: young (1–5 years after transplanting), mid-age (5–10 years), and mature (over 10 years). Trees were transplanted in a 14 × 14 m grid. Field 2 was located at the Vidalia Onion and Vegetable Research Center in Reidsville, GA, USA (Figure 1C), and the data collection was performed on 15 May 2025. The onions were cultivated on beds spaced 1.8 m center-to-center. Seedlings were transplanted at 0.3 m between rows and 0.1 m between plants, with four rows of onions per bed, totaling 222,000 plants per hectare.
The climate of the region is classified as humid subtropical, with an average annual precipitation of 1400 mm and an average temperature of 20 °C. Moreover, cultivation practices were conducted in accordance with regional environmental conditions and the specific agronomic requirements of the crops.

2.2. UAV Image Data Collection

For both fields, RGB images were acquired using a multirotor UAV (DJI Mavic 3 Multispectral, Shenzhen, China) equipped with a dual-camera system. The primary RGB camera features a 4/3-inch CMOS sensor with an effective resolution of 20 megapixels (5280 × 3956 pixels), a field of view (FOV) of 84°, and a 24 mm equivalent focal length (approximately 35 mm in full-frame format). The UAV also includes a 1/2.8-inch multispectral camera with five bands (Green, Red, RedEdge, and NIR), each with a resolution of 5 megapixels, a 25 mm equivalent focal length, and a 73.91° FOV. To ensure image quality and radiometric consistency, the UAV is equipped with a 3-axis mechanical gimbal (tilt, roll, and pan) and a sunlight sensor that records real-time irradiance for automatic radiometric correction during post-processing. In this study, only the RGB images were used. The UAV also integrates an RTK-GNSS module capable of receiving signals from GPS, GLONASS, BeiDou, and Galileo constellations. During the flights, positioning accuracy was maintained within 5 cm horizontally and vertically.
The flights were planned and executed using a mobile app (DJI Pilot 2, Shenzhen, China). All flights were conducted around solar noon (±1 h) under clear sky conditions to minimize shadow effects. Flight parameters were adjusted for each field to capture high-resolution imagery of the respective targets. For Field 1, the UAV was operated at a speed of 3 m/s, an altitude of 74 m above ground level (AGL), and with 80% frontal and side image overlap, resulting in a ground sample distance (GSD) of approximately 2.5 cm/pixel. For Field 2, the UAV was also operated at 3 m/s, but at an altitude of 12 m AGL, and with 85% frontal and side image overlap, yielding a GSD of 0.33 cm/pixel.

2.3. Ground-Truth Data Collection

Field 1: For tree counting, we considered all the trees in the field, totaling 242 trees. For tree height and canopy area measurements, we selected 18 pecan trees (6 for each development stage; Section 2.1) (Figure 2). Tree height was measured manually using a measuring tape, considering the vertical distance from ground level to the highest vegetative point. Canopy area was determined using a photogrammetry-driven approach with structure-from-motion software (Agisoft Metashape Professional 2.1.1, St. Petersburg, Russia). High-resolution UAV imagery was processed to generate orthomosaics. The canopy area was then calculated by manually delineating the horizontal projection of each tree canopy using the software’s built-in tools. This combination of manual height measurements and computer-aided canopy analysis was treated as ground-truth data. The pecan ground-truth measurements were later compared to the outputs produced by the AI-driven web platform (Solvi), which automatically counted and estimated tree height and canopy area from UAV imagery.
Field 2: For bulb counting, we delineated five plots of 10 m2 and counted all onions within them, totaling 697 onions. For diameter measurements, 20 georeferenced sample points were established across the field. At each point, ten onions were measured and labeled (Figure 3). Bulb diameter was measured using digital calipers at the equatorial point of each analyzed bulb. No soil removal was performed; onions remained under standard field conditions throughout data collection. Bulb diameter was also used for market classification: medium (<76.2 mm), jumbo (76.2–95.3 mm), and colossal (>95.3 mm) [21]. According to GDA [21] standards, there is a 5% tolerance for onions smaller than the minimum specified diameter and a 10% tolerance for larger onions. The diameter measurements and market classifications were calculated and later compared to the AI-driven web platform (Solvi).

2.4. AI-Driven Web Platform (Solvi)

UAV images collected in this study were processed using an AI-driven web platform (Solvi [12], Gothenburg, Sweden). The software supports imagery from a wide range of UAV-mounted sensors, including both RGB and multispectral cameras. After uploading, individual images were automatically stitched into georeferenced orthomosaics, with digital surface models (DSMs) generated prior to orthorectification; alternatively, users can upload their own DSMs. For tree height estimation, each detected tree is buffered by 1 m, and the lowest point within this buffer is defined as the ground level. Tree height is then calculated as the difference between the highest and lowest elevation points within the buffered area.
For automatic boundary creation, the user needs to provide the platform with a set of hand-drawn (annotation) examples of plant boundaries. The platform then fine-tunes its pre-trained model using these examples to improve measurement accuracy. The model architecture is based on Mask R-CNN with a ResNet101 backbone and Feature Pyramid Network (FPN), trained with large-scale jittering. Hand-drawn examples were created by manually outlining the visible canopy contours, with the operator zooming in and out to ensure precise edge delineation and complete boundary closure. In general, a greater number of examples results in better detection. For the pecan field, we created 12 examples, totaling 50 identified trees. For the onion field, we created 6 examples, totaling 298 identified onions. The number of examples was adjusted iteratively based on performance: initial inputs did not yield satisfactory results; therefore, additional examples were added until the outputs stabilized. Results were visualized as interactive maps, allowing for comparisons among plots and classification based on user-defined variables. All processed data can be exported in standard formats (e.g., CSV, SHP, TIFF) for downstream statistical analysis or accessed programmatically via an application programming interface (API). In this study, we used these segmentation and measurement tools to derive plant-level metrics from onion and pecan datasets, enabling direct comparisons with ground-truth observations (Figure 4).

2.5. Data Analysis

For counting measurement, we used the precision, recall, and F1 Score metrics to assess the platform’s efficiency. Additionally, for tree height, canopy area, and bulb diameter, linear regression models were performed to model estimated and observed values. The performance of the models was assessed using the coefficient of determination (R2), mean absolute error (MAE), root mean squared error (RMSE), and mean absolute percentage error (MAPE). For better validation, pecan tree analysis was stratified by developmental stage (young, mid-age, and mature). Conversely, onion analysis was grouped by market class (medium, jumbo, and colossal). Moreover, supplementary statistical analyses (interquartile range (IQR) and one-way ANOVA, p < 0.05) were performed to identify potential outliers and to evaluate differences between observed and estimated measurements, respectively. All statistical computations and data visualizations were conducted using Microsoft Excel.

3. Results

3.1. Pecan Analysis: Counting, Tree Height, and Canopy Area

Regarding pecan tree counting, the platform demonstrated high accuracy in identifying trees (Table 1). Out of 242 trees in the field, 240 were correctly detected, while 2 trees were missed. Additionally, only 3 false positives were recorded. Overall, the detection performance was strong, achieving a precision, recall, and F1 score of above 0.98.
Regression analysis demonstrated that the AI-driven web platform provided highly accurate estimations of pecan tree biometric parameters compared to ground-truth data. For tree height, the overall regression showed a strong linear relationship, with an R2 of 0.98 and low error metrics (MAE = 0.72 m, RMSE = 0.93 m, and MAPE = 11.48%) (Figure 5).
Moreover, when results were split by tree age, the effectiveness of the estimations varied considerably. For young trees, the estimated values showed greater variability and deviated more from the observed values, resulting in R2 = 0.66, MAE = 0.62 m, RMSE = 0.71 m, and MAPE = 19.43%. For mid-age trees, estimated values were closer to the observed values and more consistent (R2 = 0.90, MAE = 0.41 m, RMSE = 0.45 m, MAPE = 5.56%). For mature trees, results were also more consistent compared to young trees (R2 = 0.81, MAE = 1.14 m, RMSE = 1.36 m, and MAPE = 8.98%). Additionally, the regression models exhibited an interesting pattern: young and mid-age trees were underestimated, while mature trees were overestimated compared to the observed data. However, the supplementary statistical comparison using a one-way ANOVA revealed a significant difference only for young trees (p = 0.032), indicating a slight underestimation of height values by the model at early growth stages (Figure S1). In contrast, no statistically significant differences were found for mid-age (p = 0.212) and mature trees (p = 0.178), confirming a strong consistency between estimated and observed measurements for advanced developmental stages.
Regarding canopy area estimation, the results achieved a higher efficiency (Figure 6). The estimated and observed data reached an almost perfect fit (R2 = 0.99) while also producing low errors (MAE = 2.89 m2, RMSE = 3.75 m2, and MAPE = 23.16%). When results were analyzed by tree age, the estimated canopy area for young trees was less effective (R2 = 0.63, MAE = 0.76 m2, RMSE = 0.82 m2, and MAPE = 46.70%), whereas mid-age and mature trees exhibited high precision and accuracy metrics (R2 > 0.99, MAE = 3.15–4.75 m2, RMSE = 3.29–5.55 m2, and MAPE = 5.24–17.54%). Results for young trees underestimated the observed values. Conversely, results for mid-age and mature trees were very close to the observed values, with only slight underestimation. The ANOVA indicated a significant difference in canopy area for young trees (p = 0.011), indicating model underestimation at early stages (Figure S2). Conversely, no significant differences were found for mid-age (p = 0.226) or mature trees (p = 0.332), confirming strong agreement between estimated and observed values as canopy structure developed.

3.2. Onion Analysis: Onion Counting and Bulb Diameter

The AI-driven platform demonstrated high effectiveness in counting onion bulbs (Table 2). Out of 697 onions present in the target area, the platform successfully detected 686, with only 10 instances classified as false positives. Overall, the evaluation metrics highlighted the reliability of the platform, achieving precision, recall, and F1 scores all above 0.97.
The regression analysis yielded a significant linear relationship between the estimated and observed data (Figure 7). The estimated diameter values explained 78% of the variability in the ground-truth data while maintaining relatively low error metrics (MAE = 5.42 mm, RMSE = 6.80 mm, and MAPE = 6.29%). When analyzed by market class, the estimated data also produced promising results, with errors ranging from 4.9% to 10%, which are considered low. For the medium class, estimates tended to slightly overestimate observed values; however, the error remained below 10%. For the jumbo class, the results were particularly strong, with an error of only 4.90%. Furthermore, fewer than 4% of estimates overestimated bulb diameter, while fewer than 1% underestimated it. For the colossal class, estimates showed a tendency to underestimate bulb diameter, but the error remained low (MAPE = 6.62%). The comparison between observed and estimated bulb diameters revealed no significant difference for medium (p = 0.056) and jumbo (p = 0.102) classes, indicating strong agreement between field and UAV-based measurements. In contrast, a very highly significant difference was found for colossal trees (p < 0.001), suggesting that the model tended to underestimate diameters for the largest bulbs.

4. Discussion

Several studies have highlighted the advantages and limitations of UAV-based analysis and explored recent advances in AI [22,23,24]. However, most research has focused on field crops, leaving a gap in applications tailored to specialty crops. In particular, the use of commercially available AI-driven web platforms for UAV-based data remains largely underexplored in these systems. Our findings demonstrated that such platforms can accurately count the number of trees and onions, estimate tree height, canopy area, and bulb diameter under real-world field conditions, underscoring their potential for practical implementation in precision horticulture.
In pecan, the overall regression metrics were strong (R2 > 0.98), and errors were <23.16%. However, the model efficiency varied by developmental stage. For instance, young trees exhibited higher errors for height (19.43%) and canopy area (46.70%) measurement. Although MAE and RMSE were numerically smaller for young trees due to their limited size, relative errors (MAPE) were higher, indicating proportionally lower model efficiency rather than absolute error magnitude. Likely, their smaller stature and reduced pixel representation made the tree detection and further measurement more challenging. These findings are consistent with previous studies, which have shown that limited structural definition in early growth stages impairs UAV-based feature extraction [18,25]. Moreover, tree height estimates may have been affected by GNSS positional accuracy, particularly in altitude, which is generally associated with higher error. Similar findings have been reported in previous studies [26,27]. Conversely, canopy area estimation followed a similar pattern, with model performance improving as trees matured. MAPE decreased from 46.7% in young trees to just 5.5% in mature ones, indicating enhanced precision when applied to well-developed canopies with clearly defined geometry and greater visual separability from the background. To verify that the higher MAPE value was not driven by abnormal values, an outlier analysis using the interquartile range (IQR) method was performed, revealing no outliers in the dataset (Figures S1 and S2). Therefore, no data removal or cleaning was applied, and all analyses were conducted using the original dataset.
Canopy-related traits such as height and canopy area are agronomically critical for managing tree vigor, input allocation, and productivity in perennial nut crops. In Pecan orchards, these attributes guide key decisions including irrigation scheduling, disease control, and yield potential modeling [18,19,25]. These attributes also contribute to the decision-making for chemical applications. The ability to extract these traits rapidly and non-destructively using UAV-based tools provides substantial value for both research and commercial orchard management. Perennial crops such as pecan present both opportunities and specific challenges for the application of precision agriculture technologies [28]. Their fixed spatial layout is well-suited to the use of UAVs and stationary sensors, but complicates tasks such as yield monitoring and fruit detection due to the three-dimensional variability of the canopy [17]. Within the precision agriculture framework proposed by Bhakta et al. [29], comprising data collection, data analysis, decision-making, and variable rate application, the platform evaluated in this study directly addresses the first two layers. It enables high-throughput trait acquisition and real-time analysis via cloud-based processing, offering an accessible and scalable alternative to more technically demanding phenotyping systems.
Onion is a high-value horticultural crop whose bulb size is a key determinant of both yield and market classification. Market classes such as medium, jumbo, and colossal are directly tied to consumer preferences and pricing strategies, making size prediction a valuable indicator of economic potential. Estimating market class distribution enables stakeholders to make informed decisions regarding logistics and field-level variability [20]. In this study, size estimation errors varied across market classes. For instance, jumbo bulbs had the lowest relative error (4.9%), while medium-sized bulbs showed more variability (9.9%). This trend matches previous findings suggesting larger, more regularly shaped objects are easier to segment and measure using RGB imagery. In contrast, smaller or less defined targets may suffer from lower resolution and reduced object detection accuracy [30]. However, in this study, larger bulbs tended to be slightly underestimated (Figure S3). This effect is likely associated with partial soil coverage at the bulb perimeter and the irregular curvature of the exposed bulb surface, which can cause the visible portion of the bulb in RGB images to appear smaller than its true equatorial diameter. Additionally, mild shadowing and saturation effects in bright field conditions may have contributed to reduced pixel contrast at the bulb edges, further influencing underestimation.
Our results expand this body of work by demonstrating that UAV-derived imagery, processed through an AI-driven platform, can directly estimate bulb diameter and classify onions into the official market categories. Such in-field estimations of market class distribution provide actionable information for growers, supporting early decisions on harvest scheduling, storage allocation, and marketing strategies. Anticipating the proportion of medium, jumbo, or colossal onions before harvest enables producers to align with contractual demands and optimize postharvest handling. Furthermore, mapping within-field variability in bulb size creates opportunities for targeted management interventions, such as site-specific fertilization or irrigation, to enhance uniformity and reduce losses. Collectively, these applications highlight the added value of UAV-based monitoring in onion production, where profitability is closely linked to size grading and market class compliance.
To the best of our knowledge, this is the first study to quantitatively evaluate the performance of the AI-driven web platform Solvi for tree counting and biometric estimations. Previous research has primarily relied on alternative image-processing pipelines, such as image stitching [13,14,15,16]. Consequently, direct comparisons with Solvi-based studies are not yet possible, which limits cross-study benchmarking but underscores the novelty and relevance of the present work.
Finally, our results underscore the potential of the AI-driven platforms to democratize digital monitoring of vegetable crops. Their intuitive interfaces, cloud-based automation, and reliable trait extraction make them particularly useful for growers, extension professionals, and researchers with limited access to technical infrastructure. However, caution is warranted when applying such tools to early growth stages or small structures.

5. Conclusions

The AI-powered web platform evaluated in this study demonstrated high accuracy in biometric trait estimation across morphologically distinct specialty crops, including a perennial tree (pecan) and an annual vegetable (onion). UAV-derived measurements of tree height and canopy area in pecan, as well as bulb diameter in onion, showed strong agreement with ground-truth data. Prediction errors decreased with increasing structural maturity and bulb size, reinforcing the platform’s suitability for automated morphological measurements under field conditions. These results confirm the viability of integrating accessible, cloud-based AI tools into operational precision horticulture workflows. Future work should expand validation across different phenological stages, environments, and crop types, and explore the integration of complementary sensing technologies to enhance performance in early developmental and structurally complex scenarios. Moreover, the platform could broaden its functionalities to include models for yield and quality prediction across a wider range of crops. Furthermore, it could also expand to alternative data sources platforms such as satellites or ground-based robotics, thereby improving both scalability and resolution.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/agriculture15212213/s1, Figure S1: Interquartile range and one-way ANOVA for tree height; Figure S2: Interquartile range and one-way ANOVA for canopy area; and Figure S3: Interquartile range and one-way ANOVA for bulb diameter.

Author Contributions

Conceptualization, J.V.d.S.M., M.R.B.J. and L.P.d.O.; methodology, J.V.d.S.M., M.R.B.J. and L.P.d.O.; software, J.V.d.S.M., M.R.B.J. and L.P.d.O.; validation, J.V.d.S.M., M.R.B.J. and L.P.d.O.; formal analysis, J.V.d.S.M. and M.R.B.J.; investigation, J.V.d.S.M., M.R.B.J., L.d.A.S., R.G.d.S. and L.P.d.O.; resources, L.P.d.O.; data curation, J.V.d.S.M. and M.R.B.J.; writing—original draft preparation, J.V.d.S.M.; writing—review and editing, J.V.d.S.M., M.R.B.J., L.d.A.S., R.G.d.S., W.S.R. and L.P.d.O.; visualization, J.V.d.S.M., M.R.B.J., L.d.A.S., R.G.d.S., W.S.R. and L.P.d.O.; supervision, L.P.d.O.; project administration, L.P.d.O.; funding acquisition, J.V.d.S.M., M.R.B.J. and L.P.d.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Georgia Commodity Commission for Pecans, grant number AWD00019285, and the Vidalia Onions Committee, grant number FP00034280.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

We would like to thank the Precision Horticulture Laboratory of the Department of Horticulture at the University of Georgia. The infrastructural support provided by this institution was instrumental in facilitating the data collection, analysis, and visualization processes involved in this study. Additionally, we would like to thank Solvi (https://solvi.ag) for the support and software availability.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AGLAbove ground level
AIArtificial intelligence
APIApplication programming interface
GSDGround sample distance
MAEMean absolute error
MAPEMean absolute percentage error
R2Coefficient of determination
RMSERoot mean squared error
UAVsUnmanned aerial vehicles

References

  1. Rashid, A.B.; Kausik, A.K.; Khandoker, A.; Siddque, S.N. Integration of Artificial Intelligence and IoT with UAVs for Precision Agriculture. Hybrid Adv. 2025, 10, 100458. [Google Scholar] [CrossRef]
  2. Agrawal, J.; Arafat, M.Y. Transforming Farming: A Review of AI-Powered UAV Technologies in Precision Agriculture. Drones 2024, 8, 664. [Google Scholar] [CrossRef]
  3. Barbosa Júnior, M.R.; de Almeida Moreira, B.R.; dos Santos Carreira, V.; de Brito Filho, A.L.; Trentin, C.; de Souza, F.L.P.; Tedesco, D.; Setiyono, T.; Flores, J.P.; Ampatzidis, Y.; et al. Precision Agriculture in the United States: A Comprehensive Meta-Review Inspiring Further Research, Innovation, and Adoption. Comput. Electron. Agric. 2024, 221, 108993. [Google Scholar] [CrossRef]
  4. Zhang, S.; Wang, X.; Lin, H.; Qiang, Z. A Review of the Application of UAV Multispectral Remote Sensing Technology in Precision Agriculture. Smart Agric. Technol. 2025, 12, 101406. [Google Scholar] [CrossRef]
  5. Akter, T.; Bhattacharya, T.; Kim, J.-H.; Kim, M.S.; Baek, I.; Chan, D.E.; Cho, B.-K. A Comprehensive Review of External Quality Measurements of Fruits and Vegetables Using Nondestructive Sensing Technologies. J. Agric. Food Res. 2024, 15, 101068. [Google Scholar] [CrossRef]
  6. Perez-Sanz, F.; Navarro, P.J.; Egea-Cortines, M. Plant Phenomics: An Overview of Image Acquisition Technologies and Image Data Analysis Algorithms. Gigascience 2017, 6, gix092. [Google Scholar] [CrossRef] [PubMed]
  7. Bongomin, O.; Lamo, J.; Guina, J.M.; Okello, C.; Ocen, G.G.; Obura, M.; Alibu, S.; Owino, C.A.; Akwero, A.; Ojok, S. UAV Image Acquisition and Processing for High-throughput Phenotyping in Agricultural Research and Breeding Programs. Plant Phenome J. 2024, 7, e20096. [Google Scholar] [CrossRef]
  8. Maddikunta, P.K.R.; Pham, Q.-V.; Prabadevi, B.; Deepa, N.; Dev, K.; Gadekallu, T.R.; Ruby, R.; Liyanage, M. Industry 5.0: A Survey on Enabling Technologies and Potential Applications. J. Ind. Inf. Integr. 2022, 26, 100257. [Google Scholar] [CrossRef]
  9. Tanaka, T.S.T.; Wang, S.; Jørgensen, J.R.; Gentili, M.; Vidal, A.Z.; Mortensen, A.K.; Acharya, B.S.; Beck, B.D.; Gislum, R. Review of Crop Phenotyping in Field Plot Experiments Using UAV-Mounted Sensors and Algorithms. Drones 2024, 8, 212. [Google Scholar] [CrossRef]
  10. Yuan, Y.; Sun, Y. Practices, Challenges, and Future of Digital Transformation in Smallholder Agriculture: Insights from a Literature Review. Agriculture 2024, 14, 2193. [Google Scholar] [CrossRef]
  11. Bin Rashid, A.; Kausik, M.A.K. AI Revolutionizing Industries Worldwide: A Comprehensive Overview of Its Diverse Applications. Hybrid Adv. 2024, 7, 100277. [Google Scholar] [CrossRef]
  12. Solvi. Drone-Based Analytics for Field Trial Assessments. Available online: https://solvi.ag (accessed on 16 September 2025).
  13. Piikki, K.; Söderström, M.; Stadig, H. Remote Sensing and On-Farm Experiments for Determining in-Season Nitrogen Rates in Winter Wheat—Options for Implementation, Model Accuracy and Remaining Challenges. Field Crops Res. 2022, 289, 108742. [Google Scholar] [CrossRef]
  14. Wolters, S.; Söderström, M.; Piikki, K.; Börjesson, T.; Pettersson, C.-G. Predicting Grain Protein Concentration in Winter Wheat (Triticum aestivum L.) Based on Unpiloted Aerial Vehicle Multispectral Optical Remote Sensing. Acta Agric. Scand. Sect. B Soil Plant Sci. 2022, 72, 788–802. [Google Scholar] [CrossRef]
  15. Anku, K.E.; Percival, D.C.; Lada, R.; Heung, B.; Vankoughnett, M. Remote Estimation of Leaf Nitrogen Content, Leaf Area, and Berry Yield in Wild Blueberries. Front. Remote Sens. 2024, 5, 1414540. [Google Scholar] [CrossRef]
  16. Anku, K.; Percival, D.; Vankoughnett, M.; Lada, R.; Heung, B. Monitoring and Prediction of Wild Blueberry Phenology Using a Multispectral Sensor. Remote Sens. 2025, 17, 334. [Google Scholar] [CrossRef]
  17. Tu, S.; Xue, Y.; Zheng, C.; Qi, Y.; Wan, H.; Mao, L. Detection of Passion Fruits and Maturity Classification Using Red-Green-Blue Depth Images. Biosyst. Eng. 2018, 175, 156–167. [Google Scholar] [CrossRef]
  18. Taha, M.F.; Mao, H.; Zhang, Z.; Elmasry, G.; Awad, M.A.; Abdalla, A.; Mousa, S.; Elwakeel, A.E.; Elsherbiny, O. Emerging Technologies for Precision Crop Management Towards Agriculture 5.0: A Comprehensive Overview. Agriculture 2025, 15, 582. [Google Scholar] [CrossRef]
  19. Longchamps, L.; Tisseyre, B.; Taylor, J.; Sagoo, L.; Momin, A.; Fountas, S.; Manfrini, L.; Ampatzidis, Y.; Schueller, J.K.; Khosla, R. Yield Sensing Technologies for Perennial and Annual Horticultural Crops: A Review. Precis. Agric. 2022, 23, 2407–2448. [Google Scholar] [CrossRef]
  20. Júnior, M.R.B.; de Azevedo Sales, L.; dos Santos, R.G.; Vargas, R.B.S.; Tyson, C.; de Oliveira, L.P. Forecasting Yield and Market Classes of Vidalia Sweet Onions: A UAV-Based Multispectral and Texture Data-Driven Approach. Smart Agric. Technol. 2025, 10, 100808. [Google Scholar] [CrossRef]
  21. Georgia Department of Agriculture. Vidalia Onions. Available online: https://agr.georgia.gov/vidalia-onions (accessed on 14 November 2024).
  22. Sharma, H.; Sidhu, H.; Bhowmik, A. Remote Sensing Using Unmanned Aerial Vehicles for Water Stress Detection: A Review Focusing on Specialty Crops. Drones 2025, 9, 241. [Google Scholar] [CrossRef]
  23. Al-Tamimi, N.; Langan, P.; Bernád, V.; Walsh, J.; Mangina, E.; Negrão, S. Capturing Crop Adaptation to Abiotic Stress Using Image-Based Technologies. Open Biol. 2022, 12, 210353. [Google Scholar] [CrossRef]
  24. Jafarbiglu, H.; Pourreza, A. A Comprehensive Review of Remote Sensing Platforms, Sensors, and Applications in Nut Crops. Comput. Electron. Agric. 2022, 197, 106844. [Google Scholar] [CrossRef]
  25. Olson, L.G.; Coops, N.C.; Moreau, G.; Hamelin, R.C.; Achim, A. The Assessment of Individual Tree Canopies Using Drone-Based Intra-Canopy Photogrammetry. Comput. Electron. Agric. 2025, 234, 110200. [Google Scholar] [CrossRef]
  26. Psiroukis, V.; Papadopoulos, G.; Kasimati, A.; Tsoulias, N.; Fountas, S. Cotton Growth Modelling Using UAS-Derived DSM and RGB Imagery. Remote Sens. 2023, 15, 1214. [Google Scholar] [CrossRef]
  27. de Oliveira, M.P.; Cardoso, P.H.; Oliveira, R.P.d.; Barbosa Júnior, M.R.; da Silva, R.P. Mapping Gaps in Sugarcane Fields Using UAV-RTK Platform. Agriculture 2023, 13, 1241. [Google Scholar] [CrossRef]
  28. Wang, H.; Wood, E. The Application of Precision Agriculture Technologies in US Pecan Production: Challenges and Opportunities. West. Econ. Forum 2021, 19, 20–27. [Google Scholar]
  29. Bhakta, I.; Phadikar, S.; Majumder, K. State-of-the-art Technologies in Precision Agriculture: A Systematic Review. J. Sci. Food Agric. 2019, 99, 4878–4888. [Google Scholar] [CrossRef]
  30. Nikouei, M.; Baroutian, B.; Nabavi, S.; Taraghi, F.; Aghaei, A.; Sajedi, A.; Moghaddam, M.E. Small Object Detection: A Comprehensive Survey on Challenges, Techniques and Real-World Applications. Intell. Syst. Appl. 2025, 27, 200561. [Google Scholar] [CrossRef]
Figure 1. United States map highlighting the State of Georgia (A); pecan field (B); onion field (C).
Figure 1. United States map highlighting the State of Georgia (A); pecan field (B); onion field (C).
Agriculture 15 02213 g001
Figure 2. Ground-truth data collection for the pecan field. Ladder and pole for measuring tree height (left). Computer-aided measurement of canopy area (right).
Figure 2. Ground-truth data collection for the pecan field. Ladder and pole for measuring tree height (left). Computer-aided measurement of canopy area (right).
Agriculture 15 02213 g002
Figure 3. Ground-truth data collection for the onion field: orthomosaic display (left); sample point demonstration (middle); data classification (right). UAV imagery was acquired under standard field conditions without soil removal.
Figure 3. Ground-truth data collection for the onion field: orthomosaic display (left); sample point demonstration (middle); data classification (right). UAV imagery was acquired under standard field conditions without soil removal.
Agriculture 15 02213 g003
Figure 4. Pecan and onion measurements using the AI-based web platform (Solvi).
Figure 4. Pecan and onion measurements using the AI-based web platform (Solvi).
Agriculture 15 02213 g004
Figure 5. Linear regression between estimated and observed values of pecan tree height across three development stages: young (gray), mid-age (yellow), and mature (blue). The dashed black line represents the 1:1 reference, while the solid red line indicates the fitted regression model.
Figure 5. Linear regression between estimated and observed values of pecan tree height across three development stages: young (gray), mid-age (yellow), and mature (blue). The dashed black line represents the 1:1 reference, while the solid red line indicates the fitted regression model.
Agriculture 15 02213 g005
Figure 6. Linear regression between estimated and observed values of canopy area across three development stages: young (plum), mid-age (green), and mature (orange). The dashed black line represents the 1:1 reference, while the solid red line indicates the fitted regression model.
Figure 6. Linear regression between estimated and observed values of canopy area across three development stages: young (plum), mid-age (green), and mature (orange). The dashed black line represents the 1:1 reference, while the solid red line indicates the fitted regression model.
Agriculture 15 02213 g006
Figure 7. Linear regression between estimated and observed values of bulb diameter across three market classes: medium (brown), jumbo (red), and colossal (dark blue). The dashed black line represents the 1:1 reference, while the solid red line indicates the fitted regression model. A total of 200 onions were measured for medium (n = 33), jumbo (n = 103), and colossal (n = 64).
Figure 7. Linear regression between estimated and observed values of bulb diameter across three market classes: medium (brown), jumbo (red), and colossal (dark blue). The dashed black line represents the 1:1 reference, while the solid red line indicates the fitted regression model. A total of 200 onions were measured for medium (n = 33), jumbo (n = 103), and colossal (n = 64).
Agriculture 15 02213 g007
Table 1. Pecan tree counting analysis results.
Table 1. Pecan tree counting analysis results.
MetricValue
Precision0.988
Recall0.992
F1 Score0.990
Table 2. Onion counting analysis results.
Table 2. Onion counting analysis results.
MetricValue
Precision0.985
Recall0.970
F1 Score0.978
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Martins, J.V.d.S.; Barbosa Júnior, M.R.; Sales, L.d.A.; Santos, R.G.d.; Ribeiro, W.S.; Oliveira, L.P.d. Automated Crop Measurements with UAVs: Evaluation of an AI-Driven Platform for Counting and Biometric Analysis. Agriculture 2025, 15, 2213. https://doi.org/10.3390/agriculture15212213

AMA Style

Martins JVdS, Barbosa Júnior MR, Sales LdA, Santos RGd, Ribeiro WS, Oliveira LPd. Automated Crop Measurements with UAVs: Evaluation of an AI-Driven Platform for Counting and Biometric Analysis. Agriculture. 2025; 15(21):2213. https://doi.org/10.3390/agriculture15212213

Chicago/Turabian Style

Martins, João Victor da Silva, Marcelo Rodrigues Barbosa Júnior, Lucas de Azevedo Sales, Regimar Garcia dos Santos, Wellington Souto Ribeiro, and Luan Pereira de Oliveira. 2025. "Automated Crop Measurements with UAVs: Evaluation of an AI-Driven Platform for Counting and Biometric Analysis" Agriculture 15, no. 21: 2213. https://doi.org/10.3390/agriculture15212213

APA Style

Martins, J. V. d. S., Barbosa Júnior, M. R., Sales, L. d. A., Santos, R. G. d., Ribeiro, W. S., & Oliveira, L. P. d. (2025). Automated Crop Measurements with UAVs: Evaluation of an AI-Driven Platform for Counting and Biometric Analysis. Agriculture, 15(21), 2213. https://doi.org/10.3390/agriculture15212213

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop