Next Article in Journal
Optimization of Liquid Manure Injector Designs for Cover Crop Systems Using Discrete Element Modeling and Soil Bin Evaluation
Previous Article in Journal
Deep Neural Network with Attention and Station Embeddings for Robust Spatio-Temporal Multisensor Temperature Forecasting
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Unmanned Aerial Vehicles and Low-Cost Sensors for Monitoring Biophysical Parameters of Sugarcane

by
Maurício Martello
,
Mateus Lima Silva
,
Carlos Augusto Alves Cardoso Silva
,
Rodnei Rizzo
*,
Ana Karla da Silva Oliveira
and
Peterson Ricardo Fiorio
Department of Biosystems Engineering, “Luiz de Queiroz” College of Agriculture, University of São Paulo, Piracicaba 13418-900, SP, Brazil
*
Author to whom correspondence should be addressed.
AgriEngineering 2025, 7(12), 403; https://doi.org/10.3390/agriengineering7120403 (registering DOI)
Submission received: 23 September 2025 / Revised: 18 November 2025 / Accepted: 20 November 2025 / Published: 1 December 2025
(This article belongs to the Section Remote Sensing in Agriculture)

Abstract

Unmanned Aerial Vehicles (UAVs) equipped with low-cost RGB and near-infrared (NIR) cameras represent efficient and scalable technology for monitoring sugarcane crops. This study evaluated the potential of UAV imagery and three-dimensional crop modeling to estimate sugarcane height and yield under different nitrogen fertilization levels. The experiment comprised 28 plots subjected to four nitrogen rates, and images were processed using a Structure from Motion (SfM) algorithm to generate Digital Surface Models (DSMs). Crop Height Models (CHMs) were obtained by subtracting DSMs from Digital Terrain Models (DTMs). The most accurate CHM was derived from the combination of the reference DTM and the NIR-based DSM (R2 = 0.957; RMSE = 0.162 m), while the strongest correlation between height and yield was observed at 200 days after cutting (R2 = 0.725; RMSE = 4.85 t ha−1). The NIR-modified sensor, developed at a total cost of USD 61.59, demonstrated performance comparable with commercial systems that are up to two hundred times more expensive. These results demonstrate that the proposed low-cost NIR sensor provides accurate, reliable, and accessible data for three-dimensional modeling of sugarcane.

1. Introduction

Sugarcane is a globally relevant crop, supporting both the food and bioenergy sectors. The global demand for ethanol has increased steadily over the past two decades, rising from 18 billion liters in 2000 to 98.6 billion liters in 2020 [1]. Consequently, major producing countries such as Brazil, India and China, along with several other regions, have intensively pursued technological innovations to enhance sugarcane yield and meet the growing demand for food and renewable energy [2]. Sustaining yield gains requires both farmers and industrial plants to achieve high operational efficiency. Within this global context, remote sensing, and more recently UAV-based approaches, have become a key component of international efforts to improve crop monitoring, structural assessment, and yield prediction.
Given this growing reliance on technologies, remote sensing stands out as a powerful tool for early diagnosis and crop monitoring [3], enabling the extraction of biophysical parameters from imagery, the assessment of spatial variability within fields, and ultimately, greater management efficiency and economic return for producers [4,5]. Multispectral data acquired through satellite imaging sensors have been widely used to map sugarcane areas, estimate cultivated acreage, and discriminate management practices, including fallow fields and expansion zones [6,7].
However, despite the broad applicability of satellite data, important limitations persist. Coarse imagery often fails to detect row gaps, localized weed infestation, pest outbreaks, and other fine-scale anomalies that require high-resolution data [8]. In this context, multispectral sensors mounted on Unmanned Aerial Vehicles (UAVs) provide an effective alternative, allowing for frequent and on-demand image acquisition at very high spatial resolutions. Unlike satellite imagery, UAV-based data are not restricted to fixed revisit intervals and can be collected under favorable field and weather conditions according to producer needs [9].
UAV imagery has proven highly effective for extracting structural biophysical parameters, particularly crop height, which can be estimated through three-dimensional (3D) modeling of the canopy surface. Plant height is widely recognized as a key indicator of vegetative growth and potential yield, showing a strong correlation with biomass accumulation and productivity [10,11]. The use of Structure from Motion (SfM) photogrammetry offers an efficient, reproducible, and high-resolution alternative to traditional field measurements, which are time-consuming, prone to human error, and logistically challenging in large production areas.
The UAV-SfM approach has demonstrated high accuracy and consistency in modeling sugarcane growth under diverse conditions. Khuimphukhieo et al. [12] reported performance equivalent to that of field surveys, with a root mean square error (RMSE) of 0.15 m, validating the reliability of 3D reconstruction from aerial imagery. Similarly, Som-ard et al. [13] emphasized that, in scenarios with limited field data, UAV-derived products can complement ground observations and serve as reference data for training satellite-based models, enhancing prediction capabilities across regional scales.
Beyond accuracy, UAV-SfM also offers substantial operational efficiency, as demonstrated by Holman et al. [14], who demonstrated that data collection using UAV-based 3D modeling can be up to 72 times faster than that of Light Detection and Ranging (LiDAR) while maintaining comparable accuracy. Consistently, Souza et al. [15] reported a mean error of only 0.08 m in sugarcane height estimation using UAV imagery and reinforced the potential of SfM as a practical tool for structural mapping and management support. Likewise, Sumesh, Ninsawat, and Som-ard [16] found a high correlation (r = 0.95) between UAV-derived and field-measured heights, enabling yield predictions with an error of just 1.63 t ha−1. Collectively, these studies establish the UAV-SfM framework as a technically robust, cost-effective, and operationally efficient approach for 3D crop monitoring and pre-harvest yield estimation in sugarcane.
Despite the clear advantages of UAV technology, relatively few studies have investigated its methodological limitations, equipment costs, and process scalability [17]. Näsi et al. [18] emphasized the importance of assessing the accuracy of low-cost sensors mounted on UAVs, while Alarcão et al. [19] noted that the operational use of UAVs remains limited and requires further research and methodological improvements to enable large-scale applications.
Conventionally, sensors operating in the visible spectrum (RGB) are low-cost and widely available, which explains their extensive use in agricultural studies. Sumesh, Ninsawat, and Som-ard [16], working with sugarcane, and Cano et al. [20], studying coffee crops, demonstrated the efficiency of RGB cameras in predicting growth and yield parameters. However, demand for low-cost sensors capable of operating beyond the visible spectrum is increasing, particularly in the near-infrared (NIR) region, which is closely linked to vegetation vigor and photosynthetic activity [21,22]. Louw, Chen, and Avtar [23] showed that modifying RGB cameras to capture the NIR spectrum provides significant advantages in vegetation studies, as plant reflectance is high in this wavelength range, thereby enhancing spectral sensitivity and contrast.
Building on this line of research, several studies have evaluated whether low-cost RGB cameras modified for NIR sensitivity can match the performance of more advanced multispectral systems. Wang et al. [24] demonstrated that a low-cost RGB camera modified to capture NIR, with a total hardware cost of only USD 70–85, achieved excellent performance (R2 = 0.96) when compared with a high-end hyperspectral sensor for vegetation index estimation in maize. Similarly, Putra and Soni [25] showed that low-cost sensors modified with NIR filters provide reliable measurements of biophysical properties, yielding performance comparable to that of considerably more expensive multispectral systems.
Despite these advances, the international body of work primarily focuses on the two-dimensional (2D) spectral analysis and vegetation indices. Crucially, the performance of these low-cost NIR-modified sensors for three-dimensional (3D) SfM modeling, specifically for reconstructing the complex canopy structure of tall crops like sugarcane and deriving accurate crop height, remains a significant methodological gap in the literature. Therefore, it is necessary to experimentally assess the potential of modified low-cost sensors for both crop height mapping and terrain detection, the latter being essential for generating Digital Terrain Models (DTMs) and, consequently, for accurately estimating crop height.

2. Objectives

This study aims to evaluate the performance of a low-cost RGB sensor converted to the near-infrared (NIR) spectrum for 3D modeling of sugarcane throughout different stages of the crop cycle. The DTMs and canopy surface models derived from the modified sensor were compared with those obtained using a conventional RGB camera and ground-based topographic data. Additionally, the estimated crop height was used to predict and spatialize yield. The central hypothesis is that the NIR-modified camera provides greater accuracy than the conventional RGB camera and achieves comparable performance to that of high-cost, research-grade sensors commonly reported in the literature.

3. Materials and Methods

3.1. Study Area and Experimental Design

The study site is in Piracicaba Municipality, southeastern Brazil (22°41′02″ S, 47°38′44″ W). Twenty-eight experimental plots of approximately 0.5 ha were established following a randomized block design. The experiment treatments comprised four levels of nitrogen fertilization: a control (0 kg ha−1) and three other treatments (60, 120, and 180 kg ha−1). The control had 4 replications, while the other treatments had 8 replications each (Figure 1). Field campaigns were conducted during the growing season to evaluate the second cut (first ratoon) of the IACSP95-5000 sugarcane cultivar.
Field measurements of sugarcane height were consistently taken at the same stools in all field campaigns. Five stools were selected within each plot, totaling 140 sampling locations (described hereafter as subplots) (Figure 1). Measurements were taken from the tallest stalk within each stool, extending from the ground to the tip of the flag leaf, and were performed on the dates listed in Table 1. The geographical coordinates and elevation of each subplot were defined using a total station and post-processed GNSS data, ensuring high positioning accuracy. The elevation data was subsequently used to generate a reference DTM of the study site, through interpolation.

3.2. Remotely Piloted Aircraft and Image Acquisition

We used a multirotor-type UAV (Figure 2) equipped with an electronic flight control system, which included a gyroscope, accelerometer, barometer, and GNSS receiver. The ground control station featured a telemetry system operating at 433 MHz and 2.4 GHz, allowing for frequency switching in case of failure. The low-cost sensor (Table 2) for aerial imagery comprised two non-metric digital cameras equipped with CMOSs (Complementary Metal Oxide Semiconductors). The sensor sizes were 1/2.3″ (6.17 × 4.55 mm) and 1/3″ (4.88 × 3.60 mm) for the visible (350–700 nm; RGB) and near-infrared (700–1100 nm; NIR) ranges, respectively.
A flight plan was designed to ensure high-density image overlap, both laterally and longitudinally, enabling accurate Crop Height Modeling. Each camera was configured with an internal timer to trigger every two seconds, while the UAV maintained a constant flight speed of 1.8 m s−1, resulting in image overlaps exceeding 90%. Flight paths were approximately 153 m long, with 14.5 m between flight lines, and an altitude of 40 m above ground level. Images were acquired at 67, 99, 144, 164, 200, 228, 269, and 326 days after cutting (DAC), as well as after the crop was harvested.
Both RGB and near-infrared (NIR) images were processed using the Mission Planner software (version 1.3.82; ArduPilot Dev. Team). This software synchronized flight logs with image timestamps and subsequently performed georeferencing based on five ground control points (GCPs) established prior to the UAV flights (Figure 3), obtained via RTK with a positioning accuracy of 0.5 cm + 1 ppm for distances up to 20 km under the static positioning method. The mean error of the digital models (terrain and surface) in relation to the GCPs was 0.03 m. The positioning of the reference points and the target used during data collection are shown in Figure 3, while the general workflow of the methodological procedures adopted in this study is presented in Figure 4.

3.3. Design and Cost of the Modified Sensor

The NIR camera was adapted by replacing its RGB Bayer filter with a custom filter, which blocks electromagnetic radiation between 530 and 700 nm and enables the detection of NIR radiation (Figure 5). The filter’s efficiency was validated using a FieldSpec Pro 3 spectroradiometer (ASD Inc., Boulder, CO, USA) coupled with an integrating sphere, confirming that it blocked more than 90% of radiation in the visible range.
Regarding costs, a survey was conducted across ten purchasing sources in Brazil to estimate current prices in U.S. dollars, based on the exchange rate at the time of publication (BRA 5.35 = USD 1.00). In comparison with commercially available near-infrared sensors considered low-cost, such as the Mapir Survey 3W (MAPIR Inc., San Diego, CA, USA) which is sold at an average price of approximately USD 490.00, the proposed sensor is about 87.43% cheaper.
Table 2 shows the average cost of each component used and its corresponding value. The sensor was attached to the UAV using a custom-designed mount developed specifically for this study. The mount was designed to position both cameras as closely as possible, facilitating image alignment and ensuring standardized comparison between image sets. It also included an adjustable mechanism to compensate for the UAV’s tilt during flight, allowing for the acquisition of images as close to vertical as possible.
The cost of the custom-designed mount was estimated based on a market survey of 3D printing services, considering that having a 3D printer readily available is not common. However, the cost tends to be lower when the component is printed using one’s own equipment.

3.4. Digital Surface Models (DSMs) and Digital Terrain Models (DTMs)

Digital Surface (DSMs) and Terrain Models (DTMs) were generated using the Agisoft Metashape v2.2.1 software (Agisoft LLC, St. Petersburg, Russia), which employs a Structure from Motion (SfM) algorithm for 3D scene reconstruction. This method identifies homologous key points across overlapping images and, through point triangulation, determines the relative positions and orientations of the cameras. The processing workflow includes the following analytical steps. 1. Image alignment (high quality): Homologous key points are identified across the image set using a scale-invariant feature transform (SIFT) algorithm. The spatial relationships among images are determined by triangulating corresponding points, resulting in a sparse point cloud that defines the relative camera geometry. The sparse cloud is then subsequently cleaned to remove noise and artifacts (spray points) based on reconstruction uncertainty, projection accuracy, and reprojection error thresholds, with approximately 10% of the points eliminated in each criterion. 2. Camera calibration and optimization: Intrinsic parameters (focal length, principal point, and lens distortion) are refined through bundle adjustment to minimize reprojection error. 3. Dense point cloud generation (high quality): The algorithm interpolates additional points using multi-view stereo (MVS) correlation, resulting in a detailed 3D representation of canopy and terrain surfaces. 4. Mesh generation and rasterization: The dense cloud is triangulated to create a continuous surface mesh and later converted to a raster grid with a spatial resolution of 0.06 m per pixel (DSM). Fourteen DSMs were generated from the RGB and NIR images acquired 99, 144, 164, 200, 228, 269, and 326 days after cutting (DAC).
The DTMs were generated by identifying the lowest elevation points within the dense point clouds [26,27], using images acquired at two key moments of the crop cycle: the early growth stage and after harvest. The initial DTM (67 DAC) represented a period with minimal canopy cover, allowing for clear ground visualization and higher model accuracy. The post-harvest DTM (final DTM) was produced to evaluate the reliability of terrain reconstruction after significant biomass accumulation and residue deposition, considering possible soil disturbance. These two DTMs were used as terrain references for the calculation of Crop Height Models throughout the other acquisition dates, enabling the assessment of temporal consistency and stability of the modeling process (Figure 4).

3.5. Crop Height Model (CHM)

DSM and DTM data were extracted for the 140 subplots and aggregated to represent the average elevation by plot. Crop Height Models (CHMs) were calculated for each date using models derived from RGB and NIR sensors, following the method of Hoffmeister et al. [28], as shown in Equation (1):
CHM n   =   DSM n     DTM n
where CHMn is the Crop Height Model at time n; DSMn is the Digital Surface Model at time n; and DTMn is the Digital Terrain Model at time n.
For CHM validation, five circular subplots with a diameter of 0.65 m (0.33 m2 each) were defined within each plot. These subplots corresponded spatially to the five marked stools where field biometric measurements were collected. The average CHM-derived heights from these five subplots (resulting in 28 samples) were compared with field-measured heights through linear regression analysis to evaluate model performance for different combinations of DTMs and DSMs (Figure 4).

3.6. Models’ Performance and Relationship with Sugarcane Yield

The accuracy and performance of the DTM, DSM, and CHM were evaluated with Pearson’s correlation coefficient (r)—Equation (2), coefficient of determination (R2)—Equation (3), relative error (RE)—Equation (4), and root mean square error (RMSE)—Equation (5):
r   =   x i x ¯ y i y ¯ x i x ¯ 2 y i y ¯ 2 ,
R 2 = [ Σ ( y i y ¯ ) × ( x i x ¯ ) ] 2 [ Σ y i y ¯ 2   ×   x i x ¯ 2 ] ,
RE   = 100 y ¯   i = 1 n ( y i ^ y i ) 2 n ,
RMSE   = i = 1 n ( y i ^ y i ) 2 n
where x i denotes the observed values, y i the predicted values, x ¯ the mean of the observed data, y ¯ the mean of predicted data, and n the number of samples.
The relationship between sugarcane yield and CHM values was evaluated through linear regression analysis. Crop yield for each experimental plot was determined by manually harvesting the sugarcane and weighing the bundles using a load cell installed on a cane loader. Average height values were extracted from the CHM images by considering a 42.5 m2 (10 × 4.25 m) rectangular polygon located at the center of each plot. This height averaging aimed to generate a representative value that captured within-plot variability without restricting the analysis solely to the pre-defined subplots.

4. Results

4.1. Comparison of Initial and Final DTMs (RGB and NIR) with the Reference DTM

The models derived from RGB and NIR sensors presented high R2 (0.93–0.99) and low RMSE values (0.05–0.17 m), but the performance of the NIR models was slightly superior in all comparisons (Figure 6). Although both the initial and final DTMs were highly related to the reference model, the initial DTMs had the best performance (0.98 < R2 < 0.99; 0.05 < RMSE < 0.09 m). This suggests that the time interval between image acquisition can influence the quality of DTMs generated from RGB and NIR imagery. The reduced accuracy of the final DTMs might be attributed to the substantial amount of crop residue deposited on the soil after harvest, which led to an increase in the elevation values.

4.2. Evaluation of Crop Height Models

Comparisons between the field measurements and CHMs indicated that the most accurate height predictions were achieved using the CHM generated from the NIR-derived DSM and the reference DTM (Table 3). This model exhibited high accuracy, with R2, RMSE, and RE values of 0.957, 0.162 m, and 5.97%, respectively.
The CHMs’ data dispersion indicated an increasing trend in crop height as the measurements progressed toward the maturation phase (Figure 7). Notably, heights measured at 269 and 326 DAC were very similar, indicating that the crop had reached its final height and entered the maturation stage. Additionally, data dispersion was higher at 99 DAC, corresponding to the early growth stage when variation in plant height is naturally more pronounced.

4.3. Relationship Between Crop Height and Yield

The potential of NIR-derived CHMs as a yield predictor was evaluated through linear regression analysis (Figure 8). CHMs from most acquisition dates were good indicators of yield (R2 > 0.62 and RMSE < 5.59 t ha−1), except for the 164 DAC model (R2 = 0.33 and RMSE = 7.44 t ha−1). The 200 DAC height model was the best yield predictor, with R2, RMSE, and RE of 0.725, 4.85 t ha−1, and 5.19%.
The 200 DAC linear regression (Figure 9) and its corresponding CHM were used to generate spatially explicit predictions of sugarcane yield. Yield variation was primarily related to nitrogen fertilization practices. Plots with no nitrogen (0 kg ha−1), followed by those receiving the 60 kg ha−1 dose, exhibited lower productivity compared with plots with higher nitrogen rates (Figure 9).

5. Discussion

5.1. Evaluation of Initial and Final RGB/NIR DTMs Relative to the Reference DTM

Among all terrain models, the initial DTM derived from both RGB or NIR images achieved the best performance (R2 = 0.99 and 0.98, respectively) with an RMSE (0.05 and 0.09) lower than most errors reported in the literature [15,29,30,31]. Clapuyt et al. [29] found RMSEs ranging from 0.27 to 0.45 m when comparing UAV-derived models with LiDAR-based reference data. Similarly, Furby and Akhavian [30] assessed the accuracy of DTMs using UAV imagery and reported an RMSE of 0.1 m when compared with total station measurements.
Holman et al. [14] estimated wheat growth with high-resolution UAV imagery and found that post-harvest DTMs had lower accuracy due to landscape changes from harvesting operations. The reference DTM data in our study were collected simultaneously with the images used for the initial DTMs. In contrast, a 10-month interval separated the reference data and the images used during final DTM generation. Besides the possible impact of harvest operations on the terrain, this elapsed time may also have caused minor alterations, such as regrowth of surrounding vegetation.

5.2. Assessment of Crop Height Models

Estimating crop height from multiple combinations of DTMs and DSMs enabled the assessment of their influence on CHM accuracy. Our results corroborate previous findings [14,32], highlighting the potential of NIR data to distinguish vegetation from soil, thereby producing more accurate DSMs and, consequently, higher-quality CHMs. Upon evaluating the results, it became evident that although different DTM generation methods produced varied outputs, these differences did not significantly affect overall CHM performance.
Consistent with other studies [33,34], CHMs derived from UAV imagery tended to underestimate the actual crop height. Niu et al. [34] suggested that this underestimation may be associated with wind-induced movement of canopy leaves during UAV flights, which can disrupt the model reconstruction process. Moreover, the centimeter-scale resolution of UAV imagery may limit the accurate reconstruction of upper plant structures, such as maize tassels or sugarcane tops [34].
In addition to the effect of wind, there is also an influence from errors associated with the number and spatial distribution of ground control points (GCPs) used for georeferencing. It might have been beneficial to include additional tie points located within the crop area, as only one internal control point was used in this study (Figure 3). Furthermore, adding GCPs along the field edges and in more central positions could help enhance geometric stability and reduce distortions in the generated models.
The best performance was obtained for the CHM composed of the reference DTM and the DSM derived from the NIR sensor, which exhibited high accuracy (R2 ≥ 0.95 and RMSE ≤ 0.16 m), outperforming the RGB-based CHM. The superior accuracy achieved with NIR data can be attributed to the higher reflectance of vegetation in the near-infrared spectrum, whereas in the visible range, most incoming energy is absorbed for photosynthesis [35,36]. In contrast, the soil reflects much less NIR radiation, improving vegetation–soil segmentation and resulting in models with less noise and better canopy surface definition. Furthermore, RGB sensors are more susceptible to shadows and illumination variability, which can compromise 3D reconstruction quality [37,38,39].
NIR imagery, on the other hand, is less affected by atmospheric and illumination variations, contributing to greater photogrammetric stability [40]. Another relevant aspect is that NIR radiation interacts more effectively with the upper canopy layers, capturing overlapping vegetative structures more completely and producing denser and more coherent point clouds [41,42].
When compared with previous studies, the results obtained in this work stand out significantly. Canata et al. [31] used LiDAR to estimate sugarcane height and reported lower predictive performance (R2 = 0.38; RMSE = 0.70 m), reinforcing the efficiency of the photogrammetric approach adopted in the present study. Similarly, Yu et al. [11] achieved comparable accuracy (R2 = 0.96; RMSE = 0.18 m) using a commercial RGB camera, while Oliveira et al. [43] obtained higher precision (RMSE = 0.08 m; R2 = 0.88) by combining multispectral data with a Random Forest algorithm, demonstrating the potential of hybrid approaches that integrate structural and spectral information.
Intermediate results were also observed in other studies. For instance, Khuimphukhieo et al. [12] analyzed multiple sugarcane varieties and reported R2 = 0.89 and RMSE = 0.15 m using a multispectral sensor, while Souza et al. [15] found an RMSE of 0.40 m with a conventional RGB camera, underscoring the limitations of sensors restricted to the visible spectrum. Overall, the accuracy achieved in this study is comparable with that obtained with commercial and higher-cost sensors, demonstrating the reliability and potential of the proposed low-cost NIR-modified system for structural modeling of sugarcane canopies.

5.3. Crop Height-Yield Relationship Analysis

The strongest relationship between CHM-derived crop height and yield was observed at 200 DAC, with R2 = 0.71 and RMSE = 4.85 t ha−1. Portz et al. [44] noted that using height as a yield predictor tends to be more effective in the later stages of the crop cycle, when plants are fully developed. Similar results were reported by Som-ard et al. [45], who applied a Random Forest model to estimate yield from UAV-derived crop height, achieving high accuracy (R2 = 0.90; RMSE = 5.25 t ha−1) and confirming the relationship between productivity and plant height. In a sequential modeling approach, Sumesh, Ninsawat, and Som-ard [16] found strong correlations between UAV-estimated plant height and stem height (measured in the field) (R2 = 0.79), as well as between stem height (measured in the field) and yield (R2 = 0.77), enabling productivity estimation with an RMSE of 4.23 t ha−1.
It is important to note that some limitations still exist when using UAV-derived data to estimate stem height, since the stalks are covered by leaves. The optical sensors mounted on drones are unable to penetrate the canopy, as the reflected signal is captured mainly from the uppermost leaf surfaces rather than from the actual stem top. Even when accurate height estimations are achieved, the difference between the top vegetation detected (TVD point) and the upper leaves may vary depending on the variety, plant architecture, and field conditions. Consequently, models calibrated for a single cultivar may perform well locally but are not necessarily generalizable across different genotypes or growth environments.
Moreover, Yu et al. [11] highlighted that height measurements taken via UAVs can yield more precise yield estimates than those obtained from field-based measurements. These findings indicate that our height models performed comparably to those based on more sophisticated sensors, making it a viable alternative for data acquisition with reduced costs and field labor. Furthermore, aerial imagery-based modeling is not affected by crop height limitations, unlike the restrictions faced by Portz et al. [44] when using ultrasonic sensors on agricultural equipment, which struggled to accurately measure tall sugarcane plants.
Increasing nitrogen application rates evidently leads to higher productivity, as this nutrient directly influences biomass accumulation, which is strongly associated with plant height [46,47]. However, it is important to note that the excessive use of chemical nitrogen fertilizers can cause serious environmental impacts, such as nitrate leaching into groundwater, eutrophication of water bodies, emission of greenhouse gases (particularly nitrous oxide, N2O), and soil acidification [48,49]. Therefore, this study highlights the importance of adopting biological alternatives for nitrogen supply in sugarcane cultivation, enabling productivity gains in a sustainable manner without compromising environmental integrity [50].
Finally, although the CHM showed only a moderate correlation with yield, it proved highly effective at identifying spatial yield patterns, serving as a valuable tool for detecting areas warranting further investigation into sources of yield variability within the crop field.

5.4. Comparative Cost–Performance Analysis

The sensor system, evaluated at a total cost of USD 61.59, demonstrated performance comparable with that of an AI-based RGB photogrammetric system reported in the literature [30], which compared DTM accuracy against total station measurements. Regarding height model accuracy, the proposed sensor outperformed a LiDAR system by approximately 77% [31]. Currently, that LiDAR sensor is marketed at a price 211 times higher than the system developed in this study. In addition, the proposed sensor achieved results very close (0.11% difference) to those obtained with a commercial RGB camera [14], which is about seven times more expensive. When compared with the MicaSense multispectral sensor (Micasense Inc., Seattle, DC, USA) [43], the proposed system was about 50% less accurate, although the MicaSense unit cost 182 times more.
It is important to emphasize that these comparisons are based on results reported in the literature. The experimental conditions and methodological particularities of each study certainly influence the observed differences. Nevertheless, the developed sensor evidently exhibited consistent performance. Another relevant point is that the proposed sensor still requires additional studies aimed at characterizing its spectral response and radiometric calibration. Specifically, the optical filter used in this study did not completely block radiation transmission in the blue wavelength range, which may introduce significant errors in the estimation of reflectance values. This phenomenon was also reported by Louw, Chen, and Avtar [23], who found that modified RGB cameras equipped with optical filters may introduce systematic measurement errors. Therefore, the current use of this sensor should be restricted to height modeling and structural applications, until further optical adjustments and spectral calibrations are performed.
In addition, the physical structure required to attach the camera to the UAV must be considered. Given the diversity of commercial drone models available, not all platforms will allow for direct installation of the mount, making custom fittings or supports necessary to ensure proper stability and alignment during flight [51,52]. Despite these limitations, the proposed system shows strong potential for practical application, particularly as a low-cost and easily replicable solution. This affordability makes it a promising tool for smallholder farmers, educational institutions, and low-budget research projects, thereby expanding access to remote sensing technologies and contributing to the democratization of precision agriculture.

5.5. Limitations and Future Perspectives

While the proposed approach showed promising results, some limitations must be acknowledged. The experiment was conducted in a single region and crop cycle, under specific environmental and management conditions. Consequently, the model’s performance may vary across different soil types, climatic conditions, or irrigation and fertilization regimes [53,54,55]. In addition, only one sugarcane variety was evaluated, and canopy architecture may differ significantly among genotypes, which can affect the accuracy of height estimations [12,56,57].
A further limitation concerns the temporal scope of this study, as the data were collected during a single growing season. Multi-year and multi-location experiments are required to validate the robustness and transferability of the model [58].
Although the linear regression model performed well in this study, as in other works focused on estimating crop productivity [59,60], its stability under different nitrogen fertilizer levels should be interpreted with caution. The model was calibrated using data from a specific experimental setup, and variations in nitrogen availability may alter the physiological response of the crop, leading to nonlinear relationships between plant height and yield.
Therefore, while linear regression provided satisfactory results for the tested conditions, future studies should explore more flexible and accurate modeling approaches, such as machine learning algorithms (Random Forest, Support Vector Regression, or Neural Networks), which are capable of handling multiple input variables and capturing complex interactions among growth stage, nitrogen levels, and spectral data [61,62]. Integrating biometric data from UAVs with spectral features in such models has already shown significant improvements in predictive accuracy and generalization potential [11,55].

6. Conclusions

This study presents a comprehensive analysis of three-dimensional sugarcane modeling using low-cost RGB and NIR sensors. High correlations were observed between the generated DTMs and the reference method, with both initial and final acquisition strategies providing good results and confirming the technique’s potential for DTM generation. Furthermore, CHMs derived from aerial imagery proved highly suitable for modeling sugarcane height, showing strong correlations with field measurements.
A comparative analysis between the sensors revealed that NIR (low-cost) models exhibited higher correlations with field data than those from RGB. Importantly, CHM-based height estimates correlated well with sugarcane yield and were comparable to findings in the existing literature.
This study also identified optimal time windows for data collection and sugarcane productivity prediction (between 200 and 269 days after cutting (DAC)). The resulting maps could be a valuable tool, offering crucial information for enhanced crop management and informed decision-making.

Author Contributions

Conceptualization, M.L.S., M.M. and P.R.F.; Methodology, M.L.S., A.K.d.S.O., M.M., R.R., C.A.A.C.S. and P.R.F.; Software, M.L.S., M.M., R.R. and C.A.A.C.S.; Validation, M.L.S., A.K.d.S.O. and C.A.A.C.S.; Formal analysis, M.L.S., A.K.d.S.O., M.M., R.R., C.A.A.C.S. and P.R.F.; Investigation, M.L.S., A.K.d.S.O., M.M., R.R., C.A.A.C.S. and P.R.F.; Data curation, M.M. and P.R.F.; Writing—preparation of the original draft, M.L.S., M.M., R.R. and P.R.F.; Writing—review and editing, M.L.S. and R.R.; Visualization, M.L.S., A.K.d.S.O., M.M., R.R., C.A.A.C.S. and P.R.F.; Supervision, R.R., C.A.A.C.S. and P.R.F.; Project administration, P.R.F.; Funding acquisition, P.R.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Luiz de Queiroz Agricultural Studies Foundation—FEALQ, Brazil. Additional support was provided by the São Paulo Research Foundation (FAPESP; grants 2024/10366-7, 2025/15610-6, and 2013/22435-9) and by the Coordination for the Improvement of Higher Education Personnel (CAPES; 88887.143366/2025-00, 88887.027867/2024-00, and 88887.146948/2025-00).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data will be made available upon request.

Acknowledgments

The authors thank the Luiz de Queiroz Agricultural Studies Foundation (FEALQ) for funding the publication of this work and the São Paulo Research Foundation (FAPESP) for their support.

Conflicts of Interest

The authors declare that they are not aware of any conflicts of financial interest or personal relationships that could have influenced the work reported in this article.

References

  1. Hoang, T.-D.; Nghiem, N. Recent Developments and Current Status of Commercial Production of Fuel Ethanol. Fermentation 2021, 7, 314. [Google Scholar] [CrossRef]
  2. Vandenberghe, L.P.S.; Valladares-Diestra, K.K.; Bittencourt, G.A.; Zevallos Torres, L.A.; Vieira, S.; Karp, S.G.; Sydney, E.B.; de Carvalho, J.C.; Thomaz Soccol, V.; Soccol, C.R. Beyond Sugar and Ethanol: The Future of Sugarcane Biorefineries in Brazil. Renew. Sustain. Energy Rev. 2022, 167, 112721. [Google Scholar] [CrossRef]
  3. Karthikeyan, L.; Chawla, I.; Mishra, A.K. A Review of Remote Sensing Applications in Agriculture for Food Security: Crop Growth and Yield, Irrigation, and Crop Losses. J. Hydrol. 2020, 586, 124905. [Google Scholar] [CrossRef]
  4. Benami, E.; Jin, Z.; Carter, M.R.; Ghosh, A.; Hijmans, R.J.; Hobbs, A.; Lobell, D.B. Uniting remote sensing, crop modelling and economics for agricultural risk management. Nat. Rev. Earth Environ. 2021, 2, 140–159. [Google Scholar] [CrossRef]
  5. Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  6. Poortinga, A.; Thwal, N.S.; Khanal, N.; Mayer, T.; Bhandari, B.; Markert, K.; Nicolau, A.P.; Dilger, J.; Tenneson, K.; Clinton, N.; et al. Mapping sugarcane in Thailand using transfer learning, a lightweight convolutional neural network, NICFI high resolution satellite imagery and Google Earth Engine. ISPRS Open J. Photogramm. Remote Sens. 2021, 1, 100003. [Google Scholar] [CrossRef]
  7. Zheng, Y.; Li, Z.; Pan, B.; Lin, S.; Dong, J.; Li, X.; Yuan, W. Development of a Phenology-Based Method for Identifying Sugarcane Plantation Areas in China Using High-Resolution Satellite Datasets. Remote Sens. 2022, 14, 1274. [Google Scholar] [CrossRef]
  8. Phang, S.K.; Chiang, T.H.A.; Happonen, A.; Chang, M.M.L. From Satellite to UAV-Based Remote Sensing: A Review on Precision Agriculture. IEEE Access 2023, 11, 127057–127076. [Google Scholar] [CrossRef]
  9. Ahmad, A.; Ordoñez, J.; Cartujo, P.; Martos, V. Remotely Piloted Aircraft (RPA) in Agriculture: A Pursuit of Sustainability. Agronomy 2021, 11, 7. [Google Scholar] [CrossRef]
  10. Desalegn, B.; Kebede, E.; Legesse, H.; Fite, T. Sugarcane productivity and sugar yield improvement: Selecting variety, nitrogen fertilizer rate, and bioregulator as a first-line treatment. Heliyon 2023, 9, e15520. [Google Scholar] [CrossRef]
  11. Yu, D.; Zha, Y.; Shi, L.; Jin, X.; Hu, S.; Yang, Q.; Huang, K.; Zeng, W. Improvement of sugarcane yield estimation by assimilating UAV-derived plant height observations. Eur. J. Agron. 2020, 121, 126159. [Google Scholar] [CrossRef]
  12. Khuimphukhieo, I.; Bhandari, M.; Enciso, J.; da Silva, J.A. Estimating sugarcane yield and its components using unoc-cupied aerial systems (UAS)-based high throughput phenotyping (HTP). Comput. Electron. Agric. 2025, 237, 110658. [Google Scholar] [CrossRef]
  13. Som-ard, J.; Immitzer, M.; Vuolo, F.; Atzberger, C. Sugarcane yield estimation in Thailand at multiple scales using the integration of UAV and Sentinel-2 imagery. Precis. Agric. 2024, 25, 1581–1608. [Google Scholar] [CrossRef]
  14. Holman, F.H.; Riche, A.B.; Michalski, A.; Castle, M.; Wooster, M.J.; Hawkesford, M.J. High Throughput Field Phenotyping of Wheat Plant Height and Growth Rate in Field Plot Trials Using UAV Based Remote Sensing. Remote Sens. 2016, 8, 1031. [Google Scholar] [CrossRef]
  15. Souza, C.H.W.; Lamparelli, R.A.C.; Rocha, J.V.; Magalhães, P.S.G. Height estimation of sugarcane using an unmanned aerial system (UAS) based on structure from motion (SfM) point clouds. Int. J. Remote Sens. 2017, 38, 2218–2230. [Google Scholar] [CrossRef]
  16. Sumesh, K.; Ninsawat, S.; Som-ard, J. Integration of RGB-based vegetation index, crop surface model and object-based image analysis approach for sugarcane yield estimation using unmanned aerial vehicle. Comput. Electron. Agric. 2021, 180, 105903. [Google Scholar] [CrossRef]
  17. Alexopoulos, A.; Koutras, K.; Ali, S.B.; Puccio, S.; Carella, A.; Ottaviano, R.; Kalogeras, A. Complementary Use of Ground-Based Proximal Sensing and Airborne/Spaceborne Remote Sensing Techniques in Precision Agriculture: A Systematic Review. Agronomy 2023, 13, 1942. [Google Scholar] [CrossRef]
  18. Näsi, R.; Viljanen, N.; Kaivosoja, J.; Alhonoja, K.; Hakala, T.; Markelin, L.; Honkavaara, E. Estimating Biomass and Nitrogen Amount of Barley and Grass Using UAV and Aircraft Based Spectral and Photogrammetric 3D Features. Remote Sens. 2018, 10, 1082. [Google Scholar] [CrossRef]
  19. Alarcão Júnior, J.C.; Nuñez, D.N.C. O uso de drones na agricultura 4.0. Braz. J. Sci. 2024, 3, 1–13. [Google Scholar] [CrossRef]
  20. Cano, M.d.J.A.; Marulanda, E.E.C.; Henao-Céspedes, V.; Cardona-Morales, O.; Garcés-Gómez, Y.A. Quantification of Flowering in Coffee Growing with Low-Cost RGB Sensor UAV-Mounted. Sci. Hortic. 2023, 309, 111649. [Google Scholar] [CrossRef]
  21. Rousel, J.; Haas, R.; Schell, J.; Deering, D. Monitoring vegetation systems in the great plains with ERTS. In Proceedings of the Third Earth Resources Technology Satellite—1 Symposium, NASA SP-351, Washington, DC, USA, 10–14 December 1973; pp. 309–317. [Google Scholar]
  22. Bannari, A.; Morin, D.; Bonn, F.; Huete, A.R. A Review of Vegetation Indices. Remote Sens. Rev. 1995, 13, 95–120. [Google Scholar] [CrossRef]
  23. Louw, A.S.; Chen, X.; Avtar, R. Assessing the Accuracy of an Infrared-Converted Drone Camera with Orange-Cyan-NIR Filter for Vegetation and Environmental Monitoring. Remote Sens. Appl. 2024, 35, 101229. [Google Scholar] [CrossRef]
  24. Wang, L.; Duan, Y.; Zhang, L.; Rehman, T.U.; Ma, D.; Jin, J. Precise Estimation of NDVI with a Simple NIR Sensitive RGB Camera and Machine Learning Methods for Corn Plants. Sensors 2020, 20, 3208. [Google Scholar] [CrossRef] [PubMed]
  25. Putra, B.T.; Soni, P. Evaluating NIR-Red and NIR-Red edge external filters with digital cameras for assessing vegetation indices under different illumination. Infrared Phys. Technol. 2017, 81, 148–156. [Google Scholar] [CrossRef]
  26. Geipel, J.; Link, J.; Claupein, W. Combined Spectral and Spatial Modeling of Corn Yield Based on Aerial Images and Crop Surface Models Acquired with an Unmanned Aircraft System. Remote Sens. 2014, 6, 10335–10355. [Google Scholar] [CrossRef]
  27. Díaz-Varela, R.A.; De la Rosa, R.; León, L.; Zarco-Tejada, P.J. High-Resolution Airborne UAV Imagery to Assess Olive Tree Crown Parameters Using 3D Photo Reconstruction: Application in Breeding Trials. Remote Sens. 2015, 7, 4213–4232. [Google Scholar] [CrossRef]
  28. Hoffmeister, D.; Waldhoff, G.; Korres, W.; Curdt, C.; Bareth, G. Crop height variability detection in a single field by multi-temporal terrestrial laser scanning. Precis. Agric. 2016, 17, 296–312. [Google Scholar] [CrossRef]
  29. Clapuyt, F.; Vanacker, V.; Van Oost, K. Reproducibility of UAV-based earth topography reconstructions based on Structure-from-Motion algorithms. Geomorphology 2016, 260, 4–15. [Google Scholar] [CrossRef]
  30. Furby, B.; Akhavian, R. A Comprehensive Comparison of Photogrammetric and RTK-GPS Methods for General Order Land Surveying. Buildings 2024, 14, 1863. [Google Scholar] [CrossRef]
  31. Canata, T.F.; Martello, M.; Maldaner, L.F. 3D Data Processing to Characterize the Spatial Variability of Sugarcane Fields. Sugar Tech 2022, 24, 419–429. [Google Scholar] [CrossRef]
  32. He, N.; Chen, B.; Lu, X.; Bai, B.; Fan, J.; Zhang, Y.; Li, G.; Guo, X. Integration of UAV Multi-Source Data for Accurate Plant Height and SPAD Estimation in Peanut. Drones 2025, 9, 284. [Google Scholar] [CrossRef]
  33. Wang, H.; Singh, K.D.; Poudel, H.P.; Natarajan, M.; Ravichandran, P.; Eisenreich, B. Forage Height and Above-Ground Biomass Estimation by Comparing UAV-Based Multispectral and RGB Imagery. Sensors 2024, 24, 5794. [Google Scholar] [CrossRef]
  34. Niu, Y.; Han, W.; Zhang, H.; Zhang, L.; Chen, H. Estimating maize plant height using a crop surface model constructed from UAV RGB images. Biosyst. Eng. 2024, 241, 56–67. [Google Scholar] [CrossRef]
  35. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  36. Widjaja Putra, T.B.; Soni, P. Enhanced broadband greenness in assessing chlorophyll a and b, carotenoid, and nitrogen in Robusta coffee plantations using a digital camera. Precis. Agric. 2018, 19, 238–256. [Google Scholar] [CrossRef]
  37. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  38. Mishra, P.; Sadeh, R.; Bino, E.; Polder, G.; Boer, M.P.; Rutledge, D.N.; Herrmann, I. Complementary Chemometrics and Deep Learning for Semantic Segmentation of Tall and Wide Visible and Near-Infrared Spectral Images of Plants. Comput. Electron. Agric. 2021, 186, 106226. [Google Scholar] [CrossRef]
  39. Kior, A.; Yudina, L.; Zolin, Y.; Sukhov, V.; Sukhova, E. RGB Imaging as a Tool for Remote Sensing of Characteristics of Ter-restrial Plants: A Review. Plants 2024, 13, 1262. [Google Scholar] [CrossRef]
  40. Elder, T.; Strong, J. The infrared transmission of atmospheric windows. J. Frankl. Inst. 1953, 255, 189–208. [Google Scholar] [CrossRef]
  41. Jacquemoud, S.; Ustin, S.L. Leaf optical properties: A State of the art. In Proceedings of the 8th International Symposium of Physical Measurements & Signatures in Remote Sensing—CNES, Aussois, France, 8–12 January 2001. [Google Scholar]
  42. Winsen, M.; Hamilton, G. A Comparison of UAV-Derived Dense Point Clouds Using LiDAR and NIR Photogrammetry in an Australian Eucalypt Forest. Remote Sens. 2023, 15, 1694. [Google Scholar] [CrossRef]
  43. Oliveira, R.P.; Barbosa Júnior, M.R.; Pinto, A.A.; Oliveira, J.L.P.; Zerbato, C.; Furlani, C.E.A. Predicting Sugarcane Biometric Pa-rameters by UAV Multispectral Images and Machine Learning. Agronomy 2022, 12, 1992. [Google Scholar] [CrossRef]
  44. Portz, G.; Amaral, L.R.; Molin, J.P.; Adamchuk, V.I. Field comparison of ultrasonic and canopy reflectance sensors used to estimate biomass and N-uptake in sugarcane. In Precision Agriculture’13; Springer: Dordrecht, The Netherlands, 2013; pp. 111–117. [Google Scholar] [CrossRef]
  45. Som-ard, J.; Hossain, M.D.; Ninsawat, S.; Veerachitt, V. Pre-harvest sugarcane yield estimation using UAV-based RGB images and ground observation. Sugar Tech 2018, 20, 645–657. [Google Scholar] [CrossRef]
  46. Yang, Y.; Gao, S.; Jiang, Y.; Lin, Z.; Luo, J.; Li, M.; Guo, J.; Su, Y.; Xu, L.; Que, Y. The Physiological and Agronomic Responses to Nitrogen Dosage in Different Sugarcane Varieties. Front. Plant Sci. 2019, 10, 406. [Google Scholar] [CrossRef]
  47. Zeng, X.-P.; Zhu, K.; Lu, J.-M.; Jiang, Y.; Yang, L.-T.; Xing, Y.-X.; Li, Y.-R. Long-Term Effects of Different Nitrogen Levels on Growth, Yield, and Quality in Sugarcane. Agronomy 2020, 10, 353. [Google Scholar] [CrossRef]
  48. Byrnes, B.H. Environmental effects of n fertilizer use—An overview. Fert. Res. 1990, 26, 209–215. [Google Scholar] [CrossRef]
  49. Tyagi, J.; Ahmad, S.; Malik, M. Nitrogenous fertilizers: Impact on environment sustainability, mitigation strategies, and challenges. Int. J. Environ. Sci. Technol. 2022, 19, 11649–116472. [Google Scholar] [CrossRef]
  50. Chakraborty, T.; Akhtar, N. Biofertilizers: Prospects and Challenges for Future. In Biofertilizers; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2021; pp. 575–590. [Google Scholar] [CrossRef]
  51. Mohsan, S.A.H.; Othman, N.Q.H.; Li, Y.; Alsharif, M.H.; Khan, M.A. Unmanned aerial vehicles (UAVs): Practical aspects, applications, open challenges, security issues, and future trends. Intell. Serv. Robot. 2023, 16, 109–137. [Google Scholar] [CrossRef] [PubMed]
  52. Laghari, A.A.; Jumani, A.K.; Laghari, R.A.; Nawaz, H. Unmanned aerial vehicles: A review. Cogn. Robot. 2023, 3, 8–22. [Google Scholar] [CrossRef]
  53. Silva, W.K.M.; Medeiros, S.E.L.; da Silva, L.P.; Coelho, L.M., Jr.; Abrahão, R. Sugarcane production and climate trends in Paraíba state (Brazil). Environ. Monit. Assess. 2020, 192, 392. [Google Scholar] [CrossRef]
  54. Amorim, M.T.A.; Silvero, N.E.; Bellinaso, H.; Gómez, A.M.R.; Greschuk, L.T.; Campos, L.R.; Demattê, J.A. Impact of soil types on sugarcane development monitored over time by remote sensing. Precis. Agric. 2022, 23, 1532–1552. [Google Scholar] [CrossRef]
  55. Alexandre, M.L.d.S.; e Lima, I.d.L.; Nilsson, M.S.; Rizzo, R.; Silva, C.A.A.C.; Fiorio, P.R. Sugarcane (Saccharum officinarum) Productivity Estimation Using Multispectral Sensors in RPAs, Biometric Variables, and Vegetation Indices. Agronomy 2025, 15, 2149. [Google Scholar] [CrossRef]
  56. Chen, X.L.; Huang, Z.H.; Fu, D.W.; Fang, J.T.; Zhang, X.B.; Feng, X.M.; Xie, J.F.; Wu, B.; Luo, Y.J.; Zhu, M.F.; et al. Identification of genetic loci for sugarcane leaf angle at different developmental stages by genome-wide association study. Front. Plant Sci. 2022, 13, 841693–841705. [Google Scholar] [CrossRef]
  57. Ruwanpathirana, P.P.; Sakai, K.; Jayasinghe, G.Y.; Nakandakari, T.; Yuge, K.; Wijekoon, W.M.C.J.; Priyankara, A.C.P.; Samaraweera, M.D.S.; Madushanka, P.L.A. Evaluation of Sugarcane Crop Growth Monitoring Using Vegetation Indices Derived from RGB-Based UAV Images and Machine Learning Models. Agronomy 2024, 14, 2059. [Google Scholar] [CrossRef]
  58. Priyatikanto, R.; Lu, Y.; Dash, J.; Sheffield, J. Improving generalisability and transferability of machine-learning-based maize yield prediction model through domain adaptation. Agric. For. Meteorol. 2023, 341, 109652. [Google Scholar] [CrossRef]
  59. Silva, M.L.; da Silva, A.R.A.; de Moura Neto, J.M.; Calou, V.B.C.; Fernandes, C.N.V.; Araújo, E.M. Enhanced Water Monitoring and Corn Yield Prediction Using Rpa-Derived Imagery. Eng. Agríc. 2025, 45, e20240092. [Google Scholar] [CrossRef]
  60. Sanches, G.M.; Duft, D.G.; Kölln, O.T.; Luciano, A.C.d.S.; De Castro, S.G.Q.; Okuno, F.M.; Franco, H.C.J. The potential for RGB images obtained using unmanned aerial vehicle to assess and predict yield in sugarcane fields. Int. J. Remote Sens. 2018, 39, 5402–5414. [Google Scholar] [CrossRef]
  61. Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef]
  62. Benos, L.; Tagarakis, A.C.; Dolias, G.; Berruto, R.; Kateris, D.; Bochtis, D. Machine Learning in Agriculture: A Comprehensive Updated Review. Sensors 2021, 21, 3758. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Site location and experimental design, including the marking of rows and subplots within each plot.
Figure 1. Site location and experimental design, including the marking of rows and subplots within each plot.
Agriengineering 07 00403 g001
Figure 2. Hexacopter UAV equipped with dual sensors (RGB and NIR), a GNSS antenna, and a communication system.
Figure 2. Hexacopter UAV equipped with dual sensors (RGB and NIR), a GNSS antenna, and a communication system.
Agriengineering 07 00403 g002
Figure 3. Positioning of the ground control points and detail of the target used during data collection.
Figure 3. Positioning of the ground control points and detail of the target used during data collection.
Agriengineering 07 00403 g003
Figure 4. Workflow of the methodology adopted for Crop Height Modeling from UAV imagery, including Digital Surface and Terrain Model (DSM and DTM) generation and Crop Height Model (CHM) calculation.
Figure 4. Workflow of the methodology adopted for Crop Height Modeling from UAV imagery, including Digital Surface and Terrain Model (DSM and DTM) generation and Crop Height Model (CHM) calculation.
Agriengineering 07 00403 g004
Figure 5. Illustration of the camera modification.
Figure 5. Illustration of the camera modification.
Agriengineering 07 00403 g005
Figure 6. Linear regression between the reference DTM and the DTMs generated at (A) the beginning and (B) the end of the crop cycle (post-harvest) using RGB and NIR images.
Figure 6. Linear regression between the reference DTM and the DTMs generated at (A) the beginning and (B) the end of the crop cycle (post-harvest) using RGB and NIR images.
Agriengineering 07 00403 g006
Figure 7. Linear regression between field-measured height and CHM-derived height (RGB and NIR) at 99, 144, 164, 200, 228, 269, and 326 DAC.
Figure 7. Linear regression between field-measured height and CHM-derived height (RGB and NIR) at 99, 144, 164, 200, 228, 269, and 326 DAC.
Agriengineering 07 00403 g007
Figure 8. Linear regression between sugarcane yield (t ha−1) and crop height (m), derived from NIR imagery at 99, 144, 164, 200, 228, 269, and 326 DAC.
Figure 8. Linear regression between sugarcane yield (t ha−1) and crop height (m), derived from NIR imagery at 99, 144, 164, 200, 228, 269, and 326 DAC.
Agriengineering 07 00403 g008
Figure 9. Spatial variability of sugarcane yield (t ha−1) based on CHM-derived height at 200 DAC.
Figure 9. Spatial variability of sugarcane yield (t ha−1) based on CHM-derived height at 200 DAC.
Agriengineering 07 00403 g009
Table 1. Data collection dates in days after cutting (DAC), month, crop stage, and precipitation between sampling dates.
Table 1. Data collection dates in days after cutting (DAC), month, crop stage, and precipitation between sampling dates.
(DAC)MonthCrop StagePrecipitation (mm)
67DecemberEmergence255.90
99JanuaryTillering89.70
144MarchVegetative174.10
164MarchVegetative95.60
200AprilVegetative11.20
228MayVegetative79.40
269JulyMaturation1.90
326SeptemberMaturation33.00
Table 2. Components and costs.
Table 2. Components and costs.
ComponentMean Price (USD)
Camera—Mobius 1 1S A2 1440P HD 60.15
Filter—650 nm Narrow Bandpass1.44
3D-printed camera mount9.00
Total61.59
1 The equipment was sourced from Huizhou Topcens Technology Co., Ltd., Huizhou, China.
Table 3. Pearson correlation coefficients (r), coefficient of determination (R2), root mean square error (RMSE), and relative error—RE (%) between field-measured height and CHM-derived height across seven collection dates.
Table 3. Pearson correlation coefficients (r), coefficient of determination (R2), root mean square error (RMSE), and relative error—RE (%) between field-measured height and CHM-derived height across seven collection dates.
CombinationrR2RMSE (m)RE (%)
Initial RGB DTM + RGB DSM0.950.900.2610.36
Initial RGB DTM + NIR DSM0.980.950.176.34
Initial NIR DTM + RGB DSM0.940.890.2810.70
Initial NIR DTM + NIR DSM0.970.940.207.16
Final RGB DTM + RGB DSM0.940.890.2811.62
Final RGB DTM + NIR DSM0.970.940.197.44
Final NIR DTM + RGB DSM0.920.850.3212.83
Final NIR DTM + NIR DSM0.950.900.269.57
Reference DTM + RGB DSM0.950.900.2510.05
Reference DTM + NIR DSM0.980.960.165.97
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Martello, M.; Silva, M.L.; Silva, C.A.A.C.; Rizzo, R.; Oliveira, A.K.d.S.; Fiorio, P.R. Unmanned Aerial Vehicles and Low-Cost Sensors for Monitoring Biophysical Parameters of Sugarcane. AgriEngineering 2025, 7, 403. https://doi.org/10.3390/agriengineering7120403

AMA Style

Martello M, Silva ML, Silva CAAC, Rizzo R, Oliveira AKdS, Fiorio PR. Unmanned Aerial Vehicles and Low-Cost Sensors for Monitoring Biophysical Parameters of Sugarcane. AgriEngineering. 2025; 7(12):403. https://doi.org/10.3390/agriengineering7120403

Chicago/Turabian Style

Martello, Maurício, Mateus Lima Silva, Carlos Augusto Alves Cardoso Silva, Rodnei Rizzo, Ana Karla da Silva Oliveira, and Peterson Ricardo Fiorio. 2025. "Unmanned Aerial Vehicles and Low-Cost Sensors for Monitoring Biophysical Parameters of Sugarcane" AgriEngineering 7, no. 12: 403. https://doi.org/10.3390/agriengineering7120403

APA Style

Martello, M., Silva, M. L., Silva, C. A. A. C., Rizzo, R., Oliveira, A. K. d. S., & Fiorio, P. R. (2025). Unmanned Aerial Vehicles and Low-Cost Sensors for Monitoring Biophysical Parameters of Sugarcane. AgriEngineering, 7(12), 403. https://doi.org/10.3390/agriengineering7120403

Article Metrics

Back to TopTop