1. Introduction
Agriculture plays a fundamental role in promoting the Sustainable Development Goals (SDGs), with emphasis on eradicating famine, the sustainable use of natural resources, and tackling climate change [
1]. To achieve the goals set by the UN as part of the 2030 Agenda, the FAO recommends adopting practices such as precision agriculture, which involves decision-support systems based on advanced technologies like remote sensing. These remote sensing platforms provide images captured at different wavelengths of the electromagnetic spectrum, offering valuable information about crops that can be expressed through vegetation indices (VIs), thus enabling more efficient and sustainable agricultural production [
2].
Multispectral remote sensing includes a variety of platforms, such as satellites, aircraft, unmanned aerial vehicles (UAVs), and agricultural machinery equipped with high-tech sensors. These tools have shown great potential for agricultural monitoring, improving forecast accuracy [
3]. Each platform has specific characteristics, such as spatial, spectral, radiometric, and temporal resolutions. UAVs, also known as drones, allow the acquisition of high-spatial-resolution (centimeter-level) images, enabling the capture of details not accessible by satellites. However, UAVs face limitations including wind speed, precipitation, the need for ground operators, and the handling of large volumes of data that require prior processing. In contrast, satellite platforms offer the advantage of rapid and, in some cases, free image acquisition, though they are limited by spatial resolution and cloud cover during image capture [
4,
5]. Despite these limitations, multispectral remote sensing has advanced significantly, with new technologies offering resolutions as fine as 1 m, better addressing the needs of agriculture. The relationship between spatial resolution and within-field variability is essential for understanding differences in crop productivity. Studies show that as the spatial resolution increases from 3 m to 10 m, 20 m, and 30 m, the explained variability decreases from 100% to 86%, 72%, and 59%, respectively [
6]. A comparative analysis of UAVs, aircraft, and nine satellite image providers found that UAVs had higher costs and were not economically viable for variable-rate nitrogen fertilization in grain crops, followed by aircraft and satellites. Although free satellites such as Landsat 7/8 and Sentinel-2 have lower optical quality, the consistency of the data collected proved highly advantageous [
7].
The growing number of satellite missions increases opportunities for continuous vegetation monitoring, essential for precision agriculture and environmental sustainability. However, estimating important agronomic characteristics such as nitrogen (N) concentration using data from Landsat 7, Landsat 8, and Sentinel-2 satellites still presents a significant challenge due to different sensor configurations and complex atmospheric interactions [
8]. A comparative study evaluated the use of the NDVI index on three remote sensing platforms (tractor and UAV with Micasense Altum multispectral camera, and Sentinel-2 satellite) for variable-rate N application in maize crops. The results indicated a moderately strong correlation among the measurements; however, UAV data yielded higher values due to soil reflectance influence [
9]. Over the past two decades, numerous studies have investigated nitrogen status in maize crops using multispectral remote sensing. These studies show that different nitrogen levels can be expressed through vegetation indices (VIs). Some use UAVs equipped with multispectral cameras such as the Sensefly Sequoia and Red Edge MX, while others use satellites like Landsat-8 OLI and QuickBird. Results suggest that multispectral imagery is effective for in-field nitrogen management regardless of variety, as nitrogen levels are closely related to indices such as NDVI, NDRE, GNDVI, GRg, EVI, SAVI, NRI, and NCI [
10,
11,
12,
13].
Although N is essential for achieving high yields and ensuring global food security, about half of the applied nitrogen fertilizer is not absorbed by crops [
14]. This low efficiency is attributed to processes such as volatilization, runoff, leaching, and denitrification, which significantly reduce the return on agricultural investment [
15]. Additionally, some of these loss pathways lead to contamination of surface and groundwater, such as nitrate leaching (NO
3−) and the emission of greenhouse gases like nitrous oxide (N
2O). This contributes to global warming, with the production and use of nitrogen fertilizers accounting for about 5% of global greenhouse gas emissions (GHG). This inefficiency is particularly evident in maize crops, where nitrogen deficiency leads to higher reflectance around 550 nm due to reduced chlorophyll concentration. This decreases radiation absorption and increases reflectance, as the spectral signature of vegetation reflects plant health. Such information is crucial for efficient nitrogen management and to avoid input waste and environmental damage [
16,
17]. Efficient N management is challenging without accurate knowledge of its status in the crop [
18]. Various methods and tools are available for N management, including soil tests, plant tissue analysis, spectral response, and the use of VIs. No single method is considered fully sufficient, but precision agriculture tools such as UAVs and satellites have proven superior to conventional methods, which face numerous challenges like being time-consuming, destructive, and having limited throughput, making efficient lab analysis difficult [
19]. The adoption of these technologies can ensure proper nitrogen management, resulting in yield increases of up to 18.8% by providing appropriate nutritional support to the crop. Furthermore, the use of such technologies enables continuous and high-precision monitoring of soil and plant conditions, promoting more sustainable and efficient agricultural production [
20,
21].
Numerous remote sensing platforms have the potential to monitor crop nutritional status, but it is not yet clear whether they offer equally promising results in specific contexts. The hypothesis of this research is that different remote sensing platforms may provide varying predictive capabilities for identifying nitrogen deficiency in agricultural crops. This study investigates how key factors (spatial, spectral, radiometric, and temporal resolution) influence the accuracy of deficiency detection. It also explores whether the platforms can be complementary or if one outperforms the others in terms of performance. The effectiveness of the platforms may also be influenced by the crop’s phenological stage, highlighting the importance of vegetation indices as crucial tools for analysis. Thus, the objective of this article is to compare the multispectral remote sensing platforms Sentinel-2, CBERS-04A, and unmanned aerial vehicle (UAV) to determine whether these platforms can accurately detect different nitrogen doses (NDs) at various phenological stages of maize crops using vegetation indices (VIs).
2. Materials and Methods
2.1. Study Area
Between January and July 2023 (
Figure 1), a field experiment was conducted at the Lageado Experimental Farm, which belongs to the Faculty of Agronomic Sciences, located in Botucatu, São Paulo (22°48′50″ S, 48°25′51″ W, at an altitude of 778 m).
According to Franco et al. [
22], the Köppen–Geiger climate classification [
23] for Botucatu is type Aw (tropical savanna climate), characterized by hot and rainy summers and cold, dry winters. The average annual temperature is 21.34 °C, with relative humidity around 70% and total annual precipitation of approximately 1500 mm, distributed over 107 days per year. The soil in the experimental area was classified as Typic Rhodudalf [
24,
25].
2.2. Conducting the Experiment
The experiment was conducted during the 2023/2024 growing season using FS710 hybrid maize (ForSeed®, Changsha, China). Soil preparation was carried out to a depth of 0.30 m using plowing followed by leveling harrowing. At sowing, NPK fertilizer was applied to supply 12 kg ha−1 of N, 42 kg ha−1 of P2O5, and 24 kg ha−1 of K2O, placed 0.05 m below and beside the seeds. Sowing took place on 9 January 2023, with a spacing of 0.90 m between rows and six plants per linear meter, targeting a density of 66,000 plants per hectare.
2.3. Experimental Design and Treatments
The experiment followed a completely randomized design, with six treatments and four replicates. The plots were predefined to ensure a minimum number of pixels and respect the pixel structure and displacement of the multispectral satellite grids. The position of each plot was determined using a Garmin GPSmap 60CSx (Kansas City, Kansas, USA), based on geographic coordinates predefined in QGIS software, version 3.22.10.
The treatments consisted of different levels of topdressed nitrogen fertilization. Application was performed manually by broadcast spreading at the V6 vegetative stage (28 days after emergence), using urea as the N source (45% N). The applied nitrogen doses (NDs) were 0, 36, 84, 132, 180, and 228 kg ha
−1 of N, corresponding to 0%, 30%, 60%, 90%, 120%, and 150%, respectively, of the recommended 160 kg ha
−1 N dosage proposed by Cantarella et al. [
26], applied in a single dose.
The experiment occupied a total area of 40,600 m2 (considering the plots and the spacing between them), composed of experimental plots of 900 m2, with a spacing of 10 m between them. Each treatment level of nitrogen fertilization occupied a total area of 3600 m2. The dimensions of each plot were 30 m in the north–south direction and 30 m in the east–west direction.
2.4. Obtaining Images
2.4.1. Unmanned Aerial Vehicle (UAV)
The unmanned aerial vehicle (UAV) used in the experiment was a Phantom 4 Pro (SZ DJI Technology Co., Ltd., Shenzhen, China), equipped with an RGB camera, model FC6310 (SZ DJI Technology Co., Ltd., Shenzhen, China), with a focal length of 8.8 mm and a resolution of 5472 × 3648 pixels, resulting in a pixel resolution of 0.0232 mm. Additionally, it was integrated with an RGN camera, model Survey3W (MAPIR, San Diego, CA, USA), with a focal length of 3.4 mm and a resolution of 4000 × 3000 pixels, providing a pixel resolution of 0.0402 m. This RGN model includes the spectral bands 550 nm (reflectance in the green region), 0.66 µm (reflectance in the red region), and 0.85 µm (reflectance in the near-infrared region) [
27].
The average flight time was approximately 12 min and 18 s, covering an area of about 13 ha at an altitude of 80 m above ground level. The data collected by the RGB and RGN cameras were processed using PIX4Dmapper software, version 4.10 (Pix4D S.A., Prilly, Switzerland), which generated orthomosaics and digital maps.
2.4.2. CBERS-04A Satellite
The medium-resolution remote sensing satellite images from CBERS-04A were obtained from the Remote Sensing Data Center (CDSR), responsible for receiving, storing, and processing the images. The data were acquired through the website
http://www.dgi.inpe.br/catalogo/explore (accessed on 15 June 2025) [
28], via the image generation division of INPE|CGOBT|DIDGI.
The imagery was captured using the CBERS-04A’s Wide-Field Imager (WPM), which has a spatial resolution of 8 m. The spectral bands used correspond to 0.45–0.52 µm (blue), 0.52–0.59 µm (green), 0.63–0.69 µm (red), and 0.77–0.89 µm (near-infrared). The revisit cycle over a fixed point is 31 days [
29].
2.4.3. Sentinel-2 Satellite
Data from the Sentinel-2 satellite constellation were obtained via the Copernicus Space Data Ecosystem, responsible for providing image access, navigation, and metadata export. The images were acquired from
https://browser.dataspace.copernicus.eu/ (accessed on 15 June 2025) [
30], via the data distribution service for Sentinel-2 missions.
For data acquisition, the S2B mission, MSIL1C product, and Level-1C level were established. The MSI multispectral sensor has 13 spectral bands; we chose four bands at 10 m, corresponding to 0.49 µm (reflectance in the blue region), 0.56 µm (reflectance in the green region), 0.665 µm (reflectance in the red region), and 0.842 µm (reflectance in the near-infrared region). The period to travel the route is 5 days at the equator with the constellation of 2 satellites, Sentinel 2A and 2B [
31].
2.5. Vegetation Indices for UAV, Sentinel-2, and CBERS-04A
After collecting multispectral images from UAV, Sentinel-2, and CBERS-04A platforms, the satellite-provided images were analyzed using the open-source Geographic Information System (GIS) QGIS, version 3.22.10, to verify the presence of clouds that could compromise visualization of the experimental area. Then, the spectral bands were processed using the mathematical formulas for vegetation indices (VIs) (
Table 1). The extracted index values were determined based on the mean pixel values within experimental plots.
Throughout the maize growing season, nine data acquisition campaigns were conducted, covering three representative phenological stages: vegetative (V7), early reproductive (VT), and advanced reproductive (R3, R4, and R5). At each stage, all remote sensing platforms flew over the experimental area and collected multispectral images at varying spatial resolutions.
The UAV provided between 1,515,339 and 1,671,849 pixels per plot, due to the exceptionally high spatial resolution of the multispectral camera, which resulted in a greater number of pixels for calculating the average vegetation indices (VIs). For the Sentinel-2 satellite, the average was based on 4 pixels located at the center of the plots, due to their strategic positioning. On the other hand, for the CBERS-04A satellite, the average was calculated using 6 to 9 centrally located pixels within the plots, owing to the higher spatial resolution of its camera, which allows for greater pixel variability within the plots. The different spatial resolutions enabled a detailed analysis of the study area and a comparison between the remote sensing platforms used, as illustrated in
Figure 2,
Figure A1 and
Figure A2 for the UAV, Sentinel-2, and CBERS-04A.
2.6. Leaf Sampling and Diagnosis
Fresh plant tissue collection was performed at the R1 stage (flowering), sampling 30 leaves as recommended by Malavolta et al. [
41]. The leaves were dried in a forced-air circulation oven at 65 °C until they reached a constant weight. Subsequently, the leaves were ground using a Wiley-type knife mill (1 mm mesh). The nitrogen content in the leaves was determined following the method described by the AOAC [
42]. Leaf nitrogen content was determined following AOAC [
42] methods, using sulfuric acid digestion to obtain extracts that were distilled by the Kjeldahl method for N quantification via titration.
2.7. Statistical Analysis
2.7.1. Analysis of Variance
The data were analyzed using RStudio software (version 2022.07.1+554 “Spotted Wakerobin”). Data normality was verified using the Shapiro–Wilk and Royston tests, while the homogeneity of variances was assessed using Bartlett’s test. For data that did not meet the assumptions, transformations were applied using the “bestNormalize” package. One-way ANOVA was used to evaluate the nitrogen content in the leaves, followed by a quadratic regression analysis performed with the “stats” package. The corresponding graph was generated using the “dplyr” package.
Two-way ANOVA was applied to assess the effects of nitrogen doses (NDs) and remote sensing platforms (RSPs) on vegetation indices (VIs). Three phenological stages of the maize crop were analyzed (V7, VT, and the reproductive stages R3, R4, and R5, which were analyzed together), allowing for a comprehensive comparison of the variables throughout the plant’s development. When significant differences were observed, the Least Significant Difference (LSD) multiple comparison test was applied at a 5% significance level.
2.7.2. Principal Component Analysis and Pearson Correlation
Principal component analysis (PCA) was used to reduce data dimensionality and identify relevant patterns among the variables. The analyses were carried out using RStudio software, version 2022.07.1+554 “Spotted Wakerobin”, with the statistical packages “FactoMineR” and “factoextra”, which enabled the extraction and visualization of quantitative and categorical variables. Additionally, Pearson’s correlation was calculated using the same variables, with the aid of the “corrplot” package.