Next Article in Journal
Drone-Based Detection and Classification of Greater Caribbean Manatees in the Panama Canal Basin
Previous Article in Journal
Combining UAV Photogrammetry and TLS for Change Detection on Slovenian Coastal Cliffs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Innovative Method of Monitoring Cotton Aphid Infestation Based on Data Fusion and Multi-Source Remote Sensing Using Unmanned Aerial Vehicles

1
College of Mechanical Engineering, Xinjiang University, Urumqi 830017, China
2
Key Laboratory of Integrated Pest Management on Crop in Northwestern Oasis, Ministry of Agriculture and Rural Affairs, Institute of Plant Protection, Xinjiang Academy of Agricultural Sciences, Urumqi 830091, China
3
Xinjiang Uygur Autonomous Region Plant Protection and Quarantine Station, Urumqi 830006, China
*
Authors to whom correspondence should be addressed.
Drones 2025, 9(4), 229; https://doi.org/10.3390/drones9040229
Submission received: 8 February 2025 / Revised: 11 March 2025 / Accepted: 19 March 2025 / Published: 21 March 2025
(This article belongs to the Section Drones in Agriculture and Forestry)

Abstract

:
Cotton aphids are the primary pests that adversely affect cotton growth, and they also transmit a variety of viral diseases, seriously threatening cotton yield and quality. Although the traditional remote sensing method with a single data source improves the monitoring efficiency to a certain extent, it has limitations with regard to reflecting the complex distribution characteristics of aphid pests and accurate identification. Accordingly, there is a pressing need for efficient and high-precision UAV remote sensing technology for effective identification and localization. To address the above problems, this study began by presenting a fusion of two kinds of images, namely panchromatic and multispectral images, using Gram–Schmidt image fusion technique to extract multiple vegetation indices and analyze their correlation with aphid damage indices. After fusing the panchromatic and multispectral images, the correlation between vegetation indices and aphid infestation degree was significantly improved, which could more accurately reflect the spatial distribution characteristics of aphid infestation. Subsequently, these machine learning techniques were applied for modeling and evaluation of the performance of multispectral and fused image data. The results of the validation revealed that the GBDT (Gradient-Boosting Decision Tree) model for GLI, RVI, DVI, and SAVI vegetation indices based on the fused data performed the best, with an estimation accuracy of R2 of 0.88 and an RMSE of 0.0918, which was obviously better than that of the other five models, and that the monitoring method of combining fusion of panchromatic and multispectral imagery with the accuracy and efficiency of the GBDT model were noticeably higher than those of single multispectral imaging. The fused panchromatic and multispectral images combined with the GBDT model significantly outperformed the single multispectral image in terms of precision and efficiency. In conclusion, this study demonstrated the effectiveness of image fusion combined with GBDT modeling in cotton aphid pest monitoring.

1. Introduction

Cotton is an important cash crop, which is widely used in the textile and chemical industries, and is cultivated in more than 70 countries and territories around the world, with countries such as China, the United States, India, etc., being the major producers [1]. Pests and diseases are among the major factors hindering the quality and efficiency of the cotton industry. Aphids are the main pests restricting the development of the cotton industry, causing yield losses and infested areas at far higher rates than other pests. The cotton aphid exhibits rapid population outbreaks, high dispersal capacity, and strong sensitivity to environmental fluctuations (e.g., temperature, humidity) [2]. Currently, monitoring of pests relies mainly on manual inspection and control is based on spraying insecticides, leading to high costs and wasted resources. There is an urgent need to carry out research on precise monitoring and control. Therefore, it is crucial to establish an integrated monitoring framework for cotton aphid outbreaks, which requires surveillance systems to monitor cotton aphid infestations in a timely manner, as well as precise control strategies to mitigate population expansion and quantitative severity assessment models based on the fusion of data from multiple sources. By utilizing UAV spectral remote sensing technology combined with machine learning algorithms to generate heat maps of insect pests, the accuracy and timeliness of cotton aphid monitoring can be significantly improved.
In the last few years, monitoring technologies have been extensively employed in the agricultural sector [3,4], including satellite remote sensing [5,6], remote sensing by unmanned aerial vehicles (UAVs), and monitoring methods through manual visual image processing. Satellite remote sensing offers benefits such as a large range, real-time monitoring, objectivity, and efficiency in agricultural monitoring, providing continuous farmland data and supporting monitoring of pests, diseases, and crop growth [7]. Despite technological advancements, challenges persist when it comes to applying satellite-based monitoring to cotton aphid management. While high-spatial-resolution satellites are commercially available, their operational costs remain prohibitive for routine agricultural use. Although spectral calibration tools mitigate illumination variability, persistent limitations include cloud cover obstructing optical data acquisition during critical infestation periods, and difficulties in validating fine-scale aphid density maps with ground truth data due to spatial mismatches between satellite pixels and field sampling plots. Monitoring methods involving manual visual image processing rely on manual or fixed equipment for image acquisition, and although they have high precision and convenience, their monitoring range is limited and inefficient, and it is difficult to cover large areas of farmland.
In contrast, UAV remote sensing technology, which is advantageous because it is inexpensive, with high mobility and flexible temporal and spatial resolution [8,9,10], has become a preferred means of monitoring agricultural pests and diseases [11], and provides technological assistance for precise disease and pest monitoring in small and medium-sized agricultural fields through high-resolution imagery and a variety of spectral information [12]. The photosynthesis and leaf structure of plants are affected by pest and disease stress, leading to changes in spectral features, which in turn enables pest and disease identification and stress assessment [13]. Hu et al. 2022, proposed a new method based on spectral index reconstruction, which determines thresholds between neighboring grades to determine the range of aphid severity grades through sequence matching, and the threshold delineation accuracy of the proposed method is no less than 0.9, and the proposed hyperspectral feature reconstruction technique can effectively realize the grading of cotton aphid infestation severity, which has good application value [14], but the study lacked samples from complicated situations and had a small sample size. Wei et al. 2021, utilized the derivative ratio spectral analysis (DRS) technique to screen the sensitive bands of cotton canopy spectra, and combined the reflectance spectral characteristic bands with the derivative ratio (DR) spectral parameters to establish a multivariate composite prediction model, which was shown to have an accuracy of 0.612; the distribution maps of different cotton aphid severity classes can be obtained, providing targeted zoning decision support for realizing precise variable application of pesticides [15].
UAV remote sensing technology has achieved remarkable results when it comes to monitoring agricultural pests and diseases, and is able to effectively assess changes in plants’ spectral characteristics through high-resolution imagery and multispectral information, realizing accurate monitoring of crop stresses. However, the existing single data source monitoring method (i.e., monitoring by only one sensor) has a relatively low generalization ability under different environments and aphid hazards, which may lead to the instability of the monitoring effect. To compensate for this deficiency, fusion of multispectral images with panchromatic images has become an effective way to enhance monitoring accuracy [16]. The fused images not only enhance pest and disease recognition ability, but also improve stability and robustness in complex environments. The fusion technique reduces cost, improves processing efficiency, and supports refined spatial analysis [17]. In addition, fused images enhance the generalization ability of the model, making its performance more stable across regions and seasons.
To improve the shortcomings of traditional remote sensing methods, machine learning methods have gradually become the mainstream technology in remote sensing pest monitoring research in recent years. Jiang et al. 2023, proposed a multi-algorithm synergistic insect prediction framework, integrating XGBoost and GWO (Grey Wolf Optimizer) algorithms in combination with support vector regression (SVR) to construct an aphid prediction model, resulting in a classification accuracy of above 95% for the model. These methods can fully explore the relationship between image features and aphid pest severity and improve the monitoring accuracy [18], but the range of the sensor bands used in this work is limited; therefore, the recognition procedure is more challenging. In addition, combining field survey data and visual-based means to provide effective auxiliary verification for remote sensing monitoring results can contribute to increased precision of pest identification and classification [19]. Weicheng et al. 2021, propose a cotton yield prediction method integrating remote sensing data from multiple sources, which centers on resolving boll cracking features in high-resolution visible images via the U-Net semantic segmentation network and constructing a multi-scale yield estimation model by combining multispectral vegetation index and spatial coverage parameters [20]; Xiang et al. 2023, developed the CTFuseNet dual-stream feature fusion framework for crop type fine classification requirements, which innovatively integrates the local texture extraction capability of CNN and the global context modeling advantage of Transformer to achieve 85.33% average cross accuracy and 92.46% pixel accuracy (PA) in the crop segmentation task of remotely sensed images. The method realizes multi-scale feature adaptive fusion through the cross-attention mechanism, verifies the synergistic effect of hybrid architecture in agricultural remote sensing, and provides algorithmic support for precision agriculture plot-level management [21].
In summary, the existing methods for monitoring cotton pests and diseases still have some deficiencies, which are mainly reflected in the limitations of a single data source and weak generalization ability. Satellite remote sensing has advantages in large-scale monitoring, but it is greatly affected by spatial resolution and weather, and thus struggles to achieve precise monitoring. Despite the high precision of ground visual monitoring, the coverage is small and inefficient. UAV remote sensing has achieved remarkable results in pest and disease monitoring, but existing methods have unstable results in complex outdoor conditions and are limited by image data processing and generalization capabilities.
To solve the above problems, this research makes up for the lack of a single source of data by combining the features of multispectral images and the space details of panchromatic images, and constructs a GBDT monitoring model based on the strong correlation vegetation index of cotton aphids to enhance the accuracy of identification and monitoring of cotton aphid pests. The multifactor model based on fused images can accurately identify the degree of pest infestation and guide the plant protection equipment to realize precise control, providing important support for precision agriculture and green agriculture.

2. Materials and Methods

2.1. Study Area

This experiment was set up in a 6.07 ha study site located in the 121st Corps, Shihezi City, Xinjiang Uygur Autonomous Region, China (longitude 85°33′22″ E, latitude 44°47′21″ N; elevation: 450 m; see Figure 1 for the geographic location), which is located in a temperate continental arid climate zone, with average annual temperature, rainfall and sunshine hours, respectively, of 6.5–7.2 degrees Celsius, 170 mm, and 2780 h. The soil type is mainly sandy, with a pH value of 7.2~8.5 [12]. In this experiment, the early-maturing cotton seed Xinluzao 57 was selected, and mechanized precision sowing was carried out on April 20, with the planting density set at 18,000 plants/hm2. The experimental cotton field did not apply potassium, phosphorus, and plant growth regulators, and the management of the cotton growth process is the same as that of the local non-experimental site. In this experiment, there were 150 sampling points.

2.2. Data Acquisition

2.2.1. UAV Image Acquisition

The study was conducted based on a data acquisition platform consisting of a DJI M300RTK industrial application-grade UAV (DJI, Shenzhen, China), a RedEdge-MX Dual10 multispectral camera (MicaSense, Seattle, WA, USA), and a DJI ZENMUSE P1 high-definition RGB camera (DJI, Shenzhen, China) (Figure 2a). The drone acquired images at the experimental site on 20 July 2023 from 12:00 h to 14:00 h, the time period in which the cotton was under the most serious aphid stress. Considering the influence of weather factors on the collected data, a sunny day with a maximum wind speed of 4 m/s was chosen for data collection, which took place at midday. The drones were flown twice with two separate sensors. The multispectral data were collected at an altitude of 50 m, Ground Sampling Distance (GSD) of 3.55 cm/pixel, and the camera was calibrated at a height of 1 m above the calibrated reflectance panel before and after take-off; the panchromatic data were collected by a RGB camera at an altitude of 20 m, and the overlap rate of the heading and the side direction was 80%, and the specific parameters of the collection are detailed in Table 1.

2.2.2. Cotton Aphid Field Survey Data Collection

Samples affected by different levels of aphid severity were collected manually by plant protection experts in the cotton fields in the study area. Cotton leaves with different pest levels are shown in Figure 3; a 50 cm × 50 cm frame was taken to investigate the aphid severity in the sample points (Figure 2b), at least 5 cotton plants were investigated in each sample point, and the aphid severity was classified with reference to the national standard ‘Cotton Aphid Measurement and Reporting Technical Specification’ (GB/T 15799-2011) [14], and each data point’s aphid disease index was computed to categorize the severity of the disease, according to the aphid index (II) of cotton aphid infestation. The aphid disease index of each sample point was calculated to classify the disease severity, and the cotton aphid pest severity was classified into five grades according to the aphid index (II) of the cotton aphid pest, as shown in Table 2 [22], and the details on the sample points’ locations and the level of aphid infestation were recorded using a thousand-seeking positional differential instrument GNSS (Qianxun SI, Huzhou, China) (Figure 2b).
The aphid damage index (II, %) [14] was calculated based on the cotton aphid damage severity grading criteria using the following formula:
I I = Σ n · f N · Σ f × 100 %
In the aphid index formula, n refers to the number of aphid levels, f refers to the number of plants at each level, and N indicates the highest aphid level.
According to the aphid index (II), the severity of damage to the cotton aphid canopy was also divided into five levels, i.e., normal (0): II = 0%; mild (1): 0% < II ≤ 25%; moderate (2): 25% < II ≤ 50%; severe (3): 50% < II ≤ 75%; and very severe (4): 75% < II ≤ 100%.

2.2.3. Data Pre-Processing

The study used Agisoft Metashape Professional (Agisoft LLC, Saint Petersburg, Russia) to pre-process the spectral images taken by the UAV, including image import, camera calibration, and image alignment. After the spectral images were imported, camera calibration was performed first to correct the geometric errors caused by the camera to guarantee that the photos were accurate. The image alignment technique was used to automatically identify the overlapping areas between images, calculate the matching points, and generate a dense point cloud. By generating Digital Surface Models (DSMs) and Orthophotos (DOM), the images were corrected for geometric distortions to ensure that each pixel was aligned with the actual geographic coordinates. The exported images underwent further pre-processing at ENVI 5.6 (NV5 Geospatial, Boulder, CO, USA). Firstly, radiometric correction was performed, where a diffuse reflectance reference plate was photographed before the UAV flight to remove the influence of the atmospheric sensor and other environmental factors on the radiometric values of the image, thus improving the precision of the spectral data and assuring that the data were close to the surface reflectance. Geometric correction was then carried out to accurately align the image with the ground coordinate system through ground control points (GCPs), providing the basis for subsequent spatial analyses and integration with other remote sensing data. After that, noise removal, filtering and other operations were performed on the images to effectively remove noise and unnecessary details in the images to enhance the image data’s quality. Finally, the processed images were cropped to obtain the image data of the experimental area, and ENVI was used to map the region of interest (ROI) and label the actual ground area, providing accurate data for the subsequent spectral index calculation and other analyses.

2.2.4. Image Fusion and Alignment

Based on the acquisition of panchromatic images and multispectral data from different sensors, the coordinate information of the panchromatic and multispectral images has deviations, which necessitate coordinate alignment and image alignment operations, and then the MSI images are fused using high-resolution panchromatic images to obtain high-precision multispectral information. In ENVI software, firstly, 127 feature point pairs are identified in the panchromatic and MSI images, respectively, for image alignment, and the RMSE is controlled within 3 mm; secondly, the image alignment workflow is performed by using the cubic convolution interpolation method; and finally, the coordinate-aligned panchromatic and MSI datasets are obtained for image fusion processing.
Gram–Schmidt (GS) fusion [23] is an image enhancement technique based on orthogonalized vector space projection, which maximizes the preservation of the original spectral features while enhancing the spatial detail to the panchromatic band level. The method is based on the GS orthogonal decomposition framework, which generates a synthetic panchromatic image by linearly combining multispectral bands as initial orthogonal basis vectors and iteratively calculates the orthogonal components on a band-by-band basis. Subsequently, the high-resolution panchromatic picture replaces the first orthogonal basis vector, and the fused image is reconstructed by inverse transformation with the formulas as shown in (2)–(5) [24]. Due to the local spectral fidelity consistency characteristics of synthetic panchromatic images with real panchromatic bands, the fusion effect of this type of method is better than other image fusion methods based on matrix analysis, which are suitable for agricultural pest monitoring. This method realizes data redundancy reduction by compressing the spectral dimensions of multispectral images by principal component analysis (PCA) and extracting the orthogonal principal components with maximum variance contribution [25], and ensures the independence between principal components through the Gram–Schmidt orthogonalization process. Then, the detailed information of the high-resolution panchromatic data are precisely embedded into the first principal component of the multispectral data, and the fusion coefficient is used to control the degree of fusion. Finally, the inverse PCA transform is applied to restore the fused principal components to the initial picture space to obtain the final high-resolution fused image (Figure 4).
G S 1 = H R s i m
G S 2 = L R 1 ^ H R s i m , L R 1 ^ H R s i m , H R s i m × H R s i m
G S 3 = L R 2 ^ H R s i m , L R 2 ^ H R s i m , H R s i m × H R s i m G S 2 , L R 2 ^ G S 2 , G S 2 × G S 2
G S b + 1 = L R b ^ H R s i m , L R b ^ H R s i m , H R s i m × H R s i m k = 1 b G S k , L R k ^ G S k , G S k × G S k

2.2.5. Cotton Aphid Vegetation Index Construction

Crops under pest stress undergo significant changes in the internal biophysical parameters of their leaves, leading to a characteristic response in the spectral reflectance properties of the canopy. It can be said that the spectral response has a functional relationship with water content, pigmentation, morphology, and structural properties [26,27]. The significant spectral properties of leaf absorption and reflection in the visible and near-infrared bands have been utilized to derive vegetation indices that have been systematically applied to phenotypic monitoring of field crops and early warning of pests and diseases [28,29,30]. Analyzing the pathology of cotton aphids after infestation of leaves and combining the band characteristics of the sensors used, 10 VI indices commonly used in cotton aphid analysis were constructed (see Table 3), such as the Atmospheric Resistance Index (ARI), Green Leaf Index (GLI), Green Biome Index (GBI), Soil-Adjusted Vegetation Index (SAVI), and others.

2.3. Data Analysis and Machine Learning Model Construction

In this study, we began by analyzing the spectral characteristics of cotton canopies under different aphid damage levels and combined the sensitivities of spectral features. Ten spectral indices of the target area in the remote sensing images with the measured aphid damage level data on the ground were subjected to Pearson’s correlation analysis by R (R Core Team, Vienna, Austria) in order to screen out spectral indices related to aphid damage. The vegetation indices with higher correlation were selected and modeled using six machine learning methods, namely Linear, Ridge, Decision Tree, Random Forest, AdaBoost and GBDT [39]. The model used the aphid data measured in the field as the target variable and the highly correlated spectral indices screened from the remote sensing data as the independent variables to construct a prediction model for the degree of aphid damage. Finally, the model was validated using ground-truth data to assess its prediction accuracy. A total of 150 aphid ground sample datapoints were collected in this study. Due to the constraints of high labor costs and timeliness of data acquisition, the size of the dataset was still small despite the fact that the sample size had been maximized to improve the model’s training effect. Therefore, in order to balance model fitting and generalization ability assessment, a hierarchical random partitioning strategy was used to split the dataset into a training set (70%, 105 samples) and test set (30%, 45 samples).

2.4. Model Accuracy Evaluation

The performance of the regression model was jointly assessed using the coefficient of determination (R2) and the root mean square error (RMSE), where R2 quantifies the model’s ability to explain the variance of the target variable (0 ≤ R2 ≤ 1, the fit is better when it is closer to 1 and poorer when it is closer to 0.). The RMSE, on the other hand, reflects the absolute deviation of the predicted value from the true value, which indicates the average size of the model’s prediction error. A smaller RMSE indicates that the model has a smaller prediction error, which means that the model has a better prediction performance. R2 and RMSE are calculated by Formulas (6) and (7).
R 2 = i = 0 n ( y ^ i y ) 2 i = 1 n ( y i y ) 2
R M S E = 1 m i = 1 m ( y i y i ^ ) 2

3. Results

3.1. Spectral Characterization of Cotton Canopies with Different Aphid Infestation Classes

The reduced chlorophyll content after aphid infestation results in lower reflectance in the blue (450–495 nm) and red (620–700 nm) bands. This manifests as a significant decrease in reflectance within these bands. Significant variations between groups with pest indexes of B0, B1, B2, B3, and B4 were seen in the spectral features of the cotton plant cover under cotton aphid stress, as illustrated in Figure 5.
Figure 5 illustrates the more pronounced changes in the spectral properties of cotton infested by aphids. As the severity of the aphid infestation increased, its chlorophyll concentration and photosynthesis were impacted, and the spectral reflectance showed a decreasing trend. Cotton spectral reflectance showed a slow increase of 444–560 nm, a slow decrease of 560–668 nm, and a gradual increase of 668–842 nm after a minimal value was reached, of which the rate of increase was the greatest at 705–740 nm, and there was a greater variation in the spectral properties of the canopy of cotton plants with varying degrees of severity. The difference in spectral reflectance between the visible light and the red edge (444–705 nm) range was relatively insignificant. The differences in reflectance were particularly noticeable at the red edge and in near-infrared (705–842 nm). The variations in the reflectance characteristics of cotton leaves were more pronounced as the level of aphid damage increased, which was closely related to the rise in tissue structural damage and the decline in chlorophyll content. Taking into account the relationship between spectral features and aphid infestation, the vegetation index was chosen for the study as an indicator for further analysis. The vegetation’s growth and health state, which are directly linked to the physiological alterations brought on by aphid infestation, can be accurately reflected by the vegetation index. Therefore, by analyzing the correlation between the vegetation index and the aphid damage index, the mechanism of aphid infestation on the spectral characteristics of cotton vegetation can be better understood, and technical assistance and more efficient techniques can be offered for the tracking and forecasting of aphid damage.

3.2. Data Modeling

3.2.1. Correlation Analysis

The selected spectral indices and aphid indexes were analyzed for correlation and the relationship between them was demonstrated by plotting a correlation heat map, as illustrated in Figure 6. The investigations revealed that ARVI (0.77), SAVI (0.68), GBI (0.66), and ARI (0.60) were the top four spectral indices with the highest absolute values of correlation coefficients in the multispectral images. In contrast, GLI (0.89), RVI (0.83), DVI (0.79), and SAVI (0.78) were the top four indices in the fused panchromatic and multispectral pictures, with the correlation coefficients with the maximum absolute values. Aphid infection levels and spectral indices were significantly correlated, according to the results in Table 4. Further analysis revealed that the correlation’s absolute value coefficients between the fused panchromatic and multispectral images were significantly higher compared to the original multispectral data, and the correlation between the fused images and the extent of aphid infestation was more significant (Table 4). These findings provide an important reference for the variables chosen for the regression model. Based on the magnitude of the correlation coefficient |r|, the index with stronger correlation can be chosen as the regression model’s input variable, and the precision of the model’s estimation can be confirmed by contrasting the model’s predicted and measured values (R2, RMSE).

3.2.2. Machine Learning Modeling Comparison

Through correlation analysis, the spectral indices significantly associated with cotton aphid infestation were screened out, of which the multispectral data contained ARVI, SAVI, GBI, and ARI, and the fusion data contained GLI, RVI, DVI, and SAVI. Four spectral indices were preferred as the variable factors based on the absolute value of correlation coefficients of the indices with the aphid infestation and were inputted into each of the six machine learning models for training. By constructing different levels of aphid monitoring models and generating monitoring maps based on the fitting results (e.g., Figure 7 and Figure 8), we showed that the multispectral data and the fused data exhibited high fitting accuracy in the Random Forest, AdaBoost and GBDT models. The performance of fused data in these models is more prominent, and it has been initially verified that image fusion has certain advantages in improving the accuracy and stability of aphid monitoring models.
In order to specifically assess the difference in model monitoring performance between the two image datapoints, Figure 9 shows the R2 ranges of the multispectral image data and fused image data on the validation set. The R2 range of multispectral image data was 0.32–0.83, while that of the fused image data was 0.64–0.88. As can be seen in Table 5, the fused R2 all improved to different degrees. It is evident from comparing the R2 values of the two datasets that the model built using the fused image data performs noticeably better in terms of monitoring accuracy and shows a stronger capacity for prediction than the model built using the multispectral image data. This result shows that the fused image data can capture the aphid damage information more effectively, which provides important technical support for the construction of high-precision aphid damage monitoring and prediction models.
To further verify the prediction accuracy of the models built using fused image data, the models’ performance is thoroughly assessed by contrasting the measured values from the validation set with the forecasted values’ R2 and RMSE metrics. As shown in Figure 10, the R2 of both Random Forest and GBDT models in the validation set reached 0.88, which reflected a higher fitting ability; however, the RMSE indexes of both models were 0.0918 and 0.0933, respectively, and although the performance of the two models was close, the GBDT model was better in terms of the overall prediction effect, and the results were more stable and accurate, which could allow the aphid disease index in cotton fields to be predicted more effectively. The results were more stable and accurate, and could more effectively predict the disease index of aphids in cotton fields. This indicates that the GBDT model has more significant advantages in dealing with complex nonlinear relationships and capturing the importance of features.

3.3. Cotton Aphid Damage Model Evaluation

Based on the validation results in Section 3.2, four vegetation indices (GLI, RVI, DVI and SAVI) from the fused image data were selected as key variables and input into six models, namely, Linear, Ridge, Decision Tree, Random Forest, AdaBoost and GBDT, respectively, for constructing the monitoring model of cotton aphid damage. According to the findings, the best model for tracking aphid damage in cotton was the GBDT regression model. With a coefficient of determination of R2 = 0.88 and an RMSE of 0.0918, the GBDT regression model outperformed the other models substantially, according to the results. Therefore, the GBDT model was finally selected for the inverse prediction of harm from cotton aphids in the experimental cotton field. The cotton aphid disease index prediction results are displayed in Figure 11, where the colors progressively deepen from yellow to red. This visually represents the spatial distribution of aphid damage in the area and intuitively reflects the trend of cotton aphid damage in various spatial locations.

4. Discussion

In this study, remote sensing monitoring of cotton aphids was carried out based on the fusion technology of multispectral and panchromatic images. By combining the high spatial resolution of panchromatic images with the rich spectral characteristics of multispectral images, the precise identification of cotton aphid-infested areas and the comprehensive assessment of plant health were successfully realized. The results show that image fusion can significantly improve the monitoring accuracy and reliability compared with a single image source. The GBDT model was able to handle the nonlinear relationship between spectral variables and pest severity and therefore performed the best. Therefore, we constructed a GBDT model from the fused images and based on the extracted core vegetation indices, the performance of aphid severity monitoring was further optimized, and the accuracy of cotton aphid monitoring was improved. This approach can be integrated into precision agriculture systems to guide localized pesticide applications, reduce costs, and minimize environmental impacts. It provides a more efficient and precise technical means for remote sensing monitoring of agricultural pests and illnesses and provides powerful support for pest management decision-making.
In previous studies, the use of panchromatic or multispectral images to monitor cotton pests and diseases has been a significant research direction in the field of precision agriculture. With their high spatial resolution [40], panchromatic images are able to clearly capture plant morphology and local damage features, such as aphids sucking sap from cotton plants and taking water and nutrients from the leaves, resulting in a decrease in the water content of the leaves, which in turn affects their spectral reflectance [41]. Aphid infestation leads to inhibition of photosynthesis and a reduction in the amount of photosynthetic pigments in the plant, which affects the reflectance of the leaves in the visible and near-infrared wavelengths, providing refined support for the analysis of spatial distribution and morphological changes of pests and diseases. Multispectral images, on the other hand, are able to calculate vegetation indices (e.g., NDVI, RVI, etc.) by capturing the reflectance information of different spectral bands, sensitively reflecting plant health and spectral feature changes, and have significant advantages, especially in the early warning of diseases and identification of pest and disease types [42]. However, both have their own limitations, with panchromatic images lacking spectral information and multispectral images having lower spatial resolution, so recent studies have gradually explored the fusion of panchromatic and multispectral images to achieve higher-precision monitoring of cotton pests and diseases by integrating the benefits of both. In traditional UAV remote sensing monitoring technology, a single image source often cannot provide abundant spectral information and excellent spatial resolution at the same time, which limits the accuracy of pest and disease monitoring, so this study combines panchromatic and multispectral imagery and enhances aphid infestation detection ability on a finer scale by fusing the spatial and spectral complementarities of the imagery with the use of Gram–Schmidt image fusion technology [43]. The absolute value of the correlation coefficient of the vegetation index we obtained after fusing the images is up to 0.89, which exceeds the results reported by Jiang et al. 2023, who obtained a value around 0.5 using only multispectral images [18]. The fused images possessed both spectral features and retained high-resolution, detailed information, which enabled the model to capture the growth conditions and aphid infestation characteristics of cotton fields more comprehensively, thus improving the accuracy and reliability of monitoring. In addition, this research fully considered the differences in resolution and spectral features between multispectral images and panchromatic images in the fusion method and effectively avoided the problem of information loss or distortion in the fusion process through accurate image alignment. However, this study still has some shortcomings. First, the image fusion process is highly dependent on data quality, and environmental conditions such as light intensity and weather changes can significantly affect both panchromatic and multispectral images. These environmental factors may lead to overexposure or abnormal spectral reflectance characteristics within the images, thus affecting the accuracy of the monitoring results. And although image fusion methods show high accuracy, their large-scale application may be limited by UAV availability and data processing costs. In addition, for early aphid infestation, due to the insignificant changes in plants’ physiological characteristics, the fused images may struggle to capture such subtle changes, limiting the effectiveness of the method in early warning applications. Secondly, the fusion technique of multispectral and panchromatic images also faces technical challenges of spatial alignment and spectral matching, and the optimization of the fusion algorithm still needs further research.
The following areas can be the focus of future research. First, more data sources, such as thermal infrared images and hyperspectral images, can be introduced to capture temperature changes in plants and finer spectral features to realize deep fusion of multi-source data. Second, convolutional neural networks (CNNs) can be used in conjunction with deep learning technology to automatically extract complicated information from the fused images, thus improving the accuracy and efficiency of aphid monitoring. At the same time, temporal analysis can be expanded to monitor the spectral change trends of cotton at different growth stages through multi-temporal images to realize early warning of aphid outbreaks. Finally, with the help of drones and Internet of Things (IoT) technology, the automated collection and processing of remote sensing data can be realized to provide timely and efficient decision support. The exploration of these directions will further promote the application of fusion imaging technology in agricultural pest monitoring, providing strong technical support for precision agriculture and sustainable development.

5. Conclusions

In this study, we analyzed the spectral features of different pest classes in the cotton canopy by fusing panchromatic and multispectral images from a UAV, and calculated the correlation between different vegetation indices and cotton aphid damage indices based on their sensitivities to the spectral features, and finally realized accurate monitoring of cotton aphid damage by combining with the monitoring model constructed from the extracted core vegetation indices. The study showed that image fusion significantly improved the correlation between vegetation index and aphid damage, and the correlation coefficient of multispectral image was up to 0.77, while that of the fused image was increased to 0.89, which provided a more reliable basis for model construction. By comparing the six models, namely Linear, Ridge, Decision Tree, Random Forest, AdaBoost and GBDT, it was found that the GBDT model had a coefficient of determination R2 of 0.88 and an RMSE of 0.0918 in the validation set, which showed the best estimation accuracy and stability. Compared with the single multispectral image method, the fused image monitoring can locate and identify the aphid damage area more accurately, and significantly improve the monitoring accuracy and efficiency. This study not only improves the accuracy of pest monitoring, but also contributes to the reduction of economic and environmental losses in cotton production.

Author Contributions

Conceptualization, C.R. and B.L.; methodology, C.R.; software, Z.L. (Zhi Liang); validation, C.R., B.L. and Z.L. (Zhonglong Lin); formal analysis, X.L.; investigation, C.R.; resources, X.L. and X.Z.; data curation, C.R.; writing—original draft preparation, C.R.; writing—review and editing, Z.L. (Zhi Liang) and B.L.; visualization, W.W.; supervision, X.W.; project administration, X.Z.; funding acquisition, X.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Xinjiang Key R&D Program Project “R&D and Integrated Demonstration of Key Technologies and Equipment for Green Precision Prevention and Control of Cotton in Xinjiang” (2024B02003) and the Xinjiang Agricultural Machinery R&D, Manufacturing and Application Integration Project “R&D, Manufacturing and Application of Intelligent Cotton Field Plant Protection Equipment” (YTHSD2022-06).

Data Availability Statement

The original contributions presented in the study are included in the article; further inquiries can be directed to the corresponding authors.

Acknowledgments

We are sincerely grateful for the indispensable help and support we received from other laboratory team members not mentioned in this paper, whose comments and suggestions contributed greatly to the completion of this paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Prabhakar, M.; Prasad, Y.G.; Vennila, S.; Thirupathi, M.; Sreedevi, G.; Rao, G.R.; Venkateswarlu, B. Hyperspectral indices for assessing damage by the solenopsis mealybug (Hemiptera: Pseudococcidae) in cotton. Comput. Electron. Agric. 2013, 97, 61–70. [Google Scholar] [CrossRef]
  2. Subramanian, K.S.; Pazhanivelan, S.; Srinivasan, G.; Santhi, R.; Sathiah, N. Drones in Insect Pest Management. Front. Agron. 2021, 3, 640885. [Google Scholar] [CrossRef]
  3. Chen, M.; Chen, Z.; Luo, L.; Tang, Y.; Cheng, J.; Wei, H.; Wang, J. Dynamic visual servo control methods for continuous operation of a fruit harvesting robot working throughout an orchard. Comput. Electron. Agric. 2024, 219, 108774. [Google Scholar]
  4. Jamal Jumaah, H.; Adnan Rashid, A.; Abdul Razzaq Saleh, S.; Jamal Jumaah, S. Deep Neural Remote Sensing and Sentinel-2 Satellite Image Processing of Kirkuk City, Iraq for Sustainable Prospective. J. Opt. Photonics Res. 2024. [Google Scholar] [CrossRef]
  5. Zhang, Z.; Li, Z.; Ma, L.; Lv, X.; Zhang, L. Definition Management Zones of Drip Irrigation Cotton Field Based on the GIS and RS. In Computer and Computing Technologies in Agriculture X; Li, D., Ed.; Springer International Publishing: Cham, Switzerland, 2019; pp. 508–517. [Google Scholar] [CrossRef]
  6. Chao, D.; Wenjiang, H.; Shuang, Z.; Biyao, Z.; Yao, L.; Fang, H.; Yuanyuan, M. Greenup dates change across a temperate forest-grassland ecotone in northeastern China driven by spring temperature and tree cover. Agric. For. Meteorol. 2022, 314, 108780. [Google Scholar] [CrossRef]
  7. Palanisamy, S.; Selvaraj, R.; Ramesh, T.; Ponnusamy, J. Applications of Remote Sensing in Agriculture—A Review. Int. J. Curr. Microbiol. Appl. Sci. 2019, 8, 2270–2283. [Google Scholar] [CrossRef]
  8. Brinatti Vazquez, G.D.; Lacapmesure, A.M.; Martínez, S.; Martínez, O.E. SUPPOSe 3Dge: A Method for Super-Resolved Detection of Surfaces in Volumetric Fluorescence Microscopy. J. Opt. Photonics Res. 2024. [Google Scholar] [CrossRef]
  9. Jingcheng, Z.; Yanbo, H.; Ruiliang, P.; Pablo, G.-M.; Lin, Y.; Kaihua, W.; Wenjiang, H. Monitoring plant diseases and pests through remote sensing technology: A review. Comput. Electron. Agric. 2019, 165, 104943. [Google Scholar] [CrossRef]
  10. Li, X.; Hu, Y.; Jie, Y.; Zhao, C.; Zhang, Z. Dual-Frequency Lidar for Compressed Sensing 3D Imaging Based on All-Phase Fast Fourier Transform. J. Opt. Photonics Res. 2023, 1, 74–81. [Google Scholar] [CrossRef]
  11. Cocuzza, G. Aphis gossypii (cotton aphid). In Pest, Natural Enemy, Invasive Species, Vector of Plant Pest (CABI Compendium); CABI: Wallingford, UK, 2024. [Google Scholar]
  12. Li, X.; Liang, Z.; Yang, G.; Lin, T.; Liu, B. Assessing the Severity of Verticillium Wilt in Cotton Fields and Constructing Pesticide Application Prescription Maps Using Unmanned Aerial Vehicle (UAV) Multispectral Images. Drones 2024, 8, 176. [Google Scholar] [CrossRef]
  13. Abdollahnejad, A.; Panagiotidis, D. Tree Species Classification and Health Status Assessment for a Mixed Broadleaf-Conifer Forest with UAS Multispectral Imaging. Remote Sens. 2020, 12, 3722. [Google Scholar] [CrossRef]
  14. Hu, X.; Qiao, H.; Chen, B.; Si, H. A Novel Approach to Grade Cotton Aphid Damage Severity with Hyperspectral Index Reconstruction. Appl. Sci. 2022, 12, 8760. [Google Scholar] [CrossRef]
  15. Guo, W.; Qiao, H.-b.; Zhao, H.-q.; Zhang, J.-j.; Pei, P.-c.; Liu, Z.-l. Cotton aphid damage monitoring using UAV hyperspectral data based on derivative of ratio spectroscopy. Spectrosc. Spectr. Anal. 2021, 41, 1543–1550. [Google Scholar]
  16. Bandara, W.G.C.; Patel, V.M. Hypertransformer: A textural and spectral feature fusion transformer for pansharpening. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 1767–1777. [Google Scholar]
  17. Zhang, C.; Chen, Y.; Yang, X.; Gao, S.; Li, F.; Kong, A.; Zu, D.; Sun, L. Improved Remote Sensing Image Classification Based on Multi-Scale Feature Fusion. Remote Sens. 2020, 12, 213. [Google Scholar] [CrossRef]
  18. Jiang, P.; Zhou, X.; Liu, T.; Guo, X.; Ma, D.; Zhang, C.; Li, Y.; Liu, S. Prediction Dynamics in Cotton Aphid Using Unmanned Aerial Vehicle Multispectral Images and Vegetation Indices. IEEE Access 2023, 11, 5908–5918. [Google Scholar] [CrossRef]
  19. Chen, Z.; Wu, R.; Lin, Y.; Li, C.; Chen, S.; Yuan, Z.; Chen, S.; Zou, X. Plant Disease Recognition Model Based on Improved YOLOv5. Agronomy 2022, 12, 365. [Google Scholar] [CrossRef]
  20. Xu, W.; Chen, P.; Zhan, Y.; Chen, S.; Zhang, L.; Lan, Y. Cotton yield estimation model based on machine learning using time series UAV remote sensing data. Int. J. Appl. Earth Obs. Geoinf. 2021, 104, 102511. [Google Scholar] [CrossRef]
  21. Xiang, J.; Liu, J.; Chen, D.; Xiong, Q.; Deng, C. CTFuseNet: A Multi-Scale CNN-Transformer Feature Fused Network for Crop Type Segmentation on UAV Remote Sensing Imagery. Remote Sens. 2023, 15, 1151. [Google Scholar] [CrossRef]
  22. Bao, W.; Cheng, T.; Zhou, X.-G.; Guo, W.; Wang, Y.; Zhang, X.; Qiao, H.; Zhang, D. An improved DenseNet model to classify the damage caused by cotton aphid. Comput. Electron. Agric. 2022, 203, 107485. [Google Scholar] [CrossRef]
  23. Kong, Y.; Hong, F.; Leung, H.; Peng, X. A Fusion Method of Optical Image and SAR Image Based on Dense-UGAN and Gram–Schmidt Transformation. Remote Sens. 2021, 13, 4274. [Google Scholar] [CrossRef]
  24. Pohl, C.; Van Genderen, J. Remote Sensing Image Fusion: A Practical Guide; CRC Press: Boca Raton, FL, USA, 2016. [Google Scholar] [CrossRef]
  25. Maurer, T. How to Pan-Sharpen Images Using the Gram-Schmidt Pan-Sharpen Method—A Recipe. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 40, 239–244. [Google Scholar] [CrossRef]
  26. Zheng, Q.; Huang, W.; Cui, X.; Dong, Y.; Shi, Y.; Ma, H.; Liu, L. Identification of Wheat Yellow Rust Using Optimal Three-Band Spectral Indices in Different Growth Stages. Sensors 2019, 19, 35. [Google Scholar] [CrossRef]
  27. Liu, L.; Dong, Y.; Huang, W.; Du, X.; Ren, B.; Huang, L.; Zheng, Q.; Ma, H. A Disease Index for Efficiently Detecting Wheat Fusarium Head Blight Using Sentinel-2 Multispectral Imagery. IEEE Access 2020, 8, 52181–52191. [Google Scholar] [CrossRef]
  28. Guomin, S.; Wenting, H.; Huihui, Z.; Shouyang, L.; Yi, W.; Liyuan, Z.; Xin, C. Mapping maize crop coefficient Kc using random forest algorithm based on leaf area index and UAV-based multispectral vegetation indices. Agric. Water Manag. 2021, 252, 106906. [Google Scholar] [CrossRef]
  29. Zhi, J.; Dong, Y.; Lu, L.; Shi, J.; Luo, W.; Zhou, Y.; Geng, T.; Xia, J.; Jia, C. High-precision extraction method for maize planting information based on UAV RGB images. Trans. Chin. Soc. Agric. Eng. 2021, 37, 48–54. [Google Scholar] [CrossRef]
  30. Guo, Z.-c.; Wang, T.; Liu, S.-l.; Kang, W.-p.; Chen, X.; Feng, K.; Zhang, X.-q.; Zhi, Y. Biomass and vegetation coverage survey in the Mu Us sandy land—based on unmanned aerial vehicle RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 94, 102239. [Google Scholar] [CrossRef]
  31. Gitelson, A.A.; Merzlyak, M.N.; Chivkunova, O.B. Optical Properties and Nondestructive Estimation of Anthocyanin Content in Plant Leaves. Photochem. Photobiol. 2001, 74, 38–45. [Google Scholar] [CrossRef]
  32. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  33. Aoki, M.; Yabuki, K.; Totsuka, T.; Nishida, M. Remote sensing of chlorophyll content of leaf (I) effective spectral reflection characteristics of leaf for the evaluation of chlorophyll content in leaves of dicotyledons. Environ. Control Biol. 1986, 24, 21–26. [Google Scholar] [CrossRef]
  34. Pearson, R.L.; Miller, L.D. Remote mapping of standing crop biomass for estimation of the productivity of the shortgrass prairie, Pawnee National Grasslands, Colorado. In Proceedings of the Eighth International Symposium on Remote Sensing of Environment, Ann Arbor, MI, USA, 2–6 October 1972. [Google Scholar]
  35. Jordan, C.F. Derivation of Leaf-Area Index from Quality of Light on the Forest Floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  36. Kaufman, Y.J.; Tanre, D. Atmospherically resistant vegetation index (ARVI) for EOS-MODIS. IEEE Trans. Geosci. Remote Sens. 1992, 30, 261–270. [Google Scholar] [CrossRef]
  37. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  38. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  39. Ren, L.; Ma, Y.; Shi, H.; Chen, X. Overview of Machine Learning Algorithms. In Signal and Information Processing, Networking and Computers; Wang, Y., Fu, M., Xu, L., Zou, J., Eds.; Springer: Singapore, 2020; pp. 672–678. [Google Scholar] [CrossRef]
  40. Yao, C.; Zhang, Y.; Zhang, Y.; Liu, H. Application of Convolutional Neural Network in Classification of High Resolution Agricultural Remote Sensing Images. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 989–992. [Google Scholar] [CrossRef]
  41. Chen, B.; Wang, K.-r.; Li, S.-k.; Jing, X.; Chen, J.-l.; Su, Y. Study on spectrum characteristics of cotton leaf and its estimating with remote sensing under aphid stress. Spectrosc. Spectr. Anal. 2010, 30, 3093–3097. [Google Scholar] [CrossRef]
  42. Chen, B.; Wang, Q.; Wang, J.; Liu, T.; Yu, Y.; Song, Y.; Chen, Z.; Bai, Z. The Estimate Severity Level of Cotton Verticillium Wilt Using New Multi-spectra of UAV Comprehensive Monitoring Disease Index. In Proceedings of the 2023 International Seminar on Computer Science and Engineering Technology (SCSET), New York, NY, USA, 29–30 April 2023; pp. 520–527. [Google Scholar] [CrossRef]
  43. Zhang, S.; Han, Y.; Wang, H.; Hou, D. Gram-Schmidt Remote Sensing Image Fusion Algorithm Based on Matrix Elementary Transformation. J. Phys. Conf. Ser. 2022, 2410, 012013. [Google Scholar] [CrossRef]
Figure 1. Location distribution of the study area at the cotton field trial site of 121 Mission, Shihezi City, Xinjiang.
Figure 1. Location distribution of the study area at the cotton field trial site of 121 Mission, Shihezi City, Xinjiang.
Drones 09 00229 g001
Figure 2. Cotton field image acquisition system and remote sensing data analysis process. (a) Drone flight monitoring platform, (b) Collection of ground-truthing samples, (c) Image data collected by the UAV flight platform, (d) Fusion of multispectral images with panchromatic images, (e) Construction of Machine Learning Models.
Figure 2. Cotton field image acquisition system and remote sensing data analysis process. (a) Drone flight monitoring platform, (b) Collection of ground-truthing samples, (c) Image data collected by the UAV flight platform, (d) Fusion of multispectral images with panchromatic images, (e) Construction of Machine Learning Models.
Drones 09 00229 g002
Figure 3. Cotton leaves with different levels of infestation: (a) level 0 leaves, (b) level 1 leaves, (c) level 2 leaves, (d) level 3 leaves, and (e) level 4 leaves.
Figure 3. Cotton leaves with different levels of infestation: (a) level 0 leaves, (b) level 1 leaves, (c) level 2 leaves, (d) level 3 leaves, and (e) level 4 leaves.
Drones 09 00229 g003
Figure 4. Gram–Schmidt (GS) image fusion process.
Figure 4. Gram–Schmidt (GS) image fusion process.
Drones 09 00229 g004
Figure 5. Spectral reflectance at various levels of aphid infestation (B0, B1, B2, B3 and B4 denote the severity classes of cotton aphid canopy, i.e., normal, mild, moderate, severe, and very severe).
Figure 5. Spectral reflectance at various levels of aphid infestation (B0, B1, B2, B3 and B4 denote the severity classes of cotton aphid canopy, i.e., normal, mild, moderate, severe, and very severe).
Drones 09 00229 g005
Figure 6. Correlation between different vegetation indices and cotton aphid index for both images (*** denotes significant at the 0.001 level, ** indicates significant at the 0.01 level, and * indicates significant at the 0.05 level).
Figure 6. Correlation between different vegetation indices and cotton aphid index for both images (*** denotes significant at the 0.001 level, ** indicates significant at the 0.01 level, and * indicates significant at the 0.05 level).
Drones 09 00229 g006
Figure 7. Plot of true versus predicted values of machine learning models fitted to multispectral image data: (a) training set; (b) test set.
Figure 7. Plot of true versus predicted values of machine learning models fitted to multispectral image data: (a) training set; (b) test set.
Drones 09 00229 g007
Figure 8. Plot of true versus predicted values fitted to the machine learning model for fused image data: (a) training set; (b) test set.
Figure 8. Plot of true versus predicted values fitted to the machine learning model for fused image data: (a) training set; (b) test set.
Drones 09 00229 g008
Figure 9. Comparison of R2 between two image data.
Figure 9. Comparison of R2 between two image data.
Drones 09 00229 g009
Figure 10. Evaluation of the results of the machine learning model for fused image data (The dashed line in the figure shows the fitted line).
Figure 10. Evaluation of the results of the machine learning model for fused image data (The dashed line in the figure shows the fitted line).
Drones 09 00229 g010
Figure 11. Cotton aphid damage index prediction results.
Figure 11. Cotton aphid damage index prediction results.
Drones 09 00229 g011
Table 1. Ten band parameters in the 400–900 nm band range were captured by the multispectral camera RedEdge-MX Dual (* denotes enhanced band).
Table 1. Ten band parameters in the 400–900 nm band range were captured by the multispectral camera RedEdge-MX Dual (* denotes enhanced band).
BandCenter Wavelength (nm)Wavelength Width (nm)
Coastal blue *44428
Blue47532
Green *53114
Green56027
Red *65016
Red66814
Red Edge *70510
Red Edge71712
Near-IR *74018
Near-IR84257
Table 2. Classification standard of cotton aphid damage level.
Table 2. Classification standard of cotton aphid damage level.
Aphid LevelCriteria
0No aphids, flat leaves
1Aphids are present; leaves are not damaged
2There are aphids, and the leaves that are most seriously affected are wrinkled or slightly curled, almost semicircular.
3There are aphids, and the leaves that are most seriously affected are curled to half a circle or more, forming an arc shape.
4There are aphids, and the leaves that are most seriously affected are completely curled and ball-shaped.
Table 3. Vegetation index calculation formula.
Table 3. Vegetation index calculation formula.
Vegetation IndexFormulaReferences
ARI (Atmospheric Resistance Index) 1 G ( 1 R ) [31]
GLI (Green Leaf Index) ( 2 G R B ) ( 2 G + R + B ) [32]
GBI (Green Biome Index) ( N I R G ) ( N I R R ) [33]
RVI (Ratio Vegetation Index) R N I R [34]
DVI (Difference vegetation index) N I R R [35]
ARVI (Atmospherically Resistant Vegetation Index) N I R B ( N I R + B ) [36]
GNDVI (Green Normalized Difference Vegetation Index) ( N I R G ) ( N I R + G ) [32]
SAVI (Soil-Adjusted Vegetation Index) N I R G ( 1 + L ) ( N I R + G + L ) [37]
SIPI (Structure-Insensitive Pigment Index) ( N I R B ) N I R R [38]
TCARI (Transformed Chlorophyll Absorption in Reflectance Index) 3 [ R E R 0.2 R E G R E R ] [38]
Table 4. The top four vegetation indices with strong correlation between the two images (*** denotes significance at the 0.001 level).
Table 4. The top four vegetation indices with strong correlation between the two images (*** denotes significance at the 0.001 level).
Multispectral ImagesFused Images
Vegetation IndexCorrelation CoefficientVegetation IndexCorrelation Coefficient
ARVI−0.77 ***GLI−0.89 ***
SAVI−0.68 ***RVI0.83 ***
GBI−0.66 ***DVI−0.79 ***
ARVI−0.60 ***SAVI−0.78 ***
Table 5. Comparison of R2 values before and after fusion.
Table 5. Comparison of R2 values before and after fusion.
MethodsR2 Before FusionFused R2Gain After Fusion
Linear0.320.82+0.5
Ridge0.330.64+0.31
Decision Tree0.740.86+0.12
Random Forest0.830.88+0.05
Adaboost0.780.86+0.09
GBDT0.770.88+0.11
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ren, C.; Liu, B.; Liang, Z.; Lin, Z.; Wang, W.; Wei, X.; Li, X.; Zou, X. An Innovative Method of Monitoring Cotton Aphid Infestation Based on Data Fusion and Multi-Source Remote Sensing Using Unmanned Aerial Vehicles. Drones 2025, 9, 229. https://doi.org/10.3390/drones9040229

AMA Style

Ren C, Liu B, Liang Z, Lin Z, Wang W, Wei X, Li X, Zou X. An Innovative Method of Monitoring Cotton Aphid Infestation Based on Data Fusion and Multi-Source Remote Sensing Using Unmanned Aerial Vehicles. Drones. 2025; 9(4):229. https://doi.org/10.3390/drones9040229

Chicago/Turabian Style

Ren, Chenning, Bo Liu, Zhi Liang, Zhonglong Lin, Wei Wang, Xinzheng Wei, Xiaojuan Li, and Xiangjun Zou. 2025. "An Innovative Method of Monitoring Cotton Aphid Infestation Based on Data Fusion and Multi-Source Remote Sensing Using Unmanned Aerial Vehicles" Drones 9, no. 4: 229. https://doi.org/10.3390/drones9040229

APA Style

Ren, C., Liu, B., Liang, Z., Lin, Z., Wang, W., Wei, X., Li, X., & Zou, X. (2025). An Innovative Method of Monitoring Cotton Aphid Infestation Based on Data Fusion and Multi-Source Remote Sensing Using Unmanned Aerial Vehicles. Drones, 9(4), 229. https://doi.org/10.3390/drones9040229

Article Metrics

Back to TopTop