Identification of Cotton Leaf Mite Damage Stages Using UAV Multispectral Images and a Stacked Ensemble Method
Abstract
1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. Data Collection
2.2.1. Ground Survey Data Acquisition
2.2.2. UAV Data Acquisition and Preprocessing
2.3. Construction and Selection of VIs
2.4. Method
2.4.1. Classifier Model
2.4.2. Stacking Model
2.4.3. Accuracy Evaluation
3. Results
3.1. Single-Model Classification Results
3.2. Stacking and Integration of Individual Base Models
3.3. Stacked Integration of Two Base Models
3.4. Stacked Integration of Multiple Base Models
3.5. Confusion Matrix Analysis
3.6. Visualization of Detection Results
4. Discussion
4.1. Selection of the Optimal Base Model and Accuracy Evaluation
4.2. Discussion of Stacking Methods
4.3. Practical Usability and Runtime Analysis
4.4. Limitations of the Research
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Li, Y.; Yang, J. Few-shot cotton pest recognition and terminal realization. Comput. Electron. Agric. 2020, 169, 105240. [Google Scholar] [CrossRef]
- Xu, W.; Yang, W.; Chen, S.; Wu, C.; Chen, P.; Lan, Y. Establishing a model to predict the single boll weight of cotton in northern Xinjiang by using high resolution UAV remote sensing data. Comput. Electron. Agric. 2020, 179, 105762. [Google Scholar] [CrossRef]
- He, L.; Shi, L.; Liu, G.; Liang, C.T. Occurrence and control of pests and diseases in the Northwest Inland Cotton Area of China. Phytoparasitica 2025, 53, 81. [Google Scholar] [CrossRef]
- Zhu, H.; Lin, C.; Liu, G.; Wang, D.; Qin, S.; Li, A.; Xu, J.L.; He, Y. Intelligent agriculture: Deep learning in UAV-based remote sensing imagery for crop diseases and pests detection. Front. Plant Sci. 2024, 15, 1435016. [Google Scholar] [CrossRef] [PubMed]
- Zhang, S.; Li, X.; Ba, Y.; Lyu, X.; Zhang, M.; Li, M. Banana Fusarium Wilt Disease Detection by Supervised and Unsupervised Methods from UAV-Based Multispectral Imagery. Remote Sens. 2022, 14, 1231. [Google Scholar] [CrossRef]
- Chen, P.; Xu, W.; Zhan, Y.; Wang, G.; Yang, W.; Lan, Y. Determining application volume of unmanned aerial spraying systems for cotton defoliation using remote sensing images. Comput. Electron. Agric. 2022, 196, 106912. [Google Scholar] [CrossRef]
- Yang, W.; Xu, W.; Wu, C.; Zhu, B.; Chen, P.; Zhang, L.; Lan, Y. Cotton hail disaster classification based on drone multispectral images at the flowering and boll stage. Comput. Electron. Agric. 2021, 180, 105866. [Google Scholar] [CrossRef]
- Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Zhang, L.; Wen, S.; Jiang, Y.; Suo, G.; Chen, P. A two-stage classification approach for the detection of spider mite- infested cotton using UAV multispectral imagery. Remote Sens. Lett. 2018, 9, 933–941. [Google Scholar] [CrossRef]
- Ren, C.N.; Liu, B.; Liang, Z.; Lin, Z.L.; Wang, W.; Wei, X.Z.; Li, X.J.; Zou, X.J. An Innovative Method of Monitoring Cotton Aphid Infestation Based on Data Fusion and Multi-Source Remote Sensing Using Unmanned Aerial Vehicles. Drones 2025, 9, 229. [Google Scholar] [CrossRef]
- Sun, C.L.; Bin, A.A.; Wang, Z.Y.; Gao, X.X.; Ding, K. YOLO-UP: A High-Throughput Pest Detection Model for Dense Cotton Crops Utilizing UAV-Captured Visible Light Imagery. IEEE Access 2025, 13, 19937–19945. [Google Scholar] [CrossRef]
- Ali, T.; Zakir, R.; Ayaz, M.; Murtaza, M.; Hijji, M.; Aggoune, E.M.H. Cotton crop disease detection and classification using statistical prediction model in deep learning approach. Multimed. Tools Appl. 2025. [Google Scholar] [CrossRef]
- Alves, A.N.; Souza, W.S.R.; Borges, D.L. Cotton pests classification in field-based images using deep residual networks. Comput. Electron. Agric. 2020, 174, 105488. [Google Scholar] [CrossRef]
- Zheng, Z.J.; Yuan, J.H.; Yao, W.; Kwan, P.; Yao, H.X.; Liu, Q.Z.; Guo, L.F. Fusion of UAV-Acquired Visible Images and Multispectral Data by Applying Machine-Learning Methods in Crop Classification. Agronomy 2024, 14, 2670. [Google Scholar] [CrossRef]
- Pandiyaraju, v.; Anusha, B.; Senthil Kumar, A.M.; Jaspin, K.; Venkatraman, S.; Kannan, A. Spatial attention-based hybrid VGG-SVM and VGG-RF frameworks for improved cotton leaf disease detection. Neural Comput. Appl. 2025, 37, 8309–8329. [Google Scholar] [CrossRef]
- Qiu, Z.; Ma, F.; Li, Z.; Xu, X.; Ge, H.; Du, C. Estimation of nitrogen nutrition index in rice from UAV RGB images coupled with machine learning algorithms. Comput. Electron. Agric. 2021, 189, 106421. [Google Scholar] [CrossRef]
- Healey, S.P.; Cohen, W.B.; Yang, Z.; Kenneth Brewer, C.; Brooks, E.B.; Gorelick, N.; Hernandez, A.J.; Huang, C.; Joseph Hughes, M.; Kennedy, R.E.; et al. Mapping forest change using stacked generalization: An ensemble approach. Remote Sens. Environ. 2018, 204, 717–728. [Google Scholar] [CrossRef]
- Xiao, Y.; Guo, Y.; Yin, G.; Zhang, X.; Shi, Y.; Hao, F.; Fu, Y. UAV Multispectral Image-Based Urban River Water Quality Monitoring Using Stacked Ensemble Machine Learning Algorithms—A Case Study of the Zhanghe River, China. Remote Sens. 2022, 14, 3272. [Google Scholar] [CrossRef]
- Fu, B.; He, X.; Yao, H.; Liang, Y.; Deng, T.; He, H.; Fan, D.; Lan, G.; He, W. Comparison of RFE-DL and stacking ensemble learning algorithms for classifying mangrove species on UAV multispectral images. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102890. [Google Scholar] [CrossRef]
- Yang, M.D.; Hsu, Y.C.; Chen, Y.H.; Yang, C.Y.; Li, K.Y. Precision monitoring of rice nitrogen fertilizer levels based on machine learning and UAV multispectral imagery. Comput. Electron. Agric. 2025, 237, 110523. [Google Scholar] [CrossRef]
- Deng, L.Q.; Li, Y.Y.; Zhang, Z.M.; Mu, J.J.; Jia, S.J.; Yan, Y.Q.; Zhang, W.P. Sorghum yield prediction using UAV multispectral imaging and stacking ensemble learning in arid regions. Front. Plant Sci. 2025, 16, 1636015. [Google Scholar] [CrossRef]
- Du, R.Q.; Lu, J.S.; Xiang, Y.Z.; Zhang, F.C.; Chen, J.Y.; Tang, Z.J.; Shi, H.Z.; Wang, X.; Li, W.Y. Estimation of winter canola growth parameter from UAV multi-angular spectral-texture information using stacking-based ensemble learning model. Comput. Electron. Agric. 2024, 222, 109074. [Google Scholar] [CrossRef]
- GB/T 15802-2011; Technical Specification for Cotton Leaf Mite Detection and Reporting. Domestic—National Standards—State Administration of Market Supervision and Administration CN-GB: Beijing, China, 2011.
- Shapley, L.S. A value for n-person games. Class. Game Theory 1997, 69–79. [Google Scholar] [CrossRef]
- Wang, J.; Wiens, J.; Lundberg, S.M. Shapley Flow: A Graph-based Approach to Interpreting Model Predictions. Int. Conf. Artif. Intell. Stat. 2021, 130, 721–729. [Google Scholar]
- Lundberg, S.M.; Erion, G.; Chen, H.; DeGrave, A.; Prutkin, J.M.; Nair, B.; Katz, R.; Himmelfarb, J.; Bansal, N.; Lee, S.I. From Local Explanations to Global Understanding with Explainable AI for Trees. Nat. Mach. Intell. 2020, 2, 56–67. [Google Scholar] [CrossRef]
- Wang, F.; Yang, M.; Ma, L.; Zhang, T.; Qin, W.; Li, W.; Zhang, Y.; Sun, Z.; Wang, Z.; Li, F.; et al. Estimation of Above-Ground Biomass of Winter Wheat Based on Consumer-Grade Multi-Spectral UAV. Remote Sens. 2022, 14, 1251. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
- Qi, H.; Wu, Z.; Zhang, L.; Li, J.; Zhou, J.; Jun, Z.; Zhu, B. Monitoring of peanut leaves chlorophyll content based on drone-based multispectral image feature extraction. Comput. Electron. Agric. 2021, 187, 106292. [Google Scholar] [CrossRef]
- Xiao, Y.; Zhao, W.; Zhou, D.; Gong, H. Sensitivity Analysis of Vegetation Reflectance to Biochemical and Biophysical Variables at Leaf, Canopy, and Regional Scales. IEEE Trans. Geosci. Remote Sens. 2014, 52, 4014–4024. [Google Scholar] [CrossRef]
- Haboudane, D. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
- Ji, Y.; Chen, Z.; Cheng, Q.; Liu, R.; Li, M.; Yan, X.; Li, G.; Wang, D.; Fu, L.; Ma, Y.; et al. Estimation of plant height and yield based on UAV imagery in faba bean (Vicia faba L.). Plant Methods 2022, 18, 26. [Google Scholar] [CrossRef] [PubMed]
- Huete, A. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
- Zhen, Z.J.; Chen, S.B.; Yin, T.G.; Chavanon, E.; Lauret, N.; Guilleux, J.; Henke, M.; Qin, W.H.; Cao, L.S.; Li, J.; et al. the Negative Soil Adjustment Factor of Soil Adjusted Vegetation Index (SAVI) to Resist Saturation Effects and Estimate Leaf Area Index (LAI) in Dense Vegetation Areas. Sensors 2021, 21, 2115. [Google Scholar] [CrossRef] [PubMed]
- Ren, H.R.; Zhou, G.S.; Zhang, F. Using negative soil adjustment factor in soil-adjusted vegetation index (SAVI) for aboveground living biomass estimation in arid grasslands. Remote Sens. Environ. 2018, 209, 439–445. [Google Scholar] [CrossRef]
- Roujean, J.-L.; Breon, F.-M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
- Gitelson, A.A. Remote estimation of crop fractional vegetation cover: The use of noise equivalent as an indicator of performance of vegetation indices. Int. J. Remote Sens. 2013, 34, 6054–6066. [Google Scholar] [CrossRef]
- Bannari, A.; Asalhi, H.; Teillet, P.M. Transformed difference vegetation index (TDVI) for vegetation cover mapping. IEEE Int. Geosci. Remote Sens. Symp. 2002, 5, 3053–3055. [Google Scholar] [CrossRef]
- Jordan, C.F. Derivation of leaf-area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
- Chen, J.M. Evaluation of Vegetation Indices and a Modified Simple Ratio for Boreal Applications. Can. J. Remote Sens. 2014, 22, 229–242. [Google Scholar] [CrossRef]
- Goel, N.S.; Qin, W. Influences of canopy architecture on relationships between various vegetation indices and LAI and Fpar: A computer simulation. Remote Sens. Rev. 1994, 10, 309–347. [Google Scholar] [CrossRef]
- Peng, G.; Ruiliang, P.; Biging, G.S.; Larrieu, M.R. Estimation of forest leaf area index using vegetation indices derived from hyperion hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1355–1362. [Google Scholar] [CrossRef]
- Su, J.; Yi, D.; Coombes, M.; Liu, C.; Zhai, X.; McDonald-Maier, K.; Chen, W.-H. Spectral analysis and mapping of blackgrass weed by leveraging machine learning and UAV multispectral imagery. Comput. Electron. Agric. 2022, 192, 106621. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Merzlyak, M.N.; Chivkunova, O.B. Optical properties and nondestructive estimation of anthocyanin content in plant leaves. Photochem. Photobiol. 2001, 74, 38–45. [Google Scholar] [CrossRef] [PubMed]
- Gitelson, A.A.; Merzlyak, M.N.; Lichtenthaler, H.K. Detection of Red Edge Position and Chlorophyll Content by Reflectance Measurements Near 700 nm. J. Plant Physiol. 1996, 148, 501–508. [Google Scholar] [CrossRef]
- Siegmann, B.; Jarmer, T.; Lilienthal, H.; Richter, N.; Selige, T.; Höfled, B. Comparison of narrow band vegetation indices and empirical models from hyperspectral remote sensing data for the assessment of wheat nitrogen concentration. In Proceedings of the 8th EARSeL Workshop on Imaging Spectroscopy, Nantes, France, 8–10 April 2013; pp. 8–10. [Google Scholar]
- Hassan, M.; Yang, M.; Rasheed, A.; Jin, X.; Xia, X.; Xiao, Y.; He, Z. Time-Series Multispectral Indices from Unmanned Aerial Vehicle Imagery Reveal Senescence Rate in Bread Wheat. Remote Sens. 2018, 10, 809. [Google Scholar] [CrossRef]
- Wang, F.-m.; Huang, J.-f.; Tang, Y.-l.; Wang, X.-z. New Vegetation Index and Its Application in Estimating Leaf Area Index of Rice. Rice Sci. 2007, 14, 195–203. [Google Scholar] [CrossRef]
- Clevers, J.; Kooistra, L.; van den Brande, M. Using Sentinel-2 Data for Retrieving LAI and Leaf and Canopy Chlorophyll Content of a Potato Crop. Remote Sens. 2017, 9, 405. [Google Scholar] [CrossRef]
- Vincini, M.; Frazzi, E.; D’Alessio, P. A broad-band leaf chlorophyll vegetation index at the canopy scale. Precis. Agric. 2008, 9, 303–319. [Google Scholar] [CrossRef]
- Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
- Jiang, Z.; Huete, A.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
- Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
- Daughtry, C.S.; Walthall, C.; Kim, M.; De Colstoun, E.B.; McMurtrey Iii, J. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
- García-Fernández, M.; Sanz-Ablanedo, E.; Rodríguez-Pérez, J.R. High-Resolution Drone-Acquired RGB Imagery to Estimate Spatial Grape Quality Variability. Agronomy 2021, 11, 655. [Google Scholar] [CrossRef]
- Dash, J.; Curran, P.J. The MERIS terrestrial chlorophyll index. Int. J. Remote Sens. 2010, 25, 5403–5413. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Merzlyak, M.N. Remote estimation of chlorophyll content in higher plant leaves. Int. J. Remote Sens. 2010, 18, 2691–2697. [Google Scholar] [CrossRef]
- Agapiou, A.; Alexakis, D.D.; Stavrou, M.; Sarris, A.; Themistocleous, K.; Hadjimitsis, D.G. Prospects and limitations of vegetation indices in archeological research: The Neolithic Thessaly case study. Proc. SPIE Int. Soc. Opt. Eng. 2013, 8893, 969–970. [Google Scholar] [CrossRef]
- Parra, L.; Mostaza-Colado, D.; Marin, J.F.; Mauri, P.V.; Lloret, J. Methodology to Differentiate Legume Species in Intercropping Agroecosystems Based on UAV with RGB Camera. Electronics 2022, 11, 609. [Google Scholar] [CrossRef]
- Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
- Walsh, O.S.; Shafian, S.; Marshall, J.M.; Jackson, C.; McClintick-Chess, J.R.; Blanscet, S.M.; Swoboda, K.; Thompson, C.; Belmont, K.M.; Walsh, W.L. Assessment of UAV based vegetation indices for nitrogen concentration estimation in spring wheat. Adv. Remote Sens. 2018, 7, 71–90. [Google Scholar] [CrossRef]
- Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubühler, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
- Raper, T.B.; Varco, J.J. Canopy-scale wavelength and vegetative index sensitivities to cotton growth parameters and nitrogen status. Precis. Agric. 2014, 16, 62–76. [Google Scholar] [CrossRef]
- Peñuelas, J.; Gamon, J.; Fredeen, A.; Merino, J.; Field, C. Reflectance indices associated with physiological changes in nitrogen-and water-limited sunflower leaves. Remote Sens. Environ. 1994, 48, 135–146. [Google Scholar] [CrossRef]
- Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
- Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
- Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.-G. Unmanned Aerial Vehicles (UAV) in Precision Agriculture: Applications and Challenges. Energies 2021, 15, 217. [Google Scholar] [CrossRef]
- Guo, A.; Huang, W.; Dong, Y.; Ye, H.; Ma, H.; Liu, B.; Wu, W.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
- Martin, D.E.; Latheef, M.A. Remote Sensing Evaluation of Two-spotted Spider Mite Damage on Greenhouse Cotton. J. Vis. Exp. 2017, 122, 54314. [Google Scholar] [CrossRef]
- Reid, A.M.; Chapman, W.K.; Prescott, C.E.; Nijland, W. Using excess greenness and green chromatic coordinate colour indices from aerial images to assess lodgepole pine vigour, mortality and disease occurrence. For. Ecol. Manag. 2016, 374, 146–153. [Google Scholar] [CrossRef]
- Xu, C.; Ding, J.; Qiao, Y.; Zhang, L. Tomato disease and pest diagnosis method based on the Stacking of prescription data. Comput. Electron. Agric. 2022, 197, 106997. [Google Scholar] [CrossRef]
- Cheng, Q.; Xu, H.; Fei, S.; Li, Z.; Chen, Z. Estimation of Maize LAI Using Ensemble Learning and UAV Multispectral Imagery under Different Water and Fertilizer Treatments. Agriculture 2022, 12, 1267. [Google Scholar] [CrossRef]
- Nguyen, C.; Sagan, V.; Skobalski, J.; Severo, J.I. Early detection of wheat yellow rust disease and its impact on terminal yield with multi-spectral uav-imagery. Remote Sens. 2023, 15, 3301. [Google Scholar] [CrossRef]
- Chen, J.; Saimi, A.; Zhang, M.; Liu, Q.; Ma, Z. Epidemic of Wheat Stripe Rust Detected by Hyperspectral Remote Sensing and Its Potential Correlation with Soil Nitrogen during Latent Period. Life 2022, 12, 1377. [Google Scholar] [CrossRef]
- Aeberli, A.; Robson, A.; Phinn, S.; Lamb, D.W.; Johansen, K. A Comparison of Analytical Approaches for the Spectral Discrimination and Characterisation of Mite Infestations on Banana Plants. Remote Sens. 2022, 14, 5467. [Google Scholar] [CrossRef]











| Infestation Class | Classification Criteria |
|---|---|
| 0 | No harm done |
| 1 | Leaf blade intact; with sporadic white dots or with sporadic yellow spots on the bottom end of the leaf stem |
| 2 | The leaf blade comprises less than one-third of the leaf area and is complete; seems slightly distorted or has noticeable yellow or red patches |
| 3 | The cotton leaf has a large area of red or yellow spots that accounts for at least one-third of the leaf area, the leaf blade has a hole/holes or damage, or the leaf blade is twisted and distorted due to substantial damage |
| Flight Parameters | Camera Parameters | ||
|---|---|---|---|
| Takeoff weight | 1487 g | FOV | 62.7° |
| Diagonal distance | 350 mm | Focal length | 5.74 mm |
| Maximum flight height | 6000 m | Aperture | f/2.2 |
| Maximum ascent speed | 6 m/s | RGB sensor ISO | 200–800 |
| Maximum descent speed | 3 m/s | Monochrome sensor gain | 1–8× |
| Maximum speed | 50 km/h | Maximum image size | 1600 × 1300 |
| Maximum flight time | 27 min | Photo format | JPEG/TIFF |
| Operating temperature | 0~40 °C | File system support | ≥32 GB |
| Operating frequency | 5.72~5.85 GHz | Operating temperature | 0~40 °C |
| NUM | Vegetation Index | Formula | Reference |
|---|---|---|---|
| 1 | NDVI | (NIR − RED)/(NIR + RED) | [26] |
| 2 | GNDVI | (NIR − GREEN)/(NIR + GREEN) | [27] |
| 3 | DVI-R | NIR − RED | [28] |
| 4 | LCI | (NIR − REG)/(NIR − RED) | [29] |
| 5 | MCARI2 | [30] | |
| 6 | VARI | (GREEN − RED)/(GREEN + RED − BLUE) | [31] |
| 7 | SIPI2 | (NIR − GREEN)/(NIR − RED) | [32] |
| 8 | SAVI | 1.5(NIR − RED)/(NIR + RED + 0.5) | [33,34,35] |
| 9 | RDVI | (NIR − RED)/(NIR + RED)1/2 | [36] |
| 10 | WDRVI | (0.2 ∗ NIR − RED)/(0.2 ∗ NIR + RED) | [37] |
| 11 | TDVI | 1.5 ∗ (NIR − RED)/(NIR ∗ NIR + R + 0.5)1/2 | [38] |
| 12 | SRI | NIR/RED | [39] |
| 13 | MSRI | (NIR/RED − 1)/(NIR/1)1/2 + 1 | [40] |
| 14 | NLI | (NIR ∗ NIR − RED)/(NIR ∗ NIR + RED) | [41] |
| 15 | MNLI-R | 1.5 ∗ (NIR ∗ NIR − RED)/(NIR ∗ NIR + R + 0.5) | [42] |
| 16 | GDVI | NIR − GREEN | [43] |
| 17 | ARI1 | (1/GREEN) − (1 REG) | [44] |
| 18 | ARI2 | NIR ∗ (1/GREEN − 1/REG) | [44] |
| 19 | CI-GREEN | (NIR/GREEN) − 1 | [28] |
| 20 | CI-ReEdge | (NIR/REG) − 1 | [28] |
| 21 | GARI | [45] | |
| 22 | GOSAVI | (NIR − GREEN)/(NIR + GREEN + 0.16) | [46] |
| 23 | NDREI | (REG − GREEN)/(REG + GREEN) | [47] |
| 24 | BNDVI | (NIR − BLUE)/(NIR − BLUE) | [48] |
| 25 | CI-RED | (NIR/RED) − 1 | [49] |
| 26 | CVI | (NIR/GREEN) ∗ (RED/GREEN) | [50] |
| 27 | DVI-G | NIR − GREEN | [27] |
| 28 | DVI-RE | NIR − REG | [27] |
| 29 | EVI | 2.5 ∗ (NIR-RED)/(1 + NIR − 2.4*RED) | [51] |
| 30 | EVI2 | 2.5 ∗ (NIR-RED)/(1 + NIR + 2.4*RED) | [52] |
| 31 | GRVI | (GREEN − RED)/(GREEN + RED) | [53] |
| 32 | MCARI1 | 1.2 ∗ [2.5 ∗ (NIR − RED) − 1.3(NIR − GREEN)] | [30] |
| 33 | MCARI | [(REG − RED) − 0.2(REG − GREEN) ∗ (REG/RED)] | [54] |
| 34 | MNLI-G | (1.5 ∗ NIR2 − 1.5 ∗ GREEN)/(NIR2 + RED + 0.5) | [55] |
| 35 | MSR | [(NIR/RED) − 1]/[(NIR/RED) + 1]0.5 | [42] |
| 36 | MSR-REG | [(NIR/REG) − 1]/[(NIR/REG) + 1]0.5 | [40] |
| 37 | MTCI | (NIR − REG)/(NIR − RED) | [56] |
| 38 | NDRE | (NIR − REG)/(NIR + REG) | [57] |
| 39 | NAVI | 1 − (RED/NIR) | [58] |
| 40 | OSAVI | 1.6 ∗ [(NIR − RED)/(NIR + RED + 0.16)] | [59] |
| 41 | OSAVI-G | 1.6 ∗ [(NIR − GREEN)/(NIR + GREEN + 0.16)] | [36] |
| 42 | OSAVI-REG | 1.6 ∗ [(NIR − REG)/(NIR + REG + 0.16)] | [60] |
| 43 | RDVI-REG | (NIR − REG)/(NIR + REG)1/2 | [61] |
| 44 | RGBVI | (GREEN2 − BLUE ∗ RED)/(GREEN2 + BLUE ∗ RED) | [62] |
| 45 | RTVI-CORE | 100(NIR − REG) − 10(NIR − GREEN) | [63] |
| 46 | SAVI-G | 1.5 ∗ (NIR − GREEN)/(NIR + GREEN + 0.5) | [64] |
| 47 | S-CCCI | NDRE/NDVI | [65] |
| 48 | SIPI | (NIR − BLUE)(NIR − RED) | [66] |
| 49 | SR-REG | NIR/REG | [67] |
| 50 | TCARI | 3[(REG − RED) − 0.2(REG − GREEN) ∗ (REG/RED)] | [68] |
| 51 | T/O | TCARI/OSAVI | [69] |
| 52 | TVI | 0.5[120(NIR − GREEN) − 200(RED − GREEN)] | [70] |
| Model | Hyperparameter Optimization Range | Optimal Parameters |
|---|---|---|
| RF | n_estimators: range (1, 200, 5) min_samples_leaf: range (1, 10, 1) min_samples_split: range (2, 10, 1) max_depth: range (1, 30, 3) max_features: range (1, 10, 1) | n_estimators = 45 max_depth = 6 min_samples_leaf = 6 min_samples_split = 2 max_features = 1 random_state = 5 |
| SVM | kernel: [‘linear’, ‘poly’, ‘rbf’, ‘sigmoid’] C: [0.01, 0.1, 1, 10, 100] gamma: [0.125, 0.25, 0.5, 1, 2, 4, 8, 16, 32, 64, 128] | kernel = ‘poly’ C = 100; gamma = 4 |
| DT | criterion: ‘gini’, ‘entropy’ max_depth: [3, 5, 8, 15, 25, 30, None] min_samples_leaf: [1, 2, 5, 10] min_samples_split: [2, 5, 10, 15, 100] | criterion: ‘gini’ max_depth = 1 min_samples_leaf = 1 min_samples_split = 2 random_state = 5 |
| GBDT | n_estimators: range (1, 50, 2) max_depth: [2, 3, 4, 5, 6] min_samples_split: range (2, 20, 2) min_samples_leaf: range (1, 15, 1) max_features: range (1, 5, 1) subsample: [0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8] | n_estimators = 11 max_depth = 2 min_samples_split = 2 min_samples_leaf = 7 max_features = 4 subsample = 0.2 random_state = 5 |
| XGB | n_estimators: [30, 50, 100, 300, 500, 1000, 2000] max_depth: [1, 2, 3, 4, 5, 6, 7, 8] learning_rate: [0.01, 0.05, 0.1, 0.2, 0.3, 0.4, 0.5] gamma: [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1] reg_alpha: [0.0001, 0.001, 0.01, 0.1, 1, 100] reg_lambda: [0.0001, 0.001, 0.01, 0.1, 1, 100] min_child_weight: [2, 3, 4, 5, 6, 7, 8] colsample_bytree: [0.6, 0.7, 0.8, 0.9] subsample: [0.6, 0.7, 0.8, 0.9] scale_pos_weight = 1 | n_estimators = 8 max_depth = 2 learning_rate = 0.05 gamma = 1 reg_alpha = 0.0001 reg_lambda = 1 colsample_bytree = 0.75 min_child_weight = 1 subsample = 0.9 random_state = 5 |
| KNN | n_neighbors = range [1, 10] weights = [‘uniform’, ‘distance’] p = range [1, 5] | n_neighbors = 9 p = 3 weights = distance |
| LGBM | max_depth: range (2, 30, 1) num_leaves: range (2, 12, 1) min_data_in_leaf: range (1, 102, 10) max_bin: range (5, 256, 10) feature_fraction: [0.6, 0.7, 0.8, 0.9, 1.0] bagging_fraction: [0.6, 0.7, 0.8, 0.9, 1.0] bagging_freq: range (0,81, 10) lambda_l1: [1 × 10−5, 1 × 10−3, 1 × 10−1, 0.0, 0.1, 0.3, 0.5, 0.7, 0.9, 1.0] lambda_l2: [1 × 10−5, 1 × 10−3, 1 × 10−1, 0.0, 0.1, 0.3, 0.5, 0.7, 0.9, 1.0] min_split_gain: [0.0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0] n_estimators: range (1, 100, 5); learning_rate: [0.008, 0.01, 0.02, 0.03, 0.04, 0.06, 0.08, 0.1] | max_depth = 2 num_leaves = 4 min_data_in_leaf = 41 max_bin = 55 feature_fraction = 0.9 bagging_fraction = 0.6 bagging_freq = 10 lambda_l1 = 1 × 10−5 lambda_l2 = 1 × 10−5 min_split_gain = 0.4 n_estimators = 100 learning_rate = 0.01 random_state = 5 |
| Model | Accuracy | Macro Avg | Weighted Avg | ||||
|---|---|---|---|---|---|---|---|
| Precision | Recall | F1 | Precision | Recall | F1 | ||
| RF | 0.778 | 0.901 | 0.581 | 0.612 | 0.844 | 0.778 | 0.744 |
| GBDT | 0.810 | 0.583 | 0.583 | 0.571 | 0.746 | 0.810 | 0.762 |
| SVM | 0.762 | 0.690 | 0.640 | 0.655 | 0.749 | 0.762 | 0.752 |
| XGB | 0.825 | 0.922 | 0.691 | 0.658 | 0.826 | 0.825 | 0.796 |
| DT | 0.794 | 0.578 | 0.567 | 0.557 | 0.737 | 0.794 | 0.754 |
| LGBM | 0.746 | 0.722 | 0.791 | 0.712 | 0.845 | 0.746 | 0.773 |
| KNN | 0.794 | 0.811 | 0.674 | 0.717 | 0.810 | 0.794 | 0.783 |
| Base Model | Metamodel | Accuracy | Macro Avg | Weighted Avg | ||||
|---|---|---|---|---|---|---|---|---|
| Precision | Recall | F1 | Precision | Recall | F1 | |||
| XGB | XGB | 0.825 | 0.922 | 0.691 | 0.658 | 0.826 | 0.825 | 0.796 |
| GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.326 | 0.571 | 0.416 | |
| DT | 0.825 | 0.922 | 0.691 | 0.658 | 0.826 | 0.825 | 0.796 | |
| KNN | 0.825 | 0.922 | 0.691 | 0.658 | 0.826 | 0.825 | 0.796 | |
| LR | 0.810 | 0.583 | 0.583 | 0.571 | 0.746 | 0.810 | 0.762 | |
| GBDT | XGB | 0.794 | 0.578 | 0.567 | 0.557 | 0.737 | 0.794 | 0.745 |
| GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.326 | 0.571 | 0.416 | |
| DT | 0.794 | 0.578 | 0.567 | 0.557 | 0.737 | 0.794 | 0.745 | |
| KNN | 0.794 | 0.578 | 0.567 | 0.557 | 0.737 | 0.794 | 0.745 | |
| LR | 0.794 | 0.578 | 0.567 | 0.557 | 0.737 | 0.794 | 0.745 | |
| DT | XGB | 0.794 | 0.578 | 0.567 | 0.557 | 0.737 | 0.794 | 0.745 |
| GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.326 | 0.571 | 0.416 | |
| DT | 0.778 | 0.718 | 0.672 | 0.685 | 0.805 | 0.778 | 0.781 | |
| KNN | 0.794 | 0.578 | 0.567 | 0.557 | 0.737 | 0.794 | 0.745 | |
| LR | 0.794 | 0.578 | 0.567 | 0.557 | 0.737 | 0.794 | 0.745 | |
| KNN | XGB | 0.794 | 0.889 | 0.667 | 0.724 | 0.830 | 0.794 | 0.779 |
| GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.326 | 0.571 | 0.416 | |
| DT | 0.794 | 0.889 | 0.667 | 0.724 | 0.830 | 0.794 | 0.779 | |
| KNN | 0.794 | 0.889 | 0.667 | 0.724 | 0.830 | 0.794 | 0.779 | |
| LR | 0.794 | 0.889 | 0.667 | 0.724 | 0.830 | 0.794 | 0.779 | |
| Base Model | Metamodel | Accuracy | Macro Avg | Weighted Avg | ||||
|---|---|---|---|---|---|---|---|---|
| Precision | Recall | F1 | Precision | Recall | F1 | |||
| XGB | XGB | 0.825 | 0.922 | 0.631 | 0.658 | 0.866 | 0.825 | 0.796 |
| GDBT | GDBT | 0.571 | 0.190 | 0.333 | 0.242 | 0.327 | 0.571 | 0.416 |
| DT | DT | 0.825 | 0.922 | 0.631 | 0.658 | 0.866 | 0.825 | 0.796 |
| KNN | 0.825 | 0.922 | 0.631 | 0.658 | 0.866 | 0.825 | 0.796 | |
| LR | 0.810 | 0.917 | 0.614 | 0.644 | 0.857 | 0.810 | 0.779 | |
| XGB | XGB | 0.825 | 0.922 | 0.631 | 0.658 | 0.866 | 0.825 | 0.796 |
| GDBT | GDBT | 0.571 | 0.190 | 0.333 | 0.242 | 0.327 | 0.571 | 0.416 |
| KNN | DT | 0.857 | 0.933 | 0.726 | 0.782 | 0.886 | 0.857 | 0.847 |
| KNN | 0.810 | 0.917 | 0.676 | 0.736 | 0.857 | 0.810 | 0.795 | |
| LR | 0.825 | 0.922 | 0.631 | 0.658 | 0.866 | 0.825 | 0.796 | |
| GDBT | XGB | 0.825 | 0.922 | 0.631 | 0.658 | 0.866 | 0.825 | 0.796 |
| DT | GDBT | 0.571 | 0.190 | 0.333 | 0.242 | 0.327 | 0.571 | 0.416 |
| KNN | DT | 0.794 | 0.912 | 0.660 | 0.719 | 0.848 | 0.794 | 0.776 |
| KNN | 0.778 | 0.907 | 0.612 | 0.664 | 0.840 | 0.778 | 0.753 | |
| LR | 0.762 | 0.902 | 0.595 | 0.646 | 0.832 | 0.762 | 0.764 | |
| XGB | XGB | 0.841 | 0.928 | 0.710 | 0.767 | 0.841 | 0.859 | 0.830 |
| GDBT | GDBT | 0.571 | 0.190 | 0.333 | 0.242 | 0.327 | 0.571 | 0.416 |
| DT | DT | 0.857 | 0.933 | 0.726 | 0.782 | 0.886 | 0.857 | 0.847 |
| KNN | KNN | 0.810 | 0.917 | 0.676 | 0.736 | 0.857 | 0.810 | 0.795 |
| LR | 0.841 | 0.928 | 0.679 | 0.727 | 0.876 | 0.841 | 0.823 | |
| Base Model | Metamodel | Accuracy | Macro Avg | Weighted Avg | ||||
|---|---|---|---|---|---|---|---|---|
| Precision | Recall | F1 | Precision | Recall | F1 | |||
| XGB | XGB | 0.810 | 0.583 | 0.583 | 0.571 | 0.746 | 0.810 | 0.762 |
| GBDT | GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.326 | 0.571 | 0.416 |
| DT | 0.825 | 0.922 | 0.691 | 0.658 | 0.826 | 0.825 | 0.796 | |
| KNN | 0.825 | 0.922 | 0.691 | 0.658 | 0.826 | 0.825 | 0.796 | |
| LR | 0.810 | 0.583 | 0.583 | 0.571 | 0.746 | 0.810 | 0.762 | |
| XGB | XGB | 0.825 | 0.922 | 0.691 | 0.658 | 0.826 | 0.825 | 0.796 |
| DT | GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.326 | 0.571 | 0.416 |
| DT | 0.825 | 0.922 | 0.691 | 0.658 | 0.826 | 0.825 | 0.796 | |
| KNN | 0.810 | 0.917 | 0.614 | 0.644 | 0.857 | 0.810 | 0.780 | |
| LR | 0.810 | 0.917 | 0.614 | 0.644 | 0.857 | 0.810 | 0.780 | |
| XGB | XGB | 0.825 | 0.922 | 0.631 | 0.658 | 0.866 | 0.825 | 0.796 |
| KNN | GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.327 | 0.571 | 0.416 |
| DT | 0.857 | 0.933 | 0.726 | 0.782 | 0.886 | 0.857 | 0.847 | |
| KNN | 0.841 | 0.928 | 0.710 | 0.767 | 0.876 | 0.841 | 0.830 | |
| LR | 0.825 | 0.922 | 0.631 | 0.658 | 0.866 | 0.825 | 0.796 | |
| GBDT | XGB | 0.794 | 0.578 | 0.567 | 0.557 | 0.737 | 0.794 | 0.745 |
| DT | GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.326 | 0.571 | 0.416 |
| DT | 0.778 | 0.718 | 0.678 | 0.685 | 0.805 | 0.778 | 0.781 | |
| KNN | 0.778 | 0.730 | 0.672 | 0.691 | 0.800 | 0.778 | 0.778 | |
| LR | 0.778 | 0.573 | 0.550 | 0.542 | 0.729 | 0.778 | 0.729 | |
| GBDT | XGB | 0.746 | 0.541 | 0.524 | 0.514 | 0.693 | 0.746 | 0.696 |
| KNN | GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.326 | 0.571 | 0.416 |
| DT | 0.794 | 0.889 | 0.667 | 0.724 | 0.830 | 0.794 | 0.779 | |
| KNN | 0.857 | 0.918 | 0.734 | 0.782 | 0.875 | 0.854 | 0.846 | |
| LR | 0.778 | 0.907 | 0.643 | 0.701 | 0.840 | 0.778 | 0.757 | |
| DT | XGB | 0.778 | 0.884 | 0.619 | 0.668 | 0.821 | 0.778 | 0.756 |
| KNN | GBDT | 0.571 | 0.190 | 0.333 | 0.242 | 0.327 | 0.571 | 0.416 |
| DT | 0.794 | 0.912 | 0.660 | 0.719 | 0.848 | 0.794 | 0.776 | |
| KNN | 0.841 | 0.928 | 0.710 | 0.767 | 0.876 | 0.841 | 0.830 | |
| LR | 0.778 | 0.907 | 0.612 | 0.664 | 0.840 | 0.778 | 0.753 | |
| Integration Method | Base Model | Metamodel | Accuracy | Macro Avg | Weighted Avg | ||||
|---|---|---|---|---|---|---|---|---|---|
| Precision | Recall | F1 | Precision | Recall | F1 | ||||
| Stacking (use_probas) | XGB GDBT DT KNN | LR | 0.825 | 0.922 | 0.662 | 0.712 | 0.825 | 0.730 | 0.807 |
| Stacking (make_pipeline) | XGB KNN | DT | 0.825 | 0.922 | 0.631 | 0.658 | 0.866 | 0.825 | 0.796 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fan, S.; He, Q.; Chen, Y.; Xu, X.; Guo, W.; Lu, Y.; Liu, J.; Qiao, H. Identification of Cotton Leaf Mite Damage Stages Using UAV Multispectral Images and a Stacked Ensemble Method. Agriculture 2025, 15, 2277. https://doi.org/10.3390/agriculture15212277
Fan S, He Q, Chen Y, Xu X, Guo W, Lu Y, Liu J, Qiao H. Identification of Cotton Leaf Mite Damage Stages Using UAV Multispectral Images and a Stacked Ensemble Method. Agriculture. 2025; 15(21):2277. https://doi.org/10.3390/agriculture15212277
Chicago/Turabian StyleFan, Shifeng, Qiang He, Yongqin Chen, Xin Xu, Wei Guo, Yanhui Lu, Jie Liu, and Hongbo Qiao. 2025. "Identification of Cotton Leaf Mite Damage Stages Using UAV Multispectral Images and a Stacked Ensemble Method" Agriculture 15, no. 21: 2277. https://doi.org/10.3390/agriculture15212277
APA StyleFan, S., He, Q., Chen, Y., Xu, X., Guo, W., Lu, Y., Liu, J., & Qiao, H. (2025). Identification of Cotton Leaf Mite Damage Stages Using UAV Multispectral Images and a Stacked Ensemble Method. Agriculture, 15(21), 2277. https://doi.org/10.3390/agriculture15212277

