# Combining Spectral and Textural Information from UAV RGB Images for Leaf Area Index Monitoring in Kiwifruit Orchard

^{*}

## Abstract

**:**

^{2}of 0.947 and 0.765, RMSE of 0.048 and 0.102, and nRMSE of 7.99% and 16.81%, respectively. Moreover, the RFR model (R

^{2}= 0.972, RMSE = 0.035, nRMSE = 5.80%) exhibited the best accuracy in estimating LAI, followed by the SWR model (R

^{2}= 0.765, RMSE = 0.102, nRMSE = 16.81%) and univariate linear regression model (R

^{2}= 0.736, RMSE = 0.108, nRMSE = 17.84%). It was concluded that the estimation method based on UAV spectral parameters combined with texture features can provide an effective method for kiwifruit growth process monitoring. It is expected to provide scientific guidance and practical methods for the kiwifruit management in the field for low-cost UAV remote sensing technology to realize large area and high-quality monitoring of kiwifruit growth, thus providing a theoretical basis for kiwifruit growth investigation.

## 1. Introduction

## 2. Materials and Methods

#### 2.1. Study Area Overview

#### 2.2. Data Acquisition

#### 2.2.1. LAI Measurement of Kiwi Orchard

#### 2.2.2. UAV RGB Image Acquisition and Preprocessing

#### 2.3. Methods

#### 2.3.1. Extraction of Spectral and Texture Features

#### 2.3.2. Model Calibration and Evaluation

^{2}) and root mean square error (RMSE), normalized root mean square error (nRMSE), were used to measure the predictive performance of each estimation model by different methods. The higher values of R

^{2}and the lower values of RMSE and nRMSE indicate a better imitative effect and accuracy of the model in predicting LAI. In Formulas (1)–(3), ${y}_{i}$ is the measured value, $\overline{y}$ is the mean value of the measured value, ${\widehat{y}}_{i}$ is the predicted value, and n is the number of samples. In addition, all statistical analysis was completed with software R. More details on the MLA and method of calibration mentioned above can be found in the library packages (http://www.r-project.org/, accessed on 12 February 2022).

## 3. Results

#### 3.1. Correlation between LAI and UAV RGB Image Parameters

#### 3.2. LAI Modeling and Accuracy Verification

#### 3.2.1. Unitary Linear Model Construction and Precision Analysis

^{2}only reached 0.466 in IF, so the model should be prudently applied. In order to verify the applicability of the model, the validation set data were used to verify the model; moreover, fitting analysis was performed between the predicted and measured values. Figure 5 showed that in general the predicted value in the low value interval was lower than the measured value, while in the high value interval it was higher than the measured value. The precision of single-factor model was not preferable to model the monitoring LAI, so the multi-factor model needed to be established.

#### 3.2.2. LAI Estimation Models Established by Spectral Index Only

^{2}reached 0.541–0.819, and RMSE and nRMSE were 0.049–0.102 and 11.55–16.81% respectively (Table 6). In addition, verification R

^{2}was 0.690–0.819, RMSE and nRMSE were 0.057–0.084 and 13.10–16.36%, separately (Figure 6). As seen here, there was a certain extent of improvement after adopting the SWR model in model accuracy at different growth stages. The analysis results of the RFR model showed that the modeling R

^{2}of each period was greater than or equal to 0.965, with the highest reaching roughly 0.973. In terms of validation sets, the RFR model performed best at YF. Furthermore, the RFR model was consistently better performing than the SWR model at each growth stage.

#### 3.2.3. LAI Estimation Models Combined with Texture Features

^{2}increased by 0.318 to 0.859 at least; RMSE and nRMSE increased by 0.034 and 6.56% to 0.042 and 8.14%, respectively. Moreover, compared with the inversion only by the spectral index, the modeling R

^{2}values of the RFR model with integrated texture features at each period were all greater than or equal to 0.968, and the RMSE and nRMSE were 0.032 and 5.30% at least. According to the validation results (Figure 7), the RFR model had the best performance in FE, and the R

^{2}, RMSE, and nRMSE were 0.829, 0.069, and 13.49%, respectively. Coincidentally, after combining the spectral indices and texture information the RFR model performed better than the SWR model at each period.

#### 3.3. Model Selection and Inversion Mapping

## 4. Discussion

#### 4.1. Feasibility of LAI Estimation by UAV RGB Images

#### 4.2. Advantages of Estimation after Combining with Texture Features

#### 4.3. Model Optimization Selection of LAI

## 5. Conclusions

^{2}was 0.820 with SWR model in FE. Therefore, the new indices were suitable for the monitoring of kiwifruit growth model and it was strongly suggested that the spectral and textural information be combined in the growth monitoring of kiwi orchard. Furthermore, using the RFR model significantly improved the predictability and accuracy of the model according to the R

^{2}, RMSE, and nRMSE values. Verification results indicated that the prediction accuracy of models among the diverse growth stages was better when using the RFR model and the validation accuracy (R

^{2}= 0.829) in FE was the best. In conclusion, the inversion technique with UAV combining spectral indices and texture features can provide a cost-effective, fast, and effective method for kiwifruit growth monitoring. Meanwhile, it can also realize the large-scale and high-quality monitoring of kiwifruit orchards, providing a theoretical basis for kiwi growth investigation.

## Supplementary Materials

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Tian, Y.; Huang, H.; Zhou, G.; Zhang, Q.; Tao, J.; Zhang, Y.; Lin, J. Aboveground mangrove biomass estimation in Beibu Gulf using machine learning and UAV remote sensing. Sci. Total Environ.
**2021**, 781, 146816. [Google Scholar] [CrossRef] - Kong, B.; Yu, H.; Du, R.; Wang, Q. Quantitative Estimation of Biomass of Alpine Grasslands Using Hyperspectral Remote Sensing. Rangel. Ecol. Manag.
**2019**, 72, 336–346. [Google Scholar] [CrossRef] - Ali, A.; Imran, M. Evaluating the potential of red edge position (REP) of hyperspectral remote sensing data for real time estimation of LAI & chlorophyll content of kinnow mandarin (Citrus reticulata) fruit orchards. Sci. Hortic.
**2020**, 267, 109326. [Google Scholar] [CrossRef] - Zhang, Y.; Hui, J.; Qin, Q.; Sun, Y.; Zhang, T.; Sun, H.; Li, M. Transfer-learning-based approach for leaf chlorophyll content estimation of winter wheat from hyperspectral data. Remote Sens. Environ.
**2021**, 267, 112724. [Google Scholar] [CrossRef] - Gano, B.; Dembele, J.S.B.; Ndour, A.; Luquet, D.; Beurier, G.; Diouf, D.; Audebert, A. Using UAV Borne, Multi-Spectral Imaging for the Field Phenotyping of Shoot Biomass, Leaf Area Index and Height of West African Sorghum Varieties under Two Contrasted Water Conditions. Agronomy
**2021**, 11, 850. [Google Scholar] - Zhang, J.; Li, M.; Sun, Z.; Liu, H.; Sun, H.; Yang, W. Chlorophyll Content Detection of Field Maize Using RGB-NIR Camera. IFAC-Paper
**2018**, 51, 700–705. [Google Scholar] [CrossRef] - Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens.
**2014**, 6, 10395–10412. [Google Scholar] - Wan, L.; Cen, H.; Zhu, J.; Zhang, J.; Zhu, Y.; Sun, D.; Du, X.; Zhai, L.; Weng, H.; Li, Y.; et al. Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer—Case study of small farmlands in the South of China. Agric. For. Meteorol.
**2020**, 291, 108096. [Google Scholar] [CrossRef] - Li, B.; Xu, X.; Zhang, L.; Han, J.; Bian, C.; Li, G.; Liu, J.; Jin, L. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J. Photogramm. Remote Sens.
**2020**, 162, 161–172. [Google Scholar] [CrossRef] - Zheng, H.; Cheng, T.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Combining Unmanned Aerial Vehicle (UAV)-Based Multispectral Imagery and Ground-Based Hyperspectral Data for Plant Nitrogen Concentration Estimation in Rice. Front Plant Sci.
**2018**, 9, 936. [Google Scholar] [CrossRef] - Qiu, Z.; Ma, F.; Li, Z.; Xu, X.; Ge, H.; Du, C. Estimation of nitrogen nutrition index in rice from UAV RGB images coupled with machine learning algorithms. Comput. Electron. Agric.
**2021**, 189, 106421. [Google Scholar] [CrossRef] - Zhou, Y.; Lao, C.; Yang, Y.; Zhang, Z.; Chen, H.; Chen, Y.; Chen, J.; Ning, J.; Yang, N. Diagnosis of winter-wheat water stress based on UAV-borne multispectral image texture and vegetation indices. Agric. Water Manag.
**2021**, 256, 107076. [Google Scholar] [CrossRef] - Lama, G.F.C.; Crimaldi, M.; Pasquino, V.; Padulano, R.; Chirico, G.B. Bulk Drag Predictions of Riparian Arundo donax Stands through UAV-Acquired Multispectral Images. Water
**2021**, 13, 1333. [Google Scholar] - Taddia, Y.; Russo, P.; Lovo, S.; Pellegrinelli, A. Multispectral UAV monitoring of submerged seaweed in shallow water. Appl. Geomat.
**2020**, 12, 19–34. [Google Scholar] [CrossRef] [Green Version] - Fernández-Lozano, J.; Sanz-Ablanedo, E. Unraveling the Morphological Constraints on Roman Gold Mining Hydraulic Infrastructure in NW Spain. A UAV-Derived Photogrammetric and Multispectral Approach. Remote Sens.
**2021**, 13, 291. [Google Scholar] - Benos, L.; Tagarakis, A.C.; Dolias, G.; Berruto, R.; Kateris, D.; Bochtis, D. Machine Learning in Agriculture: A Comprehensive Updated Review. Sensors
**2021**, 21, 3758. [Google Scholar] - Sadeghifar, T.; Lama, G.F.C.; Sihag, P.; Bayram, A.; Kisi, O. Wave height predictions in complex sea flows through soft-computing models: Case study of Persian Gulf. Ocean Eng.
**2022**, 245, 110467. [Google Scholar] [CrossRef] - Hashim, W.; Eng, L.S.; Alkawsi, G.; Ismail, R.; Alkahtani, A.A.; Dzulkifly, S.; Baashar, Y.; Hussain, A. A Hybrid Vegetation Detection Framework: Integrating Vegetation Indices and Convolutional Neural Network. Symmetry
**2021**, 13, 2190. [Google Scholar] - Watson, D.J. Comparative Physiological Studies on the Growth of Field Crops: I. Variation in Net Assimilation Rate and Leaf Area between Species and Varieties, and within and between Years. Ann. Bot.
**1947**, 11, 41–76. [Google Scholar] - Negrón Juárez, R.I.; da Rocha, H.R.; e Figueira, A.M.S.; Goulden, M.L.; Miller, S.D. An improved estimate of leaf area index based on the histogram analysis of hemispherical photographs. Agric. For. Meteorol.
**2009**, 149, 920–928. [Google Scholar] [CrossRef] [Green Version] - Vose, J.M.; Barton, D.; Clinton, N.H.; Sullivan, P.V.B. Vertical leaf area distribution, light transmittance, and application of the Beer–Lambert Law in four mature hardwood stands in the southern Appalachians. Can. J. For. Res.
**1995**, 25, 1036–1043. [Google Scholar] - Wilhelm, W.W.; Ruwe, K.; Schlemmer, M.R. Comparison of three leaf area index meters in a corn canopy. Crop Sci.
**2000**, 40, 1179–1183. [Google Scholar] - Glatthorn, J.; Pichler, V.; Hauck, M.; Leuschner, C. Effects of forest management on stand leaf area: Comparing beech production and primeval forests in Slovakia. For. Ecol. Manag.
**2017**, 389, 76–85. [Google Scholar] [CrossRef] - Jiang, S.; Zhao, L.; Liang, C.; Hu, X.; Yaosheng, W.; Gong, D.; Zheng, S.; Huang, Y.; He, Q.; Cui, N. Leaf- and ecosystem-scale water use efficiency and their controlling factors of a kiwifruit orchard in the humid region of Southwest China. Agric. Water Manag.
**2022**, 260, 107329. [Google Scholar] [CrossRef] - Srinet, R.; Nandy, S.; Patel, N.R. Estimating leaf area index and light extinction coefficient using Random Forest regression algorithm in a tropical moist deciduous forest, India. Ecol. Inform.
**2019**, 52, 94–102. [Google Scholar] [CrossRef] - Ren, B.; Li, L.; Dong, S.; Liu, P.; Zhao, B.; Zhang, J. Photosynthetic Characteristics of Summer Maize Hybrids with Different Plant Heights. Agron. J.
**2017**, 109, 1454. [Google Scholar] - Hassanijalilian, O.; Igathinathane, C.; Doetkott, C.; Bajwa, S.; Nowatzki, J.; Haji Esmaeili, S.A. Chlorophyll estimation in soybean leaves infield with smartphone digital imaging and machine learning. Comput. Electron. Agric.
**2020**, 174, 105433. [Google Scholar] [CrossRef] - Lu, J.; Cheng, D.; Geng, C.; Zhang, Z.; Xiang, Y.; Hu, T. Combining plant height, canopy coverage and vegetation index from UAV-based RGB images to estimate leaf nitrogen concentration of summer maize. Biosyst. Eng.
**2021**, 202, 42–54. [Google Scholar] [CrossRef] - Raj, R.; Walker, J.P.; Pingale, R.; Nandan, R.; Naik, B.; Jagarlapudi, A. Leaf area index estimation using top-of-canopy airborne RGB images. Int. J. Appl. Earth Obs. Geoinf.
**2021**, 96, 102282. [Google Scholar] [CrossRef] - Shao, G.; Han, W.; Zhang, H.; Liu, S.; Wang, Y.; Zhang, L.; Cui, X. Mapping maize crop coefficient Kc using random forest algorithm based on leaf area index and UAV-based multispectral vegetation indices. Agric. Water Manag.
**2021**, 252, 106906. [Google Scholar] [CrossRef] - Guo, Z.-c.; Wang, T.; Liu, S.-l.; Kang, W.-p.; Chen, X.; Feng, K.; Zhang, X.-q.; Zhi, Y. Biomass and vegetation coverage survey in the Mu Us sandy land-based on unmanned aerial vehicle RGB images. Int. J. Appl. Earth Obs. Geoinf.
**2021**, 94, 102239. [Google Scholar] [CrossRef] - Li, Y.; Liu, H.; Ma, J.; Zhang, L. Estimation of leaf area index for winter wheat at early stages based on convolutional neural networks. Comput. Electron. Agric.
**2021**, 190, 106480. [Google Scholar] [CrossRef] - Yue, J.; Yang, G.; Tian, Q.; Feng, H.; Xu, K.; Zhou, C. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices. ISPRS J. Photogramm. Remote Sens.
**2019**, 150, 226–244. [Google Scholar] [CrossRef] - Flores, P.; Zhang, Z.; Igathinathane, C.; Jithin, M.; Naik, D.; Stenger, J.; Ransom, J.; Kiran, R. Distinguishing seedling volunteer corn from soybean through greenhouse color, color-infrared, and fused images using machine and deep learning. Ind. Crop. Prod.
**2021**, 161, 113223. [Google Scholar] [CrossRef] - Waheed, A.; Goyal, M.; Gupta, D.; Khanna, A.; Hassanien, A.E.; Pandey, H.M. An optimized dense convolutional neural network model for disease recognition and classification in corn leaf. Comput. Electron. Agric.
**2020**, 175, 105456. [Google Scholar] [CrossRef] - Maimaitijiang, M.; Sagan, V.; Sidike, P.; Maimaitiyiming, M.; Hartling, S.; Peterson, K.T.; Maw, M.J.W.; Shakoor, N.; Mockler, T.; Fritschi, F.B. Vegetation Index Weighted Canopy Volume Model (CVMVI) for soybean biomass estimation from Unmanned Aerial System-based RGB imagery. ISPRS J. Photogramm. Remote Sens.
**2019**, 151, 27–41. [Google Scholar] [CrossRef] - Guo, Y.; Fu, Y.H.; Chen, S.; Robin Bryant, C.; Li, X.; Senthilnath, J.; Sun, H.; Wang, S.; Wu, Z.; de Beurs, K. Integrating spectral and textural information for identifying the tasseling date of summer maize using UAV based RGB images. Int. J. Appl. Earth Obs. Geoinf.
**2021**, 102, 102435. [Google Scholar] [CrossRef] - Sumesh, K.C.; Ninsawat, S.; Som-ard, J. Integration of RGB-based vegetation index, crop surface model and object-based image analysis approach for sugarcane yield estimation using unmanned aerial vehicle. Comput. Electron. Agric.
**2021**, 180, 105903. [Google Scholar] [CrossRef] - Haralick, R.M.; Shanmugam, K.; Dinstein, I. Textural Features for Image Classification. IEEE Trans. Syst. Man Cybern.
**1973**, SMC-3, 610–621. [Google Scholar] - Laliberte, A.S.; Rango, A. Texture and Scale in Object-Based Analysis of Subdecimeter Resolution Unmanned Aerial Vehicle (UAV) Imagery. IEEE Trans. Geosci. Remote Sens.
**2009**, 47, 761–770. [Google Scholar] - Murray, H.; Lucieer, A.; Williams, R. Texture-based classification of sub-Antarctic vegetation communities on Heard Island. Int. J. Appl. Earth Obs. Geoinf.
**2010**, 12, 138–149. [Google Scholar] - Kelsey, K.C.; Neff, J.C. Estimates of Aboveground Biomass from Texture Analysis of Landsat Imagery. Remote Sens.
**2014**, 6, 6407–6422. [Google Scholar] - Sarker, L.R.; Nichol, J.E. Improved forest biomass estimates using ALOS AVNIR-2 texture indices. Remote Sens. Environ.
**2011**, 115, 968–977. [Google Scholar] - Chen, J.M.; Cihlar, J. Retrieving leaf area index of boreal conifer forests using Landsat TM images. Remote Sens. Environ.
**1996**, 55, 153–162. [Google Scholar] [CrossRef] - Torres-Sánchez, J.; Pena, J.M.; De Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric.
**2014**, 103, 104–113. [Google Scholar] - Soudani, K.; Fran?Ois, C.; Maire, G.L.; Dantec, V.L.; Dufrêne, E. Comparative analysis of IKONOS, SPOT, and ETM+ data for leaf area index estimation in temperate coniferous and deciduous forest stands. Remote Sens. Environ.
**2006**, 102, 161–175. [Google Scholar] - Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubühler, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ.
**2008**, 112, 2341–2353. [Google Scholar] - Sellaro, R.; Crepy, M.; Trupkin, S.A.; Karayekov, E.; Buchovsky, A.S.; Rossi, C.; Casal, J.J. Cryptochrome as a Sensor of the Blue/Green Ratio of Natural Radiation in Arabidopsis. Plant Physiol.
**2010**, 154, 401–409. [Google Scholar] - Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf.
**2015**, 39, 79–87. [Google Scholar] [CrossRef] - Zhang, J.; Tian, H.; Wang, D.; Li, H.; Mouazen, A.M. A novel spectral index for estimation of relative chlorophyll content of sugar beet. Comput. Electron. Agric.
**2021**, 184, 106088. [Google Scholar] [CrossRef] - Wu, J.; Wang, D.; Bauer, M.E. Assessing broadband vegetation indices and QuickBird data in estimating leaf area index of corn and potato canopies. Field Crop. Res.
**2007**, 102, 33–42. [Google Scholar] [CrossRef] - Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric.
**2008**, 63, 282–293. [Google Scholar] [CrossRef] - Li, H.; Chen, Z.-x.; Jiang, Z.-w.; Wu, W.-b.; Ren, J.-q.; Liu, B.; Tuya, H. Comparative analysis of GF-1, HJ-1, and Landsat-8 data for estimating the leaf area index of winter wheat. J. Integr. Agric.
**2017**, 16, 266–285. [Google Scholar] [CrossRef] - Singh, A.; Ganapathysubramanian, B.; Singh, A.K.; Sarkar, S. Machine Learning for High-Throughput Stress Phenotyping in Plants. Trends Plant Sci.
**2016**, 21, 110–124. [Google Scholar] [CrossRef] [Green Version] - Cen, H.; Wan, L.; Zhu, J.; Li, Y.; Li, X.; Zhu, Y.; Weng, H.; Wu, W.; Yin, W.; Xu, C.; et al. Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras. Plant Methods
**2019**, 15, 32. [Google Scholar] [CrossRef] - Geipel, J.; Link, J.; Claupein, W. Combined Spectral and Spatial Modeling of Corn Yield Based on Aerial Images and Crop Surface Models Acquired with an Unmanned Aircraft System. Remote Sens.
**2014**, 6, 10335–10355. [Google Scholar] - Yamaguchi, T.; Tanaka, Y.; Imachi, Y.; Yamashita, M.; Katsura, K. Feasibility of Combining Deep Learning and RGB Images Obtained by Unmanned Aerial Vehicle for Leaf Area Index Estimation in Rice. Remote Sens.
**2021**, 13, 84. [Google Scholar] - Sun, B.; Wang, C.; Yang, C.; Xu, B.; Zhou, G.; Li, X.; Xie, J.; Xu, S.; Liu, B.; Xie, T.; et al. Retrieval of rapeseed leaf area index using the PROSAIL model with canopy coverage derived from UAV images as a correction parameter. Int. J. Appl. Earth Obs. Geoinf.
**2021**, 102, 102373. [Google Scholar] [CrossRef] - Adnan, M.; Abaid-ur-Rehman, M.; Latif, M.A.; Ahmad, N.; Akhter, N. Mapping wheat crop phenology and the yield using machine learning (ML). Int. J. Adv. Comput. Sci. Appl.
**2018**, 9, 301–306. [Google Scholar] - Liu, Y.; Heuvelink, G.B.M.; Bai, Z.; He, P.; Xu, X.; Ding, W.; Huang, S. Analysis of spatio-temporal variation of crop yield in China using stepwise multiple linear regression. Field Crop. Res.
**2021**, 264, 108098. [Google Scholar] [CrossRef] - Ta, N.; Chang, Q.; Zhang, Y. Estimation of Apple Tree Leaf Chlorophyll Content Based on Machine Learning Methods. Remote Sens.
**2021**, 13, 3902. [Google Scholar] [CrossRef]

**Figure 3.**Part of the parameters analyzed in the study. (

**a**) R (DN value of Red Channel) (IF); (

**b**) MGRVI (Modified Green Red Vegetation Index); (

**c**). MEA_R (the mean of the red band).

**Figure 4.**Correlation between UAV image characteristic parameters and LAI at different growth stages. (

**a**) Initial flowering stage (IF); (

**b**) young fruit stage (YF); (

**c**) fruit enlargement stage (FE).

**Figure 5.**Validation results of single-factor model for kiwifruit LAI in each growth stage. (

**a**) Initial flowering stage (IF); (

**b**) young fruit stage (YF); (

**c**) fruit enlargement stage (FE).

**Figure 6.**Validation results of spectral parameters model for SWR and RFR models in each growth stage. (

**a**) With SWR model in initial flowering stage (IF); (

**b**) with SWR model in young fruit stage (YF); (

**c**) with SWR model in fruit enlargement stage (FE); (

**d**) with RFR model in initial flowering stage (IF); (

**e**) with RFR model in young fruit stage (YF); (

**f**) with RFR model in fruit enlargement stage (FE).

**Figure 7.**Validation results of combined texture feature models with SWR and RFR in each growth stage. (

**a**) With SWR model in initial flowering stage (IF); (

**b**) with SWR model in young fruit stage (YF); (

**c**) with SWR model in fruit enlargement stage (FE); (

**d**) with RFR model in initial flowering stage (IF); (

**e**) with RFR model in young fruit stage (YF); (

**f**) with RFR model in fruit enlargement stage (FE).

**Figure 8.**Spatial distribution of kiwifruit LAI estimation in study area. (

**a**) Initial flowering stage (IF); (

**b**) young fruit stage (YF); (

**c**) fruit enlargement stage (FE).

Growth Stages | Date | Number of Images |
---|---|---|

Initial flowering stage (IF) | 8 May | 145 |

Young fruit stage (YF) | 5 June | 144 |

Fruit enlargement stage (FE) | 8 July | 146 |

Parameters | Name | Formulas | Sources |
---|---|---|---|

R | DN value of Red Channel | $\mathrm{R}={\mathrm{DN}}_{\mathrm{R}}$ | Conventional empirical parameters |

G | DN value of Green Channel | $\mathrm{G}={\mathrm{DN}}_{\mathrm{G}}$ | |

B | DN value of Blue Channel | $\mathrm{B}={\mathrm{DN}}_{\mathrm{B}}$ | |

r | Normalized Redness Intensity | $\mathrm{r}=\mathrm{R}/\left(\mathrm{R}+\mathrm{G}+\mathrm{B}\right)$ | |

g | Normalized Greenness Intensity | $\mathrm{g}=\mathrm{G}/\left(\mathrm{R}+\mathrm{G}+\mathrm{B}\right)$ | |

b | Normalized Blueness Intensity | $\mathrm{b}=\mathrm{B}/\left(\mathrm{R}+\mathrm{G}+\mathrm{B}\right)$ | |

EXG | Excess Green Index | $\mathrm{EXG}=2\times \mathrm{R}-\mathrm{G}+\mathrm{B}$ | [45] |

VARI | Visible Atmospherically Resistant Index | $\mathrm{VARI}=\left(\mathrm{R}+\mathrm{G}\right)/\left(\mathrm{R}+\mathrm{G}-\mathrm{B}\right)$ | [46] |

GRRI | Green Red Ratio Index | $\mathrm{GRRI}=\mathrm{G}/\mathrm{R}$ | [47] |

GBRI | Green Blue Ratio Index | $\mathrm{GBRI}=\mathrm{G}/\mathrm{B}$ | [48] |

RBRI | Red Blue Ratio Index | $\mathrm{RBRI}=\mathrm{R}/\mathrm{B}$ | [48] |

RGBVI | Red Green Blue Vegetation Index | $\mathrm{RGBVI}=\left({\mathrm{G}}^{2}-\mathrm{BR}\right)/\left({\mathrm{G}}^{2}+\mathrm{BR}\right)$ | [49] |

GLA | Green Leaf Algorithm | $\mathrm{GLA}=\left(2\mathrm{G}-\mathrm{R}-\mathrm{B}\right)/\left(2\mathrm{G}+\mathrm{R}+\mathrm{B}\right)$ | [50] |

MGRVI | Modified Green Red Vegetation Index | $\mathrm{MGRVI}=\left({\mathrm{G}}^{2}-{\mathrm{R}}^{2}\right)/\left({\mathrm{G}}^{2}+{\mathrm{R}}^{2}\right)$ | [49] |

WI | Woebbecke Index | $\mathrm{WI}=\left(\mathrm{G}-\mathrm{B}\right)/\left(\mathrm{R}-\mathrm{G}\right)$ | [51] |

ExGR | Excess Green Red Index | $\mathrm{ExGR}=\mathrm{ExG}-1.4\mathrm{R}-\mathrm{G}$ | [52] |

CIVE | Color Index of Vegetation | $\mathrm{CIVE}=0.441\mathrm{R}-0.881\mathrm{G}+0.385\mathrm{B}+18.78745$ | [53] |

Parameters | Name | Formulas | Sources |
---|---|---|---|

MEA | Mean | ${\mathrm{MEA}}_{\mathrm{i}}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}\mathrm{i}\left({\mathrm{P}}_{\mathrm{i},,\mathrm{j}}\right),$ ${\mathrm{MEA}}_{\mathrm{j}}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}\mathrm{j}\left({\mathrm{P}}_{\mathrm{i},,\mathrm{j}}\right)$ | [39] |

VAR | Variance | ${\mathrm{VAR}}_{\mathrm{i}}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}{\mathrm{P}}_{\mathrm{i},,\mathrm{j}}{\left(\mathrm{i}-{\mathrm{MEA}}_{\mathrm{i}}\right)}^{2}$ ${\mathrm{VAR}}_{\mathrm{j}}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}{\mathrm{P}}_{\mathrm{i},,\mathrm{j}}{\left(\mathrm{j}-{\mathrm{MEA}}_{\mathrm{j}}\right)}^{2}$ | |

HOM | Homogeneity | $\mathrm{HOM}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}\frac{{\mathrm{A}}_{\mathrm{i},\mathrm{j}}}{1+{\left(\mathrm{i}-\mathrm{j}\right)}^{2}}$ | |

CON | Contrast | $\mathrm{CON}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}{\mathrm{A}}_{\mathrm{i},\mathrm{j}}{\left(\mathrm{i}-{\mathrm{j}}_{\mathrm{i}}\right)}^{2}$ | |

DIS | Dissimilarity | $\mathrm{DIS}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}{\mathrm{A}}_{\mathrm{i},\mathrm{j}}\left|\mathrm{i}-\mathrm{j}\right|$ | |

ENT | Entropy | $\mathrm{ENT}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}\mathrm{A}\left(\mathrm{i},\mathrm{j}\right)\mathrm{logA}\left(\mathrm{i},\mathrm{j}\right)$ | |

ASM | Angular Second Moment | $\mathrm{CON}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}{\mathrm{A}}_{\mathrm{i},\mathrm{j}}{}^{2}$ | |

COR | Correlation | $\mathrm{HOM}={\displaystyle \sum}_{\mathrm{i},\mathrm{j}=0}^{\mathrm{n}-1}{\mathrm{P}}_{\mathrm{i},,\mathrm{j}}\frac{\left(\mathrm{i}-{\mathrm{MEA}}_{\mathrm{i}}\right)\left(\mathrm{j}-{\mathrm{MEA}}_{\mathrm{j}}\right)}{\sqrt{{\mathrm{VAR}}_{\mathrm{i}}^{2}{\mathrm{VAR}}_{\mathrm{j}}^{2}}}$ |

Set Name | Variables | Methods for Combination |
---|---|---|

α | R, G, ExGR, B, b, RBRI, GBRI, CIVE, EXG, RGBVI | Spectral indices highly correlated with LAI in IF |

β | VAR_G, VAR_R, MEA_R, MEA_G, VAR_B, MEA_B, CON_R, CON_G, CON_B, DIS_R, DIS_G, DIS_B, HOM_B, HOM_R, HOM_G, ASM_G | Texture features highly correlated with LAI in IF |

γ | R, G, B, r, g, b, EXG, VARI, GRRI, GBRI, RGBVI, GLA, MGRVI, ExGR, CIVE | Spectral indices highly correlated with LAI in YF and FE |

δ | MEA_R, VAR_R, HOM_R, DIS_R, ENT_R, ASM_R, COR_R, MEA_G, VAR_G, HOM_G, DIS_G, ENT_G, COR_G, MEA_B, VAR_B, HOM_B, DIS_B, ENT_B, ASM_B, COR_B | Texture features highly correlated with LAI in YF and FE |

Growth Stages | Independent Variable | Modeling Equation | R^{2} | RMSE | nRMSE/% |
---|---|---|---|---|---|

IF | R | y = 0.0005x^{2} − 0.1028x + 5.2873 | 0.466 | 0.081 | 15.86 |

YF | ExGR | y = −0.00001784x^{2} + 0.00006079x + 1.004 | 0.719 | 0.061 | 14.22 |

FE | ExGR | y = 0.00006931x^{2} + 0.03275x + 4.215 | 0.736 | 0.108 | 17.84 |

Growth Stages | Modeling Method | Spectral Parameters | AIC | R^{2} | RMSE | nRMSE/% |
---|---|---|---|---|---|---|

IF | SWR | G, b, GBRI | −323.23 | 0.541 | 0.075 | 14.70 |

RFR | α | - | 0.965 | 0.021 | 4.05 | |

YF | SWR | R, G, r, g, VARI, GRRI, GBRI, RGBVI, GLA | −365.10 | 0.819 | 0.049 | 11.55 |

RFR | γ | - | 0.973 | 0.019 | 4.42 | |

FE | SWR | R, G, g, GRRI, GBRI, MGRVI | −278.64 | 0.765 | 0.102 | 16.81 |

RFR | γ | - | 0.972 | 0.035 | 5.80 |

Growth Stages | Modeling Method | Spectral Parameters | AIC | R^{2} | RMSE | nRMSE/% |
---|---|---|---|---|---|---|

IF | SWR | R, G, B, b, GBRI, RBRI, RGBVI, VAR_R, HOM_R, CON_R, DIS_R, MEA_G, VAR_G, ASM_G, VAR_B, HOM_B, CON_B, DIS_B | −368.88 | 0.859 | 0.042 | 8.14 |

RFR | α + β | - | 0.968 | 0.020 | 3.88 | |

YF | SWR | R, G, B, g, VARI, GRRI, GBRI, RGBVI, GLA, MGRVI, MEA_R, VAR_R, DIS_R, ENT_R, ASM_R, COR_R, VAR_G, HOM_G, DIS_G, ENT_G, COR_G, MEA_B, VAR_B, HOM_B, DIS_B, ENT_B, ASM_B | −465.04 | 0.978 | 0.017 | 3.99 |

RFR | γ + δ | - | 0.978 | 0.017 | 4.08 | |

FE | SWR | R, B, r, g, VAAI, GRRI, GBRI, RGBVI, GLA, MGRVI, VAR_R, ENT_R, COR_R, MEA_G, HOM_G, ENT_G, COR_G, MEA_B, HOM_B, ENT_B, ASM_B | −343.92 | 0.947 | 0.048 | 7.99 |

RFR | γ + δ | - | 0.977 | 0.032 | 5.30 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Zhang, Y.; Ta, N.; Guo, S.; Chen, Q.; Zhao, L.; Li, F.; Chang, Q.
Combining Spectral and Textural Information from UAV RGB Images for Leaf Area Index Monitoring in Kiwifruit Orchard. *Remote Sens.* **2022**, *14*, 1063.
https://doi.org/10.3390/rs14051063

**AMA Style**

Zhang Y, Ta N, Guo S, Chen Q, Zhao L, Li F, Chang Q.
Combining Spectral and Textural Information from UAV RGB Images for Leaf Area Index Monitoring in Kiwifruit Orchard. *Remote Sensing*. 2022; 14(5):1063.
https://doi.org/10.3390/rs14051063

**Chicago/Turabian Style**

Zhang, Youming, Na Ta, Song Guo, Qian Chen, Longcai Zhao, Fenling Li, and Qingrui Chang.
2022. "Combining Spectral and Textural Information from UAV RGB Images for Leaf Area Index Monitoring in Kiwifruit Orchard" *Remote Sensing* 14, no. 5: 1063.
https://doi.org/10.3390/rs14051063