Uncrewed Aerial Vehicle (UAV)-Based High-Throughput Phenotyping of Maize Silage Yield and Nutritive Values Using Multi-Sensory Feature Fusion and Multi-Task Learning with Attention Mechanism
Highlights
- Attention-based deep fusion improved the usage of multi-sensor features while keeping them distinguishable; multi-task learning estimated multiple silage traits with outperformance compared to baselines.
- Hyperspectral data contributed the most to the model; LiDAR and RGB added complementary signals; a retrieval-based option can be used when hyperspectral data are not available.
- Multi-sensor, attention-based fusion is a practical route to UAV-based, non-destructive phenotyping.
- The trait estimates can be used to support breeding selection and field management while reducing reliance on lab measurements.
Abstract
1. Introduction
2. Materials
2.1. Study Region and Phenotypic Data Collection
2.2. Phenotypic Data Exploration
2.3. Multi-Sensory Data Collection and Pre-Processing
2.3.1. Hyperspectral Imagery
2.3.2. LiDAR Data
2.3.3. RGB Imagery
3. Methodology
3.1. Feature Extraction
3.1.1. Hyperspectral-Based Features
3.1.2. RGB-Based Features
3.1.3. LiDAR-Based Features
3.2. Proposed Model
3.3. Comparative Evaluation and Performance Metrics
3.4. Permutation Feature Importance
| Algorithm 1: PFI Analysis Pseudo-code, Python-like |
| """ hs_cols: hyperspectral feature columns tx_cols: textural feature columns morp_cols: morphological feature columns struc_cols: structural feature columns int_cols: LiDAR intensity feature columns date_x: columns of features belong to UAV survey date x """ for data_modality in [hs_cols, tx_cols, morp_cols, struc_cols, int_cols]: get_importances(X, y, data_modality) for data_acquisition_date in [date_1, date_2, …]: get_importances(X, y, data_acquisition_date) def get_importances(X, y, columns_to_shuffle): """ columns_to_shuffle: is a sequence of column numbers to shuffle """ base_score = score_func(X, y) permuted_X = feature_shuffling(X, columns_to_shuffle) permuted_score = score_func(permuted_X, y) feature_importance = (base_score - permuted_score)/base_score return feature_importance def score_func(X, y): y_pre = MUSTA.predict(X) score = compute_weighted_tau(y, y_pre) return score |
3.5. Retrieval-Augmented Quality Traits Estimation
| Algorithm 2: Retrieval-Augmented Quality Traits Estimation Pseudo-code, Python-like |
| def retrieval_estimation(training_db, test_plot): """ training_db: tabular data, has all the feature columns, including the hyperspectral features(hs_col), textural features, morphological features, structural features, LiDAR intensity features(none_hs_col). test_plot: tabular data, has the same columns as training_db, but the hyperspectral features are blank (zero-masked). """ hs_retrived = retrieve_hs_feature(training_db, test_plot) test_plot[hs_col] = hs_retrived retrieval_estimation_values = MUSTA.predict(test_plot) return retrieval_estimation_values def retrieve_hs_feature(training_db, test_plot): analogous_plots_n = nearest_nb_search(training_db[none_hs_col], test_plot[none_hs_col], n_nbs=10) w_n = cos_ similarity(analogous_plots_n[none_hs_col], test_plot[none_hs_col]) hs_retrived = sum (analogous_plots_n[hs_col] * w_n)/sum (w_n) return hs_retrived |
4. Results
4.1. Model Comparison and Performance
4.1.1. Model Regression Performance
4.1.2. Model Classification Performance
4.2. Class Separability Analysis of Fused Features vs. Stacked Features
4.3. Feature Importance Analysis
4.3.1. Feature Importance on UAV Sensor Modalities
4.3.2. Feature Importance on UAV Survey Timing
4.4. Retrieval-Augmented Method Performance
5. Discussion
5.1. Advantages of MUSTA
5.2. Feature Importance Analisis
5.2.1. UAV Sensor Modality Contribution
5.2.2. UAV Survey Timing Contribution
5.3. Advantages of Retrieval-Based Method
5.4. Limitations and Future Work
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
- RGB Texture Features (20 features): used skimage Python library to compute the GLCM at a distance of 1 pixel in four directions (0°, 45°, 90°, 135°). From these matrices, five properties are calculated: Contrast, Correlation, Energy, Homogeneity, and Dissimilarity, resulting in 20 texture features per plot.
- RGB Morphological Features (25 features): used the uib_vfeatures Python library to automatically calculate 25 shape features from the cleaned canopy mask [50]. The features include Solidity, CH Perimeter, CH Area, BB Area, Rectangularity, Min r, Max r, Feret, Breadth, Circularity, Roundness, Feret Angle, Eccentricity, Center, Sphericity, Aspect Ratio, Area equivalent, Perimeter equivalent, Equivalent ellipse area, Compactness, Area, Convexity, Shape, Perimeter, Bounding_box_area, and Shape Factor—a total of 25 features that quantify the shape of the canopy.
Appendix B
- LiDAR Structural Features (36 features)
- a.
- The 10th through 99th percentiles of relative plant height (10 features).
- b.
- Statistical moments describing the distribution of relative heights (5 features): standard_deviation, quadratic_mean, skewness, kurtosis, variation.
- c.
- Canopy Cover (7 features): Estimated as the ratio of canopy points to the total number of points above seven different height thresholds (0.05, 0.10, 0.20, 0.30, 0.40, 0.50, 0.75 multiplied by the 99th percentile plot height) [66].
- d.
- Canopy Volume (1 feature): The plot is divided into a grid of 8 × 8 cm cells. The volume is the sum of all cell volumes, where each cell’s volume is its 8 × 8 cm area multiplied by the 95th percentile relative height of points within it [66].
- e.
- Projected Leaf Area (PLA) (7 features): Calculated similarly to canopy cover but multiplied by the cell resolution and total grid area to estimate an aggregated area [53].
- f.
- Plant Area Index (PAI) (1 feature): Calculated from voxelized LiDAR data using the voxel-based canopy profiling method proposed by Hosoi and Omasa, which provides an estimate of the total one-sided plant area per unit ground area [53].
- g.
- Plant Area Density (PAD) (5 features): Describes the vertical distribution of plant material within the canopy. Our study calculated PAD for vertical layers (from 0 to 4 m with a 10 cm resolution), providing a detailed profile of canopy structure [53].
- LiDAR Intensity Features (16 features)
- a.
- The 10th through 99th percentiles of point intensity values (10 features).
- b.
- Statistical moments describing the distribution of point intensity (5 features): standard_deviation, quadratic_mean, skewness, kurtosis, variation.
- c.
- Point Cloud Statistics (1 features): total non-ground points.
Appendix C
| Quality Values | Models | Metrics | ||||
|---|---|---|---|---|---|---|
| r | MAE | RMSE | ||||
| DM Yield (US ton/acre) | Single-Task Models | Ridge | 0.75 | 0.79 | 0.84 | 1.07 |
| LASSO | 0.73 | 0.78 | 0.82 | 1.06 | ||
| SVR | 0.65 | 0.72 | 0.93 | 1.20 | ||
| PLSR | 0.60 | 0.68 | 0.95 | 1.22 | ||
| Random Forest | 0.70 | 0.73 | 0.90 | 1.15 | ||
| Multi-Task Models | DNN | 0.74 | 0.78 | 0.82 | 1.05 | |
| 1D-CNN | 0.75 | 0.80 | 0.79 | 1.01 | ||
| Att-1D-CNN | 0.75 | 0.79 | 0.82 | 1.04 | ||
| Bi-LSTM | 0.70 | 0.74 | 0.92 | 1.16 | ||
| Att-Bi-LSTM | 0.70 | 0.77 | 0.87 | 1.09 | ||
| MUSTA | 0.79 | 0.82 | 0.76 | 0.95 | ||
| NDF (%) | Single-Task Models | Ridge | 0.29 | 0.33 | 2.04 | 2.65 |
| LASSO | 0.31 | 0.29 | 1.91 | 2.39 | ||
| SVR | 0.30 | 0.23 | 1.95 | 2.44 | ||
| PLSR | 0.21 | 0.21 | 1.95 | 2.44 | ||
| Random Forest | 0.42 | 0.36 | 1.86 | 2.34 | ||
| Multi-Task Models | DNN | 0.37 | 0.32 | 2.15 | 2.71 | |
| 1D-CNN | 0.29 | 0.30 | 2.03 | 2.55 | ||
| Att-1D-CNN | 0.35 | 0.34 | 1.90 | 2.41 | ||
| Bi-LSTM | 0.36 | 0.32 | 2.03 | 2.53 | ||
| Att-Bi-LSTM | 0.38 | 0.32 | 2.00 | 2.50 | ||
| MUSTA | 0.39 | 0.35 | 1.89 | 2.38 | ||
| ADF (%) | Single-Task Models | Ridge | 0.39 | 0.38 | 1.50 | 1.95 |
| LASSO | 0.47 | 0.38 | 1.40 | 1.76 | ||
| SVR | 0.36 | 0.33 | 1.43 | 1.80 | ||
| PLSR | 0.36 | 0.30 | 1.44 | 1.81 | ||
| Random Forest | 0.47 | 0.41 | 1.39 | 1.74 | ||
| Multi-Task Models | DNN | 0.45 | 0.35 | 1.60 | 2.02 | |
| 1D-CNN | 0.42 | 0.36 | 1.47 | 1.86 | ||
| Att-1D-CNN | 0.40 | 0.40 | 1.42 | 1.79 | ||
| Bi-LSTM | 0.37 | 0.36 | 1.51 | 1.89 | ||
| Att-Bi-LSTM | 0.41 | 0.34 | 1.50 | 1.88 | ||
| MUSTA | 0.51 | 0.41 | 1.39 | 1.76 | ||
| CP (%) | Single-Task Models | Ridge | 0.60 | 0.67 | 0.36 | 0.47 |
| LASSO | 0.62 | 0.65 | 0.38 | 0.48 | ||
| SVR | 0.64 | 0.67 | 0.36 | 0.46 | ||
| PLSR | 0.53 | 0.57 | 0.40 | 0.49 | ||
| Random Forest | 0.64 | 0.65 | 0.37 | 0.47 | ||
| Multi-Task Models | DNN | 0.69 | 0.70 | 0.35 | 0.44 | |
| 1D-CNN | 0.71 | 0.71 | 0.34 | 0.42 | ||
| Att-1D-CNN | 0.68 | 0.69 | 0.35 | 0.44 | ||
| Bi-LSTM | 0.62 | 0.67 | 0.36 | 0.45 | ||
| Att-Bi-LSTM | 0.60 | 0.66 | 0.37 | 0.46 | ||
| MUSTA | 0.68 | 0.68 | 0.35 | 0.44 | ||
| STARCH (%) | Single-Task Models | Ridge | 0.34 | 0.40 | 2.39 | 3.09 |
| LASSO | 0.38 | 0.35 | 2.31 | 2.89 | ||
| SVR | 0.34 | 0.32 | 2.35 | 2.93 | ||
| PLSR | 0.27 | 0.29 | 2.37 | 2.95 | ||
| Random Forest | 0.37 | 0.40 | 2.25 | 2.83 | ||
| Multi-Task Models | DNN | 0.40 | 0.36 | 2.55 | 3.23 | |
| 1D-CNN | 0.38 | 0.37 | 2.37 | 3.00 | ||
| Att-1D-CNN | 0.42 | 0.42 | 2.26 | 2.86 | ||
| Bi-LSTM | 0.40 | 0.39 | 2.37 | 3.01 | ||
| Att-Bi-LSTM | 0.42 | 0.37 | 2.35 | 2.96 | ||
| MUSTA | 0.42 | 0.40 | 2.28 | 2.89 | ||
| MILK2006 (US ton/acre) | Single-Task Models | Ridge | 0.63 | 0.71 | 1.61 | 2.04 |
| LASSO | 0.71 | 0.74 | 1.48 | 1.85 | ||
| SVR | 0.29 | 0.26 | 2.12 | 2.75 | ||
| PLSR | 0.52 | 0.63 | 1.67 | 2.15 | ||
| Random Forest | 0.63 | 0.67 | 1.61 | 2.06 | ||
| Multi-Task Models | DNN | 0.65 | 0.70 | 1.60 | 2.02 | |
| 1D-CNN | 0.68 | 0.72 | 1.52 | 1.91 | ||
| Att-1D-CNN | 0.65 | 0.72 | 1.54 | 1.94 | ||
| Bi-LSTM | 0.60 | 0.67 | 1.67 | 2.10 | ||
| Att-Bi-LSTM | 0.61 | 0.69 | 1.58 | 2.01 | ||
| MUSTA | 0.74 | 0.76 | 1.45 | 1.80 | ||
Appendix D
| Quality Values | Models | Metrics | ||||
|---|---|---|---|---|---|---|
| F1top_c | Accuracyavg | Precisiontop_c | Recalltop_c | |||
| DM Yield | Multi-Task Models | DNN | 0.68 | 0.74 | 0.66 | 0.70 |
| 1D-CNN | 0.64 | 0.73 | 0.64 | 0.63 | ||
| Att-1D-CNN | 0.69 | 0.74 | 0.63 | 0.77 | ||
| Bi-LSTM | 0.65 | 0.72 | 0.58 | 0.75 | ||
| Att-Bi-LSTM | 0.68 | 0.73 | 0.62 | 0.75 | ||
| MUSTA | 0.70 | 0.76 | 0.69 | 0.72 | ||
| NDF | Multi-Task Models | DNN | 0.39 | 0.62 | 0.49 | 0.33 |
| 1D-CNN | 0.49 | 0.62 | 0.46 | 0.52 | ||
| Att-1D-CNN | 0.50 | 0.64 | 0.46 | 0.56 | ||
| Bi-LSTM | 0.44 | 0.62 | 0.46 | 0.42 | ||
| Att-Bi-LSTM | 0.44 | 0.61 | 0.45 | 0.43 | ||
| MUSTA | 0.47 | 0.61 | 0.43 | 0.51 | ||
| ADF | Multi-Task Models | DNN | 0.41 | 0.62 | 0.51 | 0.34 |
| 1D-CNN | 0.51 | 0.63 | 0.49 | 0.54 | ||
| Att-1D-CNN | 0.50 | 0.65 | 0.46 | 0.54 | ||
| Bi-LSTM | 0.47 | 0.63 | 0.50 | 0.45 | ||
| Att-Bi-LSTM | 0.45 | 0.61 | 0.46 | 0.43 | ||
| MUSTA | 0.51 | 0.64 | 0.48 | 0.55 | ||
| CP | Multi-Task Models | DNN | 0.64 | 0.70 | 0.61 | 0.67 |
| 1D-CNN | 0.66 | 0.72 | 0.66 | 0.65 | ||
| Att-1D-CNN | 0.62 | 0.70 | 0.61 | 0.62 | ||
| Bi-LSTM | 0.63 | 0.70 | 0.61 | 0.66 | ||
| Att-Bi-LSTM | 0.61 | 0.70 | 0.64 | 0.58 | ||
| MUSTA | 0.64 | 0.70 | 0.61 | 0.67 | ||
| STARCH | Multi-Task Models | DNN | 0.44 | 0.63 | 0.52 | 0.38 |
| 1D-CNN | 0.53 | 0.64 | 0.49 | 0.57 | ||
| Att-1D-CNN | 0.54 | 0.65 | 0.49 | 0.59 | ||
| Bi-LSTM | 0.53 | 0.64 | 0.51 | 0.54 | ||
| Att-Bi-LSTM | 0.47 | 0.62 | 0.48 | 0.47 | ||
| MUSTA | 0.52 | 0.64 | 0.50 | 0.54 | ||
| MILK2006 | Multi-Task Models | DNN | 0.59 | 0.70 | 0.62 | 0.55 |
| 1D-CNN | 0.59 | 0.69 | 0.59 | 0.58 | ||
| Att-1D-CNN | 0.64 | 0.70 | 0.56 | 0.74 | ||
| Bi-LSTM | 0.61 | 0.69 | 0.54 | 0.69 | ||
| Att-Bi-LSTM | 0.61 | 0.70 | 0.56 | 0.67 | ||
| MUSTA | 0.65 | 0.72 | 0.63 | 0.67 | ||
References
- Zhao, M.; Feng, Y.; Shi, Y.; Shen, H.; Hu, H.; Luo, Y.; Xu, L.; Kang, J.; Xing, A.; Wang, S.; et al. Yield and Quality Properties of Silage Maize and Their Influencing Factors in China. Sci. China Life Sci. 2022, 65, 1655–1666. [Google Scholar] [CrossRef] [PubMed]
- Martin, N.P.; Russelle, M.P.; Powell, J.M.; Sniffen, C.J.; Smith, S.I.; Tricarico, J.M.; Grant, R.J. Invited Review: Sustainable Forage and Grain Crop Production for the US Dairy Industry. J. Dairy Sci. 2017, 100, 9479–9494. [Google Scholar] [CrossRef] [PubMed]
- Stevenson, J.R.; Villoria, N.; Byerlee, D.; Kelley, T.; Maredia, M. Green Revolution Research Saved an Estimated 18 to 27 Million Hectares from Being Brought into Agricultural Production. Proc. Natl. Acad. Sci. USA 2013, 110, 8363–8368. [Google Scholar] [CrossRef] [PubMed]
- Bornowski, N.; Michel, K.J.; Hamilton, J.P.; Ou, S.; Seetharam, A.S.; Jenkins, J.; Grimwood, J.; Plott, C.; Shu, S.; Talag, J.; et al. Genomic Variation within the Maize Stiff-Stalk Heterotic Germplasm Pool. Plant Genome 2021, 14, e20114. [Google Scholar] [CrossRef]
- Jiang, M.; Ma, Y.; Khan, N.; Khan, M.Z.; Akbar, A.; Khan, R.U.; Kamran, M.; Khan, N.A. Effect of Spring Maize Genotypes on Fermentation and Nutritional Value of Whole Plant Maize Silage in Northern Pakistan. Fermentation 2022, 8, 587. [Google Scholar] [CrossRef]
- Perisic, M.; Perkins, A.; Lima, D.C.; de Leon, N.; Mitrovic, B.; Stanisavljevic, D. GEM Project-Derived Maize Lines Crossed with Temperate Elite Tester Lines Make for High-Quality, High-Yielding and Stable Silage Hybrids. Agronomy 2023, 13, 243. [Google Scholar] [CrossRef]
- Johnson, L.M.; Harrison, J.H.; Davidson, D.; Robutti, J.L.; Swift, M.; Mahanna, W.C.; Shinners, K. Corn Silage Management I: Effects of Hybrid, Maturity, and Mechanical Processing on Chemical and Physical Characteristics. J. Dairy Sci. 2002, 85, 833–853. [Google Scholar] [CrossRef]
- Lorenz, A.J.; Beissinger, T.M.; Silva, R.R.; de Leon, N. Selection for Silage Yield and Composition Did Not Affect Genomic Diversity Within the Wisconsin Quality Synthetic Maize Population. G3 Genes Genomes Genet. 2015, 5, 541–549. [Google Scholar] [CrossRef]
- Furbank, R.T.; Tester, M. Phenomics—Technologies to Relieve the Phenotyping Bottleneck. Trends Plant Sci. 2011, 16, 635–644. [Google Scholar] [CrossRef]
- Kung, L.; Shaver, R.D.; Grant, R.J.; Schmidt, R.J. Silage Review: Interpretation of Chemical, Microbial, and Organoleptic Components of Silages. J. Dairy Sci. 2018, 101, 4020–4033. [Google Scholar] [CrossRef]
- Buxton, D.R.; Muck, R.E.; Harrison, J.H. Silage Science and Technology; American Society of Agronomy, Inc.: Madison, WI, USA, 2015; ISBN 9780891182344. [Google Scholar]
- Cherney, J.H.; Parsons, D.; Cherney, D.J.R. A Method for Forage Yield and Quality Assessment of Tall Fescue Cultivars in the Spring. Crop Sci. 2011, 51, 2878–2885. [Google Scholar] [CrossRef]
- Norris, K.H.; Barnes, R.F.; Moore, J.E.; Shenk, J.S. Predicting Forage Quality by Infrared Replectance Spectroscopy. J. Anim. Sci. 1976, 43, 889–897. [Google Scholar] [CrossRef]
- Varela, J.I.; Miller, N.D.; Infante, V.; Kaeppler, S.M.; de Leon, N.; Spalding, E.P. A Novel High-Throughput Hyperspectral Scanner and Analytical Methods for Predicting Maize Kernel Composition and Physical Traits. Food Chem. 2022, 391, 133264. [Google Scholar] [CrossRef] [PubMed]
- Starks, P.J.; Zhao, D.; Phillips, W.A.; Coleman, S.W. Development of Canopy Reflectance Algorithms for Real-Time Prediction of Bermudagrass Pasture Biomass and Nutritive Values. Crop Sci. 2006, 46, 927–934. [Google Scholar] [CrossRef]
- Hossain, M.E.; Kabir, M.A.; Zheng, L.; Swain, D.L.; McGrath, S.; Medway, J. Near-Infrared Spectroscopy for Analysing Livestock Diet Quality: A Systematic Review. Heliyon 2024, 10, e40016. [Google Scholar] [CrossRef]
- Hu, C.; Zhao, T.; Duan, Y.; Zhang, Y.; Wang, X.; Li, J.; Zhang, G. Visible-near Infrared Hyperspectral Imaging for Non-Destructive Estimation of Leaf Nitrogen Content under Water-Saving Irrigation in Protected Tomato Cultivation. Front. Plant Sci. 2025, 16, 1676457. [Google Scholar] [CrossRef]
- Geipel, J.; Bakken, A.K.; Jørgensen, M.; Korsaeth, A. Forage Yield and Quality Estimation by Means of UAV and Hyperspectral Imaging. Precis. Agric. 2021, 22, 1437–1463. [Google Scholar] [CrossRef]
- Feng, L.; Zhang, Z.; Ma, Y.; Sun, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Multitask Learning of Alfalfa Nutritive Value From UAV-Based Hyperspectral Images. IEEE Geosci. Remote Sens. Lett. 2021, 19, 5506305. [Google Scholar] [CrossRef]
- Feng, L.; Zhang, Z.; Ma, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Alfalfa Yield Prediction Using UAV-Based Hyperspectral Imagery and Ensemble Learning. Remote Sens. 2020, 12, 2028. [Google Scholar] [CrossRef]
- Hörtensteiner, S.; Matile, P. How Leaves Turn Yellow: Catabolism of Chlorophyll. In Plant Cell Death Processes; Academic Press: Cambridge, MA, USA, 2003; pp. 189–202. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (UAS)-Based Phenotyping of Soybean Using Multi-Sensor Data Fusion and Extreme Learning Machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
- Feng, A.; Zhou, J.; Vories, E.D.; Sudduth, K.A.; Zhang, M. Yield Estimation in Cotton Using UAV-Based Multi-Sensor Imagery. Biosyst. Eng. 2020, 193, 101–114. [Google Scholar] [CrossRef]
- Zhang, M.; Zhou, J.; Sudduth, K.A.; Kitchen, N.R. Estimation of Maize Yield and Effects of Variable-Rate Nitrogen Application Using UAV-Based RGB Imagery. Biosyst. Eng. 2020, 189, 24–35. [Google Scholar] [CrossRef]
- Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef]
- Lu, J.; Cheng, D.; Geng, C.; Zhang, Z.; Xiang, Y.; Hu, T. Combining Plant Height, Canopy Coverage and Vegetation Index from UAV-Based RGB Images to Estimate Leaf Nitrogen Concentration of Summer Maize. Biosyst. Eng. 2021, 202, 42–54. [Google Scholar] [CrossRef]
- Zhang, X.; Zhang, K.; Sun, Y.; Zhao, Y.; Zhuang, H.; Ban, W.; Chen, Y.; Fu, E.; Chen, S.; Liu, J.; et al. Combining Spectral and Texture Features of UAS-Based Multispectral Images for Maize Leaf Area Index Estimation. Remote Sens. 2022, 14, 331. [Google Scholar] [CrossRef]
- Han, W.; Sun, Y.; Xu, T.; Chen, X.; Su, K.O. Detecting Maize Leaf Water Status by Using Digital RGB Images. Int. J. Agric. Biol. Eng. 2014, 7, 45–53. [Google Scholar] [CrossRef]
- Lang, Q.; Zhiyong, Z.; Longsheng, C.; Hong, S.; Minzan, L.; Li, L.; Junyong, M. Detection of Chlorophyll Content in Maize Canopy from UAV Imagery. IFAC-Pap. 2019, 52, 330–335. [Google Scholar] [CrossRef]
- Shrestha, D.S.; Steward, B.L. Shape and Size Analysis of Corn Plant Canopies for Plant Population and Spacing Sensing. Appl. Eng. Agric. 2005, 21, 295–303. [Google Scholar] [CrossRef][Green Version]
- Fan, J.; Zhou, J.; Wang, B.; de Leon, N.; Kaeppler, S.M.; Lima, D.C.; Zhang, Z. Estimation of Maize Yield and Flowering Time Using Multi-Temporal UAV-Based Hyperspectral Data. Remote Sens. 2022, 14, 3052. [Google Scholar] [CrossRef]
- Brūmelis, G.; Dauškane, I.; Elferts, D.; Strode, L.; Krama, T.; Krams, I. Estimates of Tree Canopy Closure and Basal Area as Proxies for Tree Crown Volume at a Stand Scale. Forests 2020, 11, 1180. [Google Scholar] [CrossRef]
- ten Harkel, J.; Bartholomeus, H.; Kooistra, L. Biomass and Crop Height Estimation of Different Crops Using UAV-Based LiDAR. Remote Sens. 2020, 12, 17. [Google Scholar] [CrossRef]
- Maesano, M.; Khoury, S.; Nakhle, F.; Firrincieli, A.; Gay, A.; Tauro, F.; Harfouche, A. UAV-Based LiDAR for High-Throughput Determination of Plant Height and Above-ground Biomass of the Bioenergy Grass Arundo Donax. Remote Sens. 2020, 12, 3464. [Google Scholar] [CrossRef]
- Li, X.; Liu, C.; Wang, Z.; Xie, X.; Li, D.; Xu, L. Airborne LiDAR: State-of-the-Art of System Design, Technology and Application. Meas. Sci. Technol. 2020, 32, 032002. [Google Scholar] [CrossRef]
- Ravi, R.; Hasheminasab, S.M.; Zhou, T.; Masjedi, A.; Quijano, K.; Flatt, J.E.; Crawford, M.; Habib, A. UAV-Based Multi-Sensor Multi-Platform Integration for High Throughput Phenotyping. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, Baltimore, MD, USA, 15–16 April 2019. [Google Scholar] [CrossRef]
- Wang, C.; Nie, S.; Xi, X.; Luo, S.; Sun, X. Estimating the Biomass of Maize with Hyperspectral and LiDAR Data. Remote Sens. 2017, 9, 11. [Google Scholar] [CrossRef]
- Stuth, J.; Jama, A.; Tolleson, D. Direct and Indirect Means of Predicting Forage Quality through near Infrared Reflectance Spectroscopy. Field Crops Res. 2003, 84, 45–56. [Google Scholar] [CrossRef]
- Shaver, R.D. Evaluating Corn Silage Quality for Dairy Cattle; University of Wisconsin—Madison Extension: Madison, WI, USA, 2007; pp. 1–11. [Google Scholar]
- LaForest, L.; Hasheminasab, S.M.; Zhou, T.; Flatt, J.E.; Habib, A. New Strategies for Time Delay Estimation during System Calibration for UAV-Based GNSS/INS-Assisted Imaging Systems. Remote Sens. 2019, 11, 1811. [Google Scholar] [CrossRef]
- Zhang, W.; Qi, J.; Wan, P.; Wang, H.; Xie, D.; Wang, X.; Yan, G. An Easy-to-Use Airborne LiDAR Data Filtering Method Based on Cloth Simulation. Remote Sens. 2016, 8, 501. [Google Scholar] [CrossRef]
- Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color Indices for Weed Identification under Various Soil, Residue, and Lighting Conditions. Trans. Am. Soc. Agric. Eng. 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Mardanisamani, S.; Maleki, F.; Kassani, S.H.; Rajapaksa, S.; Duddu, H.; Wang, M.; Shirtliffe, S.; Ryu, S.; Josuttes, A.; Zhang, T.; et al. Crop Lodging Prediction from UAV-Acquired Images of Wheat and Canola Using a DCNN Augmented with Handcrafted Texture Features. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Long Beach, CA, USA, 16–17 June 2019; pp. 2657–2664. [Google Scholar] [CrossRef]
- Kwak, G.H.; Park, N.W. Impact of Texture Information on Crop Classification with Machine Learning and UAV Images. Appl. Sci. 2019, 9, 643. [Google Scholar] [CrossRef]
- Böhler, J.E.; Schaepman, M.E.; Kneubühler, M. Optimal Timing Assessment for Crop Separation Using Multispectral Unmanned Aerial Vehicle (UAV) Data and Textural Features. Remote Sens. 2019, 11, 1780. [Google Scholar] [CrossRef]
- Duan, B.; Liu, Y.; Gong, Y.; Peng, Y.; Wu, X.; Zhu, R.; Fang, S. Remote Estimation of Rice LAI Based on Fourier Spectrum Texture from UAV Image. Plant Methods 2019, 15, 124. [Google Scholar] [CrossRef]
- Van Der Walt, S.; Schönberger, J.L.; Nunez-Iglesias, J.; Boulogne, F.; Warner, J.D.; Yager, N.; Gouillart, E.; Yu, T. Scikit-Image: Image Processing in Python. PeerJ 2014, 2, e453. [Google Scholar] [CrossRef]
- Liu, N.; Li, L.; Li, H.; Liu, Z.; Lu, Y.; Shao, L. Selecting Maize Cultivars to Regulate Canopy Structure and Light Interception for High Yield. Agron. J. 2022, 115, 770–780. [Google Scholar] [CrossRef]
- Song, Y.; Rui, Y.; Bedane, G.; Li, J. Morphological Characteristics of Maize Canopy Development as Affected by Increased Plant Density. PLoS ONE 2016, 11, e0154084. [Google Scholar] [CrossRef]
- Petrović, N.; Moyà-Alcover, G.; Jaume-i-Capó, A.; González-Hidalgo, M. Sickle-Cell Disease Diagnosis Support Selecting the Most Appropriate Machine Learning Method: Towards a General and Interpretable Approach for Cell Morphology Analysis from Microscopy Images. Comput. Biol. Med. 2020, 126, 104027. [Google Scholar] [CrossRef]
- Masjedi, A.; Crawford, M.M. Prediction of Sorghum Biomass Using Time Series UAV-Based Hyperspectral and Lidar Data. In Proceedings of the International Geoscience and Remote Sensing Symposium (IGARSS), Waikoloa, HI, USA, 26 September–2 October 2020; pp. 3912–3915. [Google Scholar] [CrossRef]
- Jin, S.; Su, Y.; Zhang, Y.; Song, S.; Li, Q.; Liu, Z.; Ma, Q.; Ge, Y.; Liu, L.L.; Ding, Y.; et al. Exploring Seasonal and Circadian Rhythms in Structural Traits of Field Maize from Lidar Time Series. Plant Phenomics 2021, 2021, 9895241. [Google Scholar] [CrossRef]
- Su, Y.; Wu, F.; Ao, Z.; Jin, S.; Qin, F.; Liu, B.; Pang, S.; Liu, L.; Guo, Q. Evaluating Maize Phenotype Dynamics under Drought Stress Using Terrestrial Lidar. Plant Methods 2019, 15, 11. [Google Scholar] [CrossRef]
- Arnqvist, J.; Freier, J.; Dellwik, E. Robust Processing of Airborne Laser Scans to Plant Area Density Profiles. Biogeosciences 2020, 17, 5939–5952. [Google Scholar] [CrossRef]
- Fonseca, A.E.; Westgate, M.E.; Grass, L.; Dornbos, D.L. Tassel Morphology as an Indicator of Potential Pollen Production in Maize. Crop Manag. 2003, 2, 1–15. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention Is All You Need. In Advances in Neural Information Processing Systems 30 (NIPS 2017); NeurIPS: San Diego, CA, USA, 2017; pp. 5999–6009. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar] [CrossRef]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-Learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Bernardo, R. Parental Selection, Number of Breeding Populations, and Size of Each Population in Inbred Development. Theor. Appl. Genet. 2003, 107, 1252–1256. [Google Scholar] [CrossRef]
- Vigna, S. A Weighted Correlation Index for Rankings with Ties. In Proceedings of the WWW 2015: 24th International Conference on World Wide Web, Florence, Italy, 18–22 May 2015; pp. 1166–1176. [Google Scholar] [CrossRef]
- Xie, C.; Yang, C. A Review on Plant High-Throughput Phenotyping Traits Using UAV-Based Sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
- Lauer, J. Record When a Field Tassels to Predict Corn Silage Harvest Date. Available online: https://ipcm.wisc.edu/blog/2013/07/record-when-a-field-tassels-to-predict-corn-silage-harvest-date/ (accessed on 1 June 2023).
- Hoffman, P.C.; Lundberg, K.M.; Bauman, L.M.; Shaver, R.D. The Effect of Maturity on NDF Digestibility. Focus Forage 2003, 5, 1–3. [Google Scholar]
- Hoffman, P.C.; Shaver, R.D.; Combs, D.K.; Undersander, D.J.; Bauman, L.M.; Seeger, T.K. Understanding NDF Digestibility of Forages. Focus Forage 2001, 3, 3–5. [Google Scholar]
- Zhang, J.; Wang, X.; Zhang, H.; Sun, H.; Liu, X. Retrieval-Based Neural Source Code Summarization. In Proceedings of the ICSE’20: ACM/IEEE 42nd International Conference on Software Engineering, Melbourne, Australia, 27 June–19 July 2020; pp. 1385–1397. [Google Scholar] [CrossRef]
- Masjedi, A.; Crawford, M.M.; Carpenter, N.R.; Tuinstra, M.R. Multi-Temporal Predictive Modelling of Sorghum Biomass Using Uav-Based Hyperspectral and Lidar Data. Remote Sens. 2020, 12, 3587. [Google Scholar] [CrossRef]








| Phenotypic Traits | Unit | Measuring Description |
|---|---|---|
| DM Yield | US ton/acre | Measured by weighing a sample in the field, putting it in a dryer, weighing it again after it has dried to a constant weight, and calculating DM yield using DM percentage at the plot level. |
| CP | % | Measured by NIRS, expressed as a percentage of DM content. |
| Starch | % | Measured by NIRS, expressed as a percentage of DM content. |
| NDF | % | Measured by NIRS, expressed as a percentage of DM content. |
| ADF | % | Measured by NIRS, expressed as a percentage of DM content. |
| MILK2006 | US ton/acre | Milk yield per acre index, calculated by the NIRS-obtained quality values. |
| Phenotypic Traits | Count | Mean | Std * | CV ** | Min | 33% | 50% | 66% | Max |
|---|---|---|---|---|---|---|---|---|---|
| DM Yield (US ton/acre) | 1240 | 11.37 | 1.67 | 14.70 | 5.22 | 10.73 | 11.37 | 11.96 | 16.80 |
| CP (%) | 1240 | 6.99 | 0.60 | 8.59 | 4.86 | 6.76 | 7.00 | 7.23 | 8.81 |
| Starch (%) | 1240 | 27.78 | 3.08 | 11.10 | 16.31 | 26.32 | 27.76 | 29.12 | 36.52 |
| NDF (%) | 1240 | 39.95 | 2.49 | 6.24 | 33.52 | 38.78 | 39.85 | 40.91 | 49.64 |
| ADF (%) | 1240 | 21.63 | 1.90 | 8.77 | 16.70 | 20.80 | 21.57 | 22.35 | 28.11 |
| MILK2006 (US ton/acre) | 1240 | 17.88 | 2.76 | 15.43 | 7.95 | 16.81 | 17.86 | 18.97 | 20.09 |
| Sensor | Description |
|---|---|
| Hyperspectral scanner | Headwall Nano-Hyperspec, 274 spectral channels ranging from 400 to 1000 nm, 2.2 nm bandwidth, 12.7 mm lens, 640 × 1 pixels, 7.4 µm pixel size |
| LiDAR unit | Velodyne Puck 16, 16 channels, 600 rotations per minute (RPM), 100 m maximum range |
| RGB camera | Sony Cyber-shot DSC-RX1R II, 3 channels, 35 mm lens, 7952 × 5304 pixels, 4.52 µm pixel size |
| Survey Date (dd/mm/yyyy) | DAS |
|---|---|
| 27 June 2021 | 53 |
| 2 July 2021 | 58 |
| 17 July 2021 | 73 |
| 30 July 2021 | 86 |
| 13 August 2021 | 100 |
| 21 August 2021 | 108 |
| 30 August 2021 | 117 |
| Quality Values | Metrics | |||
|---|---|---|---|---|
| r | MAE | RMSE | ||
| DM Yield | 0.67 (85%) | 0.72 (88%) | 0.92 (121%) | 1.16 (122%) |
| NDF | 0.41 (105%) | 0.33 (94%) | 1.90 (101%) | 2.39 (100%) |
| ADF | 0.49 (96%) | 0.38 (93%) | 1.42 (102%) | 1.78 (101%) |
| CP | 0.55 (81%) | 0.58 (85%) | 0.39 (111%) | 0.49 (111%) |
| Starch | 0.43 (102%) | 0.38 (95%) | 2.29 (100%) | 2.88 (100%) |
| MILK2006 | 0.67 (91%) | 0.69 (91%) | 1.59 (110%) | 2.00 (111%) |
| Quality Values | Metrics | |||
|---|---|---|---|---|
| F1top_c | Accuracyavg | Precisiontop_c | Recalltop_c | |
| DM Yield | 0.64 (91%) | 0.72 (95%) | 0.64 (93%) | 0.64 (89%) |
| NDF | 0.45 (95%) | 0.61 (100%) | 0.43 (100%) | 0.46 (90%) |
| ADF | 0.49 (96%) | 0.64 (100%) | 0.47 (98%) | 0.50 (91%) |
| CP | 0.57 (89%) | 0.67 (96%) | 0.56 (92%) | 0.58 (87%) |
| Starch | 0.47 (90%) | 0.62 (97%) | 0.47 (94%) | 0.47 (87%) |
| MILK2006 | 0.61 (94%) | 0.70 (97%) | 0.60 (95%) | 0.61 (91%) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fan, J.; Zhou, J.; Leon, N.d.; Zhang, Z. Uncrewed Aerial Vehicle (UAV)-Based High-Throughput Phenotyping of Maize Silage Yield and Nutritive Values Using Multi-Sensory Feature Fusion and Multi-Task Learning with Attention Mechanism. Remote Sens. 2025, 17, 3654. https://doi.org/10.3390/rs17213654
Fan J, Zhou J, Leon Nd, Zhang Z. Uncrewed Aerial Vehicle (UAV)-Based High-Throughput Phenotyping of Maize Silage Yield and Nutritive Values Using Multi-Sensory Feature Fusion and Multi-Task Learning with Attention Mechanism. Remote Sensing. 2025; 17(21):3654. https://doi.org/10.3390/rs17213654
Chicago/Turabian StyleFan, Jiahao, Jing Zhou, Natalia de Leon, and Zhou Zhang. 2025. "Uncrewed Aerial Vehicle (UAV)-Based High-Throughput Phenotyping of Maize Silage Yield and Nutritive Values Using Multi-Sensory Feature Fusion and Multi-Task Learning with Attention Mechanism" Remote Sensing 17, no. 21: 3654. https://doi.org/10.3390/rs17213654
APA StyleFan, J., Zhou, J., Leon, N. d., & Zhang, Z. (2025). Uncrewed Aerial Vehicle (UAV)-Based High-Throughput Phenotyping of Maize Silage Yield and Nutritive Values Using Multi-Sensory Feature Fusion and Multi-Task Learning with Attention Mechanism. Remote Sensing, 17(21), 3654. https://doi.org/10.3390/rs17213654

