Estimation of Strawberry Canopy Volume in Unmanned Aerial Vehicle RGB Imagery Using an Object Detection-Based Convolutional Neural Network
Abstract
:1. Introduction
2. Materials and Methods
2.1. Setup of Strawberry Field Experiment
2.2. Acquisition of UAV-Based RGB Images
2.3. Image Preprocessing for the Development of CNN Model
2.3.1. Resizing UAV Images Using YOLOv8n to Correct Flight Altitude Deviations from Manual Flight
2.3.2. Evaluation of the YOLOv8n Object Detection Model
2.4. Development of CNN Model for Canopy Volume Estimation
2.4.1. Construction and Training of CNN Model
2.4.2. Manual Measurement of Canopy Volume Used as Target Values for CNN
2.4.3. Manual Measurement of Canopy Fullness Level Used as Target Values for CNN
2.4.4. Mixture of Manually Measured Canopy Fullness and Canopy Volume Used as Target Values for CNN
2.4.5. Evaluation of the Developed Estimation Model
2.5. Generation of Canopy Volume Distribution Maps
3. Results
3.1. Performance of Object Detection Model
3.2. Canopy Volume Estimation Using CNN
3.2.1. Comparison of R2 and RMSE Values Between Linear Regression Model Using Paraboloid Shape and CNN Model
3.2.2. Canopy Volume Estimation Using EPS Ball-Based Target Values Before and After Correction of Flight Altitude Deviations
3.2.3. Canopy Volume Estimation Using Target Values Converted from Canopy Fullness Levels
3.3. Distributions of Estimated Canopy Volume Using CNN
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Heide, O.M. Photoperiod and temperature interactions in growth and flowering of strawberry. Physiol. Plant 1977, 40, 21–26. [Google Scholar] [CrossRef]
- Buelvas, R.M.; Adamchuk, V.I. Crop canopy measurement using laser and ultrasonic sensing integration. In Proceedings of the 2017 ASABE Annual International Meeting, Spokane, WA, USA, 16–19 July 2017. [Google Scholar] [CrossRef]
- Chen, Y.; Lee, W.S.; Gan, H.; Peres, N.; Fraisse, C.; Zhang, Y.; He, Y. Strawberry yield prediction based on a deep neural network using high-resolution aerial orthoimages. Remote Sens. 2019, 11, 1584. [Google Scholar] [CrossRef]
- Usha, K.; Singh, B. Potential applications of remote sensing in horticulture—A review. Sci. Hortic. 2013, 153, 71–83. [Google Scholar] [CrossRef]
- Underwood, J.P.; Hung, C.; Whelan, B.; Sukkarieh, S. Mapping almond orchard canopy volume, flowers, fruit and yield using lidar and vision sensors. Comput. Electron. Agric. 2016, 130, 83–96. [Google Scholar] [CrossRef]
- Guan, Z.; Abd-Elrahman, A.; Fan, Z.; Whitaker, V.M.; Wilkinson, B. Modeling strawberry biomass and leaf area using object-based analysis of high-resolution images. ISPRS J. Photogramm. Remote Sens. 2020, 163, 171–186. [Google Scholar] [CrossRef]
- Abd-Elrahman, A.; Guan, Z.; Dalid, C.; Whitaker, V.; Britt, K.; Wilkinson, B.; Gonzalez, A. Automated canopy delineation and size metrics extraction for strawberry dry weight modeling using raster analysis of high-resolution imagery. Remote Sens. 2020, 12, 3632. [Google Scholar] [CrossRef]
- Abd-Elrahman, A.; Wu, F.; Agehara, S.; Britt, K. Improving strawberry yield prediction by integrating ground-based canopy images in modeling approaches. ISPRS Int. J. Geoinf. 2021, 10, 239. [Google Scholar] [CrossRef]
- Brook, A.; Tal, Y.; Markovich, O.; Rybnikova, N. Canopy Volume as a Tool for Early Detection of Plant Drought and Fertilization Stress: Banana plant fine-phenotype. bioRxiv 2021. [Google Scholar] [CrossRef]
- Zhu, Z.; Kleinn, C.; Nölke, N. Assessing tree crown volume—A review. For. Int. J. For. Res. 2021, 94, 18–35. [Google Scholar] [CrossRef]
- Cruz, M.G.; Alexander, M.E.; Wakimoto, R.H. Assessing canopy fuel stratum characteristics in crown fire prone fuel types of western North America. Int. J. Wildland Fire 2003, 12, 39–50. [Google Scholar] [CrossRef]
- Moffett, M.W. What’s “Up”? A critical look at the basic terms of canopy biology 1. Biotropica 2000, 32, 569–596. [Google Scholar] [CrossRef]
- Vélez, S.; Vacas, R.; Martín, H.; Ruano-Rosa, D.; Álvarez, S. A novel technique using planar area and ground shadows calculated from UAV RGB imagery to estimate pistachio tree (Pistacia vera L.) canopy volume. Remote Sens. 2022, 14, 6006. [Google Scholar] [CrossRef]
- Kim, D.C. Crown shape factor & volumes. In Tree Biomechanics Series; Warnell School of Forestry and Natural Resources, University of Georgia: Athens, GA, USA, 2000; Volume 11, pp. 1–5. [Google Scholar]
- Thorne, M.S.; Skinner, Q.D.; Smith, M.A.; Rodgers, J.D.; Laycock, W.A.; Cerekci, S.A. Evaluation of a technique for measuring canopy volume of shrubs. Rangel. Ecol. Manag. 2002, 55, 235–241. [Google Scholar] [CrossRef]
- Rodrigues, J.D.; Moreira, A.S.; Stuchi, E.S.; Bassanezi, R.B.; Laranjeira, F.F.; Girardi, E.A. Huanglongbing incidence, canopy volume, and sprouting dynamics of ‘Valencia’ sweet orange grafted onto 16 rootstocks. Trop. Plant Pathol. 2020, 45, 611–619. [Google Scholar] [CrossRef]
- Tumbo, S.D.; Salyani, M.; Whitney, J.D.; Wheaton, T.A.; Miller, W.M. Investigation of laser and ultrasonic ranging sensors for measurements of citrus canopy volume. Appl. Eng. Agric. 2002, 18, 367. [Google Scholar] [CrossRef]
- Zaman, Q.U.; Salyani, M. Effects of foliage density and ground speed on ultrasonic measurement of citrus tree volume. Appl. Eng. Agric. 2004, 20, 173–178. [Google Scholar] [CrossRef]
- Zaman, Q.U.; Schumann, A.W. Performance of an ultrasonic tree volume measurement system in commercial citrus groves. Precis. Agric. 2005, 6, 467–480. [Google Scholar] [CrossRef]
- Llorens, J.; Gil, E.; Llop, J.; Escolà, A. Ultrasonic and LIDAR sensors for electronic canopy characterization in vineyards: Advances to improve pesticide application methods. Sensors 2011, 11, 2177–2194. [Google Scholar] [CrossRef]
- Barber, C.B.; Dobkin, D.P.; Huhdanpaa, H. The quickhull algorithm for convex hulls. ACM Trans. Math. Softw. 1996, 22, 469–483. [Google Scholar] [CrossRef]
- Qi, Y.; Dong, X.; Chen, P.; Lee, K.H.; Lan, Y.; Lu, X.; Zhang, Y. Canopy volume extraction of Citrus reticulate Blanco cv. Shatangju trees using UAV image-based point cloud deep learning. Remote Sen. 2021, 13, 3437. [Google Scholar] [CrossRef]
- Korhonen, L.; Vauhkonen, J.; Virolainen, A.; Hovi, A.; Korpela, I. Estimation of tree crown volume from airborne lidar data using computational geometry. Int. J. Remote Sens. 2013, 34, 7236–7248. [Google Scholar] [CrossRef]
- Fernández-Sarría, A.; Martínez, L.; Velázquez-Martí, B.; Sajdak, M.; Estornell, J.; Recio, J.A. Different methodologies for calculating crown volumes of Platanus hispanica trees using terrestrial laser scanner and a comparison with classical dendrometric measurements. Comput. Electron. Agric. 2013, 90, 176–185. [Google Scholar] [CrossRef]
- Colaço, A.F.; Trevisan, R.G.; Molin, J.P.; Rosell-Polo, J.R.; Escolà, A. Orange tree canopy volume estimation by manual and LiDAR-based methods. Adv. Anim. Sci. 2017, 8, 477–480. [Google Scholar] [CrossRef]
- Edelsbrunner, H. Smooth surfaces for multi-scale shape representation. In Proceedings of the International Conference on Foundations of Software Technology and Theoretical Computer Science, Bangalore, India, 18–20 December 1995. [Google Scholar] [CrossRef]
- Kaufman, A.; Cohen, D.; Yagel, R. Volume graphics. Computer 1993, 26, 51–64. [Google Scholar] [CrossRef]
- Hess, C.; Härdtle, W.; Kunz, M.; Fichtner, A.; von Oheimb, G. A high-resolution approach for the spatiotemporal analysis of forest canopy space using terrestrial laser scanning data. Ecol. Evol. 2018, 8, 6800–6811. [Google Scholar] [CrossRef] [PubMed]
- Rueda-Ayala, V.P.; Peña, J.M.; Höglind, M.; Bengochea-Guevara, J.M.; Andújar, D. Comparing UAV-based technologies and RGB-D reconstruction methods for plant height and biomass monitoring on grass ley. Sensors 2019, 19, 535. [Google Scholar] [CrossRef]
- Jang, E.K.; Ahn, M. Estimation of single vegetation volume using 3D point cloud-based alpha shape and voxel. Ecol. Resil. Infrastruct. 2021, 8, 204–211. [Google Scholar] [CrossRef]
- DiFrancesco, P.M.; Bonneau, D.A.; Hutchinson, D.J. Computational geometry-based surface reconstruction for volume estimation: A case study on magnitude-frequency relations for a LiDAR-derived rockfall inventory. ISPRS Int. J. Geoinf. 2021, 10, 157. [Google Scholar] [CrossRef]
- Sze, V.; Chen, Y.H.; Yang, T.J.; Emer, J.S. Efficient processing of deep neural networks: A tutorial and survey. Proc. Inst. Electr. Electron. Eng. 2017, 105, 2295–2329. [Google Scholar] [CrossRef]
- Ma, J.; Li, Y.; Chen, Y.; Du, K.; Zheng, F.; Zhang, L.; Sun, Z. Estimating above ground biomass of winter wheat at early growth stages using digital images and deep convolutional neural network. Eur. J. Agron. 2019, 103, 117–129. [Google Scholar] [CrossRef]
- Gang, M.S.; Kim, H.J.; Kim, D.W. Estimation of greenhouse lettuce growth indices based on a two-stage CNN using RGB-D images. Sensors 2022, 22, 5499. [Google Scholar] [CrossRef]
- Geipel, J.; Link, J.; Claupein, W. Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system. Remote Sens. 2014, 6, 10335–10355. [Google Scholar] [CrossRef]
- Yun, H.S.; Park, S.H.; Kim, H.J.; Lee, W.D.; Lee, K.D.; Hong, S.Y.; Jung, G.H. Use of unmanned aerial vehicle for multi-temporal monitoring of soybean vegetation fraction. J. Biosyst. Eng. 2016, 41, 126–137. [Google Scholar] [CrossRef]
- Kim, D.W.; Yun, H.S.; Jeong, S.J.; Kwon, Y.S.; Kim, S.G.; Lee, W.S.; Kim, H.J. Modeling and testing of growth status for Chinese cabbage and white radish with UAV-based RGB imagery. Remote Sens. 2018, 10, 563. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Maimaitiyiming, M.; Hartling, S.; Peterson, K.T.; Fritschi, F.B. Vegetation index weighted canopy volume model (CVMVI) for soybean biomass estimation from unmanned aerial system-based RGB imagery. ISPRS J. Photogramm. Remote Sens. 2019, 151, 27–41. [Google Scholar] [CrossRef]
- Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef]
- Castro, W.; Marcato Junior, J.; Polidoro, C.; Osco, L.P.; Gonçalves, W.; Rodrigues, L.; Matsubara, E. Deep learning applied to phenotyping of biomass in forages with UAV-based RGB imagery. Sensors 2020, 20, 4802. [Google Scholar] [CrossRef]
- Liu, S.; Jin, X.; Nie, C.; Wang, S.; Yu, X.; Cheng, M.; Liu, Y. Estimating leaf area index using unmanned aerial vehicle data: Shallow vs. deep machine learning algorithms. Plant Physiol. 2021, 187, 1551–1576. [Google Scholar] [CrossRef] [PubMed]
- Zheng, C.; Abd-Elrahman, A.; Whitaker, V.M.; Dalid, C. Deep learning for strawberry canopy delineation and biomass prediction from high-resolution images. Plant Phenomics 2022, 2022, 9850486. [Google Scholar] [CrossRef]
- Valavanis, K.P.; Vachtsevanos, G.J. UAV Autonomy: Introduction; Springer: Dordrecht, The Netherlands, 2015; pp. 241–242. [Google Scholar] [CrossRef]
- Luo, X.; Wei, Z.; Jin, Y.; Wang, X.; Lin, P.; Wei, X.; Zhou, W. Fast Automatic Registration of UAV Images via Bidirectional Matching. Sensors 2023, 23, 8566. [Google Scholar] [CrossRef]
- Aicardi, I.; Nex, F.; Gerke, M.; Lingua, A.M. An image-based approach for the co-registration of multi-temporal UAV image datasets. Remote Sens. 2016, 8, 779. [Google Scholar] [CrossRef]
- Zitova, B.; Flusser, J. Image registration methods: A survey. Image Vis. Comp. 2003, 21, 977–1000. [Google Scholar] [CrossRef]
- Whitaker, V.M.; Peres, N.A.; Osorio, L.F.; Fan, Z.; do Nascimento Nunes, M.C.; Plotto, A.; Sims, C.A. ‘Florida Brilliance’ Strawberry. HortScience 2019, 54, 2073–2077. [Google Scholar] [CrossRef]
- EMCO CAL. Available online: https://www.emcocal.com/medallion-strawberry (accessed on 28 August 2024).
- Zhang, Q.; Liu, Y.; Gong, C.; Chen, Y.; Yu, H. Applications of deep learning for dense scenes analysis in agriculture: A review. Sensors 2020, 20, 1520. [Google Scholar] [CrossRef]
- Wang, Z.; Hua, Z.; Wen, Y.; Zhang, S.; Xu, X.; Song, H. E-YOLO: Recognition of estrus cow based on improved YOLOv8n model. Expert. Syst. Appl. 2024, 238, 122212. [Google Scholar] [CrossRef]
- Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Zitnick, C.L. Microsoft coco: Common objects in context. In Proceedings of the Computer Vision–ECCV 2014: 13th European Conference, Zurich, Switzerland, 6–12 September 2014. [Google Scholar] [CrossRef]
- Kaggle. Available online: https://www.kaggle.com/datasets/ultralytics/coco128 (accessed on 25 January 2024).
- Loshchilov, I. Decoupled weight decay regularization. arXiv 2017, arXiv:1711.05101. Available online: https://arxiv.org/abs/1711.05101 (accessed on 15 October 2024).
- Li, J.; Qiao, Y.; Liu, S.; Zhang, J.; Yang, Z.; Wang, M. An improved YOLOv5-based vegetable disease detection method. Comput. Electron. Agric. 2022, 202, 107345. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Identity mappings in deep residual networks. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 8–16 October 2016. [Google Scholar] [CrossRef]
- Deng, J.; Dong, W.; Socher, R.; Li, L.J.; Li, K.; Li, F.-F. ImageNet: A large-scale hierarchical image database. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar] [CrossRef]
- Mosteller, F.; Tukey, J.W. Data Analysis, Including Statistics; Addison-Wesley: Boston, MA, USA, 1968; pp. 80–203. [Google Scholar]
- Verweij, R.J.; Higgins, S.I.; Bond, W.J.; February, E.C. Water sourcing by trees in a mesic savanna: Responses to severing deep and shallow roots. Environ. Exp. Bot. 2011, 74, 229–236. [Google Scholar] [CrossRef]
- Ongole, S.; Teegalapalli, K.; Byrapoghu, V.; Ratnam, J.; Sankaran, M. Functional traits predict tree-level phenological strategies in a mesic Indian savanna. Biotropica 2021, 53, 1432–1441. [Google Scholar] [CrossRef]
- Kattenborn, T.; Leitloff, J.; Schiefer, F.; Hinz, S. Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J. Photogramm. Remote Sens. 2021, 173, 24–49. [Google Scholar] [CrossRef]
- Andujar, D.; Ribeiro, A.; Fernández-Quintanilla, C.; Dorado, J. Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Comput. Electron. Agric. 2016, 122, 67–73. [Google Scholar] [CrossRef]
- Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of remote sensing in precision agriculture: A review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
- Zaman, Q.U.; Schumann, A.W.; Miller, W.M. Variable rate nitrogen application in Florida citrus based on ultrasonically-sensed tree size. Appl. Eng. Agric. 2005, 21, 331–335. [Google Scholar] [CrossRef]
- Garcia-Ruiz, F.; Campos, J.; Llop-Casamada, J.; Gil, E. Assessment of map based variable rate strategies for copper reduction in hedge vineyards. Comput. Electron. Agric. 2023, 207, 107753. [Google Scholar] [CrossRef]
- Albornoz, V.M.; Araneda, L.C.; Ortega, R. Planning and scheduling of selective harvest with management zones delineation. Ann. Oper. Res. 2022, 316, 873–890. [Google Scholar] [CrossRef]
- Zhou, X.; Lee, W.S.; Ampatzidis, Y.; Chen, Y.; Peres, N.; Fraisse, C. Strawberry maturity classification from UAV and near-ground imaging using deep learning. Smart Agric. Technol. 2021, 1, 100001. [Google Scholar] [CrossRef]
- Zhou, C.; Lee, W.S.; Peres, N.; Kim, B.S.; Kim, J.H.; Moon, H.C. Strawberry flower and fruit detection based on an autonomous imaging robot and deep learning. In Proceedings of the 14th European Conference on Precision Agriculture, Bologna, Italy, 2–6 July 2023. [Google Scholar] [CrossRef]
Specifications | Values |
---|---|
Inner diameter of the cylindrical case | 19.4 cm |
Height of the cylindrical case | 10.1 cm |
Volume of the cylindrical case | 2985.5 cm2 |
Diameter of an EPS ball | 2 cm |
Volume of an EPS ball | 4.19 cm3 |
Total number of EPS balls filling the case | 240 pieces |
Total volume of 240 EPS balls | 1005.6 cm3 |
Space not filled by EPS balls (offset) | 1979.9 cm3 |
Offset per ball | 8.25 cm3 |
Object | mAP50 | Precision (%) | Recall (%) | F1-Score | Average Frames per Second |
---|---|---|---|---|---|
All | 0.78 | 89.7 | 88.7 | 0.89 | 18.6 |
Strawberry | 0.98 | 94.4 | 97.9 | 0.96 | - |
Dead plant | 0.74 | 62.2 | 74.2 | 0.68 | - |
Hole (missing plant) | 0.90 | 86.9 | 96.3 | 0.91 | - |
Weed | 0.63 | 47.6 | 50.0 | 0.49 | - |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gang, M.-S.; Sutthanonkul, T.; Lee, W.S.; Liu, S.; Kim, H.-J. Estimation of Strawberry Canopy Volume in Unmanned Aerial Vehicle RGB Imagery Using an Object Detection-Based Convolutional Neural Network. Sensors 2024, 24, 6920. https://doi.org/10.3390/s24216920
Gang M-S, Sutthanonkul T, Lee WS, Liu S, Kim H-J. Estimation of Strawberry Canopy Volume in Unmanned Aerial Vehicle RGB Imagery Using an Object Detection-Based Convolutional Neural Network. Sensors. 2024; 24(21):6920. https://doi.org/10.3390/s24216920
Chicago/Turabian StyleGang, Min-Seok, Thanyachanok Sutthanonkul, Won Suk Lee, Shiyu Liu, and Hak-Jin Kim. 2024. "Estimation of Strawberry Canopy Volume in Unmanned Aerial Vehicle RGB Imagery Using an Object Detection-Based Convolutional Neural Network" Sensors 24, no. 21: 6920. https://doi.org/10.3390/s24216920
APA StyleGang, M.-S., Sutthanonkul, T., Lee, W. S., Liu, S., & Kim, H.-J. (2024). Estimation of Strawberry Canopy Volume in Unmanned Aerial Vehicle RGB Imagery Using an Object Detection-Based Convolutional Neural Network. Sensors, 24(21), 6920. https://doi.org/10.3390/s24216920