Applications of 3D Reconstruction Techniques in Crop Canopy Phenotyping: A Review
Abstract
1. Introduction
2. Data Acquisition
2.1. Light Detection and Ranging
2.2. Multi-View Stereo
2.3. Camera
2.4. Multi-Sensor Fusion
3. Point Cloud Data Processing
3.1. Point Cloud Denoising
3.2. Point Cloud Registration
3.3. Segmentation and Reconstruction
3.3.1. Canopy
3.3.2. Stem and Leaf
3.3.3. Fruit
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Becker, S.; Fanzo, J. Population and food systems: What does the future hold? Popul. Environ. 2023, 45, 20. [Google Scholar] [CrossRef]
- Xing, Y.; Xie, Y.; Wang, X. Enhancing soil health through balanced fertilization: A pathway to sustainable agriculture and food security. Front. Microbiol. 2025, 16, 1536524. [Google Scholar] [CrossRef]
- Song, P.; Wang, J.; Guo, X.; Yang, W.; Zhao, C. High-throughput phenotyping: Breaking through the bottleneck in future crop breeding. Crop J. 2021, 9, 633–645. [Google Scholar] [CrossRef]
- Ninomiya, S. High-throughput field crop phenotyping: Current status and challenges. Breed. Sci. 2022, 72, 3–18. [Google Scholar] [CrossRef] [PubMed]
- Li, Y.; Wang, W.; Hu, C.; Yang, S.; Ma, C.; Wu, J.; Wang, Y.; Xu, Z.; Li, L.; Huang, Z.; et al. Ectopic Expression of a Maize Gene ZmDUF1645 in Rice Increases Grain Length and Yield, but Reduces Drought Stress Tolerance. Int. J. Mol. Sci. 2023, 24, 9794. [Google Scholar] [CrossRef] [PubMed]
- van Eeuwijk, F.A.; Bustos-Korts, D.; Millet, E.J.; Boer, M.P.; Kruijer, W.; Thompson, A.; Malosetti, M.; Iwata, H.; Quiroz, R.; Kuppe, C.; et al. Modelling strategies for assessing and increasing the effectiveness of new phenotyping techniques in plant breeding. Plant Sci. 2019, 282, 23–39. [Google Scholar] [CrossRef]
- Gracia-Romero, A.; Vergara-Díaz, O.; Thierfelder, C.; Cairns, J.E.; Kefauver, S.C.; Araus, J.L. Phenotyping Conservation Agriculture Management Effects on Ground and Aerial Remote Sensing Assessments of Maize Hybrids Performance in Zimbabwe. Remote Sens. 2018, 10, 349. [Google Scholar] [CrossRef]
- Esser, F.; Klingbeil, L.; Zabawa, L.; Kuhlmann, H. Quality Analysis of a High-Precision Kinematic Laser Scanning System for the Use of Spatio-Temporal Plant and Organ-Level Phenotyping in the Field. Remote Sens. 2023, 15, 1117. [Google Scholar] [CrossRef]
- Fei, S.; Hassan, M.A.; Xiao, Y.; Su, X.; Chen, Z.; Cheng, Q.; Duan, F.; Chen, R.; Ma, Y. UAV-based multi-sensor data fusion and machine learning algorithm for yield prediction in wheat. Precis. Agric. 2023, 24, 187–212. [Google Scholar] [CrossRef]
- Prey, L.; Hanemann, A.; Ramgraber, L.; Seidl-Schulz, J.; Noack, P.O. UAV-Based Estimation of Grain Yield for Plant Breeding: Applied Strategies for Optimizing the Use of Sensors, Vegetation Indices, Growth Stages, and Machine Learning Algorithms. Remote Sens. 2022, 14, 6345. [Google Scholar] [CrossRef]
- Tadesse, W.; Bishaw, Z.; Assefa, S. Wheat production and breeding in Sub-Saharan Africa Challenges and opportunities in the face of climate change. Int. J. Clim. Change Strateg. Manag. 2019, 11, 696–715. [Google Scholar] [CrossRef]
- Chen, C.; Jia, Y.; Zhang, J.; Yang, L.; Wang, Y.; Kang, F. Development of a 3D point cloud reconstruction-based apple canopy liquid sedimentation model. J. Clean. Prod. 2024, 451, 142038. [Google Scholar] [CrossRef]
- Mochida, K.; Koda, S.; Inoue, K.; Hirayama, T.; Tanaka, S.; Nishii, R.; Melgani, F. Computer vision-based phenotyping for improvement of plant productivity: A machine learning perspective. Gigascience 2019, 8, giy153. [Google Scholar] [CrossRef] [PubMed]
- Zou, X.; Mõttus, M.; Tammeorg, P.; Torres, C.L.; Takala, T.; Pisek, J.; Mäkelä, P.; Stoddard, F.L.; Pellikka, P. Photographic measurement of leaf angles in field crops. Agric. For. Meteorol. 2014, 184, 137–146. [Google Scholar] [CrossRef]
- Qiu, R.; Wei, S.; Zhang, M.; Li, H.; Sun, H.; Liu, G.; Li, M. Sensors for measuring plant phenotyping: A review. Int. J. Agric. Biol. Eng. 2018, 11, 1–17. [Google Scholar] [CrossRef]
- Sun, S.; Li, C.; Chee, P.W.; Paterson, A.H.; Meng, C.; Zhang, J.; Ma, P.; Robertson, J.S.; Adhikari, J. High resolution 3D terrestrial LiDAR for cotton plant main stalk and node detection. Comput. Electron. Agric. 2021, 187, 106276. [Google Scholar] [CrossRef]
- Milella, A.; Marani, R.; Petitti, A.; Reina, G. In-field high throughput grapevine phenotyping with a consumer-grade depth camera. Comput. Electron. Agric. 2019, 156, 293–306. [Google Scholar] [CrossRef]
- Wang, Y.; Chen, Y. Non-Destructive Measurement of Three-Dimensional Plants Based on Point Cloud. Plants 2020, 9, 571. [Google Scholar] [CrossRef]
- Li, S.; Cui, Z.; Yang, J.; Wang, B. A Review of Optical-Based Three-Dimensional Reconstruction and Multi-Source Fusion for Plant Phenotyping. Sensors 2025, 25, 3401. [Google Scholar] [CrossRef]
- Qi, J.; Gao, F.; Wang, Y.; Zhang, W.; Yang, S.; Qi, K.; Zhang, R. Multiscale phenotyping of grain crops based on three-dimensional models: A comprehensive review of trait detection. Comput. Electron. Agric. 2025, 237, 110597. [Google Scholar] [CrossRef]
- Wu, S.; Hu, C.; Tian, B.; Huang, Y.; Yang, S.; Li, S.; Xu, S. A 3D reconstruction platform for complex plants using OB-NeRF. Front. Plant Sci. 2025, 16, 1449626. [Google Scholar] [CrossRef]
- Jin, S.; Sun, X.; Wu, F.; Su, Y.; Li, Y.; Song, S.; Xu, K.; Ma, Q.; Baret, F.; Jiang, D.; et al. Lidar sheds new light on plant phenomics for plant breeding and management: Recent advances and future prospects. ISPRS J. Photogramm. Remote Sens. 2021, 171, 202–223. [Google Scholar] [CrossRef]
- Ali, B.; Zhao, F.; Li, Z.; Zhao, Q.; Gong, J.; Wang, L.; Tong, P.; Jiang, Y.; Su, W.; Bao, Y.; et al. Sensitivity Analysis of Canopy Structural and Radiative Transfer Parameters to Reconstructed Maize Structures Based on Terrestrial LiDAR Data. Remote Sens. 2021, 13, 3751. [Google Scholar] [CrossRef]
- Jimenez-Berni, J.A.; Deery, D.M.; Rozas-Larraondo, P.; Condon, A.G.; Rebetzke, G.J.; James, R.A.; Bovill, W.D.; Furbank, R.T.; Sirault, X.R.R. High Throughput Determination of Plant Height, Ground Cover, and Above-Ground Biomass in Wheat with LiDAR. Front. Plant Sci. 2018, 9, 237. [Google Scholar] [CrossRef] [PubMed]
- Zhu, Y.; Sun, G.; Ding, G.; Zhou, J.; Wen, M.; Jin, S.; Zhao, Q.; Colmer, J.; Ding, Y.; Ober, E.S.; et al. Large-scale field phenotyping using backpack LiDAR and CropQuant-3D to measure structural variation in wheat. Plant Physiol. 2021, 187, 716–738. [Google Scholar] [CrossRef] [PubMed]
- Jiang, Y.; Li, C.; Takeda, F.; Kramer, E.A.; Ashrafi, H.; Hunter, J. 3D point cloud data to quantitatively characterize size and shape of shrub crops. Hortic. Res. 2019, 6, 43. [Google Scholar] [CrossRef]
- Walter, J.D.C.; Edwards, J.; McDonald, G.; Kuchel, H. Estimating Biomass and Canopy Height with LiDAR for Field Crop Breeding. Front. Plant Sci. 2019, 10, 1145. [Google Scholar] [CrossRef]
- Virlet, N.; Sabermanesh, K.; Sadeghi-Tehran, P.; Hawkesford, M.J. Field Scanalyzer: An automated robotic field phenotyping platform for detailed crop monitoring. Funct. Plant Biol. 2017, 44, 143–153. [Google Scholar] [CrossRef]
- Paulus, S. Measuring crops in 3D: Using geometry for plant phenotyping. Plant Methods 2019, 15, 103. [Google Scholar] [CrossRef]
- Wei, K.; Liu, S.; Chen, Q.; Huang, S.; Zhong, M.; Zhang, J.; Sun, H.; Wu, K.; Fan, S.; Ye, Z.; et al. Fast Multi-View 3D reconstruction of seedlings based on automatic viewpoint planning. Comput. Electron. Agric. 2024, 218, 108708. [Google Scholar] [CrossRef]
- Yan, P.; Feng, Y.; Han, Q.; Wu, H.; Hu, Z.; Kang, S. Revolutionizing crop phenotyping: Enhanced UAV LiDAR flight parameter optimization for wide-narrow row cultivation. Remote Sens. Environ. 2025, 320, 114638. [Google Scholar] [CrossRef]
- Ma, X.; Zhu, K.; Guan, H.; Feng, J.; Yu, S.; Liu, G. Calculation Method for Phenotypic Traits Based on the 3D Reconstruction of Maize Canopies. Sensors 2019, 19, 1201. [Google Scholar] [CrossRef] [PubMed]
- Forero, M.G.; Murcia, H.F.; Mendez, D.; Betancourt-Lozano, J. LiDAR Platform for Acquisition of 3D Plant Phenotyping Database. Plants 2022, 11, 2199. [Google Scholar] [CrossRef] [PubMed]
- Saha, K.K.; Weltzien, C.; Zude-Sasse, M. Non-destructive Leaf Area Estimation of Tomato Using Mobile LiDAR Laser Scanner. In Proceedings of the 1st IEEE International Workshop on Metrology for the Agriculture and Forestry (IEEE MetroAgriFor), Trento-Bolzano, Italy, 3–5 November 2021; pp. 187–191. [Google Scholar] [CrossRef]
- Hu, F.; Lin, C.; Peng, J.; Wang, J.; Zhai, R. Rapeseed Leaf Estimation Methods at Field Scale by Using Terrestrial LiDAR Point Cloud. Agronomy 2022, 12, 2409. [Google Scholar] [CrossRef]
- Medic, T.; Manser, N.; Kirchgessner, N.; Roth, L. Towards Wheat Yield Estimation in Plant Breeding from Inhomogeneous LiDAR Point Clouds Using Stochastic Features. In Proceedings of the 5th International-Society-for-Photogrammetry-and-Remote-Sensing (ISPRS) Geospatial Week (GSW), Cairo, Egypt, 2–7 September 2023; pp. 741–747. [Google Scholar]
- Guo, Q.; Su, Y.; Hu, T.; Guan, H.; Jin, S.; Zhang, J.; Zhao, X.; Xu, K.; Wei, D.; Kelly, M.; et al. Lidar Boosts 3D Ecological Observations and Modelings: A Review and Perspective. IEEE Geosci. Remote Sens. Mag. 2021, 9, 232–257. [Google Scholar] [CrossRef]
- Murcia, H.F.; Tilaguy, S.; Ouazaa, S. Development of a Low-Cost System for 3D Orchard Mapping Integrating UGV and LiDAR. Plants 2021, 10, 2804. [Google Scholar] [CrossRef]
- Yan, W.; Guan, H.; Cao, L.; Yu, Y.; Li, C.; Lu, J. A Self-Adaptive Mean Shift Tree-Segmentation Method Using UAV LiDAR Data. Remote Sens. 2020, 12, 515. [Google Scholar] [CrossRef]
- Pan, Y.; Han, Y.; Wang, L.; Chen, J.; Meng, H.; Wang, G.; Zhang, Z.; Wang, S. 3D Reconstruction of Ground Crops Based on Airborne LiDAR Technology. IFAC-PapersOnLine 2019, 52, 35–40. [Google Scholar] [CrossRef]
- Young, S.N.; Kayacan, E.; Peschel, J.M. Design and field evaluation of a ground robot for high-throughput phenotyping of energy sorghum. Precis. Agric. 2019, 20, 697–722. [Google Scholar] [CrossRef]
- Yao, L.; van de Zedde, R.; Kowalchuk, G. Recent developments and potential of robotics in plant eco-phenotyping. Emerg. Top. Life Sci. 2021, 5, 289–300. [Google Scholar] [CrossRef]
- Huang, X.; Zheng, S.; Zhu, N. High-Throughput Legume Seed Phenotyping Using a Handheld 3D Laser Scanner. Remote Sens. 2022, 14, 431. [Google Scholar] [CrossRef]
- Malambo, L.; Popescu, S.C.; Horne, D.W.; Pugh, N.A.; Rooney, W.L. Automated detection and measurement of individual sorghum panicles using density-based clustering of terrestrial lidar data. ISPRS J. Photogramm. Remote Sens. 2019, 149, 1–13. [Google Scholar] [CrossRef]
- He, W.; Ye, Z.; Li, M.; Yan, Y.; Lu, W.; Xing, G. Extraction of soybean plant trait parameters based on SfM-MVS algorithm combined with GRNN. Front. Plant Sci. 2023, 14, 1181322. [Google Scholar] [CrossRef] [PubMed]
- Wu, S.; Wen, W.; Wang, Y.; Fan, J.; Wang, C.; Gou, W.; Guo, X. MVS-Pheno: A Portable and Low-Cost Phenotyping Platform for Maize Shoots Using Multiview Stereo 3D Reconstruction. Plant Phenomics 2020, 2020, 1848437. [Google Scholar] [CrossRef] [PubMed]
- Hasheminasab, S.M.; Zhou, T.; Habib, A. GNSS/INS-Assisted Structure from Motion Strategies for UAV-Based Imagery over Mechanized Agricultural Fields. Remote Sens. 2020, 12, 351. [Google Scholar] [CrossRef]
- Wu, S.; Wen, W.; Gou, W.; Lu, X.; Zhang, W.; Zheng, C.; Xiang, Z.; Chen, L.; Guo, X. A miniaturized phenotyping platform for individual plants using multi-view stereo 3D reconstruction. Front. Plant Sci. 2022, 13, 897746. [Google Scholar] [CrossRef]
- Sandhu, J.; Zhu, F.; Paul, P.; Gao, T.; Dhatt, B.K.; Ge, Y.; Staswick, P.; Yu, H.; Walia, H. PI-Plat: A high-resolution image-based 3D reconstruction method to estimate growth dynamics of rice inflorescence traits. Plant Methods 2019, 15, 162. [Google Scholar] [CrossRef]
- Schirrmann, M.; Hamdorf, A.; Garz, A.; Ustyuzhanin, A.; Dammer, K.-H. Estimating wheat biomass by combining image clustering with crop height. Comput. Electron. Agric. 2016, 121, 374–384. [Google Scholar] [CrossRef]
- Tolomelli, G.; Kothawade, G.S.; Chandel, A.K.; Manfrini, L.; Jacoby, P.; Khot, L.R. Aerial-RGB imagery based 3D canopy reconstruction and mapping of grapevines for precision management. In Proceedings of the IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Perugia, Italy, 3–5 November 2022; pp. 203–207. [Google Scholar] [CrossRef]
- Matsuura, Y.; Zhang, H.; Nakao, K.; Qiong, C.; Firmansyah, I.; Kawai, S.; Yamaguchi, Y.; Maruyama, T.; Hayashi, H.; Nobuhara, H. High-precision plant height measurement by drone with RTK-GNSS and single camera for real-time processing. Sci. Rep. 2023, 13, 6329. [Google Scholar] [CrossRef]
- Yang, R.; He, Y.; Lu, X.; Zhao, Y.; Li, Y.; Yang, Y.; Kong, W.; Liu, F. 3D-based precise evaluation pipeline for maize ear rot using multi-view stereo reconstruction and point cloud semantic segmentation. Comput. Electron. Agric. 2024, 216, 108512. [Google Scholar] [CrossRef]
- Yang, L.; Zhang, L.; Dong, H.; Alelaiwi, A.; El Saddik, A. Evaluating and Improving the Depth Accuracy of Kinect for Windows v2. IEEE Sens. J. 2015, 15, 4275–4285. [Google Scholar] [CrossRef]
- Wasenmueller, O.; Stricker, D. Comparison of Kinect V1 and V2 Depth Images in Terms of Accuracy and Precision. In Proceedings of the 13th Asian Conference on Computer Vision (ACCV), Taipei, Taiwan, 20–24 November 2016; Volume 10117, pp. 34–45. [Google Scholar] [CrossRef]
- Guan, H.; Zhang, X.; Ma, X.; Zhuo, Z.; Deng, H. Recognition and phenotypic detection of maize stem and leaf at seedling stage based on 3D reconstruction technique. Opt. Laser Technol. 2025, 187, 112787. [Google Scholar] [CrossRef]
- Teng, X.; Zhou, G.; Wu, Y.; Huang, C.; Dong, W.; Xu, S. Three-Dimensional Reconstruction Method of Rapeseed Plants in the Whole Growth Period Using RGB-D Camera. Sensors 2021, 21, 4628. [Google Scholar] [CrossRef] [PubMed]
- Song, S.; Duan, J.; Yang, Z.; Zou, X.; Fu, L.; Ou, Z. A three-dimensional reconstruction algorithm for extracting parameters of the banana pseudo-stem. Optik 2019, 185, 486–496. [Google Scholar] [CrossRef]
- Jiang, Y.; Li, C.; Paterson, A.H.; Sun, S.; Xu, R.; Robertson, J. Quantitative Analysis of Cotton Canopy Size in Field Conditions Using a Consumer-Grade RGB-D Camera. Front. Plant Sci. 2018, 8, 2233. [Google Scholar] [CrossRef]
- Ma, X.; Zhu, K.; Guan, H.; Feng, J.; Yu, S.; Liu, G. High-Throughput Phenotyping Analysis of Potted Soybean Plants Using Colorized Depth Images Based on A Proximal Platform. Remote Sens. 2019, 11, 1085. [Google Scholar] [CrossRef]
- Moreno, H.; Rueda-Ayala, V.; Ribeiro, A.; Bengochea-Guevara, J.; Lopez, J.; Peteinatos, G.; Valero, C.; Andujar, D. Evaluation of Vineyard Cropping Systems Using on-Board RGB-Depth Perception. Sensors 2020, 20, 6912. [Google Scholar] [CrossRef]
- Gene-Mola, J.; Llorens, J.; Rosell-Polo, J.R.; Gregorio, E.; Arno, J.; Solanelles, F.; Martinez-Casasnovas, J.A.; Escola, A. Assessing the Performance of RGB-D Sensors for 3D Fruit Crop Canopy Characterization under Different Operating and Lighting Conditions. Sensors 2020, 20, 7072. [Google Scholar] [CrossRef]
- Esser, F.; Rosu, R.A.; Cornelißen, A.; Klingbeil, L.; Kuhlmann, H.; Behnke, S. Field Robot for High-Throughput and High-Resolution 3D Plant Phenotyping: Towards Efficient and Sustainable Crop Production. IEEE Robot. Autom. Mag. 2023, 30, 20–29. [Google Scholar] [CrossRef]
- Vázquez-Arellano, M.; Reiser, D.; Paraforos, D.S.; Garrido-Izard, M.; Burce, M.E.C.; Griepentrog, H.W. 3-D reconstruction of maize plants using a time-of-flight camera. Comput. Electron. Agric. 2018, 145, 235–247. [Google Scholar] [CrossRef]
- Ma, X.; Wei, B.; Guan, H.; Yu, S. A method of calculating phenotypic traits for soybean canopies based on three-dimensional point cloud. Ecol. Inform. 2022, 68, 101524. [Google Scholar] [CrossRef]
- Ma, X.; Wei, B.; Guan, H.; Cheng, Y.; Zhuo, Z. A method for calculating and simulating phenotype of soybean based on 3D reconstruction. Eur. J. Agron. 2024, 154, 127070. [Google Scholar] [CrossRef]
- Sun, G.; Wang, X. Three-Dimensional Point Cloud Reconstruction and Morphology Measurement Method for Greenhouse Plants Based on the Kinect Sensor Self-Calibration. Agronomy 2019, 9, 596. [Google Scholar] [CrossRef]
- Wang, F.; Ma, X.; Liu, M.; Wei, B. Three-Dimensional Reconstruction of Soybean Canopy Based on Multivision Technology for Calculation of Phenotypic Traits. Agronomy 2022, 12, 692. [Google Scholar] [CrossRef]
- Zhu, T.; Ma, X.; Guan, H.; Wu, X.; Wang, F.; Yang, C.; Jiang, Q. A calculation method of phenotypic traits based on three-dimensional reconstruction of tomato canopy. Comput. Electron. Agric. 2023, 204, 107515. [Google Scholar] [CrossRef]
- Song, P.; Li, Z.; Yang, M.; Shao, Y.; Pu, Z.; Yang, W.; Zhai, R. Dynamic detection of three-dimensional crop phenotypes based on a consumer-grade RGB-D camera. Front. Plant Sci. 2023, 14, 1097725. [Google Scholar] [CrossRef] [PubMed]
- Li, J.; Tang, L. Developing a low-cost 3D plant morphological traits characterization system. Comput. Electron. Agric. 2017, 143, 1–13. [Google Scholar] [CrossRef]
- Yang, D.; Yang, H.; Liu, D.; Wang, X. Research on automatic 3D reconstruction of plant phenotype based on Multi-View images. Comput. Electron. Agric. 2024, 220, 108866. [Google Scholar] [CrossRef]
- Xu, N.; Sun, G.; Bai, Y.; Zhou, X.; Cai, J.; Huang, Y. Global Reconstruction Method of Maize Population at Seedling Stage Based on Kinect Sensor. Agriculture 2023, 13, 348. [Google Scholar] [CrossRef]
- Tolgyessy, M.; Dekan, M.; Chovanec, L.; Hubinsky, P. Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2. Sensors 2021, 21, 413. [Google Scholar] [CrossRef]
- Li, Y.; Liu, J.; Zhang, B.; Wang, Y.; Yao, J.; Zhang, X.; Fan, B.; Li, X.; Hai, Y.; Fan, X. Three-dimensional reconstruction and phenotype measurement of maize seedlings based on multi-view image sequences. Front. Plant Sci. 2022, 13, 974339. [Google Scholar] [CrossRef]
- Guan, H.; Liu, M.; Ma, X.; Yu, S. Three-Dimensional Reconstruction of Soybean Canopies Using Multisource Imaging for Phenotyping Analysis. Remote Sens. 2018, 10, 1206. [Google Scholar] [CrossRef]
- Qiao, G.; Zhang, Z.; Niu, B.; Han, S.; Yang, E. Plant stem and leaf segmentation and phenotypic parameter extraction using neural radiance fields and lightweight point cloud segmentation networks. Front. Plant Sci. 2025, 16, 1491170. [Google Scholar] [CrossRef]
- Chen, Y.; Xiao, K.; Gao, G.; Zhang, F. High-fidelity 3D reconstruction of peach orchards using a 3DGS-Ag model. Comput. Electron. Agric. 2025, 234, 110225. [Google Scholar] [CrossRef]
- Rusu, R.B.; Marton, Z.C.; Blodow, N.; Dolha, M.; Beetz, M. Towards 3D Point cloud based object maps for household environments. Robot. Auton. Syst. 2008, 56, 927–941. [Google Scholar] [CrossRef]
- Zhu, B.; Zhang, Y.; Sun, Y.; Shi, Y.; Ma, Y.; Guo, Y. Quantitative estimation of organ-scale phenotypic parameters of field crops through 3D modeling using extremely low altitude UAV images. Comput. Electron. Agric. 2023, 210, 107910. [Google Scholar] [CrossRef]
- Sun, Y.; Miao, L.; Zhao, Z.; Pan, T.; Wang, X.; Guo, Y.; Xin, D.; Chen, Q.; Zhu, R. An Efficient and Automated Image Preprocessing Using Semantic Segmentation for Improving the 3D Reconstruction of Soybean Plants at the Vegetative Stage. Agronomy 2023, 13, 2388. [Google Scholar] [CrossRef]
- Zhou, L.; Sun, G.; Li, Y.; Li, W.; Su, Z. Point cloud denoising review: From classical to deep learning-based approaches. Graph. Models 2022, 121, 101140. [Google Scholar] [CrossRef]
- Li, Z.; Pan, W.; Wang, S.; Tang, X.; Hu, H. A point cloud denoising network based on manifold in an unknown noisy environment. Infrared Phys. Technol. 2023, 132, 104735. [Google Scholar] [CrossRef]
- Zermas, D.; Morellas, V.; Mulla, D.; Papanikolopoulos, N. 3D model processing for high throughput phenotype extraction—The case of corn. Comput. Electron. Agric. 2020, 172, 105047. [Google Scholar] [CrossRef]
- Li, W.; Tang, B.; Hou, Z.; Wang, H.; Bing, Z.; Yang, Q.; Zheng, Y. Dynamic Slicing and Reconstruction Algorithm for Precise Canopy Volume Estimation in 3D Citrus Tree Point Clouds. Remote Sens. 2024, 16, 2142. [Google Scholar] [CrossRef]
- Chantrapornchai, C.; Srijan, P. On the 3D point clouds-palm and coconut trees data set extraction and their usages. BMC Res. Notes 2023, 16, 363. [Google Scholar] [CrossRef] [PubMed]
- Chen, L.; Yuan, Y.; Song, S. Hierarchical Denoising Method of Crop 3D Point Cloud Based on Multi-view Image Reconstruction. In Proceedings of the 11th IFIP WG 5.14 International Conference on Computer and Computing Technologies in Agriculture (CCTA), Jilin, China, 12–15 August 2017; Volume 545, pp. 416–427. [Google Scholar] [CrossRef]
- Yu, S.; Yan, X.; Jia, T.; Qiu, D.; Hu, D. Binocular structured light-based 3D reconstruction for morphological measurements of apples. Postharvest Biol. Technol. 2024, 213, 112952. [Google Scholar] [CrossRef]
- Rakotosaona, M.-J.; La Barbera, V.; Guerrero, P.; Mitra, N.J.; Ovsjanikov, M. POINTCLEANNET: Learning to Denoise and Remove Outliers from Dense Point Clouds. Comput. Graph. Forum 2020, 39, 185–203. [Google Scholar] [CrossRef]
- Wang, X.; Fan, X.; Zhao, D. PointFilterNet: A Filtering Network for Point Cloud Denoising. IEEE Trans. Circuits Syst. Video Technol. 2023, 33, 1276–1290. [Google Scholar] [CrossRef]
- Seppanen, A.; Ojala, R.; Tammi, K. 4DenoiseNet: Adverse Weather Denoising from Adjacent Point Clouds. IEEE Robot. Autom. Lett. 2023, 8, 456–463. [Google Scholar] [CrossRef]
- Wang, X.; Cui, W.; Xiong, R.; Fan, X.; Zhao, D. FCNet: Learning Noise-Free Features for Point Cloud Denoising. IEEE Trans. Circuits Syst. Video Technol. 2023, 33, 6288–6301. [Google Scholar] [CrossRef]
- Sheng, H.; Li, Y. Denoising point clouds with fewer learnable parameters. Comput.-Aided Des. 2024, 172, 103708. [Google Scholar] [CrossRef]
- Mao, A.; Du, Z.; Wen, Y.-H.; Xuan, J.; Liu, Y.-J. PD-Flow: A Point Cloud Denoising Framework with Normalizing Flows. In Proceedings of the 17th European Conference on Computer Vision (ECCV), Tel Aviv, Israel, 23–27 October 2022; Volume 13663, pp. 398–415. [Google Scholar] [CrossRef]
- Duan, C.; Chen, S.; Kovacevic, J. 3D Point Cloud Denoising via Deep Neural Network Based Local Surface Estimation. In Proceedings of the 44th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brighton, UK, 12–17 May 2019; pp. 8553–8557. [Google Scholar] [CrossRef]
- Xin, B.; Sun, J.; Bartholomeus, H.; Kootstra, G. 3D data-augmentation methods for semantic segmentation of tomato plant parts. Front. Plant Sci. 2023, 14, 1045545. [Google Scholar] [CrossRef]
- Wang, W.; Liu, X.; Zhou, H.; Wei, L.; Deng, Z.; Murshed, M.; Lu, X. Noise4Denoise: Leveraging noise for unsupervised point cloud denoising. Comput. Vis. Media 2024, 10, 659–669. [Google Scholar] [CrossRef]
- Yang, H.; Wang, X.; Sun, G. Three-Dimensional Morphological Measurement Method for a Fruit Tree Canopy Based on Kinect Sensor Self-Calibration. Agronomy 2019, 9, 741. [Google Scholar] [CrossRef]
- Magistri, F.; Chebrolu, N.; Stachniss, C. Segmentation-Based 4D Registration of Plants Point Clouds for Phenotyping. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 2433–2439. [Google Scholar] [CrossRef]
- Chebrolu, N.; Magistri, F.; Labe, T.; Stachniss, C. Registration of spatio-temporal point clouds of plants for phenotyping. PLoS ONE 2021, 16, e0247243. [Google Scholar] [CrossRef]
- Wu, D.; Yu, L.; Ye, J.; Zhai, R.; Duan, L.; Liu, L.; Wu, N.; Geng, Z.; Fu, J.; Huang, C.; et al. Panicle-3D: A low-cost 3D-modeling method for rice panicles based on deep learning, shape from silhouette, and supervoxel clustering. Crop J. 2022, 10, 1386–1398. [Google Scholar] [CrossRef]
- Giang, T.T.H.; Ryoo, Y.-J. Pruning Points Detection of Sweet Pepper Plants Using 3D Point Clouds and Semantic Segmentation Neural Network. Sensors 2023, 23, 4040. [Google Scholar] [CrossRef]
- Yao, J.; Wang, W.; Fu, H.; Deng, Z.; Cui, G.; Wang, S.; Wang, D.; She, W.; Cao, X. Automated measurement of field crop phenotypic traits using UAV 3D point clouds and an improved PointNet+. Front. Plant Sci. 2025, 16, 1654232. [Google Scholar] [CrossRef] [PubMed]
- Zhou, J.; Fu, X.; Zhou, S.; Zhou, J.; Ye, H.; Nguyen, H.T. Automated segmentation of soybean plants from 3D point cloud using machine learning. Comput. Electron. Agric. 2019, 162, 143–153. [Google Scholar] [CrossRef]
- Kok, E.; Wang, X.; Chen, C. Obscured tree branches segmentation and 3D reconstruction using deep learning and geometrical constraints. Comput. Electron. Agric. 2023, 210, 107884. [Google Scholar] [CrossRef]
- Fasiolo, D.T.; Pichierri, A.; Sivilotti, P.; Scalera, L. An analysis of the effects of water regime on grapevine canopy status using a UAV and a mobile robot. Smart Agric. Technol. 2023, 6, 100344. [Google Scholar] [CrossRef]
- Rosell, J.R.; Sanz, R. A review of methods and applications of the geometric characterization of tree crops in agricultural activities. Comput. Electron. Agric. 2012, 81, 124–141. [Google Scholar] [CrossRef]
- Liu, F.; Hu, P.; Zheng, B.; Duan, T.; Zhu, B.; Guo, Y. A field-based high-throughput method for acquiring canopy architecture using unmanned aerial vehicle images. Agric. For. Meteorol. 2021, 296, 108231. [Google Scholar] [CrossRef]
- Valluvan, A.B.; Raj, R.; Pingale, R.; Jagarlapudi, A. Canopy height estimation using drone-based RGB images. Smart Agric. Technol. 2023, 4, 100145. [Google Scholar] [CrossRef]
- Kothawade, G.S.; Chandel, A.K.; Schrader, M.J.; Rathnayake, A.P.; Khot, L.R. High throughput canopy characterization of a commercial apple orchard using aerial RGB imagery. In Proceedings of the 1st IEEE International Workshop on Metrology for the Agriculture and Forestry (IEEE MetroAgriFor), Trento-Bolzano, Italy, 3–5 November 2021; pp. 177–181. [Google Scholar]
- Dharni, J.S.; Dhatt, B.K.; Paul, P.; Gao, T.; Awada, T.; Bacher, H.; Peleg, Z.; Staswick, P.; Hupp, J.; Yu, H.; et al. A non-destructive approach for measuring rice panicle-level photosynthetic responses using 3D-image reconstruction. Plant Methods 2022, 18, 126. [Google Scholar] [CrossRef] [PubMed]
- Malekabadi, A.J.; Khojastehpour, M. Optimization of stereo vision baseline and effects of canopy structure, pre-processing and imaging parameters for 3D reconstruction of trees. Mach. Vis. Appl. 2022, 33, 87. [Google Scholar] [CrossRef]
- Chen, J.; Jiao, Y.; Jin, F.; Qin, X.; Ning, Y.; Yang, M.; Zhan, Y. Plant Sam Gaussian Reconstruction (PSGR): A High-Precision and Accelerated Strategy for Plant 3D Reconstruction. Electronics 2025, 14, 2291. [Google Scholar] [CrossRef]
- Ando, R.; Ozasa, Y.; Guo, W. Robust Surface Reconstruction of Plant Leaves from 3D Point Clouds. Plant Phenomics 2021, 2021, 3184185. [Google Scholar] [CrossRef]
- Chen, H.; Liu, S.; Wang, C.; Wang, C.; Gong, K.; Li, Y.; Lan, Y. Point Cloud Completion of Plant Leaves under Occlusion Conditions Based on Deep Learning. Plant Phenomics 2023, 5, 0117. [Google Scholar] [CrossRef]
- Badhan, S.; Desai, K.; Dsilva, M.; Sonkusare, R.; Weakey, S. Real-Time Weed Detection using Machine Learning and Stereo-Vision. In Proceedings of the 6th International Conference for Convergence in Technology (I2CT), Maharashtra, India, 2–4 April 2021. [Google Scholar] [CrossRef]
- Gu, W.; Wen, W.; Wu, S.; Zheng, C.; Lu, X.; Chang, W.; Xiao, P.; Guo, X. 3D Reconstruction of Wheat Plants by Integrating Point Cloud Data and Virtual Design Optimization. Agriculture 2024, 14, 391. [Google Scholar] [CrossRef]
- Lu, L.; Ma, J.; Qu, S. Value of Virtual Reality Technology in Image Inspection and 3D Geometric Modeling. IEEE Access 2020, 8, 139070–139083. [Google Scholar] [CrossRef]
- Tross, M.C.; Gaillard, M.; Zweiner, M.; Miao, C.; Grove, R.J.; Li, B.; Benes, B.; Schnable, J.C. 3D reconstruction identifies loci linked to variation in angle of individual sorghum leaves. PeerJ 2021, 9, e12628. [Google Scholar] [CrossRef]
- Das Choudhury, S.; Maturu, S.; Samal, A.; Stoerger, V.; Awada, T. Leveraging Image Analysis to Compute 3D Plant Phenotypes Based on Voxel-Grid Plant Reconstruction. Front. Plant Sci. 2020, 11, 521431. [Google Scholar] [CrossRef]
- Kliman, R.; Huang, Y.; Zhao, Y.; Chen, Y. Toward an Automated System for Nondestructive Estimation of Plant Biomass. Plant Direct 2025, 9, e70043. [Google Scholar] [CrossRef]
- Cai, Z.; Jin, C.; Xu, J.; Yang, T. Measurement of Potato Volume with Laser Triangulation and Three-Dimensional Reconstruction. IEEE Access 2020, 8, 176565–176574. [Google Scholar] [CrossRef]
- Zapata, N.T.; Tsoulias, N.; Saha, K.K.; Zude-Sasse, M. Fourier analysis of LiDAR scanned 3D point cloud data for surface reconstruction and fruit size estimation. In Proceedings of the IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Perugia, Italy, 3–5 November 2022; pp. 197–202. [Google Scholar] [CrossRef]
- Wang, F.; Li, F.; Mohan, V.; Dudley, R.; Gu, D.; Bryant, R. An unsupervised automatic measurement of wheat spike dimensions in dense 3D point clouds for field application. Biosyst. Eng. 2022, 223, 103–114. [Google Scholar] [CrossRef]

| Sensors | Platforms | Objects | Measured Parameters | Bundled Software | References | 
|---|---|---|---|---|---|
| ZEB1 | Handheld device, Unmanned ground vehicle | Blueberry bushes | Canopy size | GeoSLAM Cloud | [26] | 
| Focus S70 | Tripod | Rapeseed leaves | Planting density, Number of plants | PCL version 1.11.1, Visual Studio 2022 | [35] | 
| RigelScan Elite | Handheld device | Soybeans, Peas, Black beans, Red beans, Mung beans | 34 traits (shapes, edge feature, etc.) | Geomagic Studio | [43] | 
| LMS4121R-13,000 | Slide rail | Maize seedlings | Plant height, Volume, Classification of stalks and leaves | SOPAS ET, CloudCompare | [33] | 
| Focus X330 | Agricultural spray tractor | Sorghum panicles | Plant height, Panicle length, Panicle width | CloudCompare (version 2.7.0), FUSION/LDV (version 3.60) | [44] | 
| Zoller + Froehlich Profiler 9012 A | Unmanned ground robot | Maizes, Soybeans, Potatoes, Wheat, Sugar beets | LAI, LAD, Plant height | CloudCompare | [8] | 
| Focus 3D S120 TLS | Suspension system | Wheat | Yield | CloudCompare, FARO Scene V2021.1.0., Open3D | [36] | 
| RPLIDAR A2 | Unmanned aerial vehicle | Wheat, American mints | Plant height | MATLAB 2018a | [40] | 
| Names | Equipment | Objects | Measured Parameters | Bundled Software | References | 
|---|---|---|---|---|---|
| MVS-Pheno V1 | Canon 77D cameras, FARO Focus3D X120, Fastrak 3D Digitizer | Higher crops (maize, etc.) | Plant height (R2 = 0.99), Leaf width (R2 = 0.87), Leaf area (R2 = 0.93) | Agisoft Photoscan Professional, PCL | [46] | 
| MVS-Pheno V2 | Canon 77D cameras, Fastrak 3D Digitizer | Lower crops (wheat, etc.), Seedling plants, Plant organs | Plant height (R2 = 0.999), Leaf length (R2 = 0.995), Leaf width (R2 = 0.969) | OpenGL, PCL, OpenMVG, OpenMVS, ContextCapture v.4.4.9, CloudCompare | [48] | 
| PI-Plat | Sony α6500 cameras, Epson Expression 12,000 XL | Rice panicles, Grain new branches, New leaves | Number of seeds, Color | MATLAB | [49] | 
| Methods | Platforms | Accuracy | Get the Point Clouds Directly | Price | Scalability | Environmental Robustness | Scale of Analysis | References | 
|---|---|---|---|---|---|---|---|---|
| LiDAR | Unmanned aerial vehicle, Unmanned ground vehicle, Fixed platform | mm | Yes | $10,000~$50,000 | Moderate; limited by platform weight and cost, but suitable for various scales. | Strong; performs well under different lighting and weather conditions. | Can be applied from plant to field scale depending on platform (handheld, UAV, UGV). | [8,27,43,48,49,50,51] | 
| MVS | Unmanned aerial vehicle, Fixed platform | mm~cm | No | $1000~$10,000 | High; easy to deploy in greenhouse and indoor environments. | Weak; performance degrades under strong sunlight or reflective surfaces. | Mostly used for single-plant or small-scale canopy studies. | [21,45,52,53,72] | 
| Depth camera + SDK | Unmanned ground vehicle, Fixed platform | mm~cm | Yes | $500~$5000 | High; adaptable for large-scale or field phenotyping. | Moderate; affected by lighting and weather variations. | Suitable for plot or field-level analysis. | [54,55,56,57,65,66,73] | 
| Multi- Sensor Fusion System | Unmanned aerial vehicle, Unmanned ground vehicle, Fixed platform | mm~cm | Yes | >$50,000 | Moderate; complexity limits scalability but enhances versatility. | Strong; maintains stability across variable environments. | Applicable for plant to field scale, depending on sensor configuration. | [58,59,60] | 
| Classifications | Algorithms | Objects | Effects | References | 
|---|---|---|---|---|
| Obvious outliers | Elevation Values + RGB Attribute | Rapeseed plants, Cotton Canopy | Set a fixed height; suitable for flat ground and flowerpots; enhance color features to improve segmentation accuracy | [35,59] | 
| Adaptive Threshold | Blueberry Shrubs | Adjust height-histogram gradient dynamically according to terrain conditions; suitable for complex field environments | [26] | |
| Bounding Box | Soybean plants | Use box selection to remove large background interference and extract target objects | [65] | |
| Cloth Simulation Filter | Apple trees | Extract ground model from point cloud data; separate crown point cloud data | [12] | |
| Non-obvious outliers | Random Sample Consensus | Soybeans, Peas, Black beans, Red beans, Mung beans | Set a threshold to distinguish between internal and external points; multiple fitting to find the model | [43] | 
| K-Nearest Neighbors + Plane Layered Fitting | Leaves of monocots and dicots | Calculate point cloud density to identify noise points; calculate distance from point to fitting plane to filter out noise points based on Interquartile Range | [80] | |
| Density-Based Spatial Clustering of Applications with Noise + Fast Bilateral Filter | Rice and cucumber plants | Remove noise of different scales; adjust parameters according to shape and noise distribution of different crops | [87] | |
| Statistical Outlier Removal + Density-Based Spatial Clustering of Applications with Noise | Shrub canopy | Remove isolated noise points and clusters of non-target objects (weeds) in turn | [26] | |
| Gaussian Filter + Statistical Outlier Removal + Voxel Filter | Rapeseed leaves | Ensure smoothness and consistency of point cloud; optimize point cloud density and improve operating efficiency | [35] | 
| Algorithms | Objects | Accuracy | References | 
|---|---|---|---|
| Calibration + Iterative Closest Point | Apple trees | Canopy height (ARE = 2.5%), Canopy width (R2 = 3.6%), Canopy thickness (R2 = 3.2%) | [98] | 
| Refined Iterative Closest Point | Plants of early growth | Plant height (MAE = 0.392 cm), Leaf length (MAE = 0.2537 cm), Leaf width (MAE = 0.2676 cm), Leaf area (MAE = 0.957 cm2) | [18] | 
| Principal Component Analysis + Iterative Closest Point | Soybean plants | Side view: Plant height (R2 = 0.9890), Green index (R2 = 0.6059), Top view: Plant height (R2 = 0.9936), Green index (R2 = 0.8864) | [76] | 
| Random Sample Consensus + Iterative Closest Point | Soybean plants | Model matching accuracy > 0.9 | [81] | 
| Intrinsic Shape Signatures-Coherent Point Drift + Iterative Closest Point | Soybean plants | RMSE = 0.0107~0.0128 cm, Registration error and time are decreased | [66] | 
| Hungarian method + Non-rigid Registration | Tomatoes, Maizes | Tomatoes: F1-score = 97.73, Maizes: F1-score = 98.76 | [99] | 
| Hidden Markov Model + Non-rigid Registration | Tomatoes, Maizes | Tomatoes: AP = 89%, Maizes: AP = 97% | [100] | 
| Model/Method | Crop/Target | Performance (Metric & Value) | Reference | 
|---|---|---|---|
| Geometric Contraction+ MST + DBSCAN | Cotton (stalk & node extraction) | R2 = 0.94; MAPE = 5.1% | [16] | 
| Random Interception Node + RANSAC–Kalman | Maize (stem & leaf structure) | AP (LAI) = 92.5%; Height = 89.2%; Leaf length = 74.8% | [84] | 
| U-Net | Sweet pepper (pruning node detection) | Mean error = 4.1–6.2 mm | [102] | 
| DeepLabv3+ | Soybean (leaf segmentation) | IoU = 0.92–0.95 | [103] | 
| U-Net++ | Maize (organ segmentation) | Overall accuracy = 0.90–0.94 | [104] | 
| PF-Net | Chinese flowering cabbage (leaf completion) | RMSE = 6.79 cm2 | [105] | 
| Classifications | Algorithms | Objects | Effects | Results | References | 
|---|---|---|---|---|---|
| Canopy | Euclidean Clustering + Voxel Profiling | Apple tree canopy | Separate trunk/branch point clouds; voxelize canopy for calculating LAD/LAI to estimate pesticide deposition | Pesticide deposition model: R2 = 0.92 | [12] | 
| Poisson Surface Reconstruction | Maize canopy | Generate smooth polygonal meshes; analyze vertices to estimate canopy height | Canopy height: R2 = 0.86 | [109] | |
| Spatial analysis + Convex/Concave Hull | Cotton canopy | Use position info to distinguish weeds; detect canopy boundary to estimate height/volume | Canopy height: R2 = 0.87, RMSE = 0.04 m Canopy volume: R2 = 0.87 | [59] | |
| Density-Based Spatial Clustering of Applications with Noise + Convex Hull | Grape canopy | Recognize and segment canopy; create convex polygon to estimate volume | Canopy volume: r = 0.64 | [51] | |
| Dynamic Slicing and Reconstruction | Citrus tree canopy | Reconstruct canopy shape; calculate volume by dynamic slicing and iterative optimization | Canopy volume: R2 = 0.794 | [85] | |
| Stem and Leaf | Geometric Contraction + Topological Thinning + Minimum Spanning Tree + Density-Based Spatial Clustering of Applications with Noise | Cotton plants | Perform 3D skeleton extraction; detect main stalk and node in turn; extract nodes and stalk length | Main Stalk Length: R2 = 0.94, MAPE = 4.3%, Node Counting: R2 = 0.70, MAPE = 5.1% | [16] | 
| Random Interception Node + Skeletonization + Random Sample Consensus + Kalman Filter | Maize plants | Extract topological structure; extract stems and leaves after skeletonization | LAI: AP = 92.5%, Plant height: AP = 89.2%, Leaf length: AP = 74.8%, | [84] | |
| Distance-field-based Segmentation Pipeline + Median Normalized-Vectors Growth Segmentation | Soybean plants | Segment the stems and leaves by distance field segmentation method; extract stems by region growing method | OA = 0.7619~0.9064 | [66] | |
| Semantic Segmentation Neural Network | Sweet pepper plants | Combine 3D point cloud and semantic segmentation to detect stems/leaves and reconstruct their 3D structures; use semantic labels to mark pruning nodes | The average error between trim points: 4.1~6.2 mm | [102] | |
| Leaf Shape Decomposition | Soybean and sugar beet leaves | Decompose leaf shape into flat and twisted parts to improve reconstruction robustness | No obvious distortion, protrusion or missing parts after reconstruction: R2 > 0.91 | [114] | |
| PF-Net + Delaunay 2.5D Triangulation | Chinese flowering cabbages | Complete point cloud using PF-Net; reconstruct surface using triangulation | RMSE = 6.79 cm2, AvRE = 8.82% | [115] | |
| Virtual Design Technology | Wheat plants | Design 3D models; obtain optimal model by iterative modification | Height: R2 = 0.80, NRMSE = 0.10, Crown width: R2 = 0.73, NRMSE = 0.12, Leaf area: R2 = 0.90, NRMSE = 0.08 | [117] | |
| Voxel Carving + Skeletonization + Machine Learning Classifier | Sorghum plants | Reconstruct 3D structure; identify/classify leaves using Machine learning classifier | Small-scale experiment: r = 0.98. Single leaf: r = 0.86 | [119] | |
| 3DPhenoMV + Voxel Overlapping Consistency Check + Clustering | Maize plants | Separate stems, leaves, and leaf clusters using voxel-based mesh reconstruction and Clustering | Use 10 side views to reconstruct the 3D voxel grid of single maize plant: Time = 3.5 min | [120] | |
| Fruit | Euclidean Clustering Segmentation + Principal Component Analysis + Symmetrical 3D Reconstruction + Poisson Surface Reconstruction | Soybeans, Peas, Black beans, Red beans, Mung beans | Segment and extract seeds using Clustering; complete 3D shape model using symmetry | MAE: sub-millimeter MRE < 3% R2 > 0.9983 | [43] | 
| α-shape Algorithm + Fourier Series Reference Shape Generation | Apples | Reconstruct surface; estimate fruit size using Fourier analysis-based method | RMSE = 5.85 mm | [123] | |
| Point Cloud Slicing and B-spline Curve Fitting + Local Vertex Search and Interpolation + α-shape Algorithm | Potatoes | Use the fitting method to reduce point cloud dispersion in the interpolation area; repair missing areas at the top and bottom; generate 3D model | Volume: AvRE = −0.08%, MaxRE= 2.17%, MinRE = 0.03% Mass: AvRE = 0.48%, MaxRE = 2.92%, MinRE = 0.12% | [122] | |
| Adaptive K-means Based on Dynamic Perspective + Random Sample Consensus | Wheat spikes | Adjust k-means parameters dynamically; fit spike shape | Number: AvRE = 16.27%, Height: AvRE = 5.24%, Width: AvRE = 12.38% | [124] | |
| SegNet + Ray Tracing + Supervoxel Clustering + Shape from Silhouette | Rice panicles | Process the 2D and 3D segmentation in turn to improve complex plant structures non-destructively | Average IoU = 0.95 Time cost: about 26 min per plant | [101] | 
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. | 
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Y.; Liang, Z.; Liu, B.; Yin, L.; Wan, F.; Qian, W.; Qiao, X. Applications of 3D Reconstruction Techniques in Crop Canopy Phenotyping: A Review. Agronomy 2025, 15, 2518. https://doi.org/10.3390/agronomy15112518
Li Y, Liang Z, Liu B, Yin L, Wan F, Qian W, Qiao X. Applications of 3D Reconstruction Techniques in Crop Canopy Phenotyping: A Review. Agronomy. 2025; 15(11):2518. https://doi.org/10.3390/agronomy15112518
Chicago/Turabian StyleLi, Yanzhou, Zhuo Liang, Bo Liu, Lijuan Yin, Fanghao Wan, Wanqiang Qian, and Xi Qiao. 2025. "Applications of 3D Reconstruction Techniques in Crop Canopy Phenotyping: A Review" Agronomy 15, no. 11: 2518. https://doi.org/10.3390/agronomy15112518
APA StyleLi, Y., Liang, Z., Liu, B., Yin, L., Wan, F., Qian, W., & Qiao, X. (2025). Applications of 3D Reconstruction Techniques in Crop Canopy Phenotyping: A Review. Agronomy, 15(11), 2518. https://doi.org/10.3390/agronomy15112518
 
        

 
       