Plant Sam Gaussian Reconstruction (PSGR): A High-Precision and Accelerated Strategy for Plant 3D Reconstruction
Abstract
:1. Introduction
2. Related Work
3. Methods
3.1. Accurate Plant Background Segmentation Based on Grounding DINO and the Segment Anything Model
3.2. Camera Pose Estimation and Sparse Point Cloud Generation
3.3. Gaussian Splatting-Based Plant Reconstruction
3.3.1. Initialization of 3D Gaussian Points
3.3.2. Splatting and Rendering in Plant Reconstruction
3.3.3. Density Control of 3D Gaussian Points
3.4. 3D–2D Projection-Guided Background Segmentation Optimization
4. Experiments
4.1. Plant and Image Acquisition
4.2. Experimental Environment and Evaluation Metrics
4.3. Ablation Study
4.4. Visualization Experimental Results of Plants
4.4.1. Foreground Segmentation of Plants
4.4.2. Comparison Between 3D Gaussians and PSGR
4.4.3. Comparative Visualization of Plants in Various Scenes
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Antonelli, A.; Govaerts, R.; Nic Lughadha, E.; Onstein, R.E.; Smith, R.J.; Zizka, A. Why plant diversity and distribution matter. New Phytol. 2023, 240, 1331–1336. [Google Scholar] [CrossRef] [PubMed]
- Noshita, K.; Murata, H.; Kirie, S. Model-based plant phenomics on morphological traits using morphometric descriptors. Breed. Sci. 2022, 72, 19–30. [Google Scholar] [CrossRef] [PubMed]
- Schneider, H.M. Characterization, costs, cues and future perspectives of phenotypic plasticity. Ann. Bot. 2022, 130, 131–148. [Google Scholar] [CrossRef] [PubMed]
- Adams, W.W., III; Stewart, J.J.; Polutchko, S.K.; Cohu, C.M.; Muller, O.; Demmig-Adams, B. Foliar phenotypic plasticity reflects adaptation to environmental variability. Plants 2023, 12, 2041. [Google Scholar] [CrossRef]
- Brooker, R.; Brown, L.K.; George, T.S.; Pakeman, R.J.; Palmer, S.; Ramsay, L.; Schöb, C.; Schurch, N.; Wilkinson, M.J. Active and adaptive plasticity in a changing climate. Trends Plant Sci. 2022, 27, 717–728. [Google Scholar] [CrossRef]
- Yamauchi, T.; Pedersen, O.; Nakazono, M.; Tsutsumi, N. Key root traits of Poaceae for adaptation to soil water gradients. New Phytol. 2021, 229, 3133–3140. [Google Scholar] [CrossRef]
- Khan, I.; Sohail; Zaman, S.; Li, G.; Fu, M. Adaptive responses of plants to light stress: Mechanisms of photoprotection and acclimation. A review. Front. Plant Sci. 2025, 16, 1550125. [Google Scholar] [CrossRef]
- Omari, M.K.; Lee, J.; Faqeerzada, M.A.; Joshi, R.; Park, E.; Cho, B.K. Digital image-based plant phenotyping: A review. Korean J. Agric. Sci. 2020, 47, 119–130. [Google Scholar] [CrossRef]
- Luo, L.; Jiang, X.; Yang, Y.; Samy, E.R.A.; Lefsrud, M.; Hoyos-Villegas, V.; Sun, S. Eff-3DPSeg: 3D organ-level plant shoot segmentation using annotation-efficient point clouds. arXiv 2022, arXiv:2212.10263. [Google Scholar]
- Okura, F. 3D modeling and reconstruction of plants and trees: A cross-cutting review across computer graphics, vision, and plant phenotyping. Breed. Sci. 2022, 72, 31–47. [Google Scholar] [CrossRef]
- Sun, G.; Wang, X. Three-dimensional point cloud reconstruction and morphology measurement method for greenhouse plants based on the kinect sensor self-calibration. Agronomy 2019, 9, 596. [Google Scholar] [CrossRef]
- Feng, J.; Saadati, M.; Jubery, T.; Jignasu, A.; Balu, A.; Li, Y.; Attigala, L.; Schnable, P.S.; Sarkar, S.; Ganapathysubramanian, B.; et al. 3D reconstruction of plants using probabilistic voxel carving. Comput. Electron. Agric. 2023, 213, 108248. [Google Scholar] [CrossRef]
- Paproki, A.; Sirault, X.; Berry, S.; Furbank, R.; Fripp, J. A novel mesh processing based technique for 3D plant analysis. BMC Plant Biol. 2012, 12, 63. [Google Scholar] [CrossRef]
- Yang, D.; Yang, H.; Liu, D.; Wang, X. Research on automatic 3D reconstruction of plant phenotype based on Multi-View images. Comput. Electron. Agric. 2024, 220, 108866. [Google Scholar] [CrossRef]
- Zhou, L.; Wu, G.; Zuo, Y.; Chen, X.; Hu, H. A comprehensive review of vision-based 3d reconstruction methods. Sensors 2024, 24, 2314. [Google Scholar] [CrossRef]
- Lindenrnayer, A. Mathematical models for cellular interactions in development. J. Theor. Biol. 1968, 18, 280–315. [Google Scholar] [CrossRef]
- Garcia, M.; de Souza, M.; Gebbers, R.; Peets, S.; Cielniak, G.; Hämmerle, M.; Bennertz, D.; Tang, L.; Andújar, D.; Tsiropoulos, Z. LiDAR-based point cloud acquisition for crop modeling: Precision and challenges. Comput. Electron. Agric. 2020, 175, 105579. [Google Scholar]
- Xu, J.; Ma, Y.; Liang, Y.; Guan, Y.; Luo, X.; Cui, S.; Zhang, H.; Fan, J.; Wang, S. Binocular stereo vision for plant 3D reconstruction: Challenges and solutions. Comput. Electron. Agric. 2019, 158, 184–192. [Google Scholar]
- Schonberger, J.L.; Frahm, J.M. Structure-from-motion revisited. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 4104–4113. [Google Scholar]
- Zhou, Z.; Li, Y.; Xue, Z.; Luo, Z.; Zhang, X.; Xu, J.; Chen, Q.; Li, R.; Tang, Y.; Wang, Z. Multi-view reconstruction techniques in plant modeling: A review and comparison. Comput. Biol. Med. 2022, 137, 104794. [Google Scholar]
- Liu, Z.; Zhao, C.; Wang, L.; Liu, Y.; Xu, Y.; Zhao, Y.; Wang, H.; Li, C.; Wang, J.; Zhang, L. PointNet-based 3D plant reconstruction with attention mechanisms. Comput. Electron. Agric. 2021, 182, 105968. [Google Scholar]
- Mildenhall, B.; Srinivasan, P.P.; Tancik, M.; Barron, J.T.; Ramamoorthi, R.; Ng, R. Nerf: Representing scenes as neural radiance fields for view synthesis. Commun. ACM 2021, 65, 99–106. [Google Scholar] [CrossRef]
- Kerbl, B.; Kopanas, G.; Leimkühler, T.; Drettakis, G. 3D Gaussian splatting for real-time radiance field rendering. ACM Trans. Graph. 2023, 42, 1–14. [Google Scholar] [CrossRef]
- Liu, S.; Zeng, Z.; Ren, T.; Li, F.; Zhang, H.; Yang, J.; Jiang, Q.; Li, C.; Yang, J.; Su, H.; et al. Grounding dino: Marrying dino with grounded pre-training for open-set object detection. In European Conference on Computer Vision; Springer: Cham, Switzerland, 2024; pp. 38–55. [Google Scholar]
- Kirillov, A.; Mintun, E.; Ravi, N.; Mao, H.; Rolland, C.; Gustafson, L.; Xiao, T.; Whitehead, S.; Berg, A.C.; Lo, W.Y.; et al. Segment anything. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 1–6 October 2023; pp. 4015–4026. [Google Scholar]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30, 5998–6008. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An image is worth 16 × 16 words: Transformers for image recognition at scale. arXiv 2020, arXiv:2010.11929. [Google Scholar]
- Lowe, D.G. Object recognition from local scale-invariant features. In Proceedings of the seventh IEEE International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; Volume 2, pp. 1150–1157. [Google Scholar]
- Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Lepetit, V.; Moreno-Noguer, F.; Fua, P. EPnP: An accurate O(n) solution to the PnP problem. Int. J. Comput. Vis. 2009, 81, 155–166. [Google Scholar] [CrossRef]
- Müller, T.; Evans, A.; Schied, J.; Keller, A. Instant Neural Graphics Primitives with a Multiresolution Hash Encoding. ACM Trans. Graph. 2022, 41, 102:1–102:15. [Google Scholar] [CrossRef]
Experiment Setup | Removed Component | PSNR | Time Cost (s) |
---|---|---|---|
Baseline | Grounding SAM and 3D-2D guided | 29.72 | 4130 |
No Grounding SAM | Grounding SAM | 30.56 | 4377 |
No 3D–2D-guided | 3D–2D-guided | 31.87 | 2992 |
Full Model | None | 33.25 | 3026 |
Plant | Grey Star Ctenanthe | Snake Plant | Money Tree | ||||||
---|---|---|---|---|---|---|---|---|---|
Method|Metric | Loss | MAE | PSNR | Loss | MAE | PSNR | Loss | MAE | PSNR |
3DGS-7000 | 0.0174 | 0.017 | 36.70 | 0.0198 | 0.021 | 36.49 | 0.0207 | 0.011 | 29.07 |
PSGR-7000 | 0.0106 | 0.007 | 36.73 | 0.0142 | 0.006 | 38.07 | 0.0184 | 0.003 | 31.23 |
3DGS-30000 | 0.0145 | 0.012 | 39.08 | 0.0118 | 0.009 | 38.91 | 0.0158 | 0.009 | 34.00 |
PSGR-30000 | 0.0094 | 0.006 | 38.57 | 0.0098 | 0.002 | 39.79 | 0.0135 | 0.002 | 32.75 |
Plant | Grey Star Ctenanthe | Snake Plant | Money Tree | ||||||
---|---|---|---|---|---|---|---|---|---|
Method|Metric | Loss | MAE | PSNR | Loss | MAE | PSNR | Loss | MAE | PSNR |
3DGS-7000 | 0.0412 | 0.024 | 30.83 | 0.0523 | 0.032 | 30.13 | 0.0784 | 0.019 | 29.35 |
PSGR-7000 | 0.0195 | 0.015 | 29.03 | 0.0374 | 0.013 | 29.15 | 0.0233 | 0.009 | 29.02 |
3DGS-30000 | 0.0344 | 0.016 | 32.25 | 0.0386 | 0.015 | 35.17 | 0.0627 | 0.012 | 31.46 |
PSGR-30000 | 0.0171 | 0.012 | 30.11 | 0.0249 | 0.009 | 34.42 | 0.0216 | 0.007 | 30.98 |
The Number of Initial Point Clouds | 5 k | 10 k | 15 k | 25 k | 30 k | 50 k |
---|---|---|---|---|---|---|
3DGS-7000 | 235 | 263 | 288 | 352 | 403 | 846 |
PSGR-7000 | 197 | 217 | 234 | 281 | 312 | 613 |
3DGS-30000 | 1034 | 1352 | 1731 | 2373 | 2599 | 4130 |
PSGR-30000 | 875 | 1125 | 1420 | 1908 | 2042 | 3026 |
Time Reduction (%) | 15.4 | 16.8 | 18.0 | 19.6 | 21.4 | 26.9 |
Init Points | Method | Param Count (M) | .pth Size (MB) | Inference Time (s) | .ply Size (MB) |
---|---|---|---|---|---|
10 k | 3DGS | 22.3 | 144.0 | 0.091 | 49.7 |
PSGR | 24.1 | 63.4 | 0.065 | 22.1 | |
25 k | 3DGS | 22.6 | 190.0 | 0.108 | 65.4 |
PSGR | 24.5 | 95.4 | 0.078 | 32.8 | |
50 k | 3DGS | 23.1 | 396.0 | 0.129 | 136.0 |
PSGR | 24.8 | 211.0 | 0.087 | 72.9 |
Plant | Scene | Metric | PSGR | Instant-NGP | NeRF |
---|---|---|---|---|---|
Grey Star Ctenanthe | Indoor | PSNR | 38.57 | 31.9 | 29.5 |
Time | 8.03 | 2.15 | 374 | ||
Outdoor | PSNR | 30.11 | 28.7 | 26.6 | |
Time | 8.95 | 2.45 | 385 | ||
Snake Plant | Indoor | PSNR | 39.79 | 35.5 | 29.8 |
Time | 6.95 | 2.11 | 377 | ||
Outdoor | PSNR | 34.42 | 31.8 | 28.4 | |
Time | 7.45 | 2.24 | 390 | ||
Money Tree | Indoor | PSNR | 32.75 | 28.6 | 25.3 |
Time | 8.23 | 2.52 | 400 | ||
Outdoor | PSNR | 30.98 | 27.0 | 24.8 | |
Time | 9.22 | 2.24 | 395 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, J.; Jiao, Y.; Jin, F.; Qin, X.; Ning, Y.; Yang, M.; Zhan, Y. Plant Sam Gaussian Reconstruction (PSGR): A High-Precision and Accelerated Strategy for Plant 3D Reconstruction. Electronics 2025, 14, 2291. https://doi.org/10.3390/electronics14112291
Chen J, Jiao Y, Jin F, Qin X, Ning Y, Yang M, Zhan Y. Plant Sam Gaussian Reconstruction (PSGR): A High-Precision and Accelerated Strategy for Plant 3D Reconstruction. Electronics. 2025; 14(11):2291. https://doi.org/10.3390/electronics14112291
Chicago/Turabian StyleChen, Jinlong, Yingjie Jiao, Fuqiang Jin, Xingguo Qin, Yi Ning, Minghao Yang, and Yongsong Zhan. 2025. "Plant Sam Gaussian Reconstruction (PSGR): A High-Precision and Accelerated Strategy for Plant 3D Reconstruction" Electronics 14, no. 11: 2291. https://doi.org/10.3390/electronics14112291
APA StyleChen, J., Jiao, Y., Jin, F., Qin, X., Ning, Y., Yang, M., & Zhan, Y. (2025). Plant Sam Gaussian Reconstruction (PSGR): A High-Precision and Accelerated Strategy for Plant 3D Reconstruction. Electronics, 14(11), 2291. https://doi.org/10.3390/electronics14112291