Generating Multispectral Point Clouds for Digital Agriculture
Abstract
1. Introduction
2. Materials
Field Data Collection
3. Methods
3.1. Data Acquisition
3.2. Filtering Noise from the LiDAR Point Cloud
3.3. Fusion of Data from Sensors
3.3.1. Visibility Maps
Clipping Viewpoints in the Point Cloud
Labelling Occluded Points
3.3.2. Assigning Spectral Values to the Point Cloud
Subdivision of the Point Cloud
3.3.3. Radiometric Discrepancies
4. Results
4.1. Bundle Adjustment
4.2. Point Cloud Registration
4.3. Point Cloud Clipping
4.4. Labelling Occluded Points
4.5. Colouring
4.6. Complete Model
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
| AABB | Axis-Aligned Bounding Boxes |
| BV | Bounding Volume |
| DEM | Digital Elevation Model |
| DN | Digital Number |
| EOP | Exterior Orientation Parameter |
| FoV | Field of View |
| GCP | Ground Control Point |
| HPR | Hidden Points Removal |
| IOP | Interior Orientation Parameter |
| LAI | Leaf Area Index |
| LiDAR | Light Detection and Ranging |
| MLS | Moving Least Squares |
| OBB | Oriented Bounding Box |
| PCL | Point Cloud Library |
| PSOSU | Pixel Size in Object Space Units |
| RGB | Red, Green, Blue |
| TLS | Terrestrial Laser Scanner |
| VFC | Viewing Frustum Culling |
References
- Xie, P.; Du, R.; Ma, Z.; Cen, H. Generating 3D Multispectral Point Clouds of Plants with Fusion of Snapshot Spectral and RGB-D Images. Plant Phenomics 2023, 5, 40. [Google Scholar] [CrossRef]
- Ruoppa, L.; Oinonen, O.; Taher, J.; Lehtomäki, M.; Takhtkeshha, N.; Kukko, A.; Kaartinen, H.; Hyyppä, J. Unsupervised Deep Learning for Semantic Segmentation of Multispectral LiDAR Forest Point Clouds. ISPRS J. Photogramm. Remote Sens. 2025, 228, 694–722. [Google Scholar] [CrossRef]
- Mirbod, O.; Choi, D.; Schueller, J.K. From Simulation to Field Validation: A Digital Twin-Driven Sim2real Transfer Approach for Strawberry Fruit Detection and Sizing. AgriEngineering 2025, 7, 81. [Google Scholar] [CrossRef]
- Rivera, G.; Porras, R.; Florencia, R.; Sánchez-Solís, J.P. LiDAR Applications in Precision Agriculture for Cultivating Crops: A Review of Recent Advances. Comput. Electron. Agric. 2023, 207, 107737. [Google Scholar] [CrossRef]
- Gharibi, H.; Habib, A. True Orthophoto Generation from Aerial Frame Images and LiDAR Data: An Update. Remote Sens. 2018, 10, 581. [Google Scholar] [CrossRef]
- Seitz, S.M.; Dyer, C.R. Photorealistic Scene Reconstruction by Voxel Coloring. Int. J. Comput. Vis. 1999, 35, 151–173. [Google Scholar] [CrossRef]
- Hudak, A.T.; Lefsky, M.A.; Cohen, W.B.; Berterretche, M. Integration of Lidar and Landsat ETM+ Data for Estimating and Mapping Forest Canopy Height. Remote Sens. Environ. 2002, 82, 397–416. [Google Scholar] [CrossRef]
- Popescu, S.; Wynne, R. Seeing the Trees in the Forest: Using Lidar and Multispectral Data Fusion with Local Filtering and Variable Window Size for Estimating Tree Height. Photogramm. Eng. Remote Sens. 2004, 70, 589–604. [Google Scholar] [CrossRef]
- Mundt, J.T.; Streutker, D.R.; Glenn, N.F. Mapping Sagebrush Distribution Using Fusion of Hyperspectral and Lidar Classifications. Photogramm. Eng. Remote Sens. 2006, 72, 47–54. [Google Scholar] [CrossRef]
- Geerling, G.W.; Labrador-Garcia, M.; Clevers, J.; Ragas, A.M.J.; Smits, A.J.M. Classification of Floodplain Vegetation by Data Fusion of Spectral (CASI) and LiDAR Data. Int. J. Remote Sens. 2007, 28, 4263–4284. [Google Scholar] [CrossRef]
- Anderson, J.E.; Plourde, L.C.; Martin, M.E.; Braswell, B.H.; Smith, M.-L.; Dubayah, R.O.; Hofton, M.A.; Blair, J.B. Integrating Waveform Lidar with Hyperspectral Imagery for Inventory of a Northern Temperate Forest. Remote Sens. Environ. 2008, 112, 1856–1870. [Google Scholar] [CrossRef]
- Packalén, P.; Suvanto, A.; Maltamo, M. A Two Stage Method to Estimate Species-Specific Growing Stock. Photogramm. Eng. Remote Sens. 2009, 75, 1451–1460. [Google Scholar] [CrossRef]
- Kampe, T.U.; Johnson, B.R.; Kuester, M.A.; Keller, M. NEON: The First Continental-Scale Ecological Observatory with Airborne Remote Sensing of Vegetation Canopy Biochemistry and Structure. J. Appl. Remote Sens. 2010, 4, 043510. [Google Scholar] [CrossRef]
- Agrowing Development Team. Available online: https://agrowing.com/ (accessed on 3 September 2025).
- Tommaselli, A.M.G.; Berveglieri, A.; Imai, N.N.; Santos, G.H.; Moriya, E.A.S.; Watanabe, F.S.Y.; Neto, L.S. Geometric Performance of a Camera with Single Sensor and Multiple Heads. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 43, 389–396. [Google Scholar] [CrossRef]
- Okhrimenko, M.; Coburn, C.; Hopkinson, C. Investigating Multispectral Lidar Radiometry: An Overview of the Experimental Framework. In Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 8745–8748. [Google Scholar] [CrossRef]
- Crombez, N.; Caron, G.; Mouaddib, E. 3D Point Cloud Model Colorization by Dense Registration of Digital Images. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 123–130. [Google Scholar] [CrossRef]
- Nielsen, M.Ø. True Orthophoto Generation. Master’s Thesis, Technical University of Denmark, Kgs. Lyngby, Denmark, 2004. [Google Scholar]
- Eisemann, M.; Magnor, M. Filtered Blending and Floating Textures: Ghosting-Free Projective Texturing with Multiple Images; TU Braunschweig: Braunschweig, Germany, 2007. [Google Scholar]
- Zhang, H.; Manocha, D.; Hudson, T.; Hoff, K.E. Visibility Culling Using Hierarchical Occlusion Maps. In Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques, Los Angeles, CA, USA, 3–8 August 1997; ACM Press/Addison-Wesley Publishing Co.: New York, NY, USA, 1997; pp. 77–88. [Google Scholar]
- Assarsson, U.; Moller, T. Optimized View Frustum Culling Algorithms for Bounding Boxes. J. Graph. Tools 2000, 5, 9–22. [Google Scholar] [CrossRef]
- Blinn, J.F. Backface Culling Snags (Rendering Algorithm). IEEE Comput. Graph. Appl. 1993, 13, 94–97. [Google Scholar] [CrossRef]
- Cohen-Or, D.; Chrysanthou, Y.L.; Silva, C.T.; Durand, F. A Survey of Visibility for Walkthrough Applications. IEEE Trans. Vis. Comput. Graph. 2003, 9, 412–431. [Google Scholar] [CrossRef]
- Catmull, E.E. A Subdivision Algorithm for Computer Display of Curved Surfaces. Ph.D. Thesis, The University of Utah, Salt Lake City, UT, USA, 1794. [Google Scholar]
- Fuchs, H.; Kedem, Z.M.; Naylor, B.F. On Visible Surface Generation by a Priori Tree Structures. In Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques, Seattle, WA, USA, 14–18 July 1980; pp. 124–133. [Google Scholar]
- Appel, A. Some Techniques for Shading Machine Renderings of Solids. In Proceedings of the AFIPS ’68 (Spring): Spring Joint Computer Conference, Atlantic City, NJ, USA, 30 April–2 May 1968; pp. 37–45. [Google Scholar]
- Whitted, T. An Improved Illumination Model for Shaded Display. In Proceedings of the 6th Annual Conference on Computer Graphics and Interactive Techniques, Chicago, IL, USA, 8–10 August 1979; p. 14. [Google Scholar] [CrossRef]
- Katz, S.; Tal, A.; Basri, R. Direct Visibility of Point Sets. In Proceedings of the SIGGRAPH ’07: ACM SIGGRAPH 2007 Papers, San Diego, CA, USA, 5–9 August 2007; Volume 26, p. 11. [Google Scholar] [CrossRef]
- Rusu, R.B.; Cousins, S. 3D Is Here: Point Cloud Library (PCL). In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 1–4. [Google Scholar] [CrossRef]
- Hu, F.; Lin, C.; Peng, J.; Wang, J.; Zhai, R. Rapeseed Leaf Estimation Methods at Field Scale by Using Terrestrial LiDAR Point Cloud. Agronomy 2022, 12, 2409. [Google Scholar] [CrossRef]
- Alexa, M.; Behr, J.; Cohen-Or, D.; Fleishman, S.; Levin, D.; Silva, C.T. Computing and Rendering Point Set Surfaces. IEEE Trans. Vis. Comput. Graph. 2003, 9, 3–15. [Google Scholar] [CrossRef]
- Shen, C.; O’Brien, J.F.; Shewchuk, J.R. Interpolating and Approximating Implicit Surfaces from Polygon Soup. In Computer Graphics Proceedings, Annual Conference Series; Association for Computing Machinery: New York, NY, USA, 2004; pp. 896–904. [Google Scholar]
- Fleishman, S.; Cohen-Or, D.; Silva, C.T. Robust Moving Least-Squares Fitting with Sharp Features. ACM Trans. Graph. 2005, 24, 544–552. [Google Scholar] [CrossRef]
- Jenke, P.; Wand, M.; Bokeloh, M.; Schilling, A.; Straßer, W. Bayesian Point Cloud Reconstruction. Comput. Graph. Forum 2006, 25, 379–388. [Google Scholar] [CrossRef]
- Bishop, L.; Eberly, D.; Whitted, T.; Finch, M.; Shantz, M. Designing a PC Game Engine. IEEE Comput. Graph. Appl. 1998, 18, 46–53. [Google Scholar] [CrossRef]
- DirectModel. DirectModel: 1.0 Specification, Hewlett Packard Company: Corvalis, OR, USA, 1990.
- Krishnan, A. Point Cloud Library (PCL). Available online: https://pointclouds.org/documentation/frustum__culling_8h_source.html (accessed on 20 January 2024).
- Argueta, C. Filtering a Point Cloud to Match the Field of View of the Camera. Available online: https://medium.com/@kidargueta/filtering-a-point-cloud-to-match-the-field-of-view-of-the-camera-ccab0d189b58 (accessed on 20 January 2024).
- Katz, S.; Tal, A. On the Visibility of Point Clouds. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 1350–1358. [Google Scholar]
- MathWorks. Computing the Convex Hull—MATLAB & Simulink. Available online: https://www.mathworks.com/help/matlab/math/computing-the-convex-hull.html (accessed on 21 January 2024).
- Norberto, I.S.; Faria Junior, C.D.S.; Tommaselli, A.M.G.; Galo, M.; Silva, R.M. MLP-Based Classification of Multispectral Point Clouds for Digital Agriculture. In Proceedings of the ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Dubai, United Arab Emirates, 6–11 April 2025. [Google Scholar]
- Reji, J.; Nidamanuri, R.R. Deep Learning Based Fusion of LiDAR Point Cloud and Multispectral Imagery for Crop Classification Sensitive to Nitrogen Level. In Proceedings of the 2023 International Conference on Machine Intelligence for GeoAnalytics and Remote Sensing (MIGARS), Hyderabad, India, 27–29 January 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–4. [Google Scholar]




















| Parameter | Coffee Tree Dataset | Apple Tree Dataset |
|---|---|---|
| Number of images | 50 | 35 |
| Camera-to-object distance | 1.8 m | 2 m |
| Camera orientation | Horizontal and frontal | Horizontal and convergent |
| Exposure settings | 1/400 s (850 nm); 1/200 s (other bands) | 1/500 s |
| LiDAR resolution (pt/m2) | 814 pt/m2 | 102.1 pt/m2, 203.9 pt/m2, 407.8 pt/m2 and 814 pt/m2 |
| Weather conditions | Sunny (wind: 4.1 km/h) | Clear sky (wind: 11.6 km/h) * |
| Number of plants | 1 coffee tree | 1 apple tree |
| Sampling intervals (LiDAR) | One single station in front of the plant | Four stations surrounding the plant |
| Sampling intervals (camera) | 5 height levels at 10 lateral positions with 0.35 m base | Convergent acquisition with a base length of approximately 1.8 m |
| Parameter | Description | Value | Source |
|---|---|---|---|
| Camera pose | Perspective centre | (X0, Y0 rotated by −90°, Z0) | Exterior orientation parameters |
| FoV | Horizontal and vertical | H: 25° V: 25° | Camera specification |
| Near plane distance | Minimum distance from the camera to the frustum | 1 m (coffee) 1 m (apple) | >0 m to define frustum shape [37] |
| Far plane distance | Maximum distance from the camera to the frustum | 4 m (coffee) 3 m (apple) | Empirically defined based on scene depth |
| Sub-Images (Bands) | X (mm) | Y (mm) | Z (mm) |
|---|---|---|---|
| S1 (570 nm) | 0.70 | 0.84 | 0.53 |
| S2 (525 nm) | 0.65 | 0.84 | 0.67 |
| S3 (850 nm) | 0.80 | 0.75 | 0.58 |
| S4 (550 nm) | 0.68 | 0.85 | 0.67 |
| S5 (490 nm) | 0.84 | 0.63 | 0.70 |
| S6 (560 nm) | 0.82 | 0.87 | 0.45 |
| Resolution (pt/m2) | MAE XY (mm) | MAE Z (mm) |
|---|---|---|
| 102.1 | 4.8 | 1.3 |
| 203.9 | 3.5 | 1.1 |
| 407.8 | 4.6 | 0.6 |
| 814 | 2.0 | 0.4 |
| Sub-Images (Bands) | RMSE X (mm) | RMSE Y (mm) | RMSE Z (mm) |
|---|---|---|---|
| S1 (570 nm) | 9.7 | 9.9 | 4.8 |
| S2 (525 nm) | 4.5 | 25.6 | 5.9 |
| S3 (850 nm) | 28.6 | 17.3 | 5.6 |
| S4 (550 nm) | 5.1 | 5.8 | 2.1 |
| S5 (490 nm) | 5.5 | 8.3 | 4.7 |
| S6 (560 nm) | 9.2 | 18.3 | 4.8 |
| Sub-Images (Bands) | RMSE X (mm) | RMSE Y (mm) | RMSE Z (mm) |
|---|---|---|---|
| S1 (570 nm) | 4.1 | 0.6 | 1.7 |
| S2 (525 nm) | 4.0 | 2.7 | 2.5 |
| S3 (850 nm) | 6.3 | 6.5 | 5.0 |
| S4 (550 nm) | 3.5 | 2.7 | 1.9 |
| S5 (490 nm) | 5.8 | 2.9 | 1.4 |
| S6 (560 nm) | 4.3 | 2.9 | 2.8 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Norberto, I.S.; Tommaselli, A.M.G.; Shimabukuro, M.H. Generating Multispectral Point Clouds for Digital Agriculture. AgriEngineering 2025, 7, 407. https://doi.org/10.3390/agriengineering7120407
Norberto IS, Tommaselli AMG, Shimabukuro MH. Generating Multispectral Point Clouds for Digital Agriculture. AgriEngineering. 2025; 7(12):407. https://doi.org/10.3390/agriengineering7120407
Chicago/Turabian StyleNorberto, Isabella Subtil, Antonio Maria Garcia Tommaselli, and Milton Hirokazu Shimabukuro. 2025. "Generating Multispectral Point Clouds for Digital Agriculture" AgriEngineering 7, no. 12: 407. https://doi.org/10.3390/agriengineering7120407
APA StyleNorberto, I. S., Tommaselli, A. M. G., & Shimabukuro, M. H. (2025). Generating Multispectral Point Clouds for Digital Agriculture. AgriEngineering, 7(12), 407. https://doi.org/10.3390/agriengineering7120407

