BLAINDER—A Blender AI Add-On for Generation of Semantically Labeled Depth-Sensing Data
Abstract
:1. Introduction
1.1. Related Work
- The sensor system is directly linked to the Blender software and thus to a wide range of options and tools for creating and animating virtual worlds.
- In addition to LiDAR, we have implemented Sonar simulation for the first time with this level of accuracy and consider the specific properties of water bodies and Sonar characteristics.
- Besides labeling the point clouds for LiDAR and Sonar, our tool can generate image annotations and use Blender’s rendering qualities for labeled image data. The export scope of the synthetic data exceeds all publications mentioned.
1.2. Structure of This Article
2. Fundamentals of Depth-Sensing
2.1. Light Detection and Ranging (LiDAR)
2.2. Sound Navigation and Ranging (Sonar)
2.2.1. Spreading of Waterborne Sound
2.3. Interaction of Light and Sound with Matter
2.3.1. Reflection
2.3.2. Refraction
2.3.3. Measurement Errors Induced by Reflection and Refraction
2.3.4. Atmospheric Influences (Dust, Rain, Fog, Snow) on LiDAR
2.3.5. Random Measurement Error
3. Modules of the Implementation
3.1. Scene Construction/Virtual Environments
- For variation of a specific scene, the worlds can be generated completely or mostly procedurally. This is particularly useful for scene containing a lot of nature such as landscapes or vegetation (see Figure 4a).
- The worlds can be semi-static, i.e., an essential part is static even with advancing time (e.g., buildings, illustrated in Figure 4b using the example of an airport), while only a smaller part of the scene varies (e.g., specific aircraft models).
- By using animations, variation can be created within a topologically constant scene (translation of the sensor, movement of figures, physical simulation). See Section 3.1.2 for this.
3.1.1. Semi-Static Scenes
3.1.2. Animations and Physics Simulation
3.2. Virtual Sensors
3.2.1. Predefined Sensors
3.2.2. Adding Noise
3.3. Signal Processing and Physical Effects
3.3.1. Sound Profile in Water
3.3.2. Modeling Surface Properties with Materials
3.4. Semantic Labeling: Object Category
3.5. Data Export
- point position in space (X-, Y- and Z-coordinate)
- semantic label
- intensity of the measuring point
- color of the object surface
- distance between sensor and object surface
- Single frames: For each animation step a separate point-cloud is created. Since the LAS and CSV formats do not allow a hierarchical structure, one file per time step is generated. In contrast, the representation of a time step in HDF format is done by one line in the data set. Please note that the length of the lines may vary. Therefore, a variable length array must be used as data type.
- Summarized: All animation steps are also simulated separately. The difference, however, is that all data is summarized in a point-cloud at the end and stored in a single file.
4. Results
4.1. Depth Cameras and Range Scanners (LiDAR, ToF)
4.2. Semantically Labeled 2D Images
4.3. Animations
4.4. Sound Navigation and Ranging (Sonar)
5. Evaluation
5.1. Validation of Measurements
5.2. Runtime Performance
5.2.1. Number of Measurement Points
5.2.2. Number of Objects
5.2.3. Weather Simulation
5.2.4. Comparison to Similar Applications
6. Conclusions and Outlook
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AI | Artificial Intelligence |
BRDF | bidirectional reflectance distribution function |
BSDF | bidirectional scattering distribution function |
BTDF | bidirectional transmittance distribution function |
CSG | Constructive Solid Geometry |
FOV | field of view |
HDF | hierarchical data format |
ICP | Iterative Closest Point Algorithm |
LiDAR | Light Detection and Ranging |
ML | Machine Learning |
Radar | Radio Detection and Ranging |
Sonar | Sound Navigation and Ranging |
ToF | Time-of-Flight |
References
- Nakagawa, M. Point Cloud Clustering Using Panoramic Layered Range Image. In Recent Applications in Data Clustering; IntechOpen: London, UK, 2018. [Google Scholar] [CrossRef] [Green Version]
- Kisner, H.; Thomas, U. Segmentation of 3D Point Clouds using a New Spectral Clustering Algorithm without a-priori Knowledge. In Proceedings of the 13th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications—Volume 4: VISAPP, Funchal, Portugal, 27–29 January 2018; SciTePress: Setúbal, Portugal, 2018; pp. 315–322. [Google Scholar] [CrossRef]
- Aggarwal, C.C.; Reddy, C.K. Data Clustering—Algorithms and Applications; CRC Press: Boca Raton, FL, USA, 2013. [Google Scholar]
- Chen, L.; Zhu, Y.; Papandreou, G.; Schroff, F.; Adam, H. Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation. arXiv 2018, arXiv:1802.02611. [Google Scholar]
- Rajpura, P.S.; Goyal, M.; Bojinov, H.; Hegde, R.S. Dataset Augmentation with Synthetic Images Improves Semantic Segmentation. arXiv 2017, arXiv:1709.00849. [Google Scholar]
- Harvey, A. Synthetic Datasets for Conflict Zones. Available online: https://vframe.io/research/synthetic-datasets/ (accessed on 21 January 2021).
- Qi, C.R.; Su, H.; Mo, K.; Guibas, L.J. PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation. arXiv 2016, arXiv:1612.00593. [Google Scholar]
- Qi, C.R.; Yi, L.; Su, H.; Guibas, L.J. PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space. arXiv 2017, arXiv:1706.02413. [Google Scholar]
- Yi, L.; Kim, V.G.; Ceylan, D.; Shen, I.C.; Yan, M.; Su, H.; Lu, C.; Huang, Q.; Sheffer, A.; Guibas, L. A Scalable Active Framework for Region Annotation in 3D Shape Collections. ACM Trans. Graph. 2016, 35. [Google Scholar] [CrossRef]
- Gastellu-Etchegorry, J.P.; Yin, T.; Lauret, N.; Cajgfinger, T.; Gregoire, T.; Grau, E.; Feret, J.B.; Lopes, M.; Guilleux, J.; Dedieu, G.; et al. Discrete Anisotropic Radiative Transfer (DART 5) for Modeling Airborne and Satellite Spectroradiometer and LIDAR Acquisitions of Natural and Urban Landscapes. Remote Sens. 2015, 7, 1667–1701. [Google Scholar] [CrossRef] [Green Version]
- Wang, Y.; Xie, D.; Yan, G.; Zhang, W.; Mu, X. Analysis on the inversion accuracy of LAI based on simulated point clouds of terrestrial LiDAR of tree by ray tracing algorithm. In Proceedings of the 2013 IEEE International Geoscience and Remote Sensing Symposium—IGARSS, Melbourne, Australia, 21–26 July 2013; pp. 532–535. [Google Scholar] [CrossRef]
- Kim, S.; Lee, I.; Lee, M. LIDAR waveform simulation over complex targets. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 517–522. [Google Scholar] [CrossRef] [Green Version]
- Hodge, R.A. Using simulated Terrestrial Laser Scanning to analyse errors in high-resolution scan data of irregular surfaces. ISPRS J. Photogramm. Remote Sens. 2010, 65, 227–240. [Google Scholar] [CrossRef]
- Kukko, A.; Hyyppä, J. Small-footprint laser scanning simulator for system validation, error assessment, and algorithm development. Photogramm. Eng. Remote Sens. 2009, 75, 1177–1189. [Google Scholar] [CrossRef]
- Kim, S.; Min, S.; Kim, G.; Lee, I.; Jun, C. Data simulation of an airborne lidar system. In Laser Radar Technology and Applications XIV; Turner, M.D., Kamerman, G.W., Eds.; SPIE: Bellingham, WA, USA, 2009; Volume 7323, pp. 85–94. [Google Scholar] [CrossRef]
- Morsdorf, F.; Frey, O.; Koetz, B.; Meier, E. Ray tracing for modeling of small footprint airborne laser scanning returns. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2007, 36, 249–299. [Google Scholar]
- Gusmão, G.F.; Barbosa, C.R.H.; Raposo, A.B. Development and Validation of LiDAR Sensor Simulators Based on Parallel Raycasting. Sensors 2020, 20, 7186. [Google Scholar] [CrossRef]
- Hanke, T.; Schaermann, A.; Geiger, M.; Weiler, K.; Hirsenkorn, N.; Rauch, A.; Schneider, S.; Biebl, E. Generation and validation of virtual point cloud data for automated driving systems. In Proceedings of the 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), Yokohama, Japan, 16–19 October 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Wang, F.; Zhuang, Y.; Gu, H.; Hu, H. Automatic Generation of Synthetic LiDAR Point Clouds for 3-D Data Analysis. IEEE Trans. Instrum. Meas. 2019, 68, 2671–2673. [Google Scholar] [CrossRef] [Green Version]
- Fang, J.; Zhou, D.; Yan, F.; Zhao, T.; Zhang, F.; Ma, Y.; Wang, L.; Yang, R. Augmented LiDAR Simulator for Autonomous Driving. IEEE Robot. Autom. Lett. 2020, 5, 1931–1938. [Google Scholar] [CrossRef] [Green Version]
- Zhao, S.; Wang, Y.; Li, B.; Wu, B.; Gao, Y.; Xu, P.; Darrell, T.; Keutzer, K. ePointDA: An End-to-End Simulation-to-Real Domain Adaptation Framework for LiDAR Point Cloud Segmentation. arXiv 2020, arXiv:2009.03456. [Google Scholar]
- Mokrane, H.B.; De Souza, P. LIDAR Sensor Simulation in Adverse Weather Condition for Driving Assistance Development. Available online: https://hal.archives-ouvertes.fr/hal-01998668/ (accessed on 18 March 2021).
- Tallavajhula, A. Lidar Simulation for Robotic Application Development: Modeling and Evaluation. Ph.D. Thesis, Carnegie Mellon University, Pittsburgh, PA, USA, 1 May 2018. [Google Scholar] [CrossRef]
- Boucher, P.B.; Hancock, S.; Orwig, D.A.; Duncanson, L.; Armston, J.; Tang, H.; Krause, K.; Cook, B.; Paynter, I.; Li, Z.; et al. Detecting Change in Forest Structure with Simulated GEDI Lidar Waveforms: A Case Study of the Hemlock Woolly Adelgid (HWA; Adelges tsugae) Infestation. Remote Sens. 2020, 12, 1304. [Google Scholar] [CrossRef] [Green Version]
- Yun, T.; Cao, L.; An, F.; Chen, B.; Xue, L.; Li, W.; Pincebourde, S.; Smith, M.J.; Eichhorn, M.P. Simulation of multi-platform LiDAR for assessing total leaf area in tree crowns. Agric. For. Meteorol. 2019, 276–277, 107610. [Google Scholar] [CrossRef]
- Qiu, W.; Zhong, F.; Zhang, Y.; Qiao, S.; Xiao, Z.; Kim, T.S.; Wang, Y. UnrealCV: Virtual worlds for computer vision. In Proceedings of the 25th ACM international Conference on Multimedia, Mountain View, CA, USA, 23–27 October 2017; pp. 1221–1224. [Google Scholar]
- Tibom, P.; Alldén, T.; Chemander, M.; Davar, S.; Jansson, J.; Laurenius, R. Virtual Generation of Lidar Data for Autonomous Vehicles. Bachelor’s Thesis, Chalmers University of Technology, Göteborg, Sweden, 2017. Available online: http://hdl.handle.net/2077/53342 (accessed on 21 January 2021).
- dSPACE GmbH. Sensor Simulation. Available online: https://www.dspace.com/de/gmb/home/products/sw/sensor_sim.cfm (accessed on 21 January 2021).
- Presagis USA Inc. Ondulus LiDAR Sensor Simulation Software. Available online: https://www.presagis.com/en/product/ondulus-lidar/ (accessed on 21 January 2021).
- Laboratory for Analysis and Architecture of Systems. Modular OpenRobots Simulation Engine. Available online: http://morse-simulator.github.io/ (accessed on 21 January 2021).
- Open Source Robotics Foundation. Gazebo. Available online: http://gazebosim.org/ (accessed on 21 January 2021).
- Webots. Available online: http://www.cyberbotics.com (accessed on 21 January 2021).
- Manivasagam, S.; Wang, S.; Wong, K.; Zeng, W.; Sazanovich, M.; Tan, S.; Yang, B.; Ma, W.C.; Urtasun, R. LiDARsim: Realistic LiDAR Simulation by Leveraging the Real World. Available online: https://openaccess.thecvf.com/content_CVPR_2020/html/Manivasagam_LiDARsim_Realistic_LiDAR_Simulation_by_Leveraging_the_Real_World_CVPR_2020_paper.html (accessed on 18 March 2021).
- Gschwandtner, M. Support Framework for Obstacle Detection on Autonomous Trains. Ph.D. Thesis, University of Salzburg, Salzburg, Austria, 2013. [Google Scholar]
- Denninger, M.; Sundermeyer, M.; Winkelbauer, D.; Zidan, Y.; Olefir, D.; Elbadrawy, M.; Lodhi, A.; Katam, H. BlenderProc. arXiv 2019, arXiv:1911.01911. [Google Scholar]
- Bechtold, S.; Höfle, B. Helios: A Multi-Purpose LIDAR Simulation Framework for Research, Planning and Training of Laser Scanning Operations with Airborne, Ground-Based Mobile and Stationary Platforms. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, III-3, 161–168. [Google Scholar] [CrossRef] [Green Version]
- Winiwarter, L.; Pena, A.M.E.; Weiser, H.; Anders, K.; Sanchez, J.M.; Searle, M.; Höfle, B. Virtual laser scanning with HELIOS++: A novel take on ray tracing-based simulation of topographic 3D laser scanning. arXiv 2021, arXiv:2101.09154. [Google Scholar]
- Neumann, M. GitHub—Neumicha/Blender2Helios: Blender Addon to Convert a Blender Scene to a Helios Scene (LiDAR Simulation, 3D Point Clouds). Available online: https://github.com/neumicha/Blender2Helios (accessed on 21 January 2021).
- Everingham, M.; Gool, L.V.; Williams, C.K.I.; Winn, J.; Zisserman, A. The Pascal Visual Object Classes (VOC) Challenge. Int. J. Comput. Vis. 2010, 88, 303–338. [Google Scholar] [CrossRef] [Green Version]
- Xiang, Y.; Mottaghi, R.; Savarese, S. Beyond PASCAL: A benchmark for 3D object detection in the wild. In Proceedings of the IEEE Winter Conference on Applications of Computer Vision, Steamboat Springs, CO, USA, 24–26 March 2014; pp. 75–82. [Google Scholar] [CrossRef]
- Chang, A.X.; Funkhouser, T.A.; Guibas, L.J.; Hanrahan, P.; Huang, Q.; Li, Z.; Savarese, S.; Savva, M.; Song, S.; Su, H.; et al. ShapeNet: An Information-Rich 3D Model Repository. arXiv 2015, arXiv:1512.03012. [Google Scholar]
- Nikolenko, S.I. Synthetic Data for Deep Learning. arXiv 2019, arXiv:1909.11512. [Google Scholar]
- Mutto, C.D.; Zanuttigh, P.; Cortelazzo, G.M. Time-of-Flight Cameras and Microsoft Kinect™; Springer: Boston, MA, USA, 2012. [Google Scholar] [CrossRef] [Green Version]
- Li, Y.; Duthon, P.; Colomb, M.; Ibanez-Guzman, J. What Happens for a ToF LiDAR in Fog? IEEE Trans. Intell. Transp. Syst. 2020, 1–12. [Google Scholar] [CrossRef]
- Goodin, C.; Carruth, D.; Doude, M.; Hudson, C. Predicting the influence of rain on LIDAR in ADAS. Electronics 2019, 8, 89. [Google Scholar] [CrossRef] [Green Version]
- Hansen, R.E. Introduction to Sonar. Available online: https://www.uio.no/studier/emner/matnat/ifi/INF-GEO4310/h12/undervisningsmateriale/sonar_introduction_2012_compressed.pdf (accessed on 21 January 2021).
- Cook, J.C. Target Strength and Echo Structure. In Adaptive Methods in Underwater Acoustics; Urban, H.G., Ed.; Springer: Dordrecht, The Netherlands, 1985; pp. 155–172. [Google Scholar] [CrossRef]
- University of Rhode Island and Inner Space Center. SONAR Equation. Available online: https://dosits.org/science/advanced-topics/sonar-equation/ (accessed on 21 January 2021).
- Federation of American Scientists. Introduction to SONAR. Available online: https://fas.org/man/dod-101/navy/docs/es310/uw_acous/uw_acous.htm (accessed on 21 January 2021).
- Coates, R.F.W. Underwater Acoustic Systems; Macmillan Education: London, UK, 1990. [Google Scholar] [CrossRef] [Green Version]
- Ainslie, M. Principles of Sonar Performance Modelling; Springer: Berlin/Heidelberg, Gemany, 2010. [Google Scholar] [CrossRef]
- Hatzky, J. Analyse von Bathymetrie und akustischer Rückstreuung verschiedener Fächersonar- und Sedimentecholot-Systeme zur Charakterisierung und Klassifizierung des Meeresbodens am Gakkel-Rücken, Arktischer Ozean. Ph.D. Thesis, University of Bremen, Bremen, Germany, 2009. [Google Scholar] [CrossRef]
- Marschner, S.; Shirley, P. Fundamentals of Computer Graphics; CRC Press: Boca Raton, FL, USA, 2015; Volume 2, p. 4. [Google Scholar]
- Blender Foundation. Surfaces—Blender Manual. Available online: https://docs.blender.org/manual/en/latest/render/materials/components/surface.html#bsdf-parameters (accessed on 21 January 2021).
- Wikipedia Contributors. Snell’s Law—Wikipedia, The Free Encyclopedia. Available online: https://en.wikipedia.org/wiki/Snell%27s_law (accessed on 21 January 2021).
- Miyazaki, D.; Ikeuchi, K. Inverse polarization raytracing: Estimating surface shapes of transparent objects. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; Volume 2, pp. 910–917. [Google Scholar]
- The Charles Stark Draper Laboratory, Inc. Draper Catches a Photon and Earns a Patent for its Invention. Available online: https://www.draper.com/news-releases/draper-catches-photon-and-earns-patent-its-invention (accessed on 18 March 2021).
- Lewandowski, P.; Eichinger, W.; Kruger, A.; Krajewski, W. Lidar-Based Estimation of Small-Scale Rainfall: Empirical Evidence. J. Atmos. Ocean. Technol. 2009, 26, 656–664. [Google Scholar] [CrossRef]
- Trierweiler, M.; Caldelas, P.; Groninger, G.; Peterseim, T.; Neumann, C. Influence of sensor blockage on automotive LiDAR systems. In Proceedings of the 2019 IEEE SENSORS, Montreal, QC, Canada, 27–30 October 2019. [Google Scholar] [CrossRef]
- Phillips, T.; Guenther, N.; Mcaree, P. When the Dust Settles: The Four Behaviors of LiDAR in the Presence of Fine Airborne Particulates. J. Field Robot. 2017, 34. [Google Scholar] [CrossRef]
- Rasshofer, R.; Spies, M.; Spies, H. Influences of weather phenomena on automotive laser radar systems. Adv. Radio Sci. 2011, 9. [Google Scholar] [CrossRef] [Green Version]
- Gao, T.; Song, Y.; Zhang, G.; Liang, L.; Gao, F.; Du, J.; Dai, W. Effects of temperature environment on ranging accuracy of lidar. Proc. SPIE 2018, 273. [Google Scholar] [CrossRef]
- Eden, K.; Gebhard, H. Dokumentation in der Mess- und Prüftechnik; Vieweg+Teubner Verlag: Berlin, Germany, 2012. [Google Scholar] [CrossRef]
- Braßel, H.; Zouhar, A.; Fricke, H. 3D Modeling of the Airport Environment for Fast and Accurate LiDAR Semantic Segmentation of Apron Operations. In Proceedings of the 2020 AIAA/IEEE 39th Digital Avionics Systems Conference (DASC), San Antonio, TX, USA, 11–15 October 2020. [Google Scholar]
- The SciPy Community. numpy.random.Generator.normal—NumPy v1.19 Manual. Available online: https://numpy.org/doc/stable/reference/random/generated/numpy.random.Generator.normal.html#numpy.random.Generator.normal (accessed on 21 January 2021).
- Aghaei, A. Necessity and Challenges of Sensor Simulation for Autonomous Vehicle Development. Available online: https://medium.com/@metamoto/necessity-and-challenges-of-sensor-simulation-for-autonomous-vehicle-development-486bc894fd08 (accessed on 21 January 2021).
- American Society for Photogrammetry and Remote Sensing (ASPRS). LAser (Las) File Format Exchange Activities. Available online: https://www.asprs.org/divisions-committees/lidar-division/laser-las-file-format-exchange-activities (accessed on 21 January 2021).
- American Society for Photogrammetry and Remote Sensing (ASPRS). LAS Specification 1.4-R15. Available online: http://www.asprs.org/wp-content/uploads/2019/07/LAS_1_4_r15.pdf (accessed on 21 January 2021).
- Brown, G. laspy/laspy: Laspy Is a Pythonic Interface for Reading/Modifying/Creating. LAS LIDAR Files Matching Specification 1.0–1.4. Available online: https://github.com/laspy/laspy (accessed on 21 January 2021).
- The HDF Group. The HDF5™ Library & File Format. Available online: https://www.hdfgroup.org/solutions/hdf5/ (accessed on 21 January 2021).
- The HDF Group. Introduction to HDF5. Available online: https://portal.hdfgroup.org/display/HDF5/Introduction+to+HDF5 (accessed on 21 January 2021).
- Andrew Collette & Contributers. HDF5 for Python. Available online: https://www.h5py.org/ (accessed on 21 January 2021).
- Carter, A. GitHub–AndrewCarterUK/Pascal-Voc-Writer: A Python Library for Generating Annotations in the PASCAL VOC Format. Available online: https://github.com/AndrewCarterUK/pascal-voc-writer (accessed on 21 January 2021).
- Zhang, Z. GitHub—Zchrissirhcz/Imageset-Viewer: Pascal VOC BBox Viewer. Available online: https://github.com/zchrissirhcz/imageset-viewer (accessed on 21 January 2021).
- Alexdizz. Free 3D Chair Model. Available online: https://free3d.com/de/3d-model/chair-255345.html (accessed on 21 January 2021).
- Zhang, Z.; Rebecq, H.; Forster, C.; Scaramuzza, D. Benefit of large field-of-view cameras for visual odometry. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 801–808. [Google Scholar] [CrossRef] [Green Version]
- Yang, L.; Zhang, L.; Dong, H.; Alelaiwi, A.; El Saddik, A. Evaluating and Improving the Depth Accuracy of Kinect for Windows v2. IEEE Sens. J. 2015, 15. [Google Scholar] [CrossRef]
- Weidlich, M. Vor Ort—Bergwerk Reiche Zeche Freiberg—Wir Sind 360 Grad. Available online: https://tour.360grad-team.com/de/vt/4HtbRD8Q3w/d/23720/siv/1?view.hlookat=114.98&view.vlookat=9.89&view.fov=120 (accessed on 21 January 2021).
- GIScience Research Group. Simulation Seems Very Slow/Disable Full-Wave Issue 29 GIScience/Helios. Available online: https://github.com/GIScience/helios/issues/29 (accessed on 21 January 2021).
Name | Category | Type |
---|---|---|
Generic LiDAR | LiDAR | rotating |
Generic Sonar | Sonar | sideScan |
Velodyne UltraPuck | LiDAR | rotating |
Velodyne AlphaPuck | LiDAR | rotating |
Microsoft Kinect v1 (default mode) | ToF | static |
Microsoft Kinect v1 (near mode) | ToF | static |
Microsoft Kinect v2 (default mode) | ToF | static |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Reitmann, S.; Neumann, L.; Jung, B. BLAINDER—A Blender AI Add-On for Generation of Semantically Labeled Depth-Sensing Data. Sensors 2021, 21, 2144. https://doi.org/10.3390/s21062144
Reitmann S, Neumann L, Jung B. BLAINDER—A Blender AI Add-On for Generation of Semantically Labeled Depth-Sensing Data. Sensors. 2021; 21(6):2144. https://doi.org/10.3390/s21062144
Chicago/Turabian StyleReitmann, Stefan, Lorenzo Neumann, and Bernhard Jung. 2021. "BLAINDER—A Blender AI Add-On for Generation of Semantically Labeled Depth-Sensing Data" Sensors 21, no. 6: 2144. https://doi.org/10.3390/s21062144
APA StyleReitmann, S., Neumann, L., & Jung, B. (2021). BLAINDER—A Blender AI Add-On for Generation of Semantically Labeled Depth-Sensing Data. Sensors, 21(6), 2144. https://doi.org/10.3390/s21062144