A Framework of Wearable Sensor-System Development for Urban 3D Modeling
Abstract
:1. Introduction
1.1. Previous Studies
1.2. Purpose of Study
2. Design and Production of Wearable Sensor System
2.1. Platform and Sensors
2.2. Design of Wearable Sensor System
- Criterion 1. One LiDAR sensor is installed to acquire data about the surrounding environment without gaps, while one LiDAR is installed horizontally on the ground to enable data fusion using algorithms such as SLAM (Simultaneous Localization And Mapping) or ICP (Iterative Close Point).
- Criterion 2. For efficient data acquisition, the overlapping of FOV is minimized between cameras of the same type, thereby allowing data for a wider area to be acquired with one shot.
- Criterion 3. The FOV of the optical camera and LiDAR allows sufficient overlap to enable efficient data fusion between different types of sensors.
- Criterion 4. All sensors are positioned to exhibit a sufficient overlapping of FOV with at least one other sensor to ensure system calibration accuracy.
3. Calibration Framework for Wearable Sensor Systems
3.1. Sensor-Bundle Configuration
3.2. Point-Plane Correspondences Based System Calibration for Sensor Bundles
3.2.1. Point-Plane Correspondence of the Proposed System Calibration Method
3.2.2. Application of the Proposed System Calibration Method
3.2.3. Advantages of the Proposed System Calibration Method
- First, the proposed methodology only requires information regarding the target plane from the point cloud. It does not require an additional complicated process to accurately extract the edge points or lines of the target plane and is less affected by the density of the point cloud.
- Second, arbitrarily determined initial GCPs employing the BA process are used instead of the coordinates of previously measured GCPs. Therefore, the need for prior work measuring the coordinates of GCPs using a total station, etc., is eliminated and the time required to perform system calibration can be reduced.
- Third, because the proposed method is based on BA using the collinearity condition, implementing it utilizing various existing open-source codes is easy.
- Fourth, the proposed method offers the advantage that when sensor data for part of the target plane is acquired, information on the entire target can be used for calibration.
3.3. Global Optimization for Wearable System
3.4. Test Bed Configuration and Data Acquisition for Calibration
4. Evaluation of System Calibration Results
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Danilina, N.; Slepnev, M.; Chebotarev, S. Smart city: Automatic reconstruction of 3D building models to support urban development and planning. MATEC Web Conf. 2018, 251, 03047. [Google Scholar] [CrossRef]
- Anbari, S.; Majidi, B.; Movaghar, A. 3D modeling of urban environment for efficient renewable energy production in the smart city. In Proceedings of the 2019 7th Iranian Joint Congress on Fuzzy and Intelligent Systems, Bojnord, Iran, 29–31 January 2019. [Google Scholar]
- Gong, Z.; Li, J.; Luo, Z.; Wen, C.; Wang, C.; Zelek, J. Mapping and semantic modeling of underground parking lots using a backpack LiDAR system. IEEE Trans. Intell. Transp. Syst. 2019, 22, 734–746. [Google Scholar] [CrossRef]
- Hämäläinen, M. Smart city development with digital twin technology. In Proceedings of the 33rd Bled eConference-Enabling Technology for a Sustainable Society, Online Conference Proceedings, Bled, Slovenia, 28–29 June 2020. [Google Scholar]
- Huhle, B.; Jenke, P.; Straßer, W. On-the-fly scene acquisition with a handy multi-sensor system. Int. J. Intell. Syst. Technol. Appl. 2008, 5, 255–263. [Google Scholar] [CrossRef]
- Haala, N.; Jan, B. A multi-sensor system for positioning in urban environments. ISPRS J. Photogramm. Remote Sens. 2003, 58, 31–42. [Google Scholar] [CrossRef]
- Guidi, G.; Russo, M.; Ercoli, S.; Remondino, F.; Rizzi, A.; Menna, F. A multi-resolution methodology for the 3D modeling of large and complex archeological areas. Int. J. Archit. Comput. 2009, 7, 39–55. [Google Scholar] [CrossRef]
- Shim, H.; Adelsberger, R.; Kim, J.D.; Rhee, S.-M.; Rhee, T.; Sim, J.-Y.; Gross, M.; Kim, C. Time-of-flight sensor and color camera calibration for multi-view acquisition. Vis. Comput. 2012, 28, 1139–1151. [Google Scholar] [CrossRef]
- Nair, R.; Ruhl, K.; Lenzen, F.; Meister, S.; Schäfer, H.; Garbe, C.S.; Eisemann, M.; Magnor, M.; Kondermann, D. A survey on time-of-flight stereo fusion. In Time-of-Flight and Depth Imaging: Sensors, Algorithms, and Applications; Grzegorzek, M., Theobalt, C., Koch, R., Kolb, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; Volume 8200, pp. 105–127. [Google Scholar]
- Chahine, G.; Vaidis, M.; Pomerleau, F.; Pradalier, C. Mapping in unstructured natural environment: A sensor fusion framework for wearable sensor suites. Appl. Sci. 2001, 3, 571. [Google Scholar] [CrossRef]
- Di Filippo, A.; Sánchez-Aparicio, L.J.; Barba, S.; Martín-Jiménez, J.A.; Mora, R.; González Aguilera, D. Use of a wearable mobile laser system in seamless indoor 3D mapping of a complex historical site. Remote Sens. 2018, 10, 1897. [Google Scholar] [CrossRef]
- Cabo, C.; Del Pozo, S.; Rodríguez-Gonzálvez, P.; Ordóñez, C.; Gonzalez-Aguilera, D. Comparing terrestrial laser scanning (TLS) and wearable laser scanning (WLS) for individual tree modeling at plot level. Remote Sens. 2018, 10, 540. [Google Scholar] [CrossRef]
- Liu, H.; Liu, R.; Yang, K.; Zhang, J.; Peng, K.; Stiefelhagen, R. Hida: Towards holistic indoor understanding for the visually impaired via semantic instance segmentation with a wearable solid-state lidar sensor. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, Canada, 11–17 October 2021; pp. 1780–1790. [Google Scholar]
- Chung, M.; Kim, C.; Choi, K.; Chung, D.; Kim, Y. Development of LiDAR simulator for backpack-mounted mobile indoor mapping system. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2017, 35, 91–102. [Google Scholar]
- Brown, D.C. Decentering distortion of lenses. Photogramm. Eng. Remote Sens. 1966, 32, 444–462. [Google Scholar]
- Beyer, H.A. Accurate calibration of CCD-cameras. In Proceedings of the 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Champaign, IL, USA, 15–18 June 1992. [Google Scholar]
- Fraser, C.S. Digital camera self-calibration. ISPRS J. Photogramm. Remote Sens. 1997, 52, 149–159. [Google Scholar] [CrossRef]
- Remondino, F.; Fraser, C. Digital camera calibration methods: Considerations and comparisons. Int. Arch. Photogramm. Remote Sen. Spat. Inf. Sci. 2006, 36, 266–272. [Google Scholar]
- Kannala, J.; Brandt, S.S. A generic camera model and calibration method for conventional, wide-angle, and fish-eye lenses. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 1335–1340. [Google Scholar] [CrossRef] [PubMed]
- Miyamoto, K. Fish Eye Lens. J. Opt. Soc. Am. 1964, 54, 1060–1061. [Google Scholar] [CrossRef]
- Abraham, S.; Förstner, W. Fish-eye-stereo calibration and epipolar rectification. ISPRS J. Photogramm. Remote Sens. 2005, 59, 278–288. [Google Scholar] [CrossRef]
- Choi, K.H.; Yongil, K.; Changjae, K. Analysis of Fish-Eye Lens Camera Self-Calibration. Sensors 2019, 19, 1218. [Google Scholar] [CrossRef]
- Choi, K.H.; Yongmin, K.; Changjae, K. Correlation Analysis of Fish-eye Lens Camera for Acquiring Reliable Orientation Parameters. Sens. Mater. 2019, 31, 3885–3897. [Google Scholar] [CrossRef]
- Choi, K.H.; Kim, C. Proposed New AV-Type Test-Bed for Accurate and Reliable Fish-Eye Lens Camera Self-Calibration. Sensors 2001, 21, 2776. [Google Scholar] [CrossRef]
- Lichti, D.D. Error modelling, calibration and analysis of an AM–CW terrestrial laser scanner system. ISPRS J. Photogramm. Remote Sens. 2007, 61, 307–324. [Google Scholar] [CrossRef]
- Atanacio-Jiménez, G.; Gonzalez-Barbosa, J.-J.; Hurtado-Ramos, J.B.; Ornelas-Rodríguez, F.J.; Jiménez-Hernández, H.; García-Ramirez, T.; González-Barbosa, R. Lidar velodyne hdl-64e calibration using pattern planes. Int. J. Adv. Robot. Syst. 2011, 8, 59. [Google Scholar] [CrossRef]
- Chan, T.O.; Lichti, D.D.; Belton, D. A rigorous cylinder-based self-calibration approach for terrestrial laser scanners. ISPRS J. Photogramm. Remote Sens. 2015, 99, 84–99. [Google Scholar] [CrossRef]
- Glennie, C.; Lichti, D.D. Static calibration and analysis of the Velodyne HDL-64E S2 for high accuracy mobile scanning. Remote Sens. 2010, 2, 1610–1624. [Google Scholar] [CrossRef] [Green Version]
- Glennie, C.L.; Kusari, A.; Facchin, A. Calibration and Stability Analysis of the Vlp-16 Laser Scanner. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 9, 55–60. [Google Scholar] [CrossRef]
- Kim, H.S.; Kim, Y.; Kim, C.; Choi, K.H. Kinematic In Situ Self-Calibration of a Backpack-Based Multi-Beam LiDAR System. Appl. Sci. 2021, 11, 945. [Google Scholar] [CrossRef]
- Choi, K.; Kim, C.; Kim, Y. Comprehensive Analysis of System Calibration between Optical Camera and Range Finder. ISPRS Int. J. Geo-Inf. 2018, 7, 188. [Google Scholar] [CrossRef]
- Chen, C.; Yang, B.; Song, S. Low Cost and Efficient 3d Indoor Mapping Using Multiple Consumer Rgb-D Cameras. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2016, 41, 169–174. [Google Scholar] [CrossRef]
- Horaud, R.; Hansard, M.; Evangelidis, G.; Ménier, C. An overview of depth cameras and range scanners based on time-of-flight technologies. Mach. Vis. Appl. 2016, 27, 1005–1020. [Google Scholar] [CrossRef]
- Pandey, G.; McBride, J.R.; Savarese, S.; Eustice, R.M. Automatic extrinsic calibration of vision and lidar by maximizing mutual information. J. Field Robot. 2015, 32, 696–722. [Google Scholar] [CrossRef]
- Taylor, Z.; Nieto, J.; Johnson, D. Multi-Modal Sensor Calibration Using a Gradient Orientation Measure. J. Field Robot. 2015, 32, 675–695. [Google Scholar] [CrossRef]
- Zhang, Y.; Luo, C.; Liu, J. Walk&sketch: Create floor plans with an RGB-D camera. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012. [Google Scholar]
- Park, Y.; Yun, S.; Won, C.S.; Cho, K.; Um, K.; Sim, S. Calibration between color camera and 3D LIDAR instruments with a polygonal planar board. Sensors 2014, 14, 5333–5353. [Google Scholar] [CrossRef] [PubMed]
- Wang, W.; Yamakawa, K.; Hiroi, K.; Kaji, K.; Kawaguchi, N. A Mobile System for 3D Indoor Mapping Using LiDAR and Panoramic Camera. Spec. Interest Group Tech. Rep. IPSJ 2015, 1, 337–340. [Google Scholar]
- Staranowicz, A.N.; Brown, G.R.; Morbidi, F.; Mariottini, G.-L. Practical and accurate calibration of RGB-D cameras using spheres. Comput. Vis. Image Underst. 2015, 137, 102–114. [Google Scholar] [CrossRef]
- Veľas, M.; Spanel, M.; Materna, Z.; Herout, A. Calibration of rgb camera with velodyne lidar. In WSCG 2014 Communication Papers Proceedings; Union Agency: Plzeň, Česko, 2014; pp. 135–144. [Google Scholar]
- Pereira, M.; Santos, V.; Dias, P. Automatic calibration of multiple LIDAR sensors using a moving sphere as target. In Robot 2015: Second Iberian Robotics Conference; Reis, L., Moreira, A., Lima, P., Montano, L., Muñoz-Martinez, V., Eds.; Springer: Cham, Switzerland, 2016; Volume 417, pp. 477–489. [Google Scholar]
- Kümmerle, J.; Kühner, T.; Lauer, M. Automatic calibration of multiple cameras and depth sensors with a spherical target. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
- Chao, G.; Spletzer, J.R. On-line calibration of multiple lidars on a mobile vehicle platform. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010. [Google Scholar]
- Pusztai, Z.; Hajder, L. Accurate calibration of LiDAR-camera systems using ordinary boxes. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Venice, Italy, 22–29 October 2017. [Google Scholar]
- Jiao, J.; Liao, Q.; Zhu, Y.; Liu, T.; Yu, Y.; Fan, R.; Wang, L.; Liu, M. A novel dual-lidar calibration algorithm using planar surfaces. In Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IVS), Paris, France, 9–12 June 2019. [Google Scholar]
- Choi, D.-G.; Bok, Y.; Kim, J.-S.; Kweon, I.S. Extrinsic calibration of 2-d lidars using two orthogonal planes. IEEE Trans. Robot. 2016, 32, 83–98. [Google Scholar] [CrossRef]
- Pusztai, Z.; Eichhardt, I.; Hajder, L. Accurate calibration of multi-lidar-multi-camera systems. Sensors 2018, 18, 2139. [Google Scholar] [CrossRef] [PubMed]
- Chai, Z.; Sun, Y.; Xiong, Z. A Novel Method for LiDAR Camera Calibration by Plane Fitting. In Proceedings of the 2018 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Auckland, New Zealand, 9–12 July 2018. [Google Scholar]
- Geiger, A.; Moosmann, F.; Car, O.; Schuster, B. Automatic camera and range sensor calibration using a single shot. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 3936–3943. [Google Scholar]
- Lyu, Y.; Bai, L.; Elhousni, M.; Huang, X. An interactive lidar to camera calibration. In Proceedings of the 2019 IEEE High Performance Extreme Computing Conference (HPEC), Waltham, MA, USA, 24–26 September 2019; pp. 1–6. [Google Scholar]
- Pless, R.; Zhang, Q. Extrinsic calibration of a camera and laser range finder (improves camera calibration). In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sendai, Japan, 28 September–2 October 2004. [Google Scholar]
- Song, H.; Choi, W.; Kim, H. Robust vision-based relative-localization approach using an RGB-depth camera and LiDAR sensor fusion. IEEE Trans. Ind. Electron. 2016, 63, 3725–3736. [Google Scholar] [CrossRef]
- Unnikrishnan, R.; Hebert, M. Fast Extrinsic Calibration of a Laser Rangefinder to a Camera; Tech. Rep. CMU-RI-TR-05-09; Robotics Institute: Pittsburgh, PA, USA, 2005. [Google Scholar]
- Mirzaei, F.M.; Kottas, D.G.; Roumeliotis, S.I. 3D LIDAR camera intrinsic and extrinsic calibration: Identifiability and analytical least-squares-based initialization. Int. J. Robot. Res. 2012, 31, 452–467. [Google Scholar] [CrossRef]
- Pandey, G.; McBride, J.; Savarese, S.; Eustice, R. Extrinsic calibration of a 3D laser scanner and an omnidirectional camera. IFAC Proc. Vol. 2010, 43, 336–341. [Google Scholar] [CrossRef]
- Bok, Y.; Jeong, Y.; Choi, D.-G.; Kweon, I.S. Capturing village-level heritages with a hand-held camera-laser fusion sensor. Int. J. Comput. Vis. 2011, 94, 36–53. [Google Scholar] [CrossRef]
- Bok, Y.; Choi, D.-G.; Kweon, I.S. Sensor fusion of cameras and a laser for city-scale 3D reconstruction. Sensors 2014, 14, 20882–20909. [Google Scholar] [CrossRef]
- Zhou, L. A new minimal solution for the extrinsic calibration of a 2D LIDAR and a camera using three plane-line correspondences. IEEE Sens. J. 2013, 14, 442–454. [Google Scholar] [CrossRef]
- Chen, H.H. Pose determination from line-to-plane correspondences: Existence condition and closed-form solutions. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 530–541. [Google Scholar] [CrossRef]
- Vasconcelos, F.; Barreto, J.P.; Nunes, U. A minimal solution for the extrinsic calibration of a camera and a laser-rangefinder. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 2097–2107. [Google Scholar] [CrossRef] [PubMed]
- Cramer, J. Automatic generation of 3d thermal maps of building interiors. ASHRAE Trans. 2014, 120, C1. [Google Scholar]
- Gomez-Ojeda, R.; Briales, J.; Fernandez-Moral, E.; Gonzalez, J.-J. Extrinsic calibration of a 2D laser-rangefinder and a camera based on scene corners. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 3611–3616. [Google Scholar]
- Perez-Yus, A.; Fernandez-Moral, E.; Lopez-Nicolas, G.; Guerrero, J.J.; Rives, P. Extrinsic calibration of multiple RGB-D cameras from line observations. IEEE Robot. Autom. Lett. 2018, 3, 273–280. [Google Scholar] [CrossRef] [Green Version]
- Dong, W.; Isler, V. A novel method for the extrinsic calibration of a 2-D laser-rangefinder and a camera. IEEE Sens. J. 2018, 18, 4200–4211. [Google Scholar] [CrossRef] [Green Version]
Sensor | Model | Specifications | |||
---|---|---|---|---|---|
Camera | Lens | Sennex DSL315 | Projection model | Fish-eye lens (equisolid angle projection) | |
Body | Cameleon3 5.0 MP Color USB3 Vision | Image size (pixel size) | (0.00345 mm) | ||
LiDAR | Velodyne HDL-16E | Channel number | 16 | ||
FOV | Horizontal | 360° | |||
Vertical | 30° (±15°) | ||||
Resolution | Horizontal | 0.5° (10 Hz) | |||
Vertical | 2° |
X0 (mm) | Y0 (mm) | Z0 (mm) | ω (°) | φ (°) | κ (°) | |
---|---|---|---|---|---|---|
−232.89 | −121.40 | −16.28 | 89.28 | 79.66 | −87.85 | |
−226.61 | −281.42 | −406.70 | −157.07 | 100.42 | 97.56 | |
210.83 | −122.16 | −18.32 | 273.90 | −100.73 | 274.78 | |
214.32 | −281.23 | −404.43 | 36.17 | −80.89 | 98.75 |
Standard Deviation of ROPs | RMS-Residuals for Image Points (pixel) | ||||||
---|---|---|---|---|---|---|---|
X0(mm) | Y0(mm) | Z0(mm) | ω (°) | φ (°) | κ (°) | ||
2.25 | 3.67 | 2.57 | 0.0033 | 0.0011 | 0.0033 | 0.56 | |
0.77 | 1.50 | 2.28 | 0.0058 | 0.0008 | 0.0058 | 0.48 | |
2.78 | 4.55 | 4.11 | 0.0089 | 0.0019 | 0.0081 | 0.59 | |
4.95 | 4.86 | 4.89 | 0.0117 | 0.0008 | 0.0119 | 0.55 |
X0 | Y0 | Z0 | ω | φ | κ | X0 | Y0 | Z0 | ω | φ | κ | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
X0 | - | −0.27 | 0.08 | −0.28 | 0.79 | 0.21 | - | 0.19 | 0.17 | −0.26 | 0.02 | 0.28 |
Y0 | −0.27 | - | −0.15 | 0.95 | 0.10 | −0.84 | 0.19 | - | −0.08 | −0.86 | −0.42 | 0.87 |
Z0 | 0.08 | −0.15 | - | −0.20 | 0.36 | −0.02 | 0.17 | −0.08 | - | −0.37 | 0.92 | 0.37 |
ω | −0.28 | 0.95 | −0.20 | - | 0.06 | −0.79 | −0.26 | −0.86 | −0.37 | - | −0.02 | −0.79 |
φ | 0.79 | 0.10 | 0.36 | 0.06 | - | −0.20 | 0.02 | −0.42 | 0.92 | −0.02 | - | 0.01 |
κ | 0.21 | −0.84 | −0.02 | −0.79 | −0.20 | - | 0.28 | 0.87 | 0.37 | −0.79 | 0.01 | - |
X0 | Y0 | Z0 | ω | φ | κ | X0 | Y0 | Z0 | ω | φ | κ | |
X0 | - | 0.03 | −0.05 | 0.23 | −0.01 | 0.23 | - | −0.01 | 0.02 | 0.02 | 0.08 | 0.02 |
Y0 | 0.03 | - | 0.21 | 0.01 | 0.90 | −0.08 | −0.01 | - | 0.00 | −0.55 | −0.10 | −0.54 |
Z0 | −0.05 | 0.21 | - | −0.89 | 0.21 | −0.91 | 0.02 | 0.00 | - | −0.08 | 0.51 | −0.07 |
ω | 0.23 | 0.01 | −0.89 | - | −0.01 | 0.78 | 0.02 | −0.55 | −0.08 | - | −0.05 | 0.78 |
φ | −0.01 | 0.90 | 0.21 | −0.01 | - | −0.07 | 0.08 | −0.10 | 0.51 | −0.05 | - | −0.05 |
κ | 0.23 | −0.08 | −0.91 | 0.78 | −0.07 | - | 0.02 | −0.54 | −0.07 | 0.78 | −0.05 | - |
X0 (mm) | Y0 (mm) | Z0 (mm) | ω (°) | φ (°) | κ (°) | |
---|---|---|---|---|---|---|
−232.81 | −121.88 | −16.13 | 89.28 | 79.66 | −87.85 | |
210.51 | −121.09 | −18.66 | 273.9 | −100.73 | 274.78 | |
2.49 | −340.55 | 422.79 | 61.28 | 1.05 | −0.26 |
X0 (mm) | Y0 (mm) | Z0 (mm) | ω (°) | φ (°) | κ (°) | |
---|---|---|---|---|---|---|
0.00 | −350.00 | 400.00 | 60.00 | 0.00 | 0.00 | |
10.50 | −341.00 | 415.00 | 59.10 | 1.49 | −1.15 |
ROPs | Location | ||||||
---|---|---|---|---|---|---|---|
1 | −3.11 | 29.08 | 34.82 | −1.48 | 13.85 | 16.58 | |
2 | −1.23 | 12.17 | 15.32 | −0.82 | 8.11 | 10.21 | |
3 | 18.80 | 18.87 | 20.12 | 4.47 | 4.49 | 4.79 | |
4 | −1.70 | 16.40 | 20.19 | −0.55 | 5.29 | 6.51 | |
Overall | 4.10 | 17.71 | 20.87 | 0.41 | 7.94 | 9.52 | |
1 | 46.43 | 46.44 | 50.34 | 22.11 | 22.11 | 23.97 | |
2 | 20.24 | 23.00 | 28.61 | 13.49 | 15.33 | 19.07 | |
3 | 49.81 | 49.81 | 50.68 | 11.86 | 11.86 | 12.07 | |
4 | 26.78 | 28.86 | 34.04 | 8.64 | 9.31 | 10.98 | |
Overall | 34.30 | 35.68 | 39.57 | 14.02 | 14.65 | 16.52 | |
1 | −20.18 | 20.33 | 23.37 | −9.61 | 9.68 | 11.13 | |
2 | −34.43 | 34.43 | 35.94 | −22.95 | 22.95 | 23.96 | |
3 | 117.44 | 117.44 | 119.00 | 27.96 | 27.96 | 28.33 | |
4 | −30.87 | 30.90 | 32.80 | −9.95 | 9.97 | 10.58 | |
Overall | 12.01 | 55.12 | 56.98 | −3.64 | 17.64 | 18.50 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Choi, K.; Kim, C. A Framework of Wearable Sensor-System Development for Urban 3D Modeling. Appl. Sci. 2022, 12, 9061. https://doi.org/10.3390/app12189061
Choi K, Kim C. A Framework of Wearable Sensor-System Development for Urban 3D Modeling. Applied Sciences. 2022; 12(18):9061. https://doi.org/10.3390/app12189061
Chicago/Turabian StyleChoi, Kanghyeok, and Changjae Kim. 2022. "A Framework of Wearable Sensor-System Development for Urban 3D Modeling" Applied Sciences 12, no. 18: 9061. https://doi.org/10.3390/app12189061
APA StyleChoi, K., & Kim, C. (2022). A Framework of Wearable Sensor-System Development for Urban 3D Modeling. Applied Sciences, 12(18), 9061. https://doi.org/10.3390/app12189061