Radar and OpenStreetMap-Aided Consistent Trajectory Estimation in Canopy-Occluded Environments
Highlights
- A spatial–temporal enhancement strategy for radar SLAM reduced Absolute Trajectory Error by up to 55.23% compared to baseline frameworks in Canopy-Occluded routes.
- Incorporating OpenStreetMap priors with adaptive constraint weighting achieved a minimum trajectory error of 1.83 m.
- Demonstrates that integrating millimeter-wave radar sensing with spatial–temporal optimization and open-source maps substantially improves positioning robustness in real-world signal-degraded environments.
Abstract
1. Introduction
- Propose short- and long-term temporal enhancement strategies for radar SLAM, including TDC-based descriptor stabilization in the short term, and long-term loop closure pruning to address perception aliasing.
- Introduce spatial enhancement via adaptive map constraints, integrating OSM priors with confidence-based weight adjustment for improved global consistency, especially under sparse loop-closure conditions in GNSS-denied environments.
- Develop a coarse-to-fine radar trajectory estimation framework that fuses public maps with temporally enhanced radar odometry, validated through extensive experiments in urban and campus environments under GNSS outages and varying levels of canopy occlusion.
2. Related Works
2.1. Radar-Based Localization and Mapping
2.2. Radar Loop Closure Detection
2.3. Pose Graph Optimization with Prior Map Integration
3. Methods
3.1. Radar Odometry
3.2. Temporal Enhanced Loop Closure Detection
3.2.1. Short-Term Enhancement for Temporal Density Context Generation
3.2.2. Long-Term Enhancement for Loop Outlier Removal
3.3. Pose Graph Optimization with Adaptive Map Factors
3.3.1. Adaptive Map Factors
3.3.2. Loop Closure Factors
4. Results
4.1. Experimental Setup
4.2. Trajectory Evaluation
4.3. Real World Experiments
4.4. Ablation and Real-Time Performance Analysis
4.4.1. Short-Term Enhancement
4.4.2. Long-Term Enhancement
4.4.3. Adaptive Map Factors
5. Discussion
5.1. Analysis of Trajectory Accuracy and Consistency
5.2. Generalization in Complex Real-World Scenarios
5.3. Impact of Modules and Computational Efficiency
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Zheng, S.; Wang, J.; Rizos, C.; Ding, W.; El-Mowafy, A. Simultaneous localization and mapping (slam) for autonomous driving: Concept and analysis. Remote Sens. 2023, 15, 1156. [Google Scholar] [CrossRef]
- Liu, Z.; Kaartinen, H.; Hakala, T.; Hyyppä, J.; Kukko, A.; Chen, R. Performance analysis of standalone UWB positioning inside forest canopy. IEEE Trans. Instrum. Meas. 2024, 73, 9511214. [Google Scholar] [CrossRef]
- Chen, L.; Li, Y.; Li, L.; Qi, S.; Zhou, J.; Tang, Y.; Yang, J.; Xin, J. High-precision positioning, perception and safe navigation for automated heavy-duty mining trucks. IEEE Trans. Intell. Veh. 2024, 9, 4644–4656. [Google Scholar] [CrossRef]
- Chen, W.; Shang, G.; Ji, A.; Zhou, C.; Wang, X.; Xu, C.; Li, Z.; Hu, K. An overview on visual slam: From tradition to semantic. Remote Sens. 2022, 14, 3010. [Google Scholar] [CrossRef]
- Liu, Z.; Kaartinen, H.; Hakala, T.; Hyyti, H.; Kukko, A.; Hyyppa, J.; Chen, R. Comparative Analysis of Ultra-Wideband and Mobile Laser Scanning Systems for Mapping Forest Trees under A Forest Canopy. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2025, X-G-2025, 551–557. [Google Scholar] [CrossRef]
- Hong, Z.; Petillot, Y.; Wallace, A.; Wang, S. RadarSLAM: A robust simultaneous localization and mapping system for all weather conditions. Int. J. Robot. Res. 2022, 41, 519–542. [Google Scholar] [CrossRef]
- Wang, X.; Liang, X.; Campos, M.; Zhang, J.; Wang, Y. Benchmarking of Laser-Based Simultaneous Localization and Mapping Methods in Forest Environments. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5706221. [Google Scholar] [CrossRef]
- Barnes, D.; Weston, R.; Posner, I. Masking by Moving: Learning Distraction-Free Radar Odometry from Pose Information. arXiv 2019, arXiv:1909.03752. [Google Scholar]
- Barnes, D.; Posner, I. Under the radar: Learning to predict robust keypoints for odometry estimation and metric localisation in radar. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May 2020–31 August 2020; pp. 9484–9490. [Google Scholar]
- Burnett, K.; Yoon, D.J.; Schoellig, A.P.; Barfoot, T.D. Radar odometry combining probabilistic estimation and unsupervised feature learning. arXiv 2021, arXiv:2105.14152. [Google Scholar] [CrossRef]
- Lai, H.; Zheng, Z.; Zhao, M. CartoRadar: RF-Based 3D SLAM Rivaling Vision Approaches. In Proceedings of the 31st ACM International Conference on Mobile Computing and Networking (ACM MobiCom ’25), Hong Kong, China, 4–8 November 2025. [Google Scholar]
- Sie, E.; Wu, X.; Guo, H.; Vasisht, D. Radarize: Enhancing Radar SLAM with Generalizable Doppler-Based Odometry. In Proceedings of the 22nd ACM International Conference on Mobile Systems, Applications, and Services (ACM MobiSys ’24), Tokyo, Japan, 3–7 June 2024; pp. 331–344. [Google Scholar] [CrossRef]
- Adolfsson, D.; Magnusson, M.; Alhashimi, A.; Lilienthal, A.J.; Andreasson, H. Lidar-Level Localization With Radar? The CFEAR Approach to Accurate, Fast, and Robust Large-Scale Radar Odometry in Diverse Environments. IEEE Trans. Robot. 2022, 39, 1476–1495. [Google Scholar] [CrossRef]
- Burnett, K.; Schoellig, A.P.; Barfoot, T.D. Do we need to compensate for motion distortion and doppler effects in spinning radar navigation? IEEE Robot. Autom. Lett. 2021, 6, 771–778. [Google Scholar] [CrossRef]
- Rohling, H. Radar CFAR thresholding in clutter and multiple target situations. IEEE Trans. Aerosp. Electron. Syst. 1983, AES-19, 608–621. [Google Scholar] [CrossRef]
- Cen, S.H.; Newman, P. Precise ego-motion estimation with millimeter-wave radar under diverse and challenging conditions. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 6045–6052. [Google Scholar]
- Cen, S.H.; Newman, P. Radar-only ego-motion estimation in difficult settings via graph matching. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 298–304. [Google Scholar]
- Wang, D.; Duan, Y.; Fan, X.; Meng, C.; Ji, J.; Zhang, Y. MAROAM: Map-based Radar SLAM through Two-step Feature Selection. arXiv 2022, arXiv:2210.13797. [Google Scholar]
- Checchin, P.; Gérossier, F.; Blanc, C.; Chapuis, R.; Trassoudaine, L. Radar scan matching slam using the fourier-mellin transform. In Proceedings of the Field and Service Robotics: Results of the 7th International Conference, Cambridge, MA, USA, 14–16 July 2009; Springer: Berlin/Heidelberg, Germany, 2010; pp. 151–161. [Google Scholar]
- Park, Y.S.; Shin, Y.S.; Kim, A. Pharao: Direct radar odometry using phase correlation. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 2617–2623. [Google Scholar]
- Lim, H.; Jeon, J.; Myung, H. UV-SLAM: Unconstrained line-based SLAM using vanishing points for structural mapping. IEEE Robot. Autom. Lett. 2022, 7, 1518–1525. [Google Scholar] [CrossRef]
- Zhuang, Y.; Wang, B.; Huai, J.; Li, M. 4D iRIOM: 4D imaging radar inertial odometry and mapping. IEEE Robot. Autom. Lett. 2023, 8, 3246–3253. [Google Scholar] [CrossRef]
- Zhou, B.; Mo, H.; Tang, S.; Zhang, X.; Li, Q. Backpack LiDAR-based SLAM with multiple ground constraints for multistory indoor mapping. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5705516. [Google Scholar] [CrossRef]
- Jiang, W.; Xue, H.; Si, S.; Xiao, L.; Zhao, D.; Zhu, Q.; Nie, Y.; Dai, B. R2SCAT-LPR: Rotation-Robust Network with Self-and Cross-Attention Transformers for LiDAR-Based Place Recognition. Remote Sens. 2025, 17, 1057. [Google Scholar] [CrossRef]
- Duan, C.; Zhou, J.; Li, B.; Dong, Z.; Tang, Y.; Xiao, J. RoCalib: Large-Scale Autonomous Geo-Calibration for Roadside Lidar with High-Definition Map. IEEE Trans. Intell. Transp. Syst. 2025, 26, 13734–13749. [Google Scholar] [CrossRef]
- Schubert, S.; Neubert, P.; Garg, S.; Milford, M.; Fischer, T. Visual place recognition: A tutorial. IEEE Robot. Autom. Mag. 2023, 31, 139–153. [Google Scholar] [CrossRef]
- Lowe, D.G. Object recognition from local scale-invariant features. In Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; Volume 2, pp. 1150–1157. [Google Scholar]
- Konolige, K.; Agrawal, M. FrameSLAM: From bundle adjustment to real-time visual mapping. IEEE Trans. Robot. 2008, 24, 1066–1077. [Google Scholar] [CrossRef]
- Oliva, A.; Torralba, A. Building the gist of a scene: The role of global image features in recognition. Prog. Brain Res. 2006, 155, 23–36. [Google Scholar]
- Ali-Bey, A.; Chaib-Draa, B.; Giguere, P. Mixvpr: Feature mixing for visual place recognition. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, 2–7 January 2023; pp. 2998–3007. [Google Scholar]
- Arandjelovic, R.; Gronat, P.; Torii, A.; Pajdla, T.; Sivic, J. NetVLAD: CNN architecture for weakly supervised place recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 5297–5307. [Google Scholar]
- Kim, G.; Kim, A. Scan context: Egocentric spatial descriptor for place recognition within 3d point cloud map. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4802–4809. [Google Scholar]
- Uy, M.A.; Lee, G.H. Pointnetvlad: Deep point cloud based retrieval for large-scale place recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 4470–4479. [Google Scholar]
- Kim, G.; Choi, S.; Kim, A. Scan context++: Structural place recognition robust to rotation and lateral variations in urban environments. IEEE Trans. Robot. 2021, 38, 1856–1874. [Google Scholar] [CrossRef]
- Xu, D.; Liu, J.; Liang, Y.; Lv, X.; Hyyppä, J. A LiDAR-based single-shot global localization solution using a cross-section shape context descriptor. ISPRS J. Photogramm. Remote Sens. 2022, 189, 272–288. [Google Scholar] [CrossRef]
- Xiang, H.; Shi, W.; Fan, W.; Chen, P.; Bao, S.; Nie, M. FastLCD: A fast and compact loop closure detection approach using 3D point cloud for indoor mobile mapping. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102430. [Google Scholar] [CrossRef]
- Cai, K.; Wang, B.; Lu, C.X. Autoplace: Robust place recognition with single-chip automotive radar. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 2222–2228. [Google Scholar]
- Cheng, Y.; Pang, C.; Jiang, M.; Liu, Y. Relocalization based on millimeter wave radar point cloud for visually degraded environments. J. Field Robot. 2023, 40, 901–918. [Google Scholar] [CrossRef]
- Jang, H.; Jung, M.; Kim, A. Raplace: Place recognition for imaging radar using radon transform and mutable threshold. In Proceedings of the 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Detroit, MI, USA, 1–5 October 2023; pp. 11194–11201. [Google Scholar]
- Liu, Z.; Kaartinen, H.; Hakala, T.; Hyyppä, J.; Kukko, A.; Chen, R. Tracking foresters and mapping tree stem locations with decimeter-level accuracy under forest canopies using UWB. Expert Syst. Appl. 2025, 262, 125519. [Google Scholar] [CrossRef]
- Qiu, H.; Zhou, J.; Li, B.; Zou, Q.; Tang, Y.; Luo, M. Map4comm: A map-aware collaborative perception framework with efficient-bandwidth information fusion. Inf. Fusion 2025, 262, 103567. [Google Scholar] [CrossRef]
- Qiu, H.; Tang, Y.; Zhou, J.; Xiong, C.; Liu, K.; Xie, F.; Li, B. PosiFusion: A Vehicle-to-Everything Cooperative Perception Framework with Positional Prior Fusion. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2025, 10, 115–122. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef]
- Hong, Z.; Petillot, Y.; Wang, S. Radarslam: Radar based large-scale slam in all weathers. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 5164–5170. [Google Scholar]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping. In Proceedings of the 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 5135–5142. [Google Scholar]
- Cao, S.; Lu, X.; Shen, S. GVINS: Tightly coupled GNSS–visual–inertial fusion for smooth and consistent state estimation. IEEE Trans. Robot. 2022, 38, 2004–2021. [Google Scholar] [CrossRef]
- Zhang, J.; Zhuge, H.; Wu, Z.; Peng, G.; Wen, M.; Liu, Y.; Wang, D. 4DRadarSLAM: A 4D imaging radar SLAM system for large-scale environments based on pose graph optimization. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; pp. 8333–8340. [Google Scholar]
- Nedjah, N.; de Macedo Mourelle, L.; de Oliveira, P.J.A. Simultaneous localization and mapping using swarm intelligence based methods. Expert Syst. Appl. 2020, 159, 113547. [Google Scholar] [CrossRef]
- Smith, R.C.; Cheeseman, P. On the representation and estimation of spatial uncertainty. Int. J. Robot. Res. 1986, 5, 56–68. [Google Scholar] [CrossRef]
- Grisetti, G.; Stachniss, C.; Burgard, W. Improved techniques for grid mapping with rao-blackwellized particle filters. IEEE Trans. Robot. 2007, 23, 34–46. [Google Scholar] [CrossRef]
- Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef]
- Yang, S.; Zhu, X.; Nian, X.; Feng, L.; Qu, X.; Ma, T. A robust pose graph approach for city scale LiDAR mapping. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1175–1182. [Google Scholar]
- Yin, H.; Wang, Y.; Tang, L.; Xiong, R. Radar-on-lidar: Metric radar localization on prior lidar maps. In Proceedings of the 2020 IEEE International Conference on Real-Time Computing and Robotics (RCAR), Virtual, 28–29 September 2020; pp. 1–7. [Google Scholar]
- Ma, Y.; Zhao, X.; Li, H.; Gu, Y.; Lang, X.; Liu, Y. RoLM: Radar on LiDAR map localization. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; pp. 3976–3982. [Google Scholar]
- Guo, Y.; Zhou, J.; Dong, Q.; Li, B.; Xiao, J.; Li, Z. Refined high-definition map model for roadside rest area. Transp. Res. Part A Policy Pract. 2025, 195, 104463. [Google Scholar] [CrossRef]
- Zhou, J.; Yu, M.; Guo, Y.; Li, B.; Ying, S.; Li, Z. A high-definition map architecture for transportation digital twin system construction. Int. J. Appl. Earth Obs. Geoinf. 2025, 144, 104822. [Google Scholar] [CrossRef]
- Ruchti, P.; Steder, B.; Ruhnke, M.; Burgard, W. Localization on openstreetmap data using a 3d laser scanner. In Proceedings of the 2015 IEEE international conference on robotics and automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 5260–5265. [Google Scholar]
- Brubaker, M.A.; Geiger, A.; Urtasun, R. Map-based probabilistic visual self-localization. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 38, 652–665. [Google Scholar] [CrossRef]
- Hong, Z.; Petillot, Y.; Zhang, K.; Xu, S.; Wang, S. Large-Scale Radar Localization using Online Public Maps. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; pp. 3990–3996. [Google Scholar]
- Panphattarasap, P.; Calway, A. Automated map reading: Image based localisation in 2-D maps using binary semantic descriptors. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 6341–6348. [Google Scholar]
- Yang, C.; Gidofalvi, G. Fast map matching, an algorithm integrating hidden Markov model with precomputation. Int. J. Geogr. Inf. Sci. 2018, 32, 547–570. [Google Scholar] [CrossRef]
- Tang, Y.; Zhou, J.; Yan, M.; Li, B.; Huang, Y.; Xiao, J. HGCM: A hybrid radar feature and GNSS based continuous mapping method. Expert Syst. Appl. 2025, 286, 128020. [Google Scholar] [CrossRef]
- Maybeck, P.S. Stochastic Models, Estimation, and Control; Academic Press: New York, NY, USA, 1982. [Google Scholar]
- Rosinol, A.; Leonard, J.J.; Carlone, L. Probabilistic volumetric fusion for dense monocular slam. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, 2–7 January 2023; pp. 3097–3105. [Google Scholar]
- Dellaert, F. Factor Graphs and GTSAM: A Hands-on Introduction; Technical Report; Georgia Institute of Technology: Atlanta, GA, USA, 2012. [Google Scholar]
- Kaess, M.; Johannsson, H.; Roberts, R.; Ila, V.; Leonard, J.J.; Dellaert, F. iSAM2: Incremental smoothing and mapping using the Bayes tree. Int. J. Robot. Res. 2012, 31, 216–235. [Google Scholar] [CrossRef]
- Zhang, Z.; Scaramuzza, D. A tutorial on quantitative trajectory evaluation for visual (-inertial) odometry. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 7244–7251. [Google Scholar]
- Shan, T.; Englot, B. Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4758–4765. [Google Scholar]
- Lim, H.; Han, K.; Shin, G.; Kim, G.; Hong, S.; Myung, H. Orora: Outlier-robust radar odometry. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; pp. 2046–2053. [Google Scholar]
- Kim, G.; Park, Y.S.; Cho, Y.; Jeong, J.; Kim, A. Mulran: Multimodal range dataset for urban place recognition. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 6246–6253. [Google Scholar]
- Gadd, M.; De Martini, D.; Bartlett, O.; Murcutt, P.; Towlson, M.; Widojo, M.; Muşat, V.; Robinson, L.; Panagiotaki, E.; Pramatarov, G.; et al. Oord: The oxford offroad radar dataset. IEEE Trans. Intell. Transp. Syst. 2024, 25, 18779–18790. [Google Scholar] [CrossRef]

















| Method | Sequence | ||||||||
|---|---|---|---|---|---|---|---|---|---|
| DCC01 | DCC02 | DCC03 | KAIST01 | KAIST02 | KAIST03 | RIV01 | RIV02 | RIV03 | |
| SC-LeGO-LOAM | 29.34 m | 9.70 m | 5.29 m | 49.27 m | 32.10 m | 31.16 m | NA | 96.46 m | NA |
| Yeti | 46.31 m | 18.89 m | 37.89 m | 84.01 m | 42.92 m | 6.08 m | 94.36 m | 18.63 m | 21.75 m |
| RadarSLAM | 12.89 m | 9.88 m | 3.92 m | 6.87 m | 6.03 m | 4.11 m | 9.03 m | 7.05 m | 10.74 m |
| SC-ORORA | 22.58 m | 24.34 m | 17.61 m | 4.94 m | 4.06 m | 2.14 m | 13.11 m | 18.37 m | 12.30 m |
| LSRL | - | - | - | 7.00 m | 6.00 m | 6.20 m | 6.70 m | 5.70 m | NA |
| pure GNSS | 7.44 m | 10.61 m | 3.60 m | 2.83 m | 2.76 m | 3.37 m | 3.85 m | 3.01 m | 2.57 m |
| OSM | 7.00 m | 9.54 m | 4.79 m | 5.98 m | 7.00 m | 7.40 m | 12.97 m | 11.41 m | 10.90 m |
| ours | 6.12 m | 8.68 m | 7.19 m | 4.39 m | 3.39 m | 1.83 m | 5.00 m | 3.91 m | 12.93 m |
| The Starlake Building | The Youyi Square |
|---|---|
| 0.55 m | 2.08 m |
| Noise Level | ATE (m) |
|---|---|
| No noise | 6.41 |
| +5 m random noise | 10.60 |
| +10 m random noise | 10.30 |
| +20 m random noise | 11.10 |
| +50 m random noise | 10.11 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Tang, Y.; Li, B.; Zhong, H.; Yan, M.; Jiang, S.; Zhou, J. Radar and OpenStreetMap-Aided Consistent Trajectory Estimation in Canopy-Occluded Environments. Remote Sens. 2026, 18, 70. https://doi.org/10.3390/rs18010070
Tang Y, Li B, Zhong H, Yan M, Jiang S, Zhou J. Radar and OpenStreetMap-Aided Consistent Trajectory Estimation in Canopy-Occluded Environments. Remote Sensing. 2026; 18(1):70. https://doi.org/10.3390/rs18010070
Chicago/Turabian StyleTang, Youchen, Bijun Li, Haoran Zhong, Maosheng Yan, Shuiyun Jiang, and Jian Zhou. 2026. "Radar and OpenStreetMap-Aided Consistent Trajectory Estimation in Canopy-Occluded Environments" Remote Sensing 18, no. 1: 70. https://doi.org/10.3390/rs18010070
APA StyleTang, Y., Li, B., Zhong, H., Yan, M., Jiang, S., & Zhou, J. (2026). Radar and OpenStreetMap-Aided Consistent Trajectory Estimation in Canopy-Occluded Environments. Remote Sensing, 18(1), 70. https://doi.org/10.3390/rs18010070

