Vision-Aided Velocity Estimation in GNSS Degraded or Denied Environments
Abstract
1. Introduction
2. Materials and Methods
2.1. Notation
- : the i-th cluster position measured in the camera frame.
- : the estimated landmark position in the navigation frame.
- : the body-camera rotation matrix.
- : a vector from the camera frame origin to B, the body frame origin.
2.2. Vision Segment
- Features of the current image (k) are compared with those of the clusters formed at the previous iteration () that have not yet become a temporary cluster; if enough features match, a cluster is formed and is added to the set of temporary clusters.
- Features of the current image are compared with those of the set of temporary clusters; temporary clusters are clusters observed in the past but not yet promoted to landmarks. If a temporary cluster is again in view, a counter is increased; when it passes a predefined threshold, the cluster becomes a landmark.
- Clusters of the current image (at time k) are compared with the features of all previously found landmarks that are potentially in view. The HDBSCAN descriptors are used to match clusters of the current image with the previously found landmarks, potentially in view. This allows the recognition of landmarks, the refinement of their structure by adding additional previously unseen features to them, and the determination of their positions in the camera frame.
| Algorithm 1 Cluster Handling Algorithm |
|
2.3. Sequential Kalman Filter
3. Experiments and Results
3.1. First Experiment: Linear Motion
3.2. Second Experiment: Planar Motion
3.3. Third Experiment: Loop Closures
3.4. Fourth Experiment: Corner Cases
4. Limitations and Future Works
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A

- is the difference between navigation errors at times k and .
- is the difference between camera (vision-pipeline) measurement errors at times k and ; this error is bounded under the realistic assumption that camera error does not diverge.
- the two terms together model the effect of attitude estimation error on the cluster position; given that attitude estimation error can be considered bounded, and terms are also bounded, these terms also are bounded.
References
- Lee, D.; Jung, M.; Yang, W.; Kim, A. Lidar odometry survey: Recent advancements and remaining challenges. Intell. Serv. Robot. 2024, 17, 95–118. [Google Scholar] [CrossRef]
- Horn, B.K.; Schunck, B.G. Determining optical flow. Artif. Intell. 1981, 17, 185–203. [Google Scholar] [CrossRef]
- Zhang, J.; Huang, Z.; Zhu, X.; Guo, F.; Sun, C.; Zhan, Q.; Shen, R. LOFF: LiDAR and Optical Flow Fusion Odometry. Drones 2024, 8, 411. [Google Scholar] [CrossRef]
- Grabe, V.; Bülthoff, H.H.; Giordano, P.R. On-board velocity estimation and closed-loop control of a quadrotor UAV based on optical flow. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, St. Paul, MN, USA, 14–18 May 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 491–497. [Google Scholar]
- Xu, H.; Zhang, J.; Cai, J.; Rezatofighi, H.; Tao, D. Gmflow: Learning optical flow via global matching. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 8121–8130. [Google Scholar]
- Mebarki, R.; Lippiello, V.; Siciliano, B. Vision-based and IMU-aided scale factor-free linear velocity estimator. Auton. Robot. 2017, 41, 903–917. [Google Scholar] [CrossRef]
- Van Nam, D.; Danh, P.T.; Park, C.H.; Kim, G.W. Fusion consistency for industrial robot navigation: An integrated SLAM framework with multiple 2D LiDAR-visual-inertial sensors. Comput. Electr. Eng. 2024, 120, 109607. [Google Scholar]
- Jaimez, M.; Monroy, J.G.; Gonzalez-Jimenez, J. Planar odometry from a radial laser scanner. A range flow-based approach. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 4479–4485. [Google Scholar]
- Spies, H.; Jähne, B.; Barron, J.L. Range flow estimation. Comput. Vis. Image Underst. 2002, 85, 209–231. [Google Scholar] [CrossRef]
- Shahbeigi, S.; Robinson, J.; Donzella, V. A Novel Score-Based LiDAR Point Cloud Degradation Analysis Method. IEEE Access 2024, 12, 22671–22686. [Google Scholar] [CrossRef]
- Ziębiński, A.; Biernacki, P. How Accurate Can 2D LiDAR Be? A Comparison of the Characteristics of Calibrated 2D LiDAR Systems. Sensors 2025, 25, 1211. [Google Scholar] [CrossRef] [PubMed]
- Li, F.; Liu, S.; Zhao, X.; Zhang, L. Real-Time 2-D Lidar Odometry Based on ICP. Sensors 2021, 21, 7162. [Google Scholar] [CrossRef] [PubMed]
- Segal, A.; Haehnel, D.; Thrun, S. Generalized-icp. Robot. Sci. Syst. 2009, 2, 435. [Google Scholar]
- Kulmer, D.; Tahiraj, I.; Chumak, A.; Lienkamp, M. Multi-LiCa: A Motion-and Targetless Multi-LiDAR-to-LiDAR Calibration Framework. In Proceedings of the 2024 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Pilsen, Czech Republic, 4–6 September 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–7. [Google Scholar]
- Kumar, N.S.P.; Chandra, G.N.; Sekhar, P.; Sola, R.; KG, M. Comparison of 2D & 3D LiDAR SLAM Algorithms Based on Performance Metrics. In Proceedings of the 2025 International Conference on Innovation in Computing and Engineering (ICE), Greater Noida, India, 28 February–1 March 2025; IEEE: Piscataway, NJ, USA, 2025; pp. 1–6. [Google Scholar]
- Rublee, E.; Rabaud, V.; Konolige, K.; Bradski, G. ORB: An efficient alternative to SIFT or SURF. In Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 2564–2571. [Google Scholar]
- McInnes, L.; Healy, J.; Astels, S. hdbscan: Hierarchical density based clustering. J. Open Source Softw. 2017, 2, 205. [Google Scholar] [CrossRef]
- Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef]
- Kettner, A.M.; Paolone, M. Sequential discrete Kalman filter for real-time state estimation in power distribution systems: Theory and implementation. IEEE Trans. Instrum. Meas. 2017, 66, 2358–2370. [Google Scholar] [CrossRef]
- Vergara, F.; Ryals, A.D.; Arenella, A.; Pollini, L. Complementing Human Perception in Remote Site Exploration using Augmented Reality—A Proof of Concept. In Proceedings of the AIAA SCITECH 2024 Forum, Orlando, FL, USA, 8–12 January 2024; p. 0317. [Google Scholar]
- Merriaux, P.; Dupuis, Y.; Boutteau, R.; Vasseur, P.; Savatier, X. A Study of Vicon System Positioning Performance. Sensors 2017, 17, 1591. [Google Scholar] [CrossRef] [PubMed]
- Zhou, B.; Zheng, C.; Wang, Z.; Zhu, F.; Cai, Y.; Zhang, F. FAST-LIVO2 on Resource-Constrained Platforms: LiDAR-Inertial-Visual Odometry with Efficient Memory and Computation. IEEE Robot. Autom. Lett. 2025, 10, 7931–7938. [Google Scholar] [CrossRef]
- Aldegheri, S.; Bombieri, N.; Bloisi, D.D.; Farinelli, A. Data flow ORB-SLAM for real-time performance on embedded GPU boards. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 5370–5375. [Google Scholar]
- Reif, K.; Gunther, S.; Yaz, E.; Unbehauen, R. Stochastic stability of the discrete-time extended Kalman filter. IEEE Trans. Autom. Control 1999, 44, 714–728. [Google Scholar] [CrossRef]










| Matrix | Diagonal Elements |
|---|---|
| Q | |
| Algorithm | mse | ||||
|---|---|---|---|---|---|
| SKF | 0.0035847 | 0.0052489 | 0.028217 | 0.042968 | 0.0026815 |
| G-ICP | 0.0007491 | −0.0081371 | 0.026469 | 0.15074 | 0.023478 |
| RF2O | 0.00060497 | −0.0079874 | 0.023929 | 0.048903 | 0.0030267 |
| Algorithm | mse | ||||
|---|---|---|---|---|---|
| SKF | 0.00026773 | −0.0010225 | 0.024098 | 0.032205 | 0.0016186 |
| G-ICP | −0.0084467 | 0.00418 | 0.058429 | 0.10314 | 0.014138 |
| RF2O | 0.0010359 | −0.0014562 | 0.019923 | 0.038378 | 0.0018725 |
| (a) First occlusion | |||||
| Algorithm | mse | ||||
| SKF | 0.0088627 | −0.0091305 | 0.051895 | 0.08442 | 0.081612 |
| G-ICP | −0.058464 | −0.042017 | 0.20318 | 0.31524 | 0.26984 |
| RF2O | −0.0050858 | 0.011968 | 0.12108 | 0.14796 | 0.16283 |
| (b) Second occlusion | |||||
| Algorithm | mse | ||||
| SKF | −0.031128 | 0.015867 | 0.053684 | 0.067324 | 0.075745 |
| G-ICP | −0.13207 | 0.0029035 | 0.35275 | 0.22401 | 0.24328 |
| RF2O | −0.01784 | 0.022458 | 0.12309 | 0.11719 | 0.13059 |
| (c) Third occlusion | |||||
| Algorithm | mse | ||||
| SKF | −0.017734 | 0.0050868 | 0.040857 | 0.097087 | 0.089323 |
| G-ICP | −0.049228 | −0.040686 | 0.17659 | 0.22846 | 0.17242 |
| RF2O | −0.016114 | −0.010856 | 0.080844 | 0.17644 | 0.145 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Serio, P.; Ryals, A.D.; Piana, F.; Gentilini, L.; Pollini, L. Vision-Aided Velocity Estimation in GNSS Degraded or Denied Environments. Sensors 2026, 26, 786. https://doi.org/10.3390/s26030786
Serio P, Ryals AD, Piana F, Gentilini L, Pollini L. Vision-Aided Velocity Estimation in GNSS Degraded or Denied Environments. Sensors. 2026; 26(3):786. https://doi.org/10.3390/s26030786
Chicago/Turabian StyleSerio, Pierpaolo, Andrea Dan Ryals, Francesca Piana, Lorenzo Gentilini, and Lorenzo Pollini. 2026. "Vision-Aided Velocity Estimation in GNSS Degraded or Denied Environments" Sensors 26, no. 3: 786. https://doi.org/10.3390/s26030786
APA StyleSerio, P., Ryals, A. D., Piana, F., Gentilini, L., & Pollini, L. (2026). Vision-Aided Velocity Estimation in GNSS Degraded or Denied Environments. Sensors, 26(3), 786. https://doi.org/10.3390/s26030786

