Next Article in Journal
Dynamic Continuous Berth Scheduling Under Tidal Constraints and Carbon Costs Using QER-DDQN
Previous Article in Journal
Random Forest Deep Ensemble for Tax Audit Case Selection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Robust Multi-Modal Factor Graph Optimization for Distributed Collaborative LiDAR–Visual–Inertial SLAM

1
School of Mechanical Engineering, Hubei University of Technology, Wuhan 430068, China
2
Hubei Key Laboratory of Modern Manufacturing Quality Engineering, Hubei University of Technology, Wuhan 430068, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2026, 16(10), 4677; https://doi.org/10.3390/app16104677
Submission received: 2 April 2026 / Revised: 5 May 2026 / Accepted: 6 May 2026 / Published: 9 May 2026
(This article belongs to the Section Robotics and Automation)

Abstract

To address accuracy and reliability challenges in simultaneous localization and mapping (SLAM) systems under extreme conditions, this paper presents LIVE-SLAM, a tightly-coupled LiDAR–inertial–visual framework. The technical core integrates a LiDAR Probabilistic Feature Extraction (LPFE) module to reduce frontend overhead by retaining high-confidence features, an adaptive confidence-based weighting strategy in the backend optimization to dynamically balance multi-modal residuals during sensor degradation, and a Visual Redundancy Removal (VRR) based hybrid loop closure mechanism to mitigate perceptual aliasing. Evaluation on the KITTI benchmark and challenging real-world datasets demonstrates that our multi-sensor fusion effectively prevents tracking failures typical of single-sensor systems. Specifically, compared to the LVI-SAM framework, the frontend runtime is reduced by 49% and backend efficiency is improved by 25% in complex urban sequences. Furthermore, our approach achieves an average RMSE improvement of 35.3% over FAST-LIO2 and LIO-SAM in diverse real-world scenarios, particularly in environments with geometric degradation and lighting variations. These findings confirm the system’s superior real-time efficiency and global localization precision in both standard benchmarks and complex practical applications.
Keywords: fusion algorithm; front-end odometry; back-end optimization; degradation identification; loop detection; SLAM fusion algorithm; front-end odometry; back-end optimization; degradation identification; loop detection; SLAM

Share and Cite

MDPI and ACS Style

Xu, W.; Liu, S.; Chen, R.; Du, S.; Wang, Y. Robust Multi-Modal Factor Graph Optimization for Distributed Collaborative LiDAR–Visual–Inertial SLAM. Appl. Sci. 2026, 16, 4677. https://doi.org/10.3390/app16104677

AMA Style

Xu W, Liu S, Chen R, Du S, Wang Y. Robust Multi-Modal Factor Graph Optimization for Distributed Collaborative LiDAR–Visual–Inertial SLAM. Applied Sciences. 2026; 16(10):4677. https://doi.org/10.3390/app16104677

Chicago/Turabian Style

Xu, Wan, Shijie Liu, Rupeng Chen, Simin Du, and Yujie Wang. 2026. "Robust Multi-Modal Factor Graph Optimization for Distributed Collaborative LiDAR–Visual–Inertial SLAM" Applied Sciences 16, no. 10: 4677. https://doi.org/10.3390/app16104677

APA Style

Xu, W., Liu, S., Chen, R., Du, S., & Wang, Y. (2026). Robust Multi-Modal Factor Graph Optimization for Distributed Collaborative LiDAR–Visual–Inertial SLAM. Applied Sciences, 16(10), 4677. https://doi.org/10.3390/app16104677

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop