Next Article in Journal
Real-Time Robotic Navigation with Smooth Trajectory Using Variable Horizon Model Predictive Control
Previous Article in Journal
QoS and Grid-Shifting Ability Guaranteed Optimal Capacity Sizing Method of Battery Swapping Station Considering Seasonal Characteristics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Review

Multi-Camera Simultaneous Localization and Mapping for Unmanned Systems: A Survey

1
College of Electronic Science and Technology, National University of Defense Technology, Changsha 410073, China
2
School of Artificial Intelligence and Robotics, Hunan University, Changsha 410012, China
*
Author to whom correspondence should be addressed.
Electronics 2026, 15(3), 602; https://doi.org/10.3390/electronics15030602
Submission received: 20 December 2025 / Revised: 17 January 2026 / Accepted: 21 January 2026 / Published: 29 January 2026

Abstract

Autonomous navigation in unmanned systems increasingly relies on robust perception and mapping capabilities under large-scale, dynamic, and unstructured environments. Multi-camera simultaneous localization and mapping (MCSLAM) has emerged as a promising solution due to its improved field-of-view coverage, redundancy, and robustness compared to single-camera systems. However, the deployment of MCSLAM introduces several technical challenges that remain insufficiently addressed in existing literature. These challenges include the high-dimensional nature of multi-view visual data, the computational cost associated with multi-view geometry and large-scale bundle adjustment, and the strict requirements on camera calibration, temporal synchronization, and geometric consistency across heterogeneous viewpoints. This survey provides a comprehensive review of recent advances in MCSLAM for unmanned systems, categorizing existing approaches based on system configuration, field-of-view overlap, calibration strategies, and optimization frameworks. We further analyze common failure modes, evaluate representative algorithms, and identify emerging research trends toward scalable, real-time, and uncertainty-aware MCSLAM in complex operational environments.
Keywords: multi-camera simultaneous localization and mapping; vision sensors; intrinsic and extrinsic calibration; pose and depth estimation; mapping; deep learning; datasets multi-camera simultaneous localization and mapping; vision sensors; intrinsic and extrinsic calibration; pose and depth estimation; mapping; deep learning; datasets

Share and Cite

MDPI and ACS Style

Wang, G.; Wang, L.; He, J.; Jiang, Y.; Qi, Q.; Zhou, Y. Multi-Camera Simultaneous Localization and Mapping for Unmanned Systems: A Survey. Electronics 2026, 15, 602. https://doi.org/10.3390/electronics15030602

AMA Style

Wang G, Wang L, He J, Jiang Y, Qi Q, Zhou Y. Multi-Camera Simultaneous Localization and Mapping for Unmanned Systems: A Survey. Electronics. 2026; 15(3):602. https://doi.org/10.3390/electronics15030602

Chicago/Turabian Style

Wang, Guoyan, Likun Wang, Jun He, Yanwen Jiang, Qiming Qi, and Yueshang Zhou. 2026. "Multi-Camera Simultaneous Localization and Mapping for Unmanned Systems: A Survey" Electronics 15, no. 3: 602. https://doi.org/10.3390/electronics15030602

APA Style

Wang, G., Wang, L., He, J., Jiang, Y., Qi, Q., & Zhou, Y. (2026). Multi-Camera Simultaneous Localization and Mapping for Unmanned Systems: A Survey. Electronics, 15(3), 602. https://doi.org/10.3390/electronics15030602

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop