The EDI Multi-Modal Simultaneous Localization and Mapping Dataset (EDI-SLAM)
Abstract
:1. Introduction
- The EDI courtyard and its surroundings—an urban landscape, a mix of structured built-up scenery and some dense vegetation, notable for poor-quality GNSS positioning data due to occlusions and multipath scattering;
- An open-field landscape, with high-quality GNSS data but segments that make frame-to-frame tracking difficult due to a lack of reliable tracking features;
- A forest road network—long, straight tracks with intermittent GNSS data and highly repetitive scenery, recorded on a vehicle.
2. Background
2.1. Conventions
2.2. SLAM Datasets
- NTU VIRAL (A Visual–Inertial–Ranging–Lidar Dataset for Autonomous Aerial Vehicles (2022) [15]) uses a similar sensor setup (LiDAR, IMU, a pair of cameras) and also employs an external ground truth annotation method. Unlike us, they provide two LiDAR scanners located at right-angle planes to each other, which is more important in an aerial vehicle, but do not include GNSS data in their tracks. Their ground truth localization system is not designed for long, open-ended trajectories, instead utilizing a set of stationary ultra-wideband (UWB) radars to continuously localize the drone in a confined working area.
- The Multi-Vehicle Stereo Event Camera Dataset: An Event Camera Dataset for 3D Perception (2018) [16], as the name suggests, is primarily intended for use in the development of systems utilizing event cameras. Despite treating the LiDAR primarily as a ground truth estimation tool, it nevertheless provides both LiDAR and GNSS data in addition to the primary stream of event camera and grayscale images. Furthermore, some of the trajectories in this dataset are also on the kilometer scale in length, enabling use in similar target applications in comparison to ours.
- The TUM-VI [19] visual odometry benchmark provided the inspiration for our portable sensor package collection method. However, it (alongside many other popular datasets collected in a similar manner) does not include LiDAR data. Moreover, they only provide GNSS-invariant ground truth annotation at the starts and ends of trajectories, where motion tracking equipment can be deployed.
- The Malaga 2009 dataset [20] also consists of calibrated LiDAR, camera, and GNSS readings, with the GNSS data being used to provide a ground truth trajectory. Furthermore, they conduct a much more extensive analysis of the ground truth uncertainty metrics. The greatest practical differences between their approach and ours are the platform, collection environment and purpose. We use a hand-held sensor package for most recordings, which is often lower, slower, and less stable (particularly in orientation); we collect data in less structured and/or GNSS-inhibited environments; we also target the evaluation of systems that directly fuse GNSS data, necessitating GNSS-independent ground truth measurements.
3. Materials and Methods
3.1. Hardware Setup
- Ouster OS1 rev7 32-line mechanical LiDAR, which outputs point clouds at 10 Hz and IMU data at 100 Hz.
- Basler Dart 1920-160uc global shutter cameras with 4 mm lenses. These are used with a software trigger for stereo applications, produce a center-cropped image at pixels, and were configured to collect RGB images at 30 frames per second for the recordings.
- Xsens MTi-680g RTK GNSS-IMU navigation unit (occluded in the image).
- Intel nuc compact from factor general-purpose computer, running Ubuntu Linux 20.04 LTS and ROS1 Noetic, to servce as the ROS master and perform data collection;
- Intel RealSense L515 depth camera (not used for this application);
- Reflective markers for testing and calibration in the Optitrack motion tracking system;
- A voltage regulator that can supply the LiDAR and PC from a 6S Li-Po battery or direct current power supply.
3.2. Calibration
3.3. Collection
3.4. Ground Truth Measurement
4. Structure, Contents, and Usage
4.1. The Dataset
4.2. Use in Evaluation
5. Conclusions and Future Work
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
EDI | Institute of Electronics and Computer Science (Latvia) |
SLAM | Simultaneous Localization and Mapping |
ENU | East–North–Up surface-aligned coordinate system |
GNSS | Global Navigation Satellite System |
IMU | Inertial Measurement Unit |
ROS | Robot Operating System |
LiDAR | Light Detection and Ranging |
RGB | Red–Green–Blue images |
ECEF | Earth-centered, Earth-fixed Euclidean frame |
RTK | Real-Time Kinematic GNSS |
ATE | Absolute Trajectory Error |
RPE | Relative Pose Error |
RMSE | Root Mean Squared Error |
References
- Racinskis, P.; Ārents, J.; Greitans, M. Constructing Maps for Autonomous Robotics: An Introductory Conceptual Overview. Electronics 2023, 12, 2925. [Google Scholar] [CrossRef]
- Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-time loop closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1271–1278. [Google Scholar] [CrossRef]
- Keller, M.; Lefloch, D.; Lambers, M.; Izadi, S.; Weyrich, T.; Kolb, A. Real-Time 3D Reconstruction in Dynamic Scenes Using Point-Based Fusion. In Proceedings of the 2013 International Conference on 3D Vision, Seattle, WA, USA, 29 June–1 July 2013; pp. 1–8. [Google Scholar]
- Mur-Artal, R.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef]
- Campos, C.; Elvira, R.; Rodr’iguez, J.J.G.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM. IEEE Trans. Robot. 2020, 37, 1874–1890. [Google Scholar] [CrossRef]
- Cho, Y. Awesome SLAM Datasets. Available online: https://github.com/youngguncho/awesome-slam-datasets (accessed on 27 October 2024).
- Racinskis, P.; Arents, J.; Greitans, M. Annotating SLAM data sets with Apriltag markers. In Proceedings of the 2024 10th International Conference on Automation, Robotics and Applications (ICARA), Athens, Greece, 22–24 February 2024; pp. 438–442. [Google Scholar] [CrossRef]
- Wang, J.; Olson, E. AprilTag 2: Efficient and robust fiducial detection. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016. [Google Scholar]
- ROS.org. ROS/Introduction. Available online: http://wiki.ros.org/ROS/Introduction (accessed on 31 October 2024).
- Macenski, S.; Foote, T.; Gerkey, B.; Lalancette, C.; Woodall, W. Robot Operating System 2: Design, architecture, and uses in the wild. Sci. Robot. 2022, 7, eabm6074. [Google Scholar] [CrossRef] [PubMed]
- EPSG. WGS84—World Geodetic System 1984, Used in GPS. Available online: https://epsg.io/4326 (accessed on 27 October 2024).
- EPSG. WGS84—Cartesian. Available online: https://epsg.io/4978 (accessed on 27 October 2024).
- Xsens Technologies B.V. MTi Filter Profiles. Available online: https://base.movella.com/s/article/MTi-Filter-Profiles-1605869708823 (accessed on 27 October 2024).
- Xsens Technologies B.V. MTi Family Reference Manual. Available online: https://www.xsens.com/hubfs/Downloads/Manuals/MTi_familyreference_manual.pdf (accessed on 27 October 2024).
- Nguyen, T.M.; Yuan, S.; Cao, M.; Lyu, Y.; Nguyen, T.H.; Xie, L. NTU VIRAL: A Visual-Inertial-Ranging-Lidar Dataset, From an Aerial Vehicle Viewpoint. Int. J. Robot. Res. 2022, 41, 270–280. [Google Scholar] [CrossRef]
- Zhu, A.Z.; Thakur, D.; Özaslan, T.; Pfrommer, B.; Kumar, V.; Daniilidis, K. The Multivehicle Stereo Event Camera Dataset: An Event Camera Dataset for 3D Perception. IEEE Robot. Autom. Lett. 2018, 3, 2032–2039. [Google Scholar] [CrossRef]
- Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets Robotics: The KITTI Dataset. Int. J. Robot. Res. (IJRR) 2013, 32, 1231–1237. [Google Scholar] [CrossRef]
- Liao, Y.; Xie, J.; Geiger, A. KITTI-360: A Novel Dataset and Benchmarks for Urban Scene Understanding in 2D and 3D. Pattern Anal. Mach. Intell. (PAMI) 2022, 45, 3292–3310. [Google Scholar] [CrossRef] [PubMed]
- Schubert, D.; Goll, T.; Demmel, N.; Usenko, V.C.; Stückler, J.; Cremers, D. The TUM VI Benchmark for Evaluating Visual-Inertial Odometry. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1680–1687. [Google Scholar]
- Blanco, J.L.; Moreno, F.A.; González, J. A Collection of Outdoor Robotic Datasets with centimeter-accuracy Ground Truth. Auton. Robot. 2009, 27, 327–351. [Google Scholar] [CrossRef]
- Ouster, Inc. Official ROS1/ROS2 Drivers for Ouster Sensors. Available online: https://github.com/ouster-lidar/ouster-ros/tree/master (accessed on 28 October 2024).
- jiminghe. Xsens MTi ROS Driver and Ntrip Client. Available online: https://github.com/jiminghe/Xsens_MTi_ROS_Driver_and_Ntrip_Client (accessed on 28 October 2024).
- Basler A.G. Pylon SDKs. Available online: https://www.baslerweb.com/en/software/pylon/sdk/ (accessed on 28 October 2024).
- ROS.org. A ROS-Driver for Basler Cameras. Available online: http://wiki.ros.org/pylon_camera (accessed on 28 October 2024).
- Furgale, P.T.; Rehder, J.; Siegwart, R.Y. Unified temporal and spatial calibration for multi-sensor systems. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 1280–1286. [Google Scholar]
- EPSG. LKS-92/Latvia TM. Available online: https://epsg.io/3059 (accessed on 29 October 2024).
- Latvijas Ģeotelpiskās Informācijas Aģentūra. Latvian Quasi-Geoid Model. Available online: https://www.lgia.gov.lv/en/latvian-quasi-geoid-model (accessed on 29 October 2024).
- Terzakis, G.; Lourakis, M.I.A. A Consistently Fast and Globally Optimal Solution to the Perspective-n-Point Problem. In Proceedings of the European Conference on Computer Vision, Glasgow, UK, 23–28 August 2020. [Google Scholar]
- Itseez. Open Source Computer Vision Library. 2015. Available online: https://github.com/itseez/opencv (accessed on 31 October 2024).
- EDI. EDI-SLAM Data. 2024. Available online: http://edi.lv/EDI-SLAM_dataset (accessed on 31 October 2024).
- Creative Commons. Attribution-NonCommercial-ShareAlike 4.0 International. Available online: https://creativecommons.org/licenses/by-nc-sa/4.0/ (accessed on 1 November 2024).
- Sturm, J.; Engelhard, N.; Endres, F.; Burgard, W.; Cremers, D. A benchmark for the evaluation of RGB-D SLAM systems. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Abu Dhabi, United Arab Emirates, 23–27 October 2012; pp. 573–580. [Google Scholar]
- Shoemake, K. Animating rotation with quaternion curves. In Proceedings of the 12th Annual Conference on Computer Graphics and Interactive Techniques, San Francisco, CA, USA, 22–26 July 1985. [Google Scholar]
Topic | Type | Notes |
---|---|---|
/points | sensor_msgs/msg/PointCloud2 | os_sensor |
/gnss | sensor_msgs/msg/NavSatFix | |
/imu_lidar | sensor_msgs/msg/Imu | os_imu |
/imu_xsens | sensor_msgs/msg/Imu | xsens |
/gnss_pose | geometry_msgs/msg/PoseStamped | lat, lon, alt, ENU |
/camera_left/image_raw | sensor_msgs/msg/Image | cam0 |
/camera_right/image_raw | sensor_msgs/msg/Image | cam1 |
Track | LiDAR | Images (per Camera) | l, m | v, m/s | size, GB |
---|---|---|---|---|---|
courtyard_gt | 5599 | 17,152 | 42.9 | ||
saga_gt | 7905 | 24,344 | 60.9 | ||
ropazi_gt | 11,673 | 35,375 | 88.8 | ||
courtyard_no_gt | 4366 | 13,458 | - | - | 33.6 |
saga_no_gt | 3320 | 10,315 | - | - | 25.8 |
Track | |||||
---|---|---|---|---|---|
courtyard_gt | |||||
saga_gt | |||||
ropazi_gt |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Racinskis, P.; Krasnikovs, G.; Arents, J.; Greitans, M. The EDI Multi-Modal Simultaneous Localization and Mapping Dataset (EDI-SLAM). Data 2025, 10, 5. https://doi.org/10.3390/data10010005
Racinskis P, Krasnikovs G, Arents J, Greitans M. The EDI Multi-Modal Simultaneous Localization and Mapping Dataset (EDI-SLAM). Data. 2025; 10(1):5. https://doi.org/10.3390/data10010005
Chicago/Turabian StyleRacinskis, Peteris, Gustavs Krasnikovs, Janis Arents, and Modris Greitans. 2025. "The EDI Multi-Modal Simultaneous Localization and Mapping Dataset (EDI-SLAM)" Data 10, no. 1: 5. https://doi.org/10.3390/data10010005
APA StyleRacinskis, P., Krasnikovs, G., Arents, J., & Greitans, M. (2025). The EDI Multi-Modal Simultaneous Localization and Mapping Dataset (EDI-SLAM). Data, 10(1), 5. https://doi.org/10.3390/data10010005