A Low-Cost and Robust Multi-Sensor Data Fusion Scheme for Heterogeneous Multi-Robot Cooperative Positioning in Indoor Environments
Abstract
:1. Introduction
2. Methods
2.1. Integration Architecture
2.2. Coordinate Frames Definition and Transformation
2.2.1. Definition of Global and Local Coordinate Frames
2.2.2. Transformation of Visual Coordinate Frames
2.3. Model Design
2.3.1. Odometer and Gyroscope Integration (Step 1—Self Positioning)
2.3.2. Distributed Data Fusion of Visual Positioning Subsystem and Laser Observation Subsystem (Step 2—Collaborative Positioning)
- Inertial navigation system state derivation
- 2.
- Visual target detection and localization
- 3.
- Laser observation and localization
2.3.3. Joint Filtering Algorithm (Step 3—Joint Filtering)
- Completing the solution of information allocation factors β1, β2. The information allocation factor βi satisfies the information conservation principle, which meansβm represents the information distribution coefficient of the main filter, while the value of β1, …, βn directly impact the performance of joint filtering. Typically, larger coefficient values are assigned to sensors with higher measurement accuracy. However, since the external sensor subsystem employed in this study may experience sudden measurement errors like drift, using a fixed coefficient value is not appropriate. Hence, we employ eigenvalue decomposition based on the true variance Pi to calculate the real-time value of βi. Pi can be decomposed by eigenvalues , where ; the value of βi can be deduced as
- Initialize the global state estimate value and the covariance matrix Pi0 and assign the information to each sub-filter and the main filter in proportion to the information factor βi:
- The time correction is applied simultaneously to each sub-filter and the main filter, and the common noise is allocated to each sub-filter based on the information factor βi:
- Each sub-filter uses a local observation Zi for the observed data correction.
3. Analysis and Correction of Practical Errors
3.1. Visual Target Detection Delay
3.2. Network Communication Latency
3.3. Observation Masking
3.4. Requirements of the Filtering Process for the Observed Values
4. Results and Discussion
4.1. Test Platform
4.2. Test Scenarios
4.3. Test Results and Discussion
4.3.1. Odometer and Gyroscope Information Fusion
4.3.2. Joint Filter Positioning
4.3.3. Observation Occlusion Error Handling
5. Conclusions
- (1)
- Leveraging current technology [28], we aim to enhance and optimize our scheme. This involves addressing observation errors arising from the information fusion process and tackling time-delay errors in diverse formation configurations, varying robot numbers, and investigating different communication conditions. This challenging aspect contributes to the refinement of the existing architectural scheme.
- (2)
- We will examine the impact of applying the existing algorithms in height-transformed environments and develop 3D positioning information fusion schemes accordingly, which should enable a more accurate estimation of trip energy consumption to support some advanced future applications.
- (3)
- Based on the accurate position information of member robots and obstacles, the autonomous mobile robot group formation planning, holding, and control algorithms will be further investigated in combination with the currently available algorithmic models [29,30], and the optimal efficiency solution will be proposed based on different task scenarios.
- (4)
- It is necessary to conduct research on the interaction strategies of multi-robot systems under specific topologies and to design transmission strategies for the information in the interaction network, such as sensor data, communication protocols, control information, and images, which will meet the requirements of formation missions with high communication performance.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Lupton, T.; Sukkarieh, S. Visual-Inertial-Aided Navigation for High-Dynamic Motion in Built Environments Without Initial Conditions. IEEE Trans. Robot. 2011, 28, 61–76. [Google Scholar] [CrossRef]
- Kelly, J.; Sukhatme, G.S. Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration. Int. J. Robot. Res. 2010, 30, 56–79. [Google Scholar] [CrossRef]
- Jones, E.S.; Soatto, S. Visual-inertial navigation, mapping and localization: A scalable real-time causal approach. Int. J. Robot. Res. 2011, 30, 407–430. [Google Scholar] [CrossRef]
- Shen, S. Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles; University of Pennsylvania ProQuest Dissertations Publishing: Philadelphia, PA, USA, 2014. [Google Scholar]
- Lu, Y.; Lee, J.; Yeh, S.H.; Cheng, H.M.; Chen, B.; Song, D. Sharing Heterogeneous Spatial Knowledge: Map Fusion Between Asynchronous Monocular Vision and Lidar or Other Prior Inputs. Robot. Res. 2019, 10, 727–741. [Google Scholar]
- Howard, A.; Matarić, M.J.; Sukhatme, G.S. Localization for Mobile Robot Teams: A Distributed MLE Approach. Exp. Robot. VIII 2003, 5, 146–155. [Google Scholar]
- Sun, R.; Yang, Y.; Chiang, K.-W.; Duong, T.-T.; Lin, K.-Y.; Tsai, G.-J. Robust IMU/GPS/VO Integration for Vehicle Navigation in GNSS Degraded Urban Areas. IEEE Sens. J. 2020, 20, 10110–10122. [Google Scholar] [CrossRef]
- Anbu, N.A.; Jayaprasanth, D. Integration of Inertial Navigation System with Global Positioning System using Extended Kalman Filter. In Proceedings of the International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, India, 27–29 November 2019; pp. 789–794. [Google Scholar]
- Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation. In Proceedings of the International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar]
- Dou, L.; Li, M.; Li, Y.; Zhao, Q.Y.; Li, J.; Wang, Z. A novel artificial bee colony optimization algorithm for global path planning of multi-robot systems. In Proceedings of the IEEE International Conference on Robotics and Biomimetics, Bali, Indonesia, 5–10 December 2014. [Google Scholar]
- Howard, A.; Matark, M.J.; Sukhatme, G.S. Localization for mobile robot teams using maximum likelihood estimation. In Proceedings of the IEEE International Workshop on Intelligent Robots and Systems (IROS), Lausanne, Switzerland, 30 September–4 October 2002. [Google Scholar]
- Chhatpar, S.R.; Branicky, M.S. Particle filtering for localization in robotic assemblies with position uncertainty. In Proceedings of the IEEE International Workshop on Intelligent Robots and Systems (IROS), Edmonton, AB, Canada, 2–6 August 2005. [Google Scholar]
- Burgard, W.; Cremers, A.B.; Fox, D.; Hähnel, D.; Lakemeyer, G.; Schulz, D.; Steiner, W.; Thrun, S. Experiences with an interactive museum tour-guide robot. Artif. Intell. 1999, 114, 3–55. [Google Scholar] [CrossRef]
- Fox, D.; Burgard, W.; Kruppa, H.; Thrun, S. A Probabilistic Approach to Collaborative Multi-Robot Localization. Auton. Robots 2000, 8, 325–344. [Google Scholar] [CrossRef]
- Rekleitis, I.M.; Dudek, G.; Milios, E.E. Multi-robot cooperative localization: A study of trade-offs between efficiency and accuracy. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Lausanne, Switzerland, 30 September–4 October 2002. [Google Scholar]
- Ning, B.; Han, Q.L.; Zuo, Z.; Jin, J.; Zheng, J. Collective Behaviors of Mobile Robots Beyond the Nearest Neighbor Rules With Switching Topology. IEEE Trans. Cybern. 2017, 48, 1577–1590. [Google Scholar] [CrossRef] [PubMed]
- Shucker, B.; Murphey, T.; Bennett, J.K. A method of cooperative control using occasional non-local interactions. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation (IRCA), Orlando, FL, USA, 15–19 May 2006. [Google Scholar]
- Garcia, M.A.; Solanas, A. 3D simultaneous localization and modeling from stereo vision. In Proceedings of the IEEE International Conference on Robotics and Automation (IRCA), New Orleans, LA, USA, 26 April–1 May 2004. [Google Scholar]
- Kannala, J.; Heikkilä, J.; Brandt, S.S. Geometric Camera Calibration. Wiley Encycl. Comput. Sci. Eng. 2008, 13, 1–20. [Google Scholar]
- Zong, G.; Deng, L.; Wang, W. Robust Localization algorithms for outdoor mobile robot. J. Beijing Univ. Aeronaut. Astronaut. 2007, 33, 454–458. [Google Scholar]
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y. Scaled-YOLOv4: Scaling Cross Stage Partial Network. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 20–25 June 2021; pp. 13029–13038. [Google Scholar]
- Wang, C.Y.; Liao, H.Y.; Wu, Y.H.; Chen, P.Y.; Hsieh, J.W.; Yeh, I.H. CSPNet: A New Backbone That Can Enhance Learning Capability of CNN. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, Seattle, WA, USA, 13–19 June 2020; pp. 390–391. [Google Scholar]
- Hu, G.X.; Yang, Z.; Hu, L.; Huang, L.; Han, J.M. Small Object Detection with Multiscale Features. Int. J. Digit. Multimed. Broadcast. 2018, 2018, 4546896. [Google Scholar] [CrossRef]
- Jilkov, V.P.; Angelova, D.S.; Semerdjiev, T.A. Design and comparison of mode-set adaptive IMM algorithms for maneuvering target tracking. IEEE Trans. Aerosp. Elecreonis Syst. 1999, 35, 343–350. [Google Scholar] [CrossRef]
- Wang, L.; Liu, Y.H.; Wan, J.W.; Shao, J.X. Multi-Robot Cooperative Localization Based on Relative Bearing. Chin. J. Sens. Actuators 2007, 20, 794–799. [Google Scholar]
- Wang, J. Multi-Robot Collaborative SLAM Algorithm Research. Master’s Thesis, Xi’an University of Technology, Xi’an, China, 2023; pp. 47–63. [Google Scholar]
- Zhu, K.; Wen, Z.; Di, S.; Zhang, F.; Guo, G.; Kang, H. Observability and Co-Positioning Accuracy Analysis of Multi-Robot Systems. Telecommun. Eng. 2023, 51, 4–8. [Google Scholar]
- Chiang, K.W.; Duong, T.T.; Liao, J.K. The Performance Analysis of a Real-Time Integrated INS/GPS Vehicle Navigation System with Abnormal GPS Measurement Elimination. Sensors 2013, 13, 10599–10622. [Google Scholar] [CrossRef] [PubMed]
- Yang, S.; Li, T.; Shi, Q.; Bai, W.; Wu, Y. Artificial Potential-Based Formation Control with Collision and Obstacle Avoidance for Second-order Multi-Agent Systems. In Proceedings of the 2020 7th International Conference on Information, Cybernetics, and Computational Social Systems (ICCSS), Guangzhou, China, 13–15 November 2021. [Google Scholar]
- Lu, Q.; Han, Q.L.; Zhang, B.; Liu, D.; Liu, S. Cooperative Control of Mobile Sensor Networks for Environmental Monitoring: An Event-Triggered Finite-Time Control Scheme. IEEE Trans. Cybern. 2016, 47, 4134–4147. [Google Scholar] [CrossRef] [PubMed]
Sub-Processes | Time (ms) | Instruction |
---|---|---|
Camera imaging | ≤5 | Imaging of the target object, pre-processing, scaling, and normalization of the input image. |
Target detection | ≤25, ≥20 | Feature-pyramid-based target prediction using a convolutional-neural-network-trained classifier. |
Laser Positioning | ≤5 | This process can be performed in parallel with the above process. |
Serial Number | Equipment Name | Main Sensors and Information Processing Units | Robot Function Description |
---|---|---|---|
1 | Main Robot (Pioneer 3-AT) |
|
|
2 | Member Robot (AmigoBot) |
|
|
Test Design | Test Case 1 | Test Case 2 | Test Case 3 |
---|---|---|---|
Test scenario description | Circular motion of single robot | The main robot leads the members in an S-curve movement in formation according to a fixed relative position | The member robots follow the main robot in a circular motion along the same trajectory formation |
Environment | Empty indoor environment with member robots moving in formation with the main robot along a designated trajectory | ||
Fusion domain | Positioning domain with open indoor environments Robot movement speed ≤ 800 mm/s | ||
Total Time Duration | 50 s | 120 s | 120 s |
Competing Solutions | Odo only Odo/Gyroscope | Odo/Gyroscope Joint filtering algorithm (Proposed) | Odo/Gyroscope Joint filtering algorithm (Proposed, presence of observation masking) |
Reference | The reference trajectory in the experiment is obtained from a set of high-precision combined navigation equipment H Guide N580 and post-processed by Novatel commercial software IE 8.9. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cai, Z.; Liu, J.; Chi, W.; Zhang, B. A Low-Cost and Robust Multi-Sensor Data Fusion Scheme for Heterogeneous Multi-Robot Cooperative Positioning in Indoor Environments. Remote Sens. 2023, 15, 5584. https://doi.org/10.3390/rs15235584
Cai Z, Liu J, Chi W, Zhang B. A Low-Cost and Robust Multi-Sensor Data Fusion Scheme for Heterogeneous Multi-Robot Cooperative Positioning in Indoor Environments. Remote Sensing. 2023; 15(23):5584. https://doi.org/10.3390/rs15235584
Chicago/Turabian StyleCai, Zhi, Jiahang Liu, Weijian Chi, and Bo Zhang. 2023. "A Low-Cost and Robust Multi-Sensor Data Fusion Scheme for Heterogeneous Multi-Robot Cooperative Positioning in Indoor Environments" Remote Sensing 15, no. 23: 5584. https://doi.org/10.3390/rs15235584
APA StyleCai, Z., Liu, J., Chi, W., & Zhang, B. (2023). A Low-Cost and Robust Multi-Sensor Data Fusion Scheme for Heterogeneous Multi-Robot Cooperative Positioning in Indoor Environments. Remote Sensing, 15(23), 5584. https://doi.org/10.3390/rs15235584