Multi-Domain Robot Swarm for Industrial Mapping and Asset Monitoring: Technical Challenges and Solutions
Abstract
1. Introduction
Main Contributions
- A UAV–UGV collaborative framework for industrial inspection and asset monitoring, supporting real-time coordination between aerial and ground robots via event-triggered TCP/IP communication in ROS and Simulink environments.
- Implementation of a LiDAR-based SLAM system on a UGV (TurtleBot 2) for real-time map generation and autonomous navigation in cluttered indoor environments. The SLAM Toolbox was fine-tuned for mapping accuracy and validated through RViz and Gazebo simulations.
- Design of a vision-based autonomous UAV landing system using AprilTags, with pose estimation supported by OptiTrack ground-truth validation. The system combines visual tracking with onboard Proportional-Integral-Derivative (PID) and Proportional-Integral-Velocity (PIV) control loops to enable landing on a mobile UGV platform.
- Development of an end-to-end digital meter gauge reading pipeline that combines YOLOv5 for object detection, U-Net for region segmentation, and PaddleOCR for numerical value extraction, achieving 87.5% accuracy on real-world UAV-captured images.
- Establishment of a HIL simulation environment for validating swarm behavior, vision algorithms, and inspection routines under realistic constraints. The Quanser Quarc platform enabled direct deployment of Simulink models to UAV hardware.
- Experimental validation in both simulation and physical lab environments was conducted, demonstrating the system’s feasibility for multi-domain robotic collaboration under real-world conditions, including communication latency, sensor noise, and environmental variability.
2. Related Work
3. System Overview
4. Experimental Setup
4.1. Selection of UAV System
4.2. Preliminary Modification to the Standard Robot
4.3. Lab Setup
- SLAM performance in obstacle-rich environments,
- Autonomous navigation and obstacle avoidance by the UGV,
- Precise UAV landing using vision-based detection,
- System coordination under event-triggered socket communication,
- AI-based meter reading using real imagery captured in the lab.
5. Development and System Implementation
5.1. UGV Mapping and Navigation
- Map resolution: 0.05 m per grid cell, balancing accuracy and computational load.
- Update rate: Adjusted for real-time responsiveness during continuous scan processing.
- Sensor input: Real-time laser scan data from the onboard RPLiDAR sensor.
5.2. UAV Vision-Based Landing and Localization
5.3. Communication Between Mobile Robot and Drone System
| Algorithm 1 Event-Based Communication between UGV and UAV via GCS |
|
5.4. Vision-Based Landing
5.4.1. Vision-Based Localization and Tag Detection
5.4.2. Motion Capture Integration and Control Architecture
5.4.3. System Implementation, Safety, and Autonomous Landing Logic
| Algorithm 2 Vision-Based UAV Landing |
|
5.5. AI-Driven Meter Gauge Reading
6. Results and Discussion
6.1. UGV Mapping and Navigation Performance
6.2. UAV Flight Control and Real-Time Performance
6.3. Vision-Based UAV Landing Accuracy
6.4. Digital Meter Reading via AI and OCR
- = Number of images with correctly extracted digits.
- = Number of total images.
6.5. YOLO-Based Object Detection Performance
7. Conclusions, Limitations, and Future Work
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Bouachir, O.; Aloqaily, M.; Al Ridhawi, I.; Alfandi, O.; Salameh, H.B. UAV-assisted vehicular communication for densely crowded environments. In Proceedings of the NOMS 2020–2020 IEEE/IFIP Network Operations and Management Symposium, Budapest, Hungary, 20–24 April 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–4. [Google Scholar]
- Lee, H.; Yoon, J.; Jang, M.S.; Park, K.J. A robot operating system framework for secure uav communications. Sensors 2021, 21, 1369. [Google Scholar] [CrossRef] [PubMed]
- Mahdoui, N.; Frémont, V.; Natalizio, E. Communicating multi-UAV system for cooperative SLAM-based exploration. J. Intell. Robot. Syst. 2020, 98, 325–343. [Google Scholar] [CrossRef]
- Javaid, A.; Alduais, A.; Shullar, M.H.; Baroudi, U.; Alnaser, M. Monocular-Based collision avoid-ance system for unmanned aerial vehicle. IET Smart Cities 2024, 6, 1–9. [Google Scholar] [CrossRef]
- Javaid, A.; Syed, M.A.; Baroudi, U. A machine learning based method for object detection and lo-calization using a monocular rgb camera equipped drone. In Proceedings of the 2023 International Wireless Communications and Mobile Computing (IWCMC), Marrakesh, Morocco, 19–23 June 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–6. [Google Scholar]
- Eltayeb, A.; Ouerdane, F.; Ibnouf, A.; Javaid, A.; Aremu, B.M.; Abubaker, A.; Abdel-Nasser, M.; Sattar, K.A.; El Ferik, S.; Al-Nasser, M. Challenges in Multi-domain Robot Swarm for Industrial Mapping and Asset Monitoring. Transp. Res. Procedia 2025, 84, 534–542. [Google Scholar] [CrossRef]
- Jiang, X.; Wang, F.; Zheng, R.; Liu, H.; Huo, Y.; Peng, J.; Tian, L.; Barsoum, E. VIPS-Odom: Visu-al-Inertial Odometry Tightly-Coupled with Parking Slots for Autonomous Parking. arXiv 2024, arXiv:2407.05017. [Google Scholar] [CrossRef]
- Lashkari, B.; Rezazadeh, J.; Farahbakhsh, R.; Sandrasegaran, K. Crowdsourcing and sensing for indoor localization in IoT: A review. IEEE Sens. J. 2018, 19, 2408–2434. [Google Scholar] [CrossRef]
- Aloqaily, M.; Bouachir, O.; Al Ridhawi, I.; Tzes, A. An Adaptive UAV Positioning Model for Sustainable Smart Transportation. Sustain. Cities Soc. 2022, 78, 103617. [Google Scholar] [CrossRef]
- Brandão, A.S.; Sarcinelli-Filho, M.; Carelli, R. Leader-following Control of a UAV-UGV For-mation. In Proceedings of the 2013 16th International Conference on Advanced Robotics (ICAR), Montevideo, Uruguay, 25–29 November 2013; pp. 1–6. [Google Scholar] [CrossRef]
- Wang, Y.; Zou, Z.; Hu, X.; Gan, Z.; Liu, L. Robust Robot Formation Control Based on Streaming Communication and Leader-Follower Approach. In Proceedings of the 2024 International Wireless Communications and Mobile Computing (IWCMC), Ayia Napa, Cyprus, 27–31 May 2024; pp. 567–572. [Google Scholar] [CrossRef]
- Zhang, Y.; Liu, Z.; Wang, F.; Chen, Z. Rotating Consensus for Leader-Follower Multi-Agent Systems with a Smart Leader. In Proceedings of the 2023 42nd Chinese Control Conference (CCC), Tianjin, China, 24–26 July 2023; pp. 5719–5724. [Google Scholar] [CrossRef]
- Li, Y.; Zhu, X. Design and Testing of Cooperative Motion Controller for UAV-UGV System. Mech-atronics and Intelligent Transportation Systems. Mechatron. Intell. Transp. Syst. 2022, 1, 12–23. [Google Scholar] [CrossRef]
- Pillai, B.M.; Suthakorn, J. Challenges for Novice Developers in Rough Terrain Rescue Robots: A Survey on Motion Control Systems. J. Control Sci. Eng. 2019, 2019, 2135914. [Google Scholar] [CrossRef]
- Gao, J.; Liu, Q.; Chen, H.; Deng, H.; Zhang, L.; Sun, L.; Huang, J. Digital Battle: A Three-Layer Dis-tributed Simulation Architecture for Heterogeneous Robot System Collaboration. Drones 2024, 8, 156. [Google Scholar] [CrossRef]
- Ramasubramanian, A.K.; Mathew, R.; Kelly, M.; Hargaden, V.; Papakostas, N. Digital twin for human-robot collaboration in manufacturing: Review and outlook. Appl. Sci. 2022, 12, 4811. [Google Scholar] [CrossRef]
- Bringmann, E.; Kr, A. Model-Based Testing of Automotive Systems. In Proceedings of the 2008 International Conference on Software Testing, Verification, and Validation, Lillehammer, Norway, 9–11 April 2008; IEEE: Piscataway, NJ, USA, 2008. [Google Scholar] [CrossRef]
- Rizk, Y.; Awad, M.; Tunstel, E.W. Cooperative Heterogeneous Multi-Robot Systems: A Survey. ACM Comput. Surv. 2019, 52, 29. [Google Scholar] [CrossRef]
- Chen, Y.; Tang, J.; Jiang, C.; Zhu, L.; Lehtomäki, M.; Kaartinen, H.; Chen, R. The accuracy compar-ison of three simultaneous localization and mapping (SLAM)-based indoor mapping technologies. Sensors 2018, 18, 3228. [Google Scholar] [CrossRef]
- Sobczak, Ł.; Filus, K.; Domańska, J.; Domański, A. Comparison of Different Hardware Configu-rations for 2D SLAM Techniques Based on Google Cartographer Use Case. Res. Sq. 2021. [Google Scholar] [CrossRef]
- Zhang, L.; Deng, J. Deep Compressed Communication and Application in Multi-Robot 2D-Lidar SLAM: An Intelligent Huffman Algorithm. Sensors 2024, 24, 3154. [Google Scholar] [CrossRef]
- Zhang, H.; Chen, X.; Lu, H.; Xiao, J. Distributed and Collaborative Monocular Simultaneous Localization and Mapping for Multi-Robot Systems in Large-Scale Environments. Int. J. Adv. Robot. Syst. 2018, 2018, 1–20. [Google Scholar] [CrossRef]
- Shah, D.; Sridhar, A.; Dashora, N.; Stachowicz, K.; Black, K.; Hirose, N.; Levine, S. ViNT: A Foundation Model for Visual Navigation. arXiv 2023, arXiv:2306.14846. [Google Scholar] [CrossRef]
- Xing, J.; Cioffi, G.; Hidalgo-Carrió, J.; Scaramuzza, D. Autonomous Power Line Inspection with Drones via Perception-Aware MPC. arXiv 2023, arXiv:2304.00959. [Google Scholar] [CrossRef]
- Xing, J.; Romero, A.; Bauersfeld, L.; Scaramuzza, D. Bootstrapping Reinforcement Learning with Imitation for Vision-Based Agile Flight. In Proceedings of the 8th Annual Conference on Robot Learning, Munich, Germany, 6 November 2024; Available online: https://openreview.net/forum?id=bt0PX0e4rE (accessed on 27 July 2025).
- Hanover, D.; Loquercio, A.; Bauersfeld, L.; Romero, A.; Penicka, R.; Song, Y.; Cioffi, G.; Kaufmann, E.; Scaramuzza, D. Autonomous Drone Racing: A Survey. IEEE Trans. Robot. 2024, 40, 3044–3067. [Google Scholar] [CrossRef]
- Yeom, S. Special Issue on Intelligent Image Processing and Sensing for Drones. Drones 2024, 8, 87. [Google Scholar] [CrossRef]
- Osmani, K.; Schulz, D. Comprehensive Investigation of Unmanned Aerial Vehicles (UAVs): An In-Depth Analysis of Avionics Systems. Sensors 2024, 24, 3064. [Google Scholar] [CrossRef]
- Wu, X.; Shi, X.; Jiang, Y.; Gong, J. A High-Precision Automatic Pointer Meter Reading System in Low-Light Environment. Sensors 2021, 21, 4891. [Google Scholar] [CrossRef] [PubMed]
- Hong, Q.; Ding, Y.; Lin, J.; Wang, M.; Wei, Q.; Wang, X.; Zeng, M. Image-Based Automatic Water-meter Reading under Challenging Environments. Sensors 2021, 21, 434. [Google Scholar] [CrossRef] [PubMed]
- Li, Z.; Zhou, Y.; Sheng, Q.; Chen, K.; Huang, J. A High-Robust Automatic Reading Algorithm of Pointer Meters Based on Text Detection. Sensors 2020, 20, 5946. [Google Scholar] [CrossRef] [PubMed]
- Fang, H.; Ming, Z.Q.; Zhou, Y.F.; Li, H.Y.; Li, J. Meter recognition algorithm for equipment inspection robot. Autom. Instrum. 2013, 28, 10–14. [Google Scholar]
- Bannari, A.; Selouani, A.; Nehari, W.; Rhinane, H.; El-Ghmari, A.; Oulbacha, S. Accuracy Evaluation of Dems Derived from UAV-Data Acquired Over a Narrow-Deep Valley in the Middle-Atlas Mountain. In Proceedings of the IGARSS 2024—2024 IEEE International Geoscience and Remote Sensing Symposium, Athens, Greece, 7–12 July 2024. [Google Scholar]
- Quanser. QDrone 2: Your New Research Flying Computer. Quanser Blog. 2024. Available online: https://www.quanser.com/blog/artificial-intelligence/qdrone-2-your-new-research-flying-computer/ (accessed on 27 July 2025).
- Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. Unmanned Aerial Vehicle (UAV) Path Planning and Control Assisted by Augmented Reality (AR): The Case of Indoor Drones. Int. J. Prod. Res. 2023, 62, 3361–3382. [Google Scholar] [CrossRef]
- Macenski, S.; Jambrecic, I. SLAM Toolbox: SLAM for the dynamic world. J. Open Source Softw. 2021, 6, 2783. [Google Scholar] [CrossRef]
- Stachniss, C. Robotic Mapping and Exploration; Springer: Berlin/Heidelberg, Germany, 2009; Volume 55. [Google Scholar]
- Lluvia, I.; Lazkano, E.; Ansuategi, A. Active mapping and robot exploration: A survey. Sensors 2021, 21, 2445. [Google Scholar] [CrossRef]
- Iqbal, J.; Xu, R.; Sun, S.; Li, C. Simulation of an autonomous mobile robot for Li-DAR-based in-field phenotyping and navigation. Robotics 2020, 9, 46. [Google Scholar] [CrossRef]
- Rivera, Z.B.; De Simone, M.C.; Guida, D. Unmanned ground vehicle modelling in Gazebo/ROS-based environments. Machines 2019, 7, 42. [Google Scholar] [CrossRef]
- Bonelli, N.; Vigna, F.D.; Fais, A.; Lettieri, G.; Procissi, G. Programming socket-independent network functions with nethuns. ACM SIGCOMM Comput. Commun. Rev. 2022, 52, 35–48. [Google Scholar] [CrossRef]
- Feng, Y.; Zhang, C.; Baek, S.; Rawashdeh, S.; Mohammadi, A. Autonomous Landing of a UAV on a Moving Platform Using Model Predictive Control. Drones 2018, 2, 34. [Google Scholar] [CrossRef]
- Ouerdane, F.; Mysorewala, M.F. Visual Servoing of a 3 DOF Hover Quad-copter using 2D Markers. In Proceedings of the 2024 IEEE 33rd International Symposium on Industrial Electronics (ISIE), Ulsan, Republic of Korea, 18–21 June 2024; pp. 1–6. [Google Scholar] [CrossRef]
- Abubakar, A.; Farah, M.; Alsheikh, M.; Saif, A.-W.A. Optimal Altitude Range for Image Collection Drone from Agricultural Areas with Image Resolution and Power Consumption Constraints. In Proceedings of the 2023 20th International Multi-Conference on Systems, Signals& Devices (SSD), Mahdia, Tunisia, 20–23 February 2023; pp. 689–693. [Google Scholar] [CrossRef]
- Matthews, M.T.; Yi, S. Neural network based adaptive flight control of UAVs. In Proceedings of the SoutheastCon 2021, Atlanta, GA, USA, 10–13 March 2021; IEEE: Piscataway, NJ, USA, 2021. [Google Scholar]
- Vissers, J.L.P. A Matlab/Simulink-ROS Architecture for Dual Arm Control. Bachelor’s Thesis, Eindhoven University of Technology, Eindhoven, The Netherlands, 2022. [Google Scholar]
- Cui, G.; Du, G.; Quan, Q.; Xi, Z.; Liu, Y.; Cai, K.-Y. An Input-Dependent Safety Measure and Fail-Safe Strategy for Multicopters. Artif. Intell. Auton. Syst. 2024. [Google Scholar] [CrossRef]
- Probierz, E.; Bartosiak, N.; Wojnar, M.; Skowroński, K.; Gałuszka, A.; Grzejszczak, T. Application of Tiny-ML Methods for Face Recognition in Social Robotics Using OhBot Robots. In Proceedings of the 2022 26th International Conference on Methods and Models in Automa-tion and Robotics (MMAR), Międzyzdroje, Poland, 22–25 August 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 146–151. [Google Scholar] [CrossRef]
- Cheng, Q.; Wang, H.; Zhu, B.; Shi, Y.; Xie, B. A Real-Time UAV Target Detection Algo-rithm Based on Edge Computing. Drones 2023, 7, 95. [Google Scholar] [CrossRef]
- Tappert, C.C.; Suen, C.Y.; Wakahara, T. The STATE of the art in online handwriting recognition. IEEE Trans. Pattern Anal. Mach. Intell. 1990, 12, 787–808. [Google Scholar] [CrossRef]
- Srivastava, S.; Verma, A.; Sharma, S. Optical Character Recognition Techniques: A Review. In Proceedings of the 2022 IEEE International Students’ Conference on Electrical, Electronics and Computer Science (SCEECS), Bhopal, India, 19–20 February 2022; pp. 1–6. [Google Scholar] [CrossRef]
- Beerten, T. OCR Comparison: Tesseract Versus EasyOCR Versus PaddleOCR Versus MMOCR. Medium. 2023. Available online: https://toon-beerten.medium.com/ocr-comparison-tesseract-versus-easyocr-vs-paddleocr-vs-mmocr-a362d9c79e66 (accessed on 27 July 2025).
- Du, Y.; Li, C.; Guo, R.; Yin, X.; Liu, W.; Zhou, J.; Bai, Y.; Yu, Z.; Yang, Y.; Dang, Q.; et al. PP-OCR: A Practical Ultra Lightweight OCR System. arXiv 2020, arXiv:2009.09941. [Google Scholar] [CrossRef]
- Peng, X.; Chen, Y.; Cai, X.; Liu, J. An Improved YOLOv7-Based Model for Real-Time Meter Reading with PConv and Attention Mechanisms. Sensors 2024, 24, 3549. [Google Scholar] [CrossRef]


































| Advantages | Challenges | Field of Application |
|---|---|---|
|
|
|
| Author(s) | Focus of the Study | Key Technologies/Approaches | Application/Outcome |
|---|---|---|---|
| Gao et al. (2024) | Distributed simulation architecture for heterogeneous robot collaboration | Three-layer simulation architecture | Facilitated multi-agent coordination for collaborative tasks |
| Sobczak et al. (2021) | Optimizing hardware configurations for SLAM tasks | Google Cartographer SLAM, LiDAR, IMUs, odometry | Improved decontamination processes through multi-sensor integration |
| Wu et al. (2021) | Automatic pointer meter gauge reading in low-light environments | Vision algorithms | Enhanced precision for meter gauge reading in substations |
| Hong et al. (2021) | Automatic water meter gauge reading in challenging environments | Image-based processing | Reliable meter gauge reading under diverse conditions |
| Li et al. (2020) | Robust automatic pointer meter gauge reading | Text detection algorithms | Increased reading accuracy in high-voltage environments |
| Rizk et al. (2019) | Survey of cooperative heterogeneous MRS | Task decomposition, coalition formation, task allocation | Framework for designing multi-robot systems to handle complex tasks |
| Yara Rizk et al. (2019) | Automation of complex tasks using heterogeneous MRS | Task complexity handling through coordination strategies | Defined cooperation levels for loosely/tightly coupled tasks |
| Chen et al. (2018) | SLAM-based indoor mapping in complex environments | LiDAR, depth cameras | High-accuracy mapping in corridors and libraries |
| Fang et al. (2013) | Meter gauge recognition for inspection robots | Robotic meter gauge recognition algorithm | Safer and more efficient inspection in industrial settings |
| Nguyen et al. (2010) | Autonomous control and SLAM in unmanned ground vehicles | SLAM algorithms, autonomous navigation | Enhanced simultaneous localization and mapping for UGVs |
| Feature | DJI Mavic 2 Pro | Quanser QDrone 2 |
|---|---|---|
| Image | ![]() | ![]() |
| Primary Focus | Consumer-grade, high-end photography/videography | Research and Education, control systems development |
| Payload Capacity | Higher (e.g., can carry larger cameras) | Lower, optimized for control experiments |
| Flight Time | Generally longer | May have shorter flight times depending on payload and flight conditions |
| Camera Quality | Excellent image and video quality, advanced camera features | Varies depending on the camera module used, default is Intel RealSense D435 RGBD camera 30 FPS at 1080p |
| Software/SDK | Proprietary DJI software, limited access to low-level control | Comprehensive SDK and development tools (e.g., Quanser Quarc) for research and control algorithm development |
| Research Suitability | Limited for advanced control algorithms and research due to closed-source nature and limited access to low-level control | Highly suitable for research due to open-source nature, access to low-level control, and availability of development tools |
| Evaluation Metric | YOLOv5 |
|---|---|
| Precision | 0.95 |
| Recall | 0.85 |
| mAP (0–50) | 0.85 |
| mAP (50–95) | 0.5 |
| Accuracy | 0.84 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ouerdane, F.; Abubaker, A.; Aremu, M.B.; Abdel-Nasser, M.; Eltayeb, A.; Sattar, K.A.; Javaid, A.; Ibnouf, A.; El Ferik, S.; Alnasser, M. Multi-Domain Robot Swarm for Industrial Mapping and Asset Monitoring: Technical Challenges and Solutions. Sensors 2025, 25, 6295. https://doi.org/10.3390/s25206295
Ouerdane F, Abubaker A, Aremu MB, Abdel-Nasser M, Eltayeb A, Sattar KA, Javaid A, Ibnouf A, El Ferik S, Alnasser M. Multi-Domain Robot Swarm for Industrial Mapping and Asset Monitoring: Technical Challenges and Solutions. Sensors. 2025; 25(20):6295. https://doi.org/10.3390/s25206295
Chicago/Turabian StyleOuerdane, Fethi, Ahmed Abubaker, Mubarak Badamasi Aremu, Mohammed Abdel-Nasser, Ahmed Eltayeb, Karim Asif Sattar, Abdulrahman Javaid, Ahmed Ibnouf, Sami El Ferik, and Mustafa Alnasser. 2025. "Multi-Domain Robot Swarm for Industrial Mapping and Asset Monitoring: Technical Challenges and Solutions" Sensors 25, no. 20: 6295. https://doi.org/10.3390/s25206295
APA StyleOuerdane, F., Abubaker, A., Aremu, M. B., Abdel-Nasser, M., Eltayeb, A., Sattar, K. A., Javaid, A., Ibnouf, A., El Ferik, S., & Alnasser, M. (2025). Multi-Domain Robot Swarm for Industrial Mapping and Asset Monitoring: Technical Challenges and Solutions. Sensors, 25(20), 6295. https://doi.org/10.3390/s25206295



