IMUC: Edge–End–Cloud Integrated Multi-Unmanned System Payload Management and Computing Platform
Abstract
:1. Introduction
2. Platform Architecture
2.1. Device Layer
2.2. Transport Layer
2.3. Cloud Layer
3. Software Architecture
4. Related Work
4.1. Data Exchange between Edge, Device, and Cloud Platforms
4.1.1. Camera Data
4.1.2. Lidar Data
4.1.3. Video Data
4.1.4. Location Data
4.1.5. Command Data
4.2. Active Perception Based on Change Detection in Complex Scenes
- Efficiency improvement: By utilizing the detection capabilities of the Global Remote Sensing Real-time Monitoring and Point-to-Point Update Cloud Platform, we can quickly obtain information about changes in the task area. This eliminates the need for extensive searching or blind sampling, saving time and resources;
- Task optimization: By obtaining real-time location information of the changed areas, we can perform targeted data acquisition and processing for these regions. This allows us to more accurately capture the changes in the task area, thereby improving the quality and effectiveness of data acquisition;
- Timely response: Due to the continuous monitoring of changes in the task area by the Global Remote Sensing Real-time Monitoring and Point-to-Point Update Cloud Platform, our platform can receive these change notifications promptly and respond quickly. This means that we can adjust the task execution strategy in real-time to meet the evolving task requirements.
4.3. Rapid Online Fusion Processing of Multi-Sensor Data
4.3.1. Image Online Stitching
4.3.2. Multi-Source Laser Data Fusion
4.4. Cloud-Based Unmanned System Payload Management and Monitoring
4.4.1. Multi-Unmanned System Payload Management
- Device management: The platform supports the connection of various types of devices, including drones, unmanned vehicles, and robotic dogs, among others. We need to input the model, IP address, and basic information such as the task name and operators involved for each device. The basic information we submit is automatically stored in the server’s database and synchronized to the device information panel. Through such a device management mechanism, we can scientifically track and manage critical information on various devices. This includes device models, allowing us to accurately identify and differentiate different types of devices; IP addresses, enabling precise device localization within the network; and task names and operator information, facilitating the recording and tracking of device usage. This scientific device management approach not only helps ensure the accuracy and traceability of device operations but also provides comprehensive control over device connectivity status and task assignment. Additionally, by storing basic information in a server-side database and synchronizing it with the device information panel, we can have real-time awareness of device status and attributes, offering users more convenient device management services;
- Group control settings: Group control settings allow for customizing the parameters required for data transmission operations across multiple unmanned devices before starting a mission. This primarily includes setting the main control IP, topics for laser data and camera data for different unmanned devices, color settings for laser data from different sources, waypoint planning object settings, remote control object switching, as well as maximum linear and angular velocity settings. Once we have completed the settings, the data will be updated in the server parameters. Through group control settings, we can customize the operational parameters of unmanned devices to meet the specific requirements of tasks. By setting the color of laser data from different sources, we can better distinguish and visualize information from different data sources. The setting of waypoint planning objects allows us to precisely specify the navigation of unmanned devices in missions. The ability to switch remote control objects, as well as the setting of maximum linear and angular velocities, provides flexible control over the motion behavior of unmanned devices. Group control settings are of significant importance for the collaborative work of multiple unmanned devices, as they can enhance work efficiency and reduce costs and resource inputs.
4.4.2. Device Monitoring
- Equipment tracking: The platform can receive real-time location information returned by unmanned equipment and, at the same time, update the trajectory information generated during its movement into the image map. Through this function, users can clearly understand the activity trajectory of each unmanned device. We also automatically differentiate colors for different unmanned device trajectories, allowing users to easily distinguish and identify the trajectory paths of different devices. This feature will provide users with real-time and intuitive location information displays to better monitor and analyze the movement of unmanned devices. By observing the movement patterns and trajectories of unmanned devices, users can gain a better understanding of the activity range of the devices. This is of significant importance for tasks such as mission supervision, location analysis, and path optimization;
- Remote control of devices: Under normal communication conditions, the platform can remotely control the movement of unmanned devices using the NippleJS virtual joystick. NippleJS is a JavaScript library used to create interfaces for virtual joystick touch functionality. This feature allows temporary takeover of unmanned vehicles when they are unable to navigate autonomously or encounter risks. The vector controller provided by NippleJS enables simultaneous control of the angular velocity and linear velocity of the device’s movement;
- Device waypoint planning control: The waypoint planning feature empowers users to autonomously define waypoints on the map interface. It also reads the control object specified in the group control settings and sends the set waypoints to the designated unmanned devices. Upon receiving the instructions, the devices autonomously explore and navigate to the specified waypoints. During this process, the devices collect real-time scene data from their surrounding environment and transmit it back to the platform. This feature enables remote control of unmanned devices for efficient data collection. Users can utilize this functionality to flexibly plan waypoints for the devices, meeting the specific requirements of the tasks. The devices autonomously navigate according to the predefined sequence of waypoints, collecting crucial environmental information. This remote control approach not only enhances the efficiency of data collection but also reduces the complexity and risks associated with manual operation.
4.5. Cloud Route Planning and Massive Data Visualization
4.5.1. Cloud Route Planning
Algorithm 1. Global Route Planning Algorithm |
|
4.5.2. Mass Data Visualization
5. Experiment and Comparison
5.1. Hardware Platform
5.2. Lab Environment
5.3. Experiment Procedure
5.4. Experimental Results and Analysis
5.5. Platform Performance Testing
5.6. Platform Comparison
6. Discussion
- Technical Challenges Faced: First and foremost, we are highly concerned with improving the data transmission speed in practical operations, as it greatly affects the real-time performance of the platform. In the field of unmanned systems, optimizing data transmission speed is crucial for efficient and intelligent task execution. Currently, the real-time performance of the platform heavily relies on the quality of the network environment. Our current solution mainly involves compressing and simplifying data as much as possible to reduce data transmission time. However, this solution has limitations. With the continuous development of communication technology, faster and more stable communication networks and protocols will meet the demand for high-capacity data transmission. Secondly, we face the challenge of slow web-based rendering of large-scale 3D data. Rendering speed can be improved by constructing octree indexes and using level-of-detail (LOD) techniques to adjust the level of detail. However, as data volume continues to increase, more advanced technologies and algorithms are needed to support the browsing speed of massive 3D data. We are exploring the use of cloud computing and distributed computing methods for rendering large-scale 3D data. By distributing rendering tasks to multiple computing nodes for parallel processing, rendering speed can be accelerated, and the real-time capabilities of the system can be improved. Additionally, leveraging the elasticity and scalability of cloud computing resources, we can dynamically adjust the scale of computing resources to accommodate different scales and complexities of 3D geospatial data, as per the requirement;
- Integration of Artificial Intelligence and Machine Learning: The integration of artificial intelligence (AI) and machine learning (ML) technologies holds significant importance in data transmission and processing. By leveraging these technologies, intelligent allocation and scheduling of tasks can be achieved. Based on factors such as the nature of the tasks, priority, status, and capabilities of the robotic devices, algorithms can autonomously determine which robots should be assigned tasks and arrange the execution order of tasks to achieve optimal task completion efficiency. Additionally, when performing real-time analysis and processing of raw data, useful information or features can be extracted, noise or outlier data can be filtered, and data compression and storage methods can be optimized to improve data processing efficiency and performance. We will conduct in-depth research and apply these technologies to the platform to achieve more efficient and intelligent applications;
- Applications in the Surveying and Mapping Industry: Through this platform, the surveying and mapping industry can achieve more efficient and accurate geographical data collection and processing, resulting in significant value and impact. The platform simplifies the surveying and mapping process by automating data collection, analysis, and visualization, thus improving efficiency and productivity. It reduces manual work and time-consuming tasks, helping to save time and resources. Figure 20 shows the application direction of the surveying and mapping industry and its main technical routes. Our platform is applied in the surveying and mapping industry and brings practical help. For example, emergency disaster rescue operations can assist rescue personnel in remote operations and visualize the situation at the disaster site. This not only quickly reconstructs the field conditions but also improves safety by reducing actual contact with hazardous environments or complex terrains. However, applying this platform to the surveying and mapping industry may face some potential obstacles. Firstly, there are concerns regarding data quality and security. The surveying and mapping industry has high requirements for data quality, including high accuracy and high resolution. It is necessary to implement appropriate data quality control measures to ensure accuracy during the data collection process. Additionally, it is essential to strengthen data security and privacy protection to ensure data integrity. Secondly, there is the challenge of transmitting and visualizing multi-sensor data in complex surveying scenarios. As we encounter increasingly complex scenarios, we often deploy more sensors, which complicates data integration and visualization. This requires continuous iteration of data transformation and standardization tools, as well as data integration techniques, to achieve the fusion of multi-source data.
7. Conclusions
- Our platform adopts a batch-streaming hybrid data transmission mechanism. This mechanism allows us to achieve the transmission and exchange of heterogeneous payload data from multiple sources, ensuring real-time and complete data delivery. Whether it is structured or unstructured data, both can be efficiently and reliably transmitted and exchanged through this platform;
- By enhancing the management and monitoring capabilities of multiple unmanned systems in the cloud, our platform ensures safety during actual operations. This includes, but is not limited to, real-time tracking of the geographical positions of unmanned devices, monitoring the operational status of devices, and receiving real-time feedback from alarm systems. Additionally, the platform enables fast online fusion processing and collaborative perception of data from multiple sensors;
- Our platform achieves multiple functionalities through the visual design of cloud-based task and route planning. These functionalities include multi-layer switching, editing and visual computation of payload parameters, offline playback of archived data, and statistical analysis of massive data. Users can efficiently perform route planning, adjust payload parameters, view archived data, and conduct data analysis through a clean and intuitive interface. This greatly enhances the convenience and efficiency of operating unmanned systems.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Lei, G.P. Research on Optimization of Cloud-based Control System for Agricultural Drone in Crop Protection. J. Agric. Mech. Res. 2023, 45, 5. [Google Scholar]
- Cao, K.; Liu, Y.; Meng, G.; Sun, Q. An overview on edge computing research. IEEE Access 2020, 8, 85714–85728. [Google Scholar] [CrossRef]
- Kim, B.; Jang, J.; Jung, J.; Han, J.; Heo, J.; Min, H. A Computation Offloading Scheme for UAV-Edge Cloud Computing Environments Considering Energy Consumption Fairness. Drones 2023, 7, 139. [Google Scholar] [CrossRef]
- Xu, Y.; Li, L.; Sun, S.; Wu, W.; Jin, A.; Yan, Z.; Yang, B.; Chen, C. Collaborative Exploration and Mapping with Multimodal LiDAR Sensors. In Proceedings of the 2023 IEEE International Conference on Unmanned Systems (ICUS), Hefei, China, 13–15 October 2023. [Google Scholar]
- Nohel, J.; Stodola, P.; Flasar, Z.; Křišťálová, D.; Zahradníček, P.; Rak, L. Swarm Maneuver of Combat UGVs on the future Digital Battlefield. In Proceedings of the International Conference on Modelling and Simulation for Autonomous Systems, Prague, Czech Republic, 20–21 October 2022; Springer International Publishing: Cham, Switzerland, 2022; pp. 209–230. [Google Scholar]
- Zhang, Y.; Yu, J.; Tang, Y.; Deng, Y.; Tian, X.; Yue, Y.; Yang, Y. GACF: Ground-Aerial Collaborative Framework for Large-Scale Emergency Rescue Scenarios. In Proceedings of the 2023 IEEE International Conference on Unmanned Systems (ICUS), IEEE, Hefei, China, 13–15 October 2023; pp. 1701–1707. [Google Scholar]
- Berger, G.S.; Teixeira, M.; Cantieri, A.; Lima, J.; Pereira, A.I.; Valente, A.; de Castro, G.G.R.; Pinto, M.F. Cooperative Heterogeneous Robots for Autonomous Insects Trap Monitoring System in a Precision Agriculture Scenario. Agriculture 2023, 13, 239. [Google Scholar] [CrossRef]
- Zhao, M.; Li, D. Collaborative task allocation of heterogeneous multi-unmanned platform based on a hybrid improved contract net algorithm. IEEE Access 2021, 9, 78936–78946. [Google Scholar] [CrossRef]
- Zhou, X.; Wen, X.; Wang, Z.; Gao, Y.; Li, H.; Wang, Q.; Yang, T.; Lu, H.; Cao, Y.; Xu, C.; et al. Swarm of micro flying robots in the wild. Sci. Robot. 2022, 7, eabm5954. [Google Scholar] [CrossRef] [PubMed]
- Stolfi, D.H.; Brust, M.R.; Danoy, G.; Bouvry, P. UAV-UGV-UMV multi-swarms for cooperative surveillance. Front. Robot. AI 2021, 8, 616950. [Google Scholar] [CrossRef]
- Potena, C.; Khanna, R.; Nieto, J.; Siegwart, R.; Nardi, D.; Pretto, A. AgriColMap: Aerial-ground collaborative 3D mapping for precision farming. IEEE Robot. Autom. Lett. 2019, 4, 1085–1092. [Google Scholar] [CrossRef]
- Tagarakis, A.C.; Filippou, E.; Kalaitzidis, D.; Benos, L.; Busato, P.; Bochtis, D. Proposing UGV and UAV systems for 3D mapping of orchard environments. Sensors 2022, 22, 1571. [Google Scholar] [CrossRef]
- Asadi, K.; Suresh, A.K.; Ender, A.; Gotad, S.; Maniyar, S.; Anand, S.; Noghabaei, M.; Han, K.; Lobaton, E.; Wu, T. An integrated UGV-UAV system for construction site data collection. Autom. Constr. 2020, 112, 103068. [Google Scholar] [CrossRef]
- Pino, M.; Matos-Carvalho, J.P.; Pedro, D.; Campos, L.M.; Seco, J.C. Uav cloud platform for precision farming. In Proceedings of the 2020 12th International Symposium on Communication Systems, Networks and Digital Signal Processing (CSNDSP), IEEE, Porto, Portugal, 20–22 July 2020; pp. 1–6. [Google Scholar]
- Koubaa, A.; Qureshi, B. DroneTrack: Cloud-Based Real-Time Object Tracking Using Unmanned Aerial Vehicles Over the Internet. IEEE Access 2018, 6, 13810–13824. [Google Scholar] [CrossRef]
- DJI. Skysense Cloud Platform. Available online: https://enterprise.dji.com/flighthub-2?site=enterprise&from=nav (accessed on 29 August 2023).
- Kitebeam Aerospace Company. Kitebeam Unmanned Platform. Available online: https://kitebeam.com (accessed on 27 August 2023).
- Rottmann, N.; Studt, N.; Ernst, F.; Rueckert, E. Ros-mobile: An android application for the robot operating system. arXiv 2020, arXiv:2011.02781. [Google Scholar]
- Labib, N.S.; Brust, M.R.; Danoy, G.; Bouvry, P. The rise of drones in internet of things: A survey on the evolution, prospects and challenges of unmanned aerial vehicles. IEEE Access 2021, 9, 115466–115487. [Google Scholar] [CrossRef]
- Rana, B.; Singh, Y. Internet of things and UAV: An interoperability perspective. In Unmanned Aerial Vehicles for Internet of Things (IoT) Concepts, Techniques, and Applications; John Wiley & Sons: Hoboken, NJ, USA, 2021; pp. 105–127. [Google Scholar]
- Penmetcha, M.; Kannan, S.S.; Min, B.C. Smart cloud: Scalable cloud robotic architecture for web-powered multi-robot applications. In Proceedings of the 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), IEEE, Toronto, ON, Canada, 11–14 October 2020; pp. 2397–2402. [Google Scholar]
- Yahuza, M.; Idris, M.Y.I.; Ahmedy, I.B.; Wahab, A.W.A.; Nandy, T.; Noor, N.M.; Bala, A. Internet of drones security and privacy issues: Taxonomy and open challenges. IEEE Access 2021, 9, 57243–57270. [Google Scholar] [CrossRef]
- Wazid, M.; Das, A.K.; Lee, J.-H. Authentication protocols for the internet of drones: Taxonomy, analysis and future directions. J. Ambient Intell. Humanized Comput. 2018, 1–10. [Google Scholar] [CrossRef]
- Yang, W.; Wang, S.; Yin, X.; Wang, X.; Hu, J. A review on security issues and solutions of the Internet of Drones. IEEE Open J. Comput. Soc. 2022, 3, 96–110. [Google Scholar] [CrossRef]
- Heidari, A.; Navimipour, N.J.; Unal, M. A Secure Intrusion Detection Platform Using Blockchain and Radial Basis Function Neural Networks for Internet of Drones. IEEE Internet Things J. 2023, 10, 8445–8454. [Google Scholar] [CrossRef]
- Prabhu Kavin, B.; Ganapathy, S.; Kanimozhi, U.; Kannan, A. An enhanced security framework for secured data storage and communications in cloud using ECC, access control and LDSA. Wirel. Pers. Commun. 2020, 115, 1107–1135. [Google Scholar] [CrossRef]
- Matallah, H.; Belalem, G.; Bouamrane, K. Comparative study between the MySQL relational database and the MongoDB NoSQL database. Int. J. Softw. Sci. Comput. Intell. IJSSCI 2021, 13, 38–63. [Google Scholar] [CrossRef]
- Makris, A.; Tserpes, K.; Spiliopoulos, G.; Zissis, D.; Anagnostopoulos, D. MongoDB Vs PostgreSQL: A comparative study on performance aspects. GeoInformatica 2021, 25, 243–268. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Zhong, R.; Li, Q.; Zhou, C.; Li, X.; Yang, C.; Zhang, S.; Zhao, K.; Du, Y. Design and implementation of global remote sensing real-time monitoring and fixed-point update cloud platform. Natl. Remote Sens. Bull. 2022, 26, 324–334. [Google Scholar] [CrossRef]
- Kern, A.; Bobbe, M.; Khedar, Y.; Bestmann, U. OpenREALM: Real-time mapping for unmanned aerial vehicles. In Proceedings of the 2020 International Conference on Unmanned Aircraft Systems (ICUAS), IEEE, Athens, Greece, 1–4 September 2020; pp. 902–911. [Google Scholar]
- Mur-Artal, R.; Tardós, J.D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef]
- Fankhauser, P.; Hutter, M. A universal grid map library: Implementation and use case for rough terrain navigation. Robot Oper. Syst. (ROS) Complet. Ref. 2016, 1, 99–120. [Google Scholar]
- Xu, Y.; Chen, C.; Wang, Z.; Yang, B.; Wu, W.; Li, L.; Wu, J.; Zhao, L. PMLIO: Panoramic Tightly-Coupled Multi-LiDAR-nertial Odometry and Mapping. ISPRS GSW 2023, 5, 703–708. [Google Scholar] [CrossRef]
- Schütz, M. Potree: Rendering Large Point Clouds in Web Browsers; Technische Universität Wien: Wiedeń, Österreich, 2016. [Google Scholar]
- OpenDroneMap Team. WebODM Software. Available online: https://www.opendronemap.org/webodm/ (accessed on 21 August 2023).
- PX4 Team. QGroundControl Software. Available online: http://qgroundcontrol.com/ (accessed on 23 August 2023).
- Foxglove Company. Foxglove Official Website. Available online: https://console.foxglove.dev/ (accessed on 20 August 2023).
Number | Date | Experiment Duration | Laser Data (MB) | Camera Data (MB) | GNSS Data (MB) | Laboratory Equipment |
---|---|---|---|---|---|---|
1 | 3 June 2023 | 9 min 5 s | 263.67 | 190.84 | 0.0515 | UGVs |
2 | 5 June 2023 | 10 min 19 s | 275.53 | 205.08 | 0.0544 | UGVs |
3 | 8 June 2023 | 8 min 25 s | 236.72 | 182.47 | 0.0491 | UGVs |
4 | 9 June 2023 | 15 min 12 s | 436.41 | 323.33 | 0.0826 | UGVs |
5 | 12 June 2023 | 13 min 21 s | 388.90 | 279.62 | 0.0779 | UGVs |
Number | Date | Experiment Duration | Camera Data (MB) | GNSS Data (MB) | Laboratory Equipment |
---|---|---|---|---|---|
1 | 2 June 2023 | 5 min 1 s | 64.492 | 0.0291 | UAV |
2 | 5 June 2023 | 4 min 21 s | 56.124 | 0.0266 | UAV |
3 | 7 June 2023 | 6 min 11 s | 81.761 | 0.0366 | UAV |
4 | 13 June 2023 | 8 min 36 s | 112.322 | 0.0513 | UAV |
5 | 15 June 2023 | 5 min 2 s | 66.157 | 0.0313 | UAV |
Number | Number of Unmanned Devices | Requests | Fails | Median (ms) | Average (ms) | Min (ms) | Max (ms) |
---|---|---|---|---|---|---|---|
1 | 5 | 132,437 | 0 | 2 | 2 | 1 | 14 |
2 | 10 | 162,549 | 0 | 3 | 3 | 2 | 13 |
3 | 15 | 148,449 | 0 | 5 | 5 | 1 | 18 |
4 | 20 | 147,630 | 0 | 7 | 6 | 3 | 29 |
5 | 25 | 153,998 | 0 | 8 | 8 | 2 | 41 |
6 | 30 | 167,087 | 0 | 9 | 9 | 4 | 72 |
CPU (%) | Memory (GB) | GPU (%) |
---|---|---|
<10 | <1 | <55 |
Laser Data | Camera Data | Other Instruction Data |
---|---|---|
0.52 MB/S | 0.36 MB/S | 0.12 MB/S |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tang, J.; Zhong, R.; Zhang, R.; Zhang, Y. IMUC: Edge–End–Cloud Integrated Multi-Unmanned System Payload Management and Computing Platform. Drones 2024, 8, 19. https://doi.org/10.3390/drones8010019
Tang J, Zhong R, Zhang R, Zhang Y. IMUC: Edge–End–Cloud Integrated Multi-Unmanned System Payload Management and Computing Platform. Drones. 2024; 8(1):19. https://doi.org/10.3390/drones8010019
Chicago/Turabian StyleTang, Jie, Ruofei Zhong, Ruizhuo Zhang, and Yan Zhang. 2024. "IMUC: Edge–End–Cloud Integrated Multi-Unmanned System Payload Management and Computing Platform" Drones 8, no. 1: 19. https://doi.org/10.3390/drones8010019
APA StyleTang, J., Zhong, R., Zhang, R., & Zhang, Y. (2024). IMUC: Edge–End–Cloud Integrated Multi-Unmanned System Payload Management and Computing Platform. Drones, 8(1), 19. https://doi.org/10.3390/drones8010019