Next Article in Journal
Deep Learning-Based Energy Optimization for Edge Device in UAV-Aided Communications
Previous Article in Journal
Collaborative Unmanned Vehicles for Inspection, Maintenance, and Repairs of Offshore Wind Turbines
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mission Chain Driven Unmanned Aerial Vehicle Swarms Cooperation for the Search and Rescue of Outdoor Injured Human Targets

1
Department of Military Biomedical Engineering, Air Force Military Medical University, Xi’an 710032, China
2
Shaanxi Provincial Key Laboratory of Bioelectromagnetic Detection and Intelligent Perception, Xi’an 710032, China
3
Drug and Instrument Supervisory & Test Station of PLA Xining Joint Logistics Support Center, Lanzhou 730050, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Drones 2022, 6(6), 138; https://doi.org/10.3390/drones6060138
Submission received: 4 May 2022 / Revised: 26 May 2022 / Accepted: 26 May 2022 / Published: 28 May 2022

Abstract

:
A novel cooperative strategy for distributed unmanned aerial vehicle (UAV) swarms with different functions, namely the mission chain-driven unmanned aerial vehicle swarms cooperation method, is proposed to allow the fast search and timely rescue of injured human targets in a wide-area outdoor environment. First, a UAV-camera unit is exploited to detect the suspected human target combined with improved deep learning technology. Then, the target location information is transferred to a self-organizing network. Then, the special bio-radar-UAV unit was released to recheck the survivals through a respiratory characteristic detection algorithm. Finally, driven by the location and vital sign status of the injured, a nearby emergency-UAV unit will perform corresponding medical emergency missions, such as dropping emergency supplies. Experimental results show that this strategy can identify the human targets autonomously from the outdoor environment effectively, and the target detection, target sensing, and medical emergency mission chain is completed successfully relying on the cooperative working mode, which is meaningful for the future search-rescue mission of outdoor injured human targets.

1. Introduction

After natural disasters, wars, and other public safety events, complex environments put forward severe tests for the search for the wounded. A wide range of areas in distress makes the search for the wounded inefficient, thus missing the best rescue time for the wounded. In addition, if rescuers can obtain the location information and life status of the injured in a timely manner, it is crucial to improve the rescue effect. At this stage, the wounded search equipment includes mainly individual search equipment and wounded search unmanned aerial vehicles. Common single-soldier search equipment includes chest bands, wristbands, and handheld search devices, and the main vital signs monitored include breathing, heart rate, and blood oxygen [1]. This kind of equipment has the following deficiencies: first, the equipment needs to be distributed in advance, and it is easy to cause inconvenience in the movement of the user personnel; second, when this equipment is damaged by impact, fire, etc., the accuracy of the collection of life information of the injured will be reduced. To solve the limitations of wearable technology, researchers can effectively improve the search efficiency of outdoor injured people by using an unmanned aerial vehicle (UAV) equipped with a variety of sensors to detect injured people in large areas of distress [2].
Multirotor UAVs have strong manoeuvrability, hovering stability, and flexibility and are not affected by terrain and landforms. Multirotor UAVs with sensors are widely used in power line inspection, remote sensing, and disaster rescue. The image obtained by the visible light camera of the UAV is processed by methods such as semantic segmentation to extract the power lines in the image. Compared with traditional human power line inspection, it is more efficient [3]. Through the UAV-based multispectral camera system, all the ground targets would be screened, and specific features of the spectral image would be extracted for target recognition [4]. Therefore, the UAV-based searching system is adopted in various search-rescue environments, such as maritime distress and post-earthquakes, effectively reducing task risk and improving rescue efficiency [5,6,7].
However, a single UAV is limited by its endurance and communication distance, and it is difficult to complete some large-scale tasks. The technology of UAV swarms will make the working mode of the above scenes better expanded and deepened [8,9]. Affected by the limited power of the UAV and irregular distribution of the wounded, the search and rescue capability of a single UAV is very weak in a rescue mission. If an information-sharing network is established among multiple UAVs to coordinate the work of UAVs with different functions, rescue efficiency will be improved. Therefore, it is necessary to establish a sharing network in advance to realize the combination of searching and sensing the wounded by UAV swarms [10].
In the existing UAV networking solutions, centralized and distributed approaches occupy the mainstream [11,12,13]. The centralized networking solution adopts a one-to-many communication module that will return the status of the UAVs to the ground workstation processing and then the ground workstation coordination. The distributed network solution mainly adopts each UAV carrying onboard edge device real-time processing of UAV images or external sensor data, directly controlling the status of the UAV to realize the coordination of the cluster. Both have their own advantages and disadvantages in different application scenarios. The former does not require an additional load, but the higher bandwidth of transferring the data to the ground station causes a delay in the system response. The latter improves the response speed, but the computing power is lower than the computing power of the ground station equipment.
The construction of shared networks makes collaboration between different functional UAVs possible. Besides, the real-time search for injured outdoor human targets in the jungle requires a high-performance object-detection algorithm. A large number of object-detection algorithms have been proposed based on deep convolutional neural networks (CNNs) to extract target features and improve detection accuracy. Such algorithms mainly include two categories, and one is the regression-based single-stage, which directly uses CNN to extract features to predict object classification and location. The other is the two-stage algorithm, which generates preselected boxes (region proposals) that may contain the objects to be inspected in advance and then uses the CNN to classify each box. The two-stage includes R-CNN and Faster R-CNN [14]. Typical single-stage detection algorithms include single-shot detection (SSD) [15] and You Only Look Once (YOLO) [16]. Although single-stage networks expose a deficiency of lower accuracy than two-stage networks, it outperforms two-stage networks in terms of processing speed. Thanks to this, single-stage networks deployed on Nvidia edge devices are optimized through TensorRT to meet real-time and accuracy requirements [17], which greatly improves the inference speed for vehicle recognition. For this reason, the TensorRT-based optimized YOLO is suitable and thus adopted in our searching task for outdoor injured human targets.
Subsequently, after the UAV cluster obtains the target detection results of the injured outdoor personnel, it is necessary to judge the survival status of the target in time to provide detailed data support for the rescue plan. To realize this, acquiring vital signs of the target is a convincing and direct way and serves as a trigger for the emergency UAV to throw corresponding medical emergency supplies.
In this paper, a novel mission chain-driven unmanned aerial vehicle swarms cooperation method is proposed to improve the performance of UAV swarms during the search-rescue task for injured outdoor human targets and solve the problems of low endurance and low efficiency of individual UAVs. It combines long-range radio (LoRa) self-organizing network, machine vision, bio-radar, and medical emergency together. Thus, the human targets would be screened twice via different modal sensors, providing effective rescue guidance for ground search and rescue personnel.

2. System Design

The UAV swarm collaboration system, shown in Figure 1, consists of the LoRa self-organizing network, human target detection, and remote sensing of breath signals. In one mission execution, the system will first use the LoRa self-organizing network to establish information sharing between UAVs. Secondly, the onboard edge device (Nvidia jetson Nano) will use an object-detection algorithm to process the images captured by the camera in real-time. Finally, the UAV equipped with a bio-radar will further perceive the breath signals of the remote sense target. UAVs carrying emergency relief supplies will deliver medical supplies to the targets. A schematic diagram of the collaboration between UAVs with different functions is shown in Figure 2.

2.1. Main Hardware of the System

2.1.1. UAV Swarms

For UAV swarms that collaborate on missions outdoors, excellent manoeuvrability, flexibility, and hover stability are essential. To adapt to better communication and control, we chose a quadcopter UAV as a delivery platform, which can cooperate with Jetson Nano to fly autonomously and combine external modules such as communication modules and cameras to form a multifunctional intelligent unmanned aerial vehicle system. The camera has 12.4 million pixels and can provide an effective target detection image at a height of 70 metres. This program-based development control design greatly improves the intelligent control performance of the UAV cluster and reduces manpower expenditure in the search and rescue mission. Figure 3 shows the appearance of these UAV swarms.

2.1.2. Bio-Radar Module and First Aid Kit

The bio-radar sensor (model: JC122-3.3UA6) was custom-developed. When a normal person breathes with minute displacement, the thoracic cavity will reflect the microwave emitted by the radar sensor to generate an echo signal. According to the Doppler effect [18,19], there will be a phase difference between the original radar beams and the echo beams. The relationship between the phase variation and the chest displacement can be expressed as below:
Δ θ t = 4 π λ Δ x t  
where Δ x t is the chest displacement,   λ is the wavelength of the bioradar, and Δ θ t is the phase change introduced by the breathing activity of the human subject.
After analogue amplification and filtering, the respirational waveform can be obtained. Observing this waveform can be used to judge the physiological condition of a person. The bioradar operated with a wavelength of 1.25 cm and provided continuous linear waves with a maximum transmission power of 1 mW. Taking advantage of the wide sensing range of the bio-radar (horizontal angle: ±60°, vertical angle: ±16°), the respiration signal of the tested target is detected by throwing multiple bio-radars at different angles and distances.
The appearance of the outdoor first aid kit is shown in Figure 4. The first aid kit is equipped with commonly used medicines and equipment, including scissors, fixing belts, tourniquets, cotton wool, various dressings, haemostatic drugs (powder), quick-acting rescue pills, hypertension drugs, nitroglycerine, Star intestine medicine, malaria medicine, cold medicine, cough medicine, etc.

2.1.3. Control and Information Processing Center

Jetson Nano is a powerful embedded device produced by Nvidia. The device contains a 128-core Maxwell architecture graphics processing unit (GPU), which achieves balanced processing in terms of power consumption, volume, and price. The official test frame rate of Tiny YOLO running on Jetson Nano after TensorRT acceleration is FPS = 38. Combining the advantages of small size and excellent computing power, Jetson Nano can meet the needs of onboard suspected target detection.
The model of the LoRa chip we used for networking is ATK-LoRa-SX1278. Based on spread spectrum technology, the LoRa chip can perform ultralong-distance wireless transmission and has the characteristics of low power consumption and many networking nodes. Jetson Nano will configure the LoRa channel, baud rate, and other parameters to build a communication network between UAVs.
This shared network could further achieve the function of the secondary screening of suspected targets through UAV-based radar. Based on the open-source Python-dronekit control library, we have developed programs for autonomous flight of UAVs based on shared point locations. The UAV that performs the detection of life forms will throw the perception module equipped with bio-radar near the target to obtain the target’s respiration signal.

2.2. Mission Chain Driven UAV Swarms Cooperation Algorithm

Based on the above platform, for the real-time search and rescue of injured human targets in a wide-area environment, we propose a cooperative process based on the functional differences driven by the mission chain. The collaboration process of the UAV swarm refers to the three function types of UAVs: detect, sense, and supply. The schematic diagrams of the three are shown in Figure 5.
(1)
UAV-camera-based suspected human target detection. After the UAVs receive the Take-off command, they will automatically form a formation to go to the mission point and automatically perform the search task according to the “zigzag” pattern.
(2)
UAV-radar-based human target reconfirmation. We wrote a Python script to obtain the location of the UAV when the target was detected and share the location information of the injured to the sensing UAV through the self-organizing network. The sensing UAV will autonomously fly near the injured person and throw a sensing module to further obtain the breath signal of the target.
(3)
Medical emergencies through the emergency UAV. Finally, after the target survival is determined, the UAVs that deliver emergency rescue will provide the necessary support to keep the wounded alive.

2.2.1. UAV-Camera-Based Suspected Human Target Detection

(1)
Tiny Yolov4
To meet the real-time and accuracy requirements of target recognition detection algorithms on airborne edge devices, we tried the recognition and detection effects of the YOLO series of algorithms on low-contrast targets in the grass, including yolov3, yolov4, and yolov4-tiny. The Yolov4-tiny structure is a simplified version of Yolov4, a lightweight model. The overall network structure has a total of 38 layers. Using three residual units, the activation function uses LeakyReLU, the classification and regression of the target are changed to use two feature layers, and the feature pyramid network (FPN) is used when merging the effective feature layers.The feature structure of Yolov4 tiny is shown in Figure 6.
(2)
K-means clustering methods
In object recognition detection, the network model learns the target category based on multiple features and needs to learn the position and size of the target in the graph. Therefore, algorithms such as Faster regional-based convolutional neural network (RCNN), SSD, and YOLO preset a set of reference boxes of different sizes and aspect ratios on the image in advance before recognition and detection. These cover almost all positions of the picture to match the width and height of the target in the datasets to calculate the target box faster and better. It is called the anchor point in the Faster RCNN, and the SSD is called the previous bounding box. Starting with YOLOv2, the detection mechanism of the YOLO series uses an anthropology mechanism. The difference is that in Faster-RCNN, anchor points are set manually; however, for different datasets, one needs to preset the appropriate anchors based on the target size. In YOLO, the k-means clustering algorithm is used to cluster the bounding boxes in the training set. Finding the appropriate size of the anchor solves this problem very well.
(3)
TensorRT Acceleration
In the actual deep learning model deployment link, the efficiency of using the original network framework is relatively inefficient, so Nvidia has developed a TensorRT inference library for its own GPU. TensorRT is a C++ inference framework that can run on Nvidia’s various GPU hardware platforms. We use PyTorch, TF, or other frameworks to train the model, which can be converted to TensorRT format, and then use the TensorRT inference engine to run our model, thereby improving the speed of this model running on Nvidia GPUs.

2.2.2. UAV-Radar-Based Human Target Reconfirmation

After receiving the location information of the suspected target, the UAV equipped with bio-radar will fly near the target, collect the breath signal of the suspected target by throwing the sense device, determine whether the target is still alive, and transmit this information to the lower layer. There are two important problems to be solved in the process of life information perception of the wounded: the acquisition of the wounded information and the long-distance transmission of the information. Therefore, the design of the system adopts modularization, which is composed of three modules: a life sensing device, an air communication device, and a ground station processing terminal, as shown in Figure 7.
The life-sensing device uses bio-radar to collect the respiration signal of the casualty noncontact. It throws multiple life-sensing devices integrating the positioning module, bio-radar, and communication modules at the target point (Figure 8) to collect the position and respiration signal of the casualty. The information is processed by the Stm32 controller and sent to the air communication device in the form of data packets through LoRa. The ground network and air communication transfer are used to expand the perception range of the wounded. The ground station processing terminal is responsible for the visualization of the life information and location information of the wounded, big data analysis, injury level assessment, and other operations, providing effective information for searchers.
The air communication device is carried on the UAV. It is responsible for connecting the life sense device and the ground station processing terminal up and down, using LoRa and radio to transmit data. First, the life sensing device autonomously alarms after detecting a life signal. It simultaneously transmits the life information of the wounded to the unmanned air communication terminal, and then the wireless transmitter transmits the information to the ground workstation processing terminal. A physical diagram of the air communication device is shown in Figure 9.

3. Experiments and Discussion

To verify the actual performance of the UAV swarm, the experiment was divided into three parts for testing:
(1)
Multi-UAV cooperation: The communication distance of the ad hoc network and the task coordination based on functional differences are mainly tested.
(2)
Human target detection: The YOLOv4-Tiny algorithm was tested to match the accuracy and speed of object recognition detection.
(3)
Human Target reconfirmation: The accuracy of the sensing device on the respiration signal acquisition of human targets was tested and analysed.
(4)
Medical emergencies through the emergency UAV

3.1. Multi-UAV Cooperation

First, we tested the communication distance of the LoRa ad hoc network: the maximum communication distance of a single node was tested in a relatively open offsite environment. Change the distance between the LoRa acquisition and receiver ends. Test signal strength and packet loss were tested at 1000 m, 1500 m, and 1600 m, and the communication was stable at 1500 m without data loss. The packet loss rate was 5% at 1500 m and more than 50% at 1600 m.
Second, the communication distance between the air communication device and the ground station was tested in the same way, and the communication quality between them was stable at 1500 m. By combining two communication modules, the communication range is effectively expanded, and the communication quality is stable. In the subsequent testing process, the communication protocol will be improved to enhance the anti-jamming performance of the system wireless communication in different environments.
Multi-UAV cooperation includes two parts: four-UAV formations and three-UAV cooperation. We conducted a formation test of four UAVs outdoors to verify whether multiple UAVs could respond to ground workstation commands in real-time. As shown in Figure 10a, after receiving the take-off command from the ground workstation, the four UAVs fly to the vicinity of the target area according to the “one-line” formation and the search mission. The test results show that the formation effect is ideal, the action feedback is accurate and timely, and the planned mission area can be covered.
Then, we carried out a simulated searching experiment of outdoor injured people in the playground and designed a working group of two search UAVs and a sensing UAV. The two UAVs responsible for the search sent the simulated position of the wounded to the perception UAV in the coverage search mission. After receiving the position, the perception UAV quickly went to the point to throw the respiration signal perception module. As shown in Figure 10b, the cooperation of three UAVs can be initially realized, and the information transmission and receiving in communication (like the target’s position information) among them could reach 65%. To solve this problem, we analysed the reason for the external interference to affect the stability of the communication. In the future, we will propose a method to improve the stability of communication protocol, and we will continue to improve the accuracy.

3.2. Human Target Detection

3.2.1. Experimental Design and Configuration

This experiment mainly simulates the identification and detection of wounds in an outdoor low-contrast environment. It builds a training platform for an in-depth learning model and a platform for UAV target identification and detection. Among these operating modes, the former trains the data with better computing power, thus providing good training and testing standards for the airborne system of the latter.
The main experiment process includes making datasets, model training, and algorithm deployment. First, 2130 images of targets at different heights and positions were collected, and the wounded targets in these images were labelled. The data were divided into a training set and a test set at a ratio of 7:3. The algorithm is trained by a neural network, and the precision and recall rate of the test set are calculated. The parameters of the deep learning platform we use for training data are mainly shown in Table 1.

3.2.2. Results and Analysis

Based on the above dataset and experimental configuration, the dataset was preprocessed by operations such as labeling and normalization. The initial learning rate (0.001) and decay parameter (0.0005) have the best training effect. Using mean average precision (mAP50) (IOU = 50) and frames per second (FPS) detection speed as evaluation indicators, we compared the detection effects of the original Yolov4-tiny, clustered yolov4-tiny, Yolov4-tiny after clustering, and TensorRT acceleration.
As shown in Table 2, compared to the original Yolov4-tiny, the detection accuracy of Yolov4-tiny after clustering the anchor box by the K-means clustering algorithm increased by 23.69%. The detection speed remained unchanged. The clustered Yolov4-tiny refers to the size of the anchor box clustered in advance when predicting the target, which improves the detection effect. The detection accuracy of the TensorRT acceleration of the clustered Yolov4-tiny decreased by 3.86% and the detection speed increased by 171% compared to the clustered Yolov4-tiny. Compared with the original Yolov4-tiny, the detection speed was improved on the premise of ensuring a higher detection accuracy. The results show that the algorithm has a good detection effect on low-contrast targets with a height of less than 70 m. As shown in Figure 11, it can accurately detect the wounded in trenches and weed coverage, and can meet the needs of UAV target detection.

3.3. Human Target Reconfirmation

After a suspected human target has been detected based on a UAV camera, the following necessary mission is to confirm further whether it is a surviving human body and to sense the corresponding physiological state.
Therefore, once the video detects a suspected target, the drone will be triggered, and the bio-radar module will be precisely dropped around the human target. Since the electromagnetic waves emitted by bio-radar can detect regular chest wall movement caused by human respiratory movement, the respiratory characteristic analysis of radar echoes can determine whether the suspected target is a human body. In addition, a previous research group showed that after human injury (blood loss, temperature loss), human respiratory radar echoes would have morphologically specific characteristics, which can be used to assess the physiological state of survivors. The physical state of the survivor is good or bad, or even terrible (being dying), which could directly represent the urgency of the need of the survivor to be treated in time.
To test the effectiveness of detecting human respiration signals by throwing bio-radar, we tested the detection distance and angle of the bio-radar separately. Ideally, it is assumed that the radar can be accurately thrown to the position of 0.5 m in front of the human chest. The experiment of the effective detection distance of the respiration signal was carried out by increasing the distance by 0.5 m successively. Then, with the human body target as the centre, the detection effect of different angles (the view angle to the human chest wall) (0 degrees, 45 degrees, 90 degrees, 135 degrees, 180 degrees) at 0.5 m were tested with 0 degrees in the direction of the chest cavity. Finally, the detection effect of different angles corresponding to different distances was tested.
Like the respiratory detection results of a typical example (0 degrees, 2 m) shown in Figure 12, a strong and regular respiratory signal was clearly acquired. There was only weak and chaotic noise when the human target left, demonstrating that the respiration feature acquired by bio-radar could convincingly confirm the human attribution. Moreover, the statistical results of different detection positions (angle and position) show that almost all human respiration can be detected from most view angles within a distance. Even the detection failure at 90 degrees can be solved by throwing over two radar modules or using our proposed omnidirectional bio-radar integrated module [20].
However, our previous studies have effectively verified that the injured human body will go through four typical physiological stages, including normal, transitioning, and agonal stages. Fortunately, the bio-radar could effectively detect and judge these stages based on some signal features combined with machine learning methods [21]. Consequently, it could help determine the most appropriate rescue strategy of emergency UAV swarms for rescuers and finally help save more lives.

3.4. Medical Emergencies through the Emergency UAV

The degree of urgency and priority of medical treatment depends mainly on the degree of injury and physiological state. Therefore, these medical emergency UAVs will perform reasonable treatment missions according to the physiological state evaluation based on the detected respiratory radar signal. As shown in Figure 13, this type of UAV is equipped with a throwing switch and an outdoor first-aid kit for autonomously throwing medical supplies. Bandages and haemostatic drugs can provide the injured person with timely treatment for trauma. In addition, quick-acting rescue pills, cold medicine, and nitroglycerin can relieve some sudden acute symptoms and effectively obtain more rescue time for those in distress.

4. Conclusions

Targeting the challenging mission chain, including searching, sensing, and emergency treatment of injured human targets under a wide-area outdoor jungle environment, a novel cooperative strategy for distributed UAV swarms is proposed. In this way, the multi-UAV network will work collaboratively to perform a quick search, accurate sensing, and timely medical treatment. This provides a strong basis for the planning of rescue programs, reducing the difficulty of searching for the wounded and effectively improving the efficiency of searching for and rescuing the wounded. The combination of casualty search and technology such as drones, machine vision, and bio-radar will subvert traditional search and rescue methods. Generally, however, the collaborative search and rescue of the outdoor injured based on the distributed drone swarms could also be continuously improved, such as the accuracy of physiological state evaluation. However, it is promising to lay a foundation for the intelligent search and rescue of the injured in the new era.

Author Contributions

Conceptualization, G.L. and J.W.; methodology, M.Z.; software, T.L.; investigation, Y.J. and J.X.; data curation, Z.L.; writing—original draft preparation, Y.C.; writing—review and editing, F.Q.; visualization, Y.C.; supervision, G.L; funding acquisition, G.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Key Research and Development Program of Shaanxi (2021ZDLGY09-07 and 2022SF-482). The APC was funded by 2021ZDLGY09-07.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

DefinitionAcronyms
Unmanned Aerial VehicleUAV
Deep LearningDL
Convolutional Neural NetworksCNN
You Only Look OnceYOLO
Region-CNNRCNN
Long-range RadioLoRa
Graphics Processing UnitGPU
Leaky Rectified Linear UnitLeakyReLU
Feature Pyramid NetworkFPN
Single Shot MultiBox DetectorSSD
Mean average precisionmAP
Frames per secondFPS

References

  1. Jayasekera, S.; Hensel, E.; Robinson, R. Feasibility Assessment of Wearable Respiratory Monitors for Ambulatory Inhalation Topography. Int. J. Environ. Res. Public Health 2021, 18, 2990. [Google Scholar] [CrossRef] [PubMed]
  2. Yeom, S. Moving People Tracking and False Track Removing with Infrared Thermal Imaging by a Multirotor. Drones 2021, 5, 65. [Google Scholar] [CrossRef]
  3. Zhao, W.; Dong, Q.; Zuo, Z. A Method Combining Line Detection and Semantic Segmentation for Power Line Extraction from Unmanned Aerial Vehicle Images. Remote Sens. 2022, 14, 1367. [Google Scholar] [CrossRef]
  4. Qi, F.; Zhu, M.; Li, Z.; Lei, T.; Xia, J.; Zhang, L.; Yan, Y.; Wang, J.; Lu, G. Automatic Air-to-Ground Recognition of Outdoor Injured Human Targets Based on UAV Bimodal Information: The Explore Study. Appl. Sci. 2022, 12, 3457. [Google Scholar] [CrossRef]
  5. Wang, S.; Han, Y.; Chen, J.; Zhang, Z.; Du, N. A deep-learning-based sea search and rescue algorithm by UA V remote sensing. In Proceedings of the 2018 IEEE CSAA Guidance, Navigation and Control Conference (GNCC), Xiamen, China, 10–12 August 2018; pp. 1–5. [Google Scholar]
  6. Ding, J.; Zhang, J.; Zhan, Z.; Tang, X.; Wang, X. A Precision Efficient Method for Collapsed Building Detection in Post-Earthquake UAV Images Based on the Improved NMS Algorithm and Faster R-CNN. Remote Sens. 2022, 14, 663. [Google Scholar] [CrossRef]
  7. Pedersen, C.B.; Nielsen, K.G.; Rosenkrands, K.; Vasegaard, A.E.; Nielsen, P.; El Yafrani, M. A GRASP-Based Approach for Planning UAV-Assisted Search and Rescue Missions. Sensors 2022, 22, 275. [Google Scholar] [CrossRef] [PubMed]
  8. Liu, H.; Ge, J.; Wang, Y.; Li, J.; Ding, K.; Zhang, Z.; Guo, Z.; Li, W.; Lan, J. Multi-UAV Optimal Mission Assignment and Path Planning for Disaster Rescue Using Adaptive Genetic Algorithm and Improved Artificial Bee Colony Method. Actuators 2021, 11, 4. [Google Scholar] [CrossRef]
  9. Qin, B.; Zhang, D.; Tang, S.; Wang, M. Distributed Grouping Cooperative Dynamic Task Assignment Method of UAV Swarm. Appl. Sci. 2022, 12, 2865. [Google Scholar] [CrossRef]
  10. Hildmann, H.; Kovacs, E.; Saffre, F.; Isakovic, A.F. Nature-Inspired Drone Swarming for Real-Time Aerial Data-Collection Under Dynamic Operational Constraints. Drones 2019, 3, 71. [Google Scholar] [CrossRef] [Green Version]
  11. Camarillo-Escobedo, R.; Flores, J.L.; Marin-Montoya, P.; García-Torales, G.; Camarillo-Escobedo, J.M. Smart Multi-Sensor System for Remote Air Quality Monitoring Using Unmanned Aerial Vehicle and LoRaWAN. Sensors 2022, 22, 1706. [Google Scholar] [CrossRef] [PubMed]
  12. Davoli, L.; Pagliari, E.; Ferrari, G. Hybrid LoRa-IEEE 802.11s Opportunistic Mesh Networking for Flexible UAV Swarming. Drones 2021, 5, 26. [Google Scholar] [CrossRef]
  13. Okulski, M.; Ławryńczuk, M. How Much Energy Do We Need to Fly with Greater Agility? Energy Consumption and Performance of an Attitude Stabilization Controller in a Quadcopter Drone: A Modified MPC vs. PID. Energies 2022, 15, 1380. [Google Scholar] [CrossRef]
  14. Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single shot multibox detector. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 8–16 October 2016; pp. 21–37. [Google Scholar]
  16. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016; pp. 779–788. [Google Scholar]
  17. Li, X.; He, B.; Ding, K.; Guo, W.; Huang, B.; Wu, L. Wide-Area and Real-Time Object Search System of UAV. Remote Sens. 2022, 14, 1234. [Google Scholar] [CrossRef]
  18. Kathuria, N.; Seet, B.-C. 24 GHz Flexible Antenna for Doppler Radar-Based Human Vital Signs Monitoring. Sensors 2021, 21, 3737. [Google Scholar] [CrossRef] [PubMed]
  19. Gouveia, C.; Vieira, J.; Pinho, P. A Review on Methods for Random Motion Detection and Compensation in Bio-Radar Systems. Sensors 2019, 19, 604. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Li, C.; Chen, F.; Qi, F.; Liu, M.; Li, Z.; Liang, F.; Jing, X.; Lu, G.; Wang, J. Searching for survivors through random human-body movement outdoors by continuous-wave radar array. PLoS ONE 2016, 11, e0152201. [Google Scholar] [CrossRef] [PubMed]
  21. Ma, Y.; Wang, P.; Xue, H.; Liang, F.; Qi, F.; Lev, H.; Yu, X.; Wang, J.; Zhang, Y. Non-contact vital states identification of trapped living bodies using ultra-wideband bio-radar. IEEE Access 2020, 9, 6550–6559. [Google Scholar] [CrossRef]
Figure 1. UAV swarm’s collaboration system.
Figure 1. UAV swarm’s collaboration system.
Drones 06 00138 g001
Figure 2. Collaboration between UAVs with different functions.
Figure 2. Collaboration between UAVs with different functions.
Drones 06 00138 g002
Figure 3. UAV swarms for carrying real-time target detection and sensing systems.
Figure 3. UAV swarms for carrying real-time target detection and sensing systems.
Drones 06 00138 g003
Figure 4. Main configuration of the outdoor first-aid kit.
Figure 4. Main configuration of the outdoor first-aid kit.
Drones 06 00138 g004
Figure 5. Schematic diagram of UAV Swarms.
Figure 5. Schematic diagram of UAV Swarms.
Drones 06 00138 g005
Figure 6. Yolov4-tiny feature structure diagram.
Figure 6. Yolov4-tiny feature structure diagram.
Drones 06 00138 g006
Figure 7. Schematic diagram of sensing life.
Figure 7. Schematic diagram of sensing life.
Drones 06 00138 g007
Figure 8. Design of the life sensing device; (a) Stm32F103; (b) Positioning module; (c) LoRa; (d) Bio-radar; (e) System power switch.
Figure 8. Design of the life sensing device; (a) Stm32F103; (b) Positioning module; (c) LoRa; (d) Bio-radar; (e) System power switch.
Drones 06 00138 g008
Figure 9. Design of the air communication device on the UAV; (a) Nvidia Jetson Nano; (b) Usb-ttl; (c) Usb-ttl; (d) Radio module; (e) LoRa; (f) Stm32F103; (g) Battery.
Figure 9. Design of the air communication device on the UAV; (a) Nvidia Jetson Nano; (b) Usb-ttl; (c) Usb-ttl; (d) Radio module; (e) LoRa; (f) Stm32F103; (g) Battery.
Drones 06 00138 g009
Figure 10. (a) Real-time formation test of four UAVs, (b) Target detection experiment based on UAV swarms with different functions.
Figure 10. (a) Real-time formation test of four UAVs, (b) Target detection experiment based on UAV swarms with different functions.
Drones 06 00138 g010
Figure 11. Human target detection. (a) Human target in the weed cover. (b) Human target in the gully.
Figure 11. Human target detection. (a) Human target in the weed cover. (b) Human target in the gully.
Drones 06 00138 g011
Figure 12. Respiration signal detection. (a) Different distances and angles. (b) Respiration signal waveform.
Figure 12. Respiration signal detection. (a) Different distances and angles. (b) Respiration signal waveform.
Drones 06 00138 g012
Figure 13. Emergency UAV for emergency supply.
Figure 13. Emergency UAV for emergency supply.
Drones 06 00138 g013
Table 1. Key Parameters of the System Setup.
Table 1. Key Parameters of the System Setup.
ParameterConfiguration
CPUInter i7-1180H
GPUNvidia RTX 3050
SystemWindows10/Ubuntu20.04
Accelerate environmentCUDA11.4 cuDNN8.2
Training frameworkDarknet
Table 2. Comparison of experimental results.
Table 2. Comparison of experimental results.
ModelmAP50FPS
Yolov4-tiny67.38%14
clustered yolov4-tiny91.07%14
Yolov4-tiny after clustering and TensorRT acceleration87.21%38
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cao, Y.; Qi, F.; Jing, Y.; Zhu, M.; Lei, T.; Li, Z.; Xia, J.; Wang, J.; Lu, G. Mission Chain Driven Unmanned Aerial Vehicle Swarms Cooperation for the Search and Rescue of Outdoor Injured Human Targets. Drones 2022, 6, 138. https://doi.org/10.3390/drones6060138

AMA Style

Cao Y, Qi F, Jing Y, Zhu M, Lei T, Li Z, Xia J, Wang J, Lu G. Mission Chain Driven Unmanned Aerial Vehicle Swarms Cooperation for the Search and Rescue of Outdoor Injured Human Targets. Drones. 2022; 6(6):138. https://doi.org/10.3390/drones6060138

Chicago/Turabian Style

Cao, Yusen, Fugui Qi, Yu Jing, Mingming Zhu, Tao Lei, Zhao Li, Juanjuan Xia, Jianqi Wang, and Guohua Lu. 2022. "Mission Chain Driven Unmanned Aerial Vehicle Swarms Cooperation for the Search and Rescue of Outdoor Injured Human Targets" Drones 6, no. 6: 138. https://doi.org/10.3390/drones6060138

Article Metrics

Back to TopTop