EffFeu Project: Towards Mission-Guided Application of Drones in Safety and Security Environments
Abstract
:1. Introduction
2. Project
3. Architecture
4. UAS System Details and Results
4.1. Object Recognition
4.1.1. Architecture of Object Recognition Module
- pyramid net augments a standard convolutional network with a top-down pathway and lateral connections so the network efficiently constructs a rich, multi-scale feature pyramid from a single resolution input image. LateralBlock [24], ThreeWayBlock [25], and TransferConnectionBlock [26] have been implemented.
- anchor boxes associates every feature map cell to a default bounding boxes of different dimensions and aspect ratios. These are manually chosen for different applications.
- post processing merges and applies non-maximum suppression to all detections to yield the final results.
4.1.2. Implementation and Experiments
4.1.3. Results on Drone Dataset
4.2. Localisation and Navigation in the Transition of Indoor and Outdoor Environments
4.2.1. Experiments
4.2.2. Discussion
4.3. Mission-Guided Control
4.3.1. Decision-Making and Planning
4.3.2. End-User and UI Integration
4.3.3. Milestone Scenario
4.3.4. Evaluation: Dynamic Scenario
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Skorput, P.; Mandzuka, S.; Vojvodic, H. The use of Unmanned Aerial Vehicles for forest fire monitoring. In Proceedings of the 2016 International Symposium ELMAR, Zadar, Croatia, 12–14 September 2016; pp. 93–96. [Google Scholar] [CrossRef]
- Twidwell, D.; Allen, C.R.; Detweiler, C.; Higgins, J.; Laney, C.; Elbaum, S. Smokey comes of age: Unmanned aerial systems for fire management. Front. Ecol. Environ. 2016, 14, 333–339. [Google Scholar] [CrossRef]
- Meiboom, M.; Andert, F.; Batzdorfer, S.; Schulz, H.; Inninger, W.; Rieser, A. Untersuchungen zum Einsatz von UAVs bei der Lawinenrettung; Deutsche Gesellschaft für Luft-und Raumfahrt-Lilienthal-Oberth eV: Bonn, Germany, 2014. [Google Scholar]
- De Cubber, G.; Doroftei, D.; Serrano, D.; Chintamani, K.; Sabino, R.; Ourevitch, S. The EU-ICARUS project: Developing assistive robotic tools for search and rescue operations. In Proceedings of the 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Linkoping, Sweden, 21–26 October 2013; pp. 1–4. [Google Scholar]
- Kuntze, H.B.; Frey, C.W.; Tchouchenkov, I.; Staehle, B.; Rome, E.; Pfeiffer, K.; Wenzel, A.; Wöllenstein, J. SENEKA-sensor network with mobile robots for disaster management. In Proceedings of the 2012 IEEE Conference on Technologies for Homeland Security (HST), Waltham, MA, USA, 13–15 November 2012; pp. 406–410. [Google Scholar]
- Daniel, K.; Dusza, B.; Lewandowski, A.; Wietfeld, C. AirShield: A system-of-systems MUAV remote sensing architecture for disaster response. In Proceedings of the 2009 3rd Annual IEEE Systems Conference, Vancouver, BC, Canada, 23–26 March 2009; pp. 196–200. [Google Scholar]
- Mekki, S.; Kamoun, M. ANCHORS, an UAV Assisted Integrated Approach to Crisis Management. 2014. Available online: www. agence-nationale-recherche. fr/fileadmin/documents/2014/wisg/actes/ANCHORS.pdf (accessed on 10 April 2016).
- Ramchurn, S.D.; Wu, F.; Jiang, W.; Fischer, J.E.; Reece, S.; Roberts, S.; Rodden, T.; Greenhalgh, C.; Jennings, N.R. Human—Agent collaboration for disaster response. Auton. Agents Multi-Agent Syst. 2016, 30, 82–111. [Google Scholar] [CrossRef]
- Hrabia, C.E.; Hessler, A.; Xu, Y.; Brehmer, J.; Albayrak, S. EffFeu Project: Efficient Operation of Unmanned Aerial Vehicles for Industrial Fire Fighters. In Proceedings of the 4th ACM Workshop on Micro Aerial Vehicle Networks, Systems, and Applications, Munich, Germany, 15–18 June 2018; ACM: New York, NY, USA, 2018; pp. 33–38. [Google Scholar] [CrossRef]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R.B. Mask R-CNN. arXiv, 2017; arXiv:1703.06870. [Google Scholar]
- Lin, T.; Maire, M.; Belongie, S.J.; Bourdev, L.D.; Girshick, R.B.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft COCO: Common Objects in Context. arXiv, 2014; arXiv:1405.0312. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. arXiv, 2016; arXiv:1612.08242. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.E.; Fu, C.; Berg, A.C. SSD: Single Shot MultiBox Detector. arXiv, 2015; arXiv:1512.02325. [Google Scholar]
- Du, D.; Qi, Y.; Yu, H.; Yang, Y.; Duan, K.; Li, G.; Zhang, W.; Huang, Q.; Tian, Q. The Unmanned Aerial Vehicle Benchmark: Object Detection and Tracking. In Computer Vision—ECCV 2018; Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 375–391. [Google Scholar]
- Zhu, P.; Wen, L.; Bian, X.; Haibin, L.; Hu, Q. Vision Meets Drones: A Challenge. arXiv, 2018; arXiv:1804.07437. [Google Scholar]
- Jia, Y.; Shelhamer, E.; Donahue, J.; Karayev, S.; Long, J.; Girshick, R.; Guadarrama, S.; Darrell, T. Caffe: Convolutional Architecture for Fast Feature Embedding. arXiv, 2014; arXiv:1408.5093. [Google Scholar]
- Facebook. PyToch. 2018. Available online: http://pytorch.org/ (accessed on 6 November 2017).
- Lin, T.; Goyal, P.; Girshick, R.B.; He, K.; Dollár, P. Focal Loss for Dense Object Detection. arXiv, 2017; arXiv:1708.02002. [Google Scholar] [Green Version]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv, 2014; arXiv:1409.1556. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. arXiv, 2015; arXiv:1512.03385. [Google Scholar]
- Xie, S.; Girshick, R.B.; Dollár, P.; Tu, Z.; He, K. Aggregated Residual Transformations for Deep Neural Networks. arXiv, 2016; arXiv:1611.05431. [Google Scholar]
- Hu, J.; Shen, L.; Sun, G. Squeeze-and-Excitation Networks. arXiv, 2018; arXiv:1709.01507. [Google Scholar]
- Huang, J.; Rathod, V.; Sun, C.; Zhu, M.; Korattikara, A.; Fathi, A.; Fischer, I.; Wojna, Z.; Song, Y.; Guadarrama, S.; et al. Speed/accuracy trade-offs for modern convolutional object detectors. arXiv, 2016; arXiv:1611.10012. [Google Scholar]
- Lin, T.; Dollár, P.; Girshick, R.B.; He, K.; Hariharan, B.; Belongie, S.J. Feature Pyramid Networks for Object Detection. arXiv, 2016; arXiv:1612.03144. [Google Scholar]
- Lee, K.; Choi, J.; Jeong, J.; Kwak, N. Residual Features and Unified Prediction Network for Single Stage Detection. arXiv, 2017; arXiv:1707.05031. [Google Scholar]
- Zhang, S.; Wen, L.; Bian, X.; Lei, Z.; Li, S.Z. Single-Shot Refinement Neural Network for Object Detection. arXiv, 2017; arXiv:1711.06897. [Google Scholar]
- Dai, J.; Qi, H.; Xiong, Y.; Li, Y.; Zhang, G.; Hu, H.; Wei, Y. Deformable Convolutional Networks. arXiv, 2017; arXiv:1703.06211. [Google Scholar] [Green Version]
- Liu, S.; Huang, D.; Wang, Y. Receptive Field Block Net for Accurate and Fast Object Detection. arXiv, 2017; arXiv:1711.07767. [Google Scholar]
- Loshchilov, I.; Hutter, F. Fixing Weight Decay Regularization in Adam. arXiv, 2017; arXiv:1711.05101. [Google Scholar]
- Micikevicius, P.; Narang, S.; Alben, J.; Diamos, G.F.; Elsen, E.; García, D.; Ginsburg, B.; Houston, M.; Kuchaiev, O.; Venkatesh, G.; et al. Mixed Precision Training. arXiv, 2017; arXiv:1710.03740. [Google Scholar]
- Hrabia, C.E.; Berger, M.; Hessler, A.; Wypler, S.; Brehmer, J.; Matern, S.; Albayrak, S. An autonomous companion UAV for the SpaceBot Cup competition 2015. In Robot Operating System (ROS)—The Complete Reference (Volume 2); Springer International Publishing: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
- Mur-Artal, R.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef] [Green Version]
- Mur-Artal, R.; Tardós, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef]
- Lucas, B.D.; Kanade, T. An Iterative Image Registration Technique with an Application to Stereo Vision (DARPA). In Proceedings of the 1981 DARPA Image Understanding Workshop, Vancouver, BC, Canada, 24–28 August 1981; pp. 121–130. [Google Scholar]
- Echeverria, G.; Lassabe, N.; Degroote, A.; Lemaignan, S. Modular open robots simulation engine: MORSE. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, 9–13 May 2011; pp. 46–51. [Google Scholar]
- Carpin, S. Fast and accurate map merging for multi-robot systems. Auton. Robot. 2008, 25, 305–316. [Google Scholar] [CrossRef]
- Martínez, J.L.; González, J.; Morales, J.; Mandow, A.; García-Cerezo, A.J. Genetic and ICP laser point matching for 2D mobile robot motion estimation. J. Field Robot. 2006, 23, 21–34. [Google Scholar] [CrossRef]
- Arun, K.S.; Huang, T.S.; Blostein, S.D. Least-Squares Fitting of Two 3-D Point Sets. IEEE Trans. Pattern Anal. Mach. Intell. 1987, 9, 698–700. [Google Scholar] [CrossRef] [PubMed]
- Hrabia, C.E.; Wypler, S.; Albayrak, S. Towards Goal-driven Behaviour Control of Multi-Robot Systems. In Proceedings of the 2017 3nd International Conference on Control, Automation and Robotics (ICCAR), Nagoya, Japan, 24–26 April 2017; pp. 166–173. [Google Scholar]
- Hoffmann, J. The Metric-FF Planning System: Translating “Ignoring Delete Lists” to Numeric State Variables. J. Artif. Intell. Res. 2003, 20, 291–341. [Google Scholar] [CrossRef]
- Hrabia, C.E.; Kaiser, T.K.; Albayrak, S. Combining self-organisation with decision-making and planning. In Multi-Agent Systems and Agreement Technologies; Springer: Berlin/Heidelberg, Germany, 2017; pp. 385–399. [Google Scholar]
- Hrabia, C.E.; Lehmann, P.M.; Battjbuer, N.; Hessler, A.; Albayrak, S. Applying robotic frameworks in a simulated multi-agent contest. Ann. Math. Artif. Intell. 2018. [Google Scholar] [CrossRef]
- Krakowczyk, D.; Wolff, J.; Ciobanu, A.; Meyer, D.J.; Hrabia, C.E. Developing a Distributed Drone Delivery System with a Hybrid Behavior Planning System. In Proceedings of the Joint German/Austrian Conference on Artificial Intelligence (Künstliche Intelligenz), Berlin, Germany, 24–28 September 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 107–114. [Google Scholar]
Backbone Net | No. of Parameters | Inference Time | [email protected] | [email protected]:0.95 |
---|---|---|---|---|
mobilenet v2 | 12 M | 110 ms | 0.4884 | 0.2986 |
vgg16 | 47 M | 200 ms | 0.4911 | 0.3094 |
resnet50 | 47 M | 160 ms | 0.5211 | 0.3346 |
resnet101 | 72 M | 180 ms | 0.5343 | 0.3476 |
resnext101_32x4d | 72 M | 210 ms | 0.5348 | 0.3501 |
se_resnext50_32x4d | 56 M | 200 ms | 0.5257 | 0.3334 |
se_resnext101_32x4d | 76 M | 240 ms | 0.5405 | 0.3553 |
senet154 | 142 M | 370 ms | 0.5461 | 0.3619 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hrabia, C.-E.; Hessler, A.; Xu, Y.; Seibert, J.; Brehmer, J.; Albayrak, S. EffFeu Project: Towards Mission-Guided Application of Drones in Safety and Security Environments. Sensors 2019, 19, 973. https://doi.org/10.3390/s19040973
Hrabia C-E, Hessler A, Xu Y, Seibert J, Brehmer J, Albayrak S. EffFeu Project: Towards Mission-Guided Application of Drones in Safety and Security Environments. Sensors. 2019; 19(4):973. https://doi.org/10.3390/s19040973
Chicago/Turabian StyleHrabia, Christopher-Eyk, Axel Hessler, Yuan Xu, Jacob Seibert, Jan Brehmer, and Sahin Albayrak. 2019. "EffFeu Project: Towards Mission-Guided Application of Drones in Safety and Security Environments" Sensors 19, no. 4: 973. https://doi.org/10.3390/s19040973