FFAU—Framework for Fully Autonomous UAVs
Abstract
:1. Introduction
2. Framework for Fully Autonomous UAVs
2.1. Perception
- Camera Node: Receives data from the camera and publishes it in a sensor image message format. This looks like a rather simple node, but it can be extremely complex. ROS passes images in its own message format, but many developers use images nodes bundled with different image processing libraries such as OpenCV [34];
- Odometry Node: This node can estimate the UAV position relative to its starting point. This can be done by using the UAV motion sensors data, performing estimations via visual odometry, or by using any sort of fusion algorithm that mixes methods. The data from this node is published on a message format denominated navigation odometry messages;
- IMU Node: The Inertial Measurement Unit (IMU) is responsible for handling the IMU sensors (accelerometer, gyroscope, magnetometer, and barometer) and periodically publishes sensor IMU messages to the ROS network;
- GNSS Node: Obtains data from the Global Navigation Satellite System (GNSS) and periodically publishes navigation messages on the ROS network.
2.2. Collision Aware Planner
2.3. Plan Handler
2.4. Command Multiplexer
2.5. Dynamic Collision Avoidance
2.6. Communication Handler
3. Collision Avoidance
- Static collision —Represents collisions between the UAV and any obstacle that moves considerably slower than the UAV. It is considered that by using the world as a referential, an object will produce a static collision if it is moving slower than 5% the UAV maximum speed ;
- Dynamic collision —Represents collisions between the UAV and any obstacle that is moving faster than the computing of point clouds for the path planner to plan a safe path in avoiding the collision. It is considered that by using the world as a referential, an object will produce a dynamic collision if it is moving faster than 5% .
3.1. Static Collision Avoidance
3.2. Dynamic Collision Avoidance
3.2.1. Feature Extraction
3.2.2. Temporal Correlation and Decision
Algorithm 1: Dynamic Collision Avoidance—processing the latest video frame. |
|
3.2.3. Training and Results
- The input data must be an array of SEQ length. A value of 25 was considered in this paper, however any value between 20 and 50 yielded similar results;
- The generated sequences must only contain frames from a single video. To work with video data on GPUs is not a trivial task and generating video sequences adds an overhead. The dataset is seen by the model as a continuous stream of data, and this constrain must be enforced to avoid having the model learning jumps between videos (false knowledge);
- The last frame target label is the target for the entire sequence.
4. Beyond Skyline
- Communication in real-time with the UAVs, receiving telemetry, video data, and transmitting commands;
- Actuate over these UAVs by uploading missions, with designated waypoints;
- Orchestrate a swarm missions with multiple UAVs;
- Collect and analyze data produced by several UAVs during each flight, with the possibility of replaying a given flight/mission;
- Parallel running computational heavy Artificial Intelligence (AI) algorithms facilitating user decisions.
5. Field Tests Results
- The client, from the WA, sends a counter number and the name of the destination UAV for the RTT topic. The timestamp associated with this message is saved;
- The BE receives, analyzes the message structure and permissions, and sends this number to the UAV. Furthermore, a timestamp is added to the message;
- The UAV receives the message;
- The UAV re-sends a copy of the received message;
- The BE receives this message and sends it to all connected clients that have permission to receive information from this UAV. Additionally, it calculates the RTT between the server and UAV by using the timestamp registered in step 2;
- The server receives this message, verifies its value, creates a new timestamp from which it decreases the one saved in step (1) and calculates the RTT. It increments the counter and repeats from step (1).
Discussion of Results
6. Conclusions and Future Work
- Optimized DCA. The DCA module can be explored in greater depth, as this area still have many unsolved problems. The dataset must be amplified and the proposed algorithm needs to be optimized to run faster. The concept can be optimized by exploring different features extractors, variations on the sequence size with which the RNN is ran and different types of RNN;
- Testing the DCA algorithm on real UAVs in autonomous missions;
- Edge multi-tenant computing. Whenever a UAV is flying in a different country, the BE and WebRTC server should be instantiated in the proximity, minimizing the RTT, providing better control of the UAVs;
- Framework modules variants. Different implementations of the proposed framework should be developed, allowing a performance evaluation and comparison.
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Patias, P. Introduction to Unmanned Aircraft Systems. Photogramm. Eng. Remote. Sens. 2016, 82, 89–92. [Google Scholar] [CrossRef]
- Koubaa, A.; Qureshi, B.; Sriti, M.F.; Javed, Y.; Tovar, E. A service-oriented Cloud-based management system for the Internet-of-Drones. In Proceedings of the 2017 IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC 2017, Coimbra, Portugal, 26–28 April 2017. [Google Scholar] [CrossRef] [Green Version]
- Alvarez-Vanhard, E.; Houet, T.; Mony, C.; Lecoq, L.; Corpetti, T. Can UAVs fill the gap between in situ surveys and satellites for habitat mapping? Remote Sens. Environ. 2020, 243, 111780. [Google Scholar] [CrossRef]
- Navarro, A.; Young, M.; Allan, B.; Carnell, P.; Macreadie, P.; Ierodiaconou, D. The application of Unmanned Aerial Vehicles (UAVs) to estimate above-ground biomass of mangrove ecosystems. Remote Sens. Environ. 2020, 242, 111747. [Google Scholar] [CrossRef]
- Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
- Weibel, R.E.; Hansman, R.J. Safety Considerations for Operation of Unmanned Aerial Vehicles in the National Airspace System; MIT International Center for Air Transportation: Cambridge, UK, 2005. [Google Scholar]
- Zhong, Y.; Hu, X.; Luo, C.; Wang, X.; Zhao, J.; Zhang, L. WHU-Hi: UAV-borne hyperspdectral with high spatial resolution (H2) benchmark datasets and classifier for precise crop identification based on deep convolutional neural network with CRF. Remote Sens. Environ. 2020, 250, 112012. [Google Scholar] [CrossRef]
- Meinen, B.U.; Robinson, D.T. Mapping erosion and deposition in an agricultural landscape: Optimization of UAV image acquisition schemes for SfM-MVS. Remote Sens. Environ. 2020, 239, 111666. [Google Scholar] [CrossRef]
- Bhardwaj, A.; Sam, L.; Akanksha; Martín-Torres, F.J.; Kumar, R. UAVs as remote sensing platform in glaciology: Present applications and future prospects. Remote Sens. Environ. 2016, 175, 196–204. [Google Scholar] [CrossRef]
- Yao, H.; Qin, R.; Chen, X. Unmanned Aerial Vehicle for Remote Sensing Applications—A Review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef] [Green Version]
- Gerhards, M.; Schlerf, M.; Mallick, K.; Udelhoven, T. Challenges and Future Perspectives of Multi-/Hyperspectral Thermal Infrared Remote Sensing for Crop Water-Stress Detection: A Review. Remote Sens. 2019, 11, 1240. [Google Scholar] [CrossRef] [Green Version]
- Messina, G.; Modica, G. Applications of UAV thermal imagery in precision agriculture: State of the art and future research outlook. Remote Sens. 2020, 12, 1491. [Google Scholar] [CrossRef]
- Gaffey, C.; Bhardwaj, A. Applications of unmanned aerial vehicles in cryosphere: Latest advances and prospects. Remote Sens. 2020, 12, 948. [Google Scholar] [CrossRef] [Green Version]
- Rödel, C.; Stadler, S.; Meschtscherjakov, A.; Tscheligi, M. Towards autonomous cars: The effect of autonomy levels on Acceptance and User Experience. In Proceedings of the Automotive UI 2014—6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, in Cooperation with ACM SIGCHI, Seattle, WA, USA, 17–19 September 2014. [Google Scholar] [CrossRef]
- Caron, C. After Drone Hits Plane in Canada, New Fears About Air Safety. 2017. Available online: https://www.nytimes.com/2017/10/17/world/canada/canada-drone-plane.html (accessed on 19 May 2019).
- BBC. Drone’ Hits British Airways Plane Approaching Heathrow Airport. 2016. Available online: https://www.bbc.com/news/uk-36067591 (accessed on 19 May 2019).
- Canada, C. Drone That Struck Plane Near Quebec City Airport Was Breaking the Rules|CBC News. 2017. Available online: http://www.cbc.ca/news/canada/montreal/garneau-airport-drone-quebec-1.4355792 (accessed on 19 May 2019).
- BBC. Drone Collides with Commercial Aeroplane in Canada. 2017. Available online: https://www.bbc.com/news/technology-41635518 (accessed on 19 May 2019).
- Goglia, J. NTSB Finds Drone Pilot at Fault for Midair Collision with Army Helicopter. 2017. Available online: https://www.forbes.com/sites/johngoglia/2017/12/14/ntsb-finds-drone-pilot-at-fault-for-midair-collision-with-army-helicopter/ (accessed on 19 May 2019).
- Rawlinson, K. Drone Hits Plane at Heathrow Airport, Says Pilot. 2016. Available online: https://www.theguardian.com/uk-news/2016/apr/17/drone-plane-heathrow-airport-british-airways (accessed on 19 May 2019).
- Tellman, J.; News, T.V. First-Ever Recorded dRone-Hot Air Balloon Collision Prompts Safety Conversation. 2018. Available online: https://www.postregister.com/news/local/first-ever-recorded-drone-hot-air-balloon-collision-prompts-safety/article_7cc41c24-6025-5aa6-b6dd-6d1ea5e85961.html (accessed on 19 May 2019).
- Pedro, D.; Mora, A.; Carvalho, J.; Azevedo, F.; Fonseca, J. ColANet: A UAV Collision Avoidance Dataset. In Technological Innovation for Life Improvement; Springer: Cham, Switzerland, 2020. [Google Scholar]
- Gharibi, M.; Boutaba, R.; Waslander, S.L. Internet of Drones. IEEE Access 2016, 4, 1148–1162. [Google Scholar] [CrossRef]
- Apvrille, L.; Tanzi, T.; Dugelay, J.L. Autonomous drones for assisting rescue services within the context of natural disasters. In Proceedings of the 2014 31th URSI General Assembly and Scientific Symposium, URSI GASS 2014, Beijing, China, 16–23 August 2014. [Google Scholar] [CrossRef]
- Mahmoud, S.; Mohamed, N. Collaborative UAVs cloud. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems, ICUAS 2014, Orlando, FL, USA, 27–30 May 2014. [Google Scholar] [CrossRef]
- Mahmoud, S.; Mohamed, N. Broker architecture for collaborative UAVs cloud computing. In Proceedings of the 2015 International Conference on Collaboration Technologies and Systems, CTS 2015, Atlanta, GA, USA, 1–5 June 2015. [Google Scholar] [CrossRef]
- Mahmoud, S.; Mohamed, N.; Al-Jaroodi, J. Integrating UAVs into the Cloud Using the Concept of the Web of Things. J. Robot. 2015, 2015, 631420. [Google Scholar] [CrossRef] [Green Version]
- Introduction. 2005. Available online: https://mavlink.io/en/ (accessed on 19 May 2019).
- ROS. Powering the World’s Robots. 2007. Available online: https://www.ros.org/ (accessed on 19 May 2019).
- La, H.J.; Kim, S.D. A service-based approach to designing cyber physical systems. In Proceedings of the 9th IEEE/ACIS International Conference on Computer and Information Science, ICIS 2010, Yamagata, Japan, 18–20 August 2010. [Google Scholar] [CrossRef]
- Combe, T.; Martin, A.; Di Pietro, R. To Docker or Not to Docker: A Security Perspective. IEEE Cloud Comput. 2016, 3, 54–62. [Google Scholar] [CrossRef]
- Cloud Native Computing Foundation. What Is Kubernetes—Kubernetes. 2019. Available online: https://kubernetes.io/docs/concepts/overview/what-is-kubernetes/ (accessed on 27 October 2020).
- Acuña, P.; Acuña, P. Kubernetes. In Deploying Rails with Docker, Kubernetes and ECS; Apress: New York, NY, USA, 2016. [Google Scholar] [CrossRef]
- Bowman, J.; Mihelich, P. Camera Calibration—ROS Wiki. 2014. Available online: http://wiki.ros.org/camera_calibration (accessed on 27 October 2020).
- Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. Trans. ASME-Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef] [Green Version]
- Turner, D.; Lucieer, A.; Watson, C. An automated technique for generating georectified mosaics from ultra-high resolution Unmanned Aerial Vehicle (UAV) imagery, based on Structure from Motion (SFM) point clouds. Remote Sens. 2012, 4, 1392. [Google Scholar] [CrossRef] [Green Version]
- Keselman, L.; Woodfill, J.I.; Grunnet-Jepsen, A.; Bhowmik, A. Intel(R) RealSense(TM) Stereoscopic Depth Cameras. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar] [CrossRef]
- Galceran, E.; Carreras, M. A survey on coverage path planning for robotics. Robot. Auton. Syst. 2013, 61, 1258–1276. [Google Scholar] [CrossRef] [Green Version]
- Hert, S.; Tiwari, S.; Lumelsky, V. A terrain-covering algorithm for an AUV. Auton. Robot. 1996, 3, 91–119. [Google Scholar] [CrossRef]
- Azevedo, F.; Oliveira, A.; Dias, A.; Almeida, J.; Moreira, M.; Santos, T.; Ferreira, A.; Martins, A.; Silva, E. Collision avoidance for safe structure inspection with multirotor UAV. In Proceedings of the 2017 European Conference on Mobile Robots, ECMR 2017, Paris, France, 6–8 September 2017. [Google Scholar] [CrossRef] [Green Version]
- Paul, S.; Paul, S. Real-Time Transport Protocol (RTP). In Multicasting on the Internet and Its Applications; Springer: Boston, MA, USA, 1998. [Google Scholar] [CrossRef]
- Marr, D. Visual information processing: The structure and creation of visual representations. Philos. Trans. R. Soc. Lond. Ser. B Biol. Sci. 1980, 290, 199–218. [Google Scholar] [CrossRef]
- Kuchar, J.K.; Yang, L.C. A Review of Conflict Detection and Resolution Modeling Methods. IEEE Trans. Intell. Transp. Syst. 2000, 1, 179–189. [Google Scholar] [CrossRef] [Green Version]
- Kovacs, L. Visual Monocular Obstacle Avoidance for Small Unmanned Vehicles. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Las Vegas, NV, USA, 26 June–1 July 2016. [Google Scholar] [CrossRef]
- Hrabar, S. An evaluation of stereo and laser-based range sensing for rotorcraft unmanned aerial vehicle obstacle avoidance. J. Field Robot. 2012, 29, 215–239. [Google Scholar] [CrossRef]
- Merz, T.; Kendoul, F. Beyond visual range obstacle avoidance and infrastructure inspection by an autonomous helicopter. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011. [Google Scholar] [CrossRef]
- Hrabar, S. 3D path planning and stereo-based obstacle avoidance for rotorcraft UAVs. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, Nice, France, 22–26 September 2008. [Google Scholar] [CrossRef]
- Magree, D.; Mooney, J.G.; Johnson, E.N. Monocular visual mapping for obstacle avoidance on UAVs. J. Intell. Robot. Syst. Theory Appl. 2014, 74, 17–26. [Google Scholar] [CrossRef]
- Yang, Z.; Gao, F.; Shen, S. Real-time monocular dense mapping on aerial robots using visual-inertial fusion. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017. [Google Scholar] [CrossRef]
- Li, C.J.; Ling, H. Synthetic aperture radar imaging using a small consumer drone. In Proceedings of the IEEE Antennas and Propagation Society, AP-S International Symposium (Digest), Vancouver, BC, Canada, 19 July 2015. [Google Scholar] [CrossRef]
- Hornung, A.; Wurm, K.M.; Bennewitz, M.; Stachniss, C.; Burgard, W. OctoMap: An efficient probabilistic 3D mapping framework based on octrees. Auton. Robot. 2013, 34, 189–206. [Google Scholar] [CrossRef] [Green Version]
- Hermann, A.; Drews, F.; Bauer, J.; Klemm, S.; Roennau, A.; Dillmann, R. Unified GPU voxel collision detection for mobile manipulation planning. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 4154–4160. [Google Scholar]
- Burgard, W.; Bennewitz, M.; Tipaldi, D.; Spinello, L. Introduction to Mobile Robotics: Techniques for 3D Mapping. 2019. Available online: http://ais.informatik.uni-freiburg.de/teaching/ss14/robotics/slides/17-3dmapping.pdf (accessed on 19 May 2019).
- Koenig, N.; Howard, A. Design and use paradigms for Gazebo, an open-source multi-robot simulator. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Sendai, Japan, 28 September–2 October 2004. [Google Scholar] [CrossRef] [Green Version]
- Kavraki, L.E.; Svestka, P.; Latombe, J.; Overmars, M.H. Probabilistic roadmaps for path planning in high-dimensional configuration spaces. IEEE Trans. Robot. Autom. 1996, 12, 566–580. [Google Scholar] [CrossRef] [Green Version]
- Lavalle, S.M. Rapidly-Exploring Random Trees: A New Tool for Path Planning; Technical Report; Computer Science Department, Iowa State University: Ames, IA, USA, 1998. [Google Scholar]
- Hrabar, S. Reactive obstacle avoidance for rotorcraft UAVs. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011. [Google Scholar] [CrossRef]
- Sabatini, R.; Gardi, A.; Richardson, M.A. LIDAR Obstacle Warning and Avoidance System for Unmanned Aircraft. Int. J. Mech. Aerosp. Ind. Mechatronics Eng. 2014, 8, 718–729. [Google Scholar]
- Gallup, D.; Frahm, J.M.; Mordohai, P.; Pollefeys, M. Variable baseline/resolution stereo. In Proceedings of the 26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR, Anchorage, AK, USA, 23–28 June 2008. [Google Scholar] [CrossRef] [Green Version]
- Mueggler, E.; Forster, C.; Baumli, N.; Gallego, G.; Scaramuzza, D. Lifetime estimation of events from Dynamic Vision Sensors. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015. [Google Scholar] [CrossRef] [Green Version]
- Andrew, A.M. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2001. [Google Scholar] [CrossRef]
- Poiesi, F.; Cavallaro, A. Detection of fast incoming objects with a moving camera. In Proceedings of the British Machine Vision Conference, London, UK, 4–7 September 2017. [Google Scholar] [CrossRef] [Green Version]
- Falanga, D.; Kim, S.; Scaramuzza, D. How Fast Is Too Fast? the Role of Perception Latency in High-Speed Sense and Avoid. IEEE Robot. Autom. Lett. 2019, 4, 1884–1891. [Google Scholar] [CrossRef]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.C. MobileNetV2: Inverted Residuals and Linear Bottlenecks. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018. [Google Scholar] [CrossRef] [Green Version]
- Ma, N.; Zhang, X.; Zheng, H.T.; Sun, J. ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design. In Proceedings of the The European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018. [Google Scholar]
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Boureau, Y.L.; Ponce, J.; Lecun, Y. A theoretical analysis of feature pooling in visual recognition. In Proceedings of the ICML 2010—27th International Conference on Machine Learning, Haifa, Israel, 21–24 June 2010. [Google Scholar]
- Tran, D.; Bourdev, L.; Fergus, R.; Torresani, L.; Paluri, M. Learning spatiotemporal features with 3D convolutional networks. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015. [Google Scholar] [CrossRef] [Green Version]
- Shanmugamani, R. Deep Learning for Computer Vision: Expert Techniques to Train Advanced Neural Networks Using TensorFlow and Keras; Packt Publishing Ltd.: Birmingham, UK, 2018. [Google Scholar] [CrossRef]
- Pan, S.J.; Yang, Q. A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 2010, 22, 1345–1359. [Google Scholar] [CrossRef]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis. 2015, 115, 211–252. [Google Scholar] [CrossRef] [Green Version]
- Kingma, D.P.; Ba, J.L. Adam: A method for stochastic optimization. In Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015—Conference Track Proceedings, San Diego, CA, USA, 7–9 May 2015. [Google Scholar]
- de Vries, S.C. UAVs and Control Delays; TNO Rep.; TNO: The Hague, The Netherlands, 2005. [Google Scholar]
- Wang, F.; Qi, S.; Li, J. An Analysis of Time-delay for Remote Piloted Vehicle. MATEC Web Conf. 2017, 114, 04012. [Google Scholar] [CrossRef] [Green Version]
- Guo, W.; Devine, C.; Wang, S. Performance analysis of micro unmanned airborne communication relays for cellular networks. In Proceedings of the 2014 9th International Symposium on Communication Systems, Networks and Digital Signal Processing, CSNDSP 2014, Manchester, UK, 23–25 July 2014. [Google Scholar] [CrossRef] [Green Version]
- Popovski, P.; Stefanovic, C.; Nielsen, J.J.; de Carvalho, E.; Angjelichinoski, M.; Trillingsgaard, K.F.; Bana, A.S. Wireless Access in Ultra-Reliable Low-Latency Communication (URLLC). IEEE Trans. Commun. 2019, 67, 5783–5801. [Google Scholar] [CrossRef] [Green Version]
- Burke, P.J. 4G Antipode: Remote Control of a Ground Vehicle From Around the World. IEEE J. Miniaturization Air Space Syst. 2020. early access. [Google Scholar] [CrossRef]
- Itkin, M.; Kim, M.; Park, Y. Development of cloud-based UAV monitoring and management system. Sensors 2016, 16, 1913. [Google Scholar] [CrossRef] [PubMed]
- Dusza, B.; Wietfeld, C. Performance evaluation of IEEE 802.16e mobile WiMAX for long distance control of UAV swarms. In Proceedings of the 2010 IEEE International Conference on Wireless Information Technology and Systems, ICWITS 2010, Honolulu, HI, USA, 28 August–3 September 2010. [Google Scholar] [CrossRef]
Metrics | MobileNetV2 | MobileNetV2 Fine-Tuned | DCA Model Using | DCA Model Using |
---|---|---|---|---|
Training Accuracy | 66.17% | 84.76% | 85.94% | 85.76% |
Validation Accuracy | 62.02% | 74.18% | 86.43% | 87.14% |
Location | Day Time | (ms) ▾ | UAV-BE (ms) | BE-WA (ms) | (ms) | UAV-BE (ms) | BE-WA (ms) | 99 (ms) | |
---|---|---|---|---|---|---|---|---|---|
PT | Lisbon | 09h34 | 86.31 | 72.69 | 13.61 | 18.10 | 17.10 | 5.41 | 124.99 |
PT | Caparica | 15h20 | 89.69 | 62.75 | 26.95 | 24.46 | 19.67 | 12.93 | 150.00 |
PT | Porto | 12h57 | 91.63 | 65.39 | 26.24 | 19.81 | 17.76 | 6.36 | 140.20 |
ES | Segovia | 12h00 | 116.69 | 66.08 | 50.61 | 41.63 | 29.36 | 29.95 | 320.04 |
UK | London | 12h11 | 134.49 | 65.11 | 69.38 | 19.83 | 16.89 | 10.46 | 184.89 |
PT | Beja | 12h48 | 135.24 | 67.39 | 67.86 | 32.70 | 21.69 | 23.64 | 204.95 |
BE | Brussels | 15h50 | 147.08 | 60.82 | 86.26 | 29.32 | 15.96 | 24.90 | 270.31 |
NO | Oslo | 13h44 | 162.67 | 67.70 | 94.97 | 21.63 | 17.96 | 14.05 | 229.61 |
DE | Bestwig | 16h58 | 162.90 | 72.01 | 89.65 | 24.63 | 17.54 | 17.40 | 220.35 |
FR | Nice | 11h21 | 176.92 | 67.24 | 109.53 | 19.92 | 16.78 | 11.64 | 229.81 |
GR | Corfu | 11h43 | 211.67 | 68.49 | 143.18 | 46.27 | 16.69 | 40.30 | 329.75 |
AE | Abu Dhabi | 12h39 | 247.00 | 67.60 | 179.41 | 29.10 | 16.78 | 23.23 | 330.54 |
IT | Pisa | 15h39 | 249.64 | 64.70 | 184.94 | 84.00 | 16.00 | 82.08 | 594.03 |
US | Los Angeles | 17h20 | 259.86 | 70.92 | 188.94 | 23.21 | 16.39 | 18.01 | 310.21 |
AU | Sydney | 09h17 | 431.58 | 80.06 | 351.52 | 41.54 | 18.44 | 36.72 | 560.57 |
Altitude Interval (m) | <20 | 20–40 | 40–60 | 60–80 | 80–100 | >100 |
(ms) | 85.75 | 84.73 | 82.27 | 88.80 | 86.92 | 89.32 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pedro, D.; Matos-Carvalho, J.P.; Azevedo, F.; Sacoto-Martins, R.; Bernardo, L.; Campos, L.; Fonseca, J.M.; Mora, A. FFAU—Framework for Fully Autonomous UAVs. Remote Sens. 2020, 12, 3533. https://doi.org/10.3390/rs12213533
Pedro D, Matos-Carvalho JP, Azevedo F, Sacoto-Martins R, Bernardo L, Campos L, Fonseca JM, Mora A. FFAU—Framework for Fully Autonomous UAVs. Remote Sensing. 2020; 12(21):3533. https://doi.org/10.3390/rs12213533
Chicago/Turabian StylePedro, Dário, João P. Matos-Carvalho, Fábio Azevedo, Ricardo Sacoto-Martins, Luís Bernardo, Luís Campos, José M. Fonseca, and André Mora. 2020. "FFAU—Framework for Fully Autonomous UAVs" Remote Sensing 12, no. 21: 3533. https://doi.org/10.3390/rs12213533
APA StylePedro, D., Matos-Carvalho, J. P., Azevedo, F., Sacoto-Martins, R., Bernardo, L., Campos, L., Fonseca, J. M., & Mora, A. (2020). FFAU—Framework for Fully Autonomous UAVs. Remote Sensing, 12(21), 3533. https://doi.org/10.3390/rs12213533