Situational Awareness and Problems of Its Formation in the Tasks of UAV Behavior Control
Abstract
:1. Introduction
2. UAV as a Controllable Dynamic System Operating under Uncertainty Conditions
2.1. Kinds of Controllable Dynamic Systems
2.2. System Behavior and Activity
2.3. Requirements for UAV Behavior Control Algorithms
- planning a flight operation, controlling its execution, promptly adjusting the plan when the situation changes;
- control of UAV motion, including its trajectory motion (including guidance and navigation) and angular motion;
- control of target tasks (surveillance and reconnaissance equipment operation control, weapons application control, assembly operations control, etc.);
- control of interaction with other aircraft, both unmanned and manned, when performing the task of a group of aircraft;
- control of structural dynamics (for “smart” structures);
- monitoring the “health state” of the UAV and implementation of actions to restore it if necessary (monitoring the state of the UAV structure and its onboard systems, coordination of the UAV onboard systems, reconfiguration of algorithms of the behavior control system in case of equipment failures and damage to the UAV structure).
- able to assess the current situation based on a multifaceted perception of the external and internal environment, be able to form a forecast of the situation evolution;
- able to achieve the established goals in a highly dynamic environment with a significant number of heterogeneous uncertainties in it, taking into account possible counteractions;
- able to adjust the established goals, and to form new goals and sets of goals, based on the values and normative regulations (motivation) incorporated into the UAV behavior control system;
- able to acquire new knowledge, accumulate experience in solving various tasks, learn from this experience, modify their behavior based on the obtained knowledge and accumulated experience;
- able to adapt to the type of tasks that need to be solved, including learning how to solve problems that were not in the original design of the system;
- able to form teams, intended for the interaction of their members in solving some common problem.
- providing situational awareness with an assessment of the current and/or predicted situation to obtain the information required for behavior control actions on the UAV;
- generating control actions that determine the behavior of the UAV.
3. The Situation and Situational Awareness in Behavior Control Problem for UAV
- UAV flying in airspace over the Earth’s surface;
- UAV flying in airspace over the ground as part of a team on a prescribed flight operation;
- UAV flying in airspace over the water surface;
- UAV flying in urban areas.
- tools for working with data in the visual and infrared range (video cameras, OELS, IR cameras);
- acoustic data handling equipment (helicopter lowered hydroacoustic station (HSS) + hydroacoustic buoys (HBS));
- tools to work with radar data (radar stations of various kinds);
- external data sources concerning the UAV (GPS/GLONASS, radio navigation tools, motion capture tools).
- tools for sensing UAV state variables (gyroscopes, accelerometers, ROVs, angle-of-attack and angle-of-slip sensors, etc.);
- devices and sensors that characterize the state of the atmosphere (air density, turbulence, wind, etc.), for example, meteoradar;
- inertial navigation instruments.
- sensors providing control of the <<health state>> of the UAV systems and design.
4. Levels and Kinds of Situation Awareness in Behavior Control Problem for UAV
- (1)
- What is the goal of the problem? I.e., what is to be obtained as a result of its solution (the specific composition of the situational awareness component)?
- (2)
- Which source data are needed to solve the problem under consideration, and how they can be obtained?
- (3)
- What should the results of the problem to be solved look like?
- (4)
- Into which subtasks is the original problem broken down, and what problems arise in solving these subtasks and the problem as a whole?
- list of objects detected in the scene (the result of solving the problem of localization of objects in );
- classes of objects detected and localized in the scene (the result of solving the problem of classifying objects in );
- list and localization of objects of given (prescribed) classes in the scene ;
- results of tracking the behavior of the prescribed objects in , and the prediction of this behavior.
- (1)
- scene formation based on one or more sources of input data (for example, video streams, data from infrared cameras and/or radar);
- (2)
- identification and classification of objects in the considered fragment of objective reality (object localization) in , semantic segmentation of the scene (semantic segmentation), identification of scene objects (object classification), detection of spatial and other relationships between scene objects );
- (3)
- object tracking (object tracking) in the considered fragment of objective reality, including objects of specified (prescribed) classes (detecting the position relative to the UAV of objects detected in , tracking the change of this position in time), and predicting the behavior of objects, all or selectively (trajectory prediction), located in ;
- (4)
- revealing behavioral patterns of objects (pattern extraction) of a fragment of objective reality and on this basis revealing groupings of objects in , patterns in the behavior of the revealed groupings of objects, and predicting the behavior of these groupings;
- (5)
- revealing behavior goals of objects (all or selectively) in the considered fragment of objective reality, and groupings of objects in .
5. Forming Elements of Situational Awareness for UAVs
5.1. The Components of Situational Awareness That Ensure the Tracking of Objects in a Fragment of Objective Reality and Predicting Their Behavior
5.1.1. Reference Systems That Provide Object Tracking in a Fragment of Objective Reality
- in a coordinate system associated with some point on the Earth’s surface;
- in a coordinate system associated with some moving object (aircraft, ship, ground moving object) outside the considered UAV;
- in a coordinate system associated with the considered UAV.
- observer is at some point immobile relative to the Earth’s surface (for example, this could be the flight control point of a UAV);
- observer is on a movable platform (e.g., it could be a ship or some other movable carrier for the UAV flight control point);
- observer is on a platform that vigorously changes its phase state, including both its trajectory and angular components (in particular, it could be an aircraft acting as a leader in a constellation of aircraft);
- observer is onboard the UAV, which can vigorously change the components of its trajectory and/or angular motion.
5.1.2. An Example of Solving the Problem of Analyzing the Behavior of a Dynamic Object in a Fragment of Objective Reality
5.1.3. The Problem of Team Behavior in Object Tracking in a Fragment of Objective Reality
- What set of quantities is required to describe the behavior of tracked objects, based on the specifics of the particular UAV behavior control problem being solved?
- Are scene objects tracked by onboard means of the UAV, or by means external in relation to the UAV with transferring them to the UAV?
- In relation to which coordinate system is the measurement data presented? (Internal, in relation to the origin in the center of mass of the UAV; external, the origin lies outside the UAV—for example, tied to the ground command and measurement complex).
- By which means are the internal components of the situation measured?—in particular, the components describing the trajectory and/or angular motion of the UAV. By measuring instruments onboard the UAV, or by external means (for example, by means of motion capture technology)?
5.2. Semantic Image Segmentation as a Tool for Forming Situational Awareness Elements in UAV Control Tasks
5.2.1. A Formation Experiment for Visual Components of Situational Awareness
5.2.2. Description of the Dataset Used in the Experiment
5.2.3. Preprocessing of the Data Used
- the training set contains 300 images;
- the validation set contains 50 images;
- the test set contains 50 images.
- horizontal flip;
- vertical flip;
- grid distortion;
- random changes in brightness and contrast;
- adding Gaussian noise.
5.2.4. Description of Approaches to Solving the Semantic Segmentation Problem
5.2.5. Results of the Semantic Segmentation Problem
5.2.6. Ways to Increase Data Processing Speed When Solving the Semantic Segmentation Problem
- use more advanced neural network architectures;
- perform low-level optimization of the computation graph for the particular computing architecture used onboard (this can be done both manually and with third-party tools);
- perform downscaling of neural network weights, use simple quantization and quantization aware training;
- apply low-rank factorization;
- optimize convolutional filters;
- build new, more efficient blocks;
- use neural network knowledge distillation to train a much simpler learner model on a mixture of responses from the larger teacher model and the original model itself;
- use automated neural network architecture (NAS) search.
- distillation lossThis part of the loss function is responsible for comparing the soft predictions of the teacher and student models. The better the student’s model repeats the results of the teacher’s model, the lower the value of this part of the loss function.
- student lossThis part of the loss function, as in ordinary training, compares the results of the student’s model (hard labels) with the markup (ground truth)
- permanently reserving the right amount of memory on the GPU for incoming images, so that a resource-intensive operation of allocating and freeing memory is not required when a new batch arrives;
- to transfer preprocessing and postprocessing data to the GPU;
- organize efficient work with video streams by using specialized libraries;
- to optimize the model for a specific GPU;
- rewriting program code of the model in C++, using high-performance libraries.
6. Areas of Further Research on Situational Awareness for Highly Autonomous UAVs
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Finn, A.; Scheding, S. Developments and Challenges for Autonomous Unmanned Vehicles; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Valavanis, K.P. (Ed.) Advances in Unmanned Aerial Vehicles: State of the Art and the Road to Autonomy; Springer: Berlin/Heidelberg, Germany, 2007. [Google Scholar]
- Martynyuk, A.A.; Martynyuk-Chernienko, Y.A. Uncertain Dynamic Systems: Stability and Motion Control; CRC Press: London, UK, 2012. [Google Scholar]
- Endsley, M.R.; Garland, D.J. (Eds.) Situation Awareness Analysis and Measurement; Lawrence Erlbaum Associates Inc.: Mahwah, NJ, USA, 2000. [Google Scholar]
- Endsley, M.R.; Jones, D.G. Designing of Situation Awareness: An Approach to User-Centered Design, 2nd ed.; CRC Press: London, UK, 2004. [Google Scholar]
- Endsley, M.R. Toward a theory of situation awareness in dynamic systems. Hum. Factors 1995, 37, 32–64. [Google Scholar] [CrossRef]
- Endsley, M.R. Situation awareness misconceptions and misunderstandings. J. Cogn. Eng. Decis. Mak. 2015, 9, 4–32. [Google Scholar] [CrossRef]
- Endsley, M.R. Situation awareness: Operationally necessary and scientifically grounded. Cogn. Technol. Work. 2015, 17, 163–167. [Google Scholar] [CrossRef]
- Endsley, M.R. From here to autonomy: Lessons learned from human-automation research. Hum. Factors 2017, 59, 5–27. [Google Scholar] [CrossRef]
- Endsley, M.R. The limits of highly autonomous vehicles: An uncertain future. Ergonomics 2019, 62, 496–499. [Google Scholar] [CrossRef] [Green Version]
- Endsley, M.R. Situation awareness in future autonomous vehicles: Beware of the unexpected. In Proceedings of the 20th Congress of the International Ergonomics Association (IEA 2018), Florence, Italy, 26–30 August 2018; Advances in Intelligent Systems and Computing Book Series. AISC: Chicago, IL, USA, 2019; Volume 824, pp. 303–309. [Google Scholar]
- Endsley, M.R. The divergence of objective and subjective situation awareness: A meta-analysis. J. Cogn. Eng. Decis. Mak. 2020, 14, 34–53. [Google Scholar] [CrossRef]
- Walker, G.H.; Stanton, N.A.; Salmon, P.M. Vehicle Feedback and Driver Situation Awareness; CRC Press: London, UK, 2018. [Google Scholar]
- Stanton, N.A.; Salmon, P.M.; Walker, G.H.; Houghton, R.J.; Baber, C.; McMaster, R.; Salmon, P.; Hoyle, G.; Walker, G.; Young, M.S.; et al. Distributed situation awareness in dynamic systems: Theoretical development and application of an ergonomics methodology. Ergonomics 2006, 49, 1288–1311. [Google Scholar] [CrossRef] [Green Version]
- Salmon, P.; Stanton, N.; Walker, G.; Green, D. Situation awareness measurement: A review of applicability for C4i environments. Appl. Ergon. 2006, 37, 225–238. [Google Scholar] [CrossRef] [Green Version]
- Salmon, P.M.; Stanton, N.A.; Walker, G.H.; Baber, C.; Jenkins, D.P.; McMaster, R.; Young, M.S. What really is going on? Review of situation awareness models for individuals and teams. Theor. Issues Ergon. Sci. 2008, 9, 297–323. [Google Scholar] [CrossRef]
- Salmon, P.M.; Stanton, N.A.; Walker, G.H.; Jenking, D.P. Distributed Situation Awareness: Theory, Measurement and Application to Teamwork; MPG Books Group: Farnham, UK, 2009. [Google Scholar]
- Salmon, P.M.; Stanton, N.A.; Walker, G.H.; Jenkins, D.; Ladva, D.; Rafferty, L.; Young, M. Measuring situation awareness in complex systems: Comparison of measures study. Intern. J. Ind. Ergon. 2009, 39, 490–500. [Google Scholar] [CrossRef]
- Stanton, N.A.; Salmon, P.M.; Walker, G.H.; Jenkins, D.P. Is situation awareness all in the mind? Theor. Issues Ergon. Sci. 2010, 11, 29–40. [Google Scholar] [CrossRef]
- Stanton, N.A.; Salmon, P.M.; Walker, G.H.; Salas, E.; Hancock, P.A. State-of-science: Situation awareness in individuals, teams and systems. Ergonomics 2017, 60, 449–466. [Google Scholar] [CrossRef]
- Salmon, P.M.; Read, G.J.M.; Walker, G.H.; Lenne, M.G.; Stanton, N.A. Distributed Situation Awareness in Road Transport; CRC Press: London, UK, 2019. [Google Scholar]
- Hancock, P.A. Some pitfalls in the promises of automated and autonomous vehicles. Ergonomics 2019, 62, 479–495. [Google Scholar] [CrossRef] [Green Version]
- Van der Laar, P.; Tretmans, J.; Borth, M. (Eds.) Situation Awareness with Systems of Systems; Springer: New York, NY, USA, 2013. [Google Scholar]
- Gawron, V.J. Human Performance and Situation Awareness Measures, 3rd ed.; CRC Press: London, UK, 2019. [Google Scholar]
- Wise, J.A.; Hopkin, V.D.; Garland, D.J. (Eds.) Handbook of Aviation Human Factors, 2nd ed.; CRC Press: London, UK, 2010. [Google Scholar]
- Angelov, P. (Ed.) Sense and Avoid in UAS: Research and Applications; John Wiley & Sons: Chichester, UK, 2012. [Google Scholar]
- Sarter, N.B.; Woods, D.D. Situation awareness: A critical but ill-defined phenomenon. Int. J. Aviat. Psychol. 1991, 1, 45–57. [Google Scholar] [CrossRef]
- Ackerman, K.A.; Talleur, D.A.; Carbonari, R.S.; Xargay, E.; Seefeldt, B.D.; Kirlik, A.; Hovakimyan, N.; Trujillo, A.C. Automation situation awareness display for a flight envelope protection system. J. Guid. Control. Dyn. 2017, 40, 964–980. [Google Scholar] [CrossRef]
- Nguyen, T.; Lim, C.P.; Nguyen, N.D.; Gordon-Brown, L.; Nahavandi, S. A review of situation awareness assessment approaches in aviation environment. IEEE Syst. J. 2019, 13, 3590–3603. [Google Scholar] [CrossRef] [Green Version]
- Wei, B.; Nener, B.D. Multi-sensor space debris tracking for situational awareness with labelled random finite sets. IEEE Access 2019, 7, 36991–37003. [Google Scholar] [CrossRef]
- Amos, B.; Dinh, L.; Cabi, S.; Rothorl, T.; Colmenarejo, S.G.; Muldal, A.; Erez, T.; Tassa, Y.; de Freitas, N.; Denil, M. Learning awareness models. arXiv 2018, arXiv:1804.06318. [Google Scholar]
- Endsley, M.R.; Rodgers, M.D. Distribution of attention, situation awareness and workload in a passive air traffic control task: Implications for operational errors and automation. Air Traffic Control. Q. 1998, 6, 21–44. [Google Scholar] [CrossRef]
- Jones, D.G.; Endsley, M.R. Use of real-time probes for measuring situation awareness. Int. J. Aviat. Psychol. 2004, 14, 343–367. [Google Scholar] [CrossRef]
- Kim, Y.-G.; Chang, W.; Kim, K.; Oh, T. Development of an situation awareness software for autonomous unmanned aerial vehicles. J. Aerosp. Syst. Eng. 2021, 15, 36–44. [Google Scholar]
- Sampedro, C.; Rodriguez-Ramos, A.; Bavle, H.; Carrio, A.; de la Puente, P.; Campoy, P. A fully-autonomous aerial robot for search and rescue applications in indoor environments using learning-based techniques. J. Intell. Robot. Syst. 2019, 95, 601–627. [Google Scholar] [CrossRef]
- Jeon, M.-J.; Park, H.-K.; Jagvaral, B.; Yoon, H.-S.; Kim, Y.-G.; Park, Y.-T. Relationship between UAVs and ambient objects with threat situational awareness through grid map-based ontology reasoning. Int. J. Comput. Appl. 2019, 41, 1–16. [Google Scholar] [CrossRef]
- Ruano, S.; Cuevas, C.; Gallego, G.; Garcia, N. Augmented reality tool for the situational awareness improvement of UAV operators. Sensors 2017, 17, 1–16. [Google Scholar] [CrossRef] [Green Version]
- Cuevas, H.M.; Aguiar, M. Assessing situation awareness in unmanned aircraft systems operations. Intern. J. Aviat. Aeronaut. Aerosp. 2017, 4. [Google Scholar] [CrossRef] [Green Version]
- McAree, O.; Chen, W.-H. Artificial situation awareness for increased autonomy of unmanned aerial systems in the terminal area. J. Intell. Robot. Syst. 2013, 70, 545–555. [Google Scholar] [CrossRef] [Green Version]
- McAree, O.; Aitken, J.; Veres, S. Towards artificial situation awareness by autonomous vehicles. IFAC Pap. 2017, 50, 7038–7043. [Google Scholar] [CrossRef]
- McAree, O.; Aitken, J.; Veres, S. Quantifying situation awareness for small unmanned aircraft. Aeronaut. J. 2018, 122, 733–746. [Google Scholar] [CrossRef] [Green Version]
- Liu, C.; Coombes, M.; Li, B.; Chen, W.-H. Enhanced situation awareness for unmanned aerial vehicle operating in terminal areas with circuit flight rules. J. Aerosp. Eng. 2016, 230, 1683–1693. [Google Scholar] [CrossRef] [Green Version]
- Cavaliere, D.; Loia, V.; Saggese, A.; Senatore, S.; Vento, M. Semantically enhanced UAVs to increase the aerial scene understanding. IEEE Trans. Syst. Man Cybern. 2019, 10, 555–567. [Google Scholar] [CrossRef]
- Bocaniala, C.D.; Sastry, V.V.S.S. On enhanced situational awareness models for unmanned aerial systems. In Proceedings of the 2010 IEEE Aerospace Conference, Big Sky, MT, USA, 6–13 March 2010; pp. 1–14. [Google Scholar]
- Freedman, S.T.; Adams, J.A. The inherent components of unmanned vehicle situation awareness. In Proceedings of the 2007 IEEE International Conference on Systems, Man and Cybernetics, Montreal, QC, Canada, 7–10 October 2007; pp. 973–977. [Google Scholar]
- He, Y. Mission-driven autonomous perception and fusion based on UAV swarm. Chin. J. Aeronaut. 2020, 33, 2831–2834. [Google Scholar] [CrossRef]
- Hill, V.W.; Thomas, R.W.; Larson, J.D. Autonomous situation awareness for UAS swarms. arXiv 2021, arXiv:2104.08904. [Google Scholar]
- Frische, F.; Lüdtke, A. SA-Tracer: A tool for assessment of UAV swarm operator SA during mission execution. In Proceedings of the 2013 IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA), San Diego, CA, USA, 25–28 February 2013; pp. 203–211. [Google Scholar]
- Geraldes, R.; Goncalves, A.; Lai, T.; Villerabel, M.; Deng, W.; Salta, A.; Nakayama, K.; Matsuo, Y.; Prendinger, H. UAV-based situational awareness system using deep learning. IEEE Access 2019, 7, 122583–122594. [Google Scholar] [CrossRef]
- Carrio, A.; Sampedro, C.; Rodriguez-Ramos, A.; Campoy, P. A review od deep learning methods and applications for unmanned aerial vehicles. J. Sens. 2017, 1–13. [Google Scholar] [CrossRef]
- Igonin, D.; Kolganov, P.; Tiumentsev, Y. Providing situational awareness in the control of unmanned vehicles. Stud. Comput. Intell. 2021, 925, 125–134. [Google Scholar]
- Zhang, J.; Jia, Y.; Zhu, D.; Hu, W.; Tang, Z. Study on the situational awareness system of mine fire rescue using Faster Ross Girshick convolutional neural network. IEEE Intell. Syst. 2020, 35, 54–61. [Google Scholar] [CrossRef]
- Peng, H.; Zhang, Y.; Yang, S.; Song, B. Battlefield image situational awareness application based on deep learning. IEEE Intell. Syst. 2020, 35, 36–43. [Google Scholar] [CrossRef]
- Almeida, R.B.; Junes, V.R.C.; da Silva Machado, R.; da Rosa, D.Y.L.; Donato, L.M.; Yamin, A.C.; Pernas, A.M. A distributed event-driven architectural model based on situational awareness applied on internet of things. Inf. Softw. Technol. 2019, 111, 144–158. [Google Scholar] [CrossRef]
- Schaefer, K.E.; Chen, J.Y.C.; Szalma, J.L.; Hancock, P.A. A meta-analysis of factors influencing the development of trust in automation: Implications for understanding autonomy in future systems. Hum. Factors 2016, 58, 377–400. [Google Scholar] [CrossRef] [Green Version]
- Blom, H.A.P.; Sharpanskykh, A. Modelling situation awareness relations in a multiagent system. Appl. Intell. 2015, 43, 412–423. [Google Scholar] [CrossRef] [Green Version]
- Van der Heijden, F.; Duin, R.P.; de Ridder, D.; Tax, D.M.J. Classification, Parameter Estimation and State Estimation; John Wiley & Sons: Hoboken, NJ, USA, 2004. [Google Scholar]
- Hajiyev, C. State Estimation and Control for Low-Cost Unmanned Aerial Vehicles; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Isermann, R.; Münchhoh, M. Identification of Dynamic Systems: An Introduction with Applications; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
- Ljung, L. System Identification: Theory for the User, 2nd ed.; Prentice-Hall, Inc.: Upper Saddle River, NJ, USA, 1999. [Google Scholar]
- Wang, L.; Garnier, H. (Eds.) System Identification, Environment Modelling, and Control System Design; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Nelles, O. Nonlinear System Identification: From Classical Approaches to Neural Networks and Fuzzy Models, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
- Billings, S.A. Nonlinear System Identification: NARMAX Methods in the Time, Frequency, and Spatio-temporal Domains; John Wiley & Sons: Chichester, UK, 2013. [Google Scholar]
- Klein, V.; Morelli, E.A. Aircraft System Identification: Theory and Practice; AIAA, Inc.: Reston, VA, USA, 2006. [Google Scholar]
- Jategaonkar, R.V. Flight Vehicle System Identification: A Time Domain Methodology; AIAA, Inc.: Reston, VA, USA, 2006. [Google Scholar]
- Tischler, M.B.; Remple, R.K. Aircraft and Rotorcraft System Identification: Engineering Methods with Flight-Test Examples; AIAA, Inc.: Reston, VA, USA, 2006. [Google Scholar]
- Talebi, H.A.; Abdollahi, F.; Patel, R.V.; Khorasani, K. Neural Network-Based State Estimation of Nonlinear Systems: Application to Fault Detection and Isolation; Springer: Berlin/Heidelberg, Germany, 2004. [Google Scholar]
- Palade, V.; Bocaniala, C.D. Computational Intelligence in Fault Diagnosis; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Blanke, M.; Kinnaert, M.; Lunze, J.; Staroswiecki, M. Diagnosis and Fault-Tolerant Control; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Patan, K. Artificial Neural Networks for the Modelling and Fault Diagnosis of Technical Processes; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- Sobhani-Tehrani, E.; Khorasani, K. Fault Diagnosis of Nonlinear Systems Using a Hybrid Approach; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
- Tiumentsev, Y.V.; Egorchev, M.V. Neural Network Modeling and Identification of Dynamical Systems; Elsevier: London, UK, 2019. [Google Scholar]
- Katok, A.; Hasselblatt, B. Introduction to the Modern Theory of Dynamical Systems; Cambridge University Press: Cambridge, UK, 1995. [Google Scholar]
- Ljung, L.; Glad, T. Modeling of Dynamic Systems; Prentice-Hall, Inc.: Englewood Cliffs, NJ, USA, 1994. [Google Scholar]
- Khalil, H.K. Nonlinear Systems, 3rd ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2002. [Google Scholar]
- Brusov, V.S.; Tiumentsev, Y.V. Neural Network Based Modeling of Aircraft Motion; The MAI Publishing House: Moscow, Russia, 2016. (In Russian) [Google Scholar]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; The MIT Press: Cambridge, MA, USA, 2017. [Google Scholar]
- Sutton, R.S.; Barto, A.G. Reinforcement Learning: An Introduction, 2nd ed.; The MIT Press: Cambridge, MA, USA, 2018. [Google Scholar]
- Dong, H.; Ding, Z.; Zhang, S. (Eds.) Deep Reinforcement Learning: Fundamentals, Research and Applications; Springer: Singapore, 2020. [Google Scholar]
- Kamalapurkar, R.; Walters, P.; Rosenfeld, J.; Dixon, W. Reinforcement Learning for Optimal Feedback Control: A Lyapunov-Based Approach; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
- Ahrens, C.D.; Henson, R. Essentials of Meteorology Today: An Invitation to the Atmosphere, 8th ed.; Cengage Learning: Boston, MA, USA, 2018. [Google Scholar]
- Ahrens, C.D.; Henson, R. Meteorology Today: An Introduction to Weather, Climate, and the Environment, 12th ed.; Cengage Learning: Boston, MA, USA, 2019. [Google Scholar]
- Saha, K. The Earth’s Atmosphere: Its Physics and Dynamics; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- Wells, N.C. The Atmosphere and Ocean: A Physical Introduction, 3rd ed.; John Wiley & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
- Njoku, E.G. (Ed.) Encyclopedia of Remote Sensing; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
- Thenkabail, P.S. (Ed.) Remote Sensing Handbook. Vol. I: Remotely Sensed Data Characterization, Classification, and Accuracies; CRC Press: London, UK, 2015. [Google Scholar]
- Thenkabail, P.S. (Ed.) Remote Sensing Handbook. Vol. II, Land Resources Monitoring, Modeling, and Mapping with Remote Sensing; CRC Press: London, UK, 2016. [Google Scholar]
- Thenkabail, P.S. (Ed.) Remote Sensing Handbook. Vol. III, Remote Sensing of Water Resources, Disasters, and Urban Studies; CRC Press: London, UK, 2016. [Google Scholar]
- Adams, J.B.; Gillespie, A.R. Remote Sensing of Landscapes with Spectral Images: A Physical Modeling Approach; Cambridge University Press: Cambridge, UK, 2006. [Google Scholar]
- Fraden, J. Handbook of Modern Sensors: Physics, Designs, and Applications, 5th ed.; Springer: New York, NY, USA, 2016. [Google Scholar]
- Krasilshchikov, M.N.; Sebryakov, G.G. (Eds.) Modern Information Technologies Applying to Maneuverable Unmanned Vehicles Guidance, Navigation and Control Problems; Fizmatlit: Moscow, Russia, 2009. (In Russian) [Google Scholar]
- Liggins, M.E.; Hall, D.L.; Llinas, J. (Eds.) Handbook of Multisensor Data Fusion: Theory and Practice, 2nd ed.; CRC Press: London, UK, 2009. [Google Scholar]
- Ellis, G. Observers in Control Systems; Academic Press: Amsterdam, The Netherlands, 2002. [Google Scholar]
- The FlightGear Manual, September 2021. Available online: http://flightgear.sourceforge.net/getstart-en/getstart-en.html (accessed on 1 December 2021).
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
- Phillips, D.J.; Wheeler, T.A.; Kochenderfer, M.J. Generalizable intention prediction of human drivers at intersections. In Proceedings of the IEEE Intelligent Vehicles Symposium (IV), Los Angeles, CA, USA, 11–14 June 2017; pp. 1665–1670. [Google Scholar]
- Zyner, A.; Worrall, S.; Nebot, E. A recurrent neural network solution for predicting driver intention at unsignalized intersections. IEEE Robot. Autom. Lett. 2018, 3, 1759–1764. [Google Scholar] [CrossRef]
- Gers, F.A.; Schmidhuber, J.; Cummins, F. Learning to forget: Continual prediction with LSTM. Neural Comput. 2000, 12, 2451–2471. [Google Scholar] [CrossRef] [PubMed]
- Semantic Drone Dataset. Institute of Computer Graphics and Vision, Graz University of Technology (Austria). Available online: http://dronedataset.icg.tugraz.at (accessed on 1 December 2021).
- Aerial Semantic Segmentation Drone Dataset. Available online: https://www.kaggle.com/nunenuh/semantic-drone (accessed on 1 December 2021).
- Fast Image Augmentation Library. Available online: https://github.com/albumentations-team/albumentations (accessed on 1 December 2021).
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional networks for biomedical image segmentation. arXiv 2015, arXiv:1505.04597v1. [Google Scholar]
- Zhao, H.; Shi, J.; Qi, X.; Wang, X.; Jia, J. Pyramid scene parsing network. arXiv 2017, arXiv:1612.01105v2. [Google Scholar]
- Chen, L.-C.; Papandreou, G.; Schroff, F.; Adam, H. Rethinking atrous convolution for semantic image segmentation. arXiv 2017, arXiv:1706.05587v3. [Google Scholar]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet large scale visual recognition challenge. arXiv 2015, arXiv:1409.0575v3. [Google Scholar] [CrossRef] [Green Version]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.-C. MobileNetV2: Inverted residuals and linear bottlenecks. arXiv 2019, arXiv:1801.04381v4. [Google Scholar]
- He, K.; Zhang, K.; Ren, S.; Sun, J. Deep residual learning for image recognition. arXiv 2015, arXiv:1512.03385v1. [Google Scholar]
- Loshchilov, I.; Hutter, Y. Decoupled weight decay regularization. arXiv 2019, arXiv:1711.05101v3. [Google Scholar]
- Hinton, G.; Vinyals, O.; Dean, J. Distilling the knowledge in a neural network. arXiv 2015, arXiv:1503.02531v1. [Google Scholar]
- Gou, J.; Yu, B.; Maybank, S.J.; Tao, D. Knowledge distillation: A survey. arXiv 2021, arXiv:2006.05525v7. [Google Scholar] [CrossRef]
- Alkhulaifi, A.; Alsahli, F.; Ahmad, I. Knowledge distillation in deep learning and its applications. PeerJ Comput. Sci. 2021, 7, 1–24. [Google Scholar] [CrossRef] [PubMed]
Segmentation Model | Backbone | Mean IoU | Pixel Accuracy | Prediction Time, s |
---|---|---|---|---|
Unet | MobileNet v2 | 0.49 | 0.89 | 0.067 |
PSPNet | ResNet34 | 0.45 | 0.88 | 0.045 |
DeepLab V3 | ResNet34 | 0.51 | 0.9 | 0.14 |
Model Type | Model | Backbone | Model Size | Mean IoU | Pixel Accuracy | Prediction Time, s |
---|---|---|---|---|---|---|
teacher | DeepLabV3 | SE-Resnet50 | 162 MB | 0.644 | 0.93 | 0.2 |
student | PSPNet | MobileNetV2 | 9.4 MB | 0.516 (0.612) | 0.9 (0.91) | 0.041 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Igonin, D.M.; Kolganov, P.A.; Tiumentsev, Y.V. Situational Awareness and Problems of Its Formation in the Tasks of UAV Behavior Control. Appl. Sci. 2021, 11, 11611. https://doi.org/10.3390/app112411611
Igonin DM, Kolganov PA, Tiumentsev YV. Situational Awareness and Problems of Its Formation in the Tasks of UAV Behavior Control. Applied Sciences. 2021; 11(24):11611. https://doi.org/10.3390/app112411611
Chicago/Turabian StyleIgonin, Dmitry M., Pavel A. Kolganov, and Yury V. Tiumentsev. 2021. "Situational Awareness and Problems of Its Formation in the Tasks of UAV Behavior Control" Applied Sciences 11, no. 24: 11611. https://doi.org/10.3390/app112411611
APA StyleIgonin, D. M., Kolganov, P. A., & Tiumentsev, Y. V. (2021). Situational Awareness and Problems of Its Formation in the Tasks of UAV Behavior Control. Applied Sciences, 11(24), 11611. https://doi.org/10.3390/app112411611