Development and Implementation of a Pilot Intent Recognition Model Based on Operational Sequences
Abstract
1. Introduction
2. Methodology
2.1. Overview
2.2. Standard Operational Sequences Analysis
2.3. Operation Sequences Collection
2.3.1. Experiment and Task Design
2.3.2. Participants
2.3.3. Experimental Procedure
2.4. Intent Recognition Model
2.4.1. Model Process and Methods
2.4.2. Operation Sequence and Task Definition
2.4.3. Feature Scoring Metrics
2.4.4. Comprehensive Evaluation of Metrics
3. Results
3.1. Performance of Pilot Intent Model Based on Feature Linear Weighting
3.2. Pilot Intent Recognition Model Based on Feature Hierarchy
3.2.1. Research on Feature Hierarchy of Indicator Contribution Rate
3.2.2. Model Performance
3.3. Comparative Study
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
Subtask | Execution Actions |
---|---|
Initiate Radar Monitoring | Radar sub-mode shortcut button group, radar frequency/band selection switch, scan angle adjustment knob, beam width adjustment knob, radar gain/sensitivity adjustment knob, radar emergency reset button |
Observe HUD Alert Information | HUD alert prompt area, HUD weapon aiming crosshair, HUD attitude guidance line |
Use HMD for Assisted Identification | HMD attitude sensor, HMD navigation guidance line, HMD night vision enhancement module |
Target Interception and Lock | Target interception/forced tracking button, target handover data link control key, target classification and filtering button, joystick radar linkage button, joystick weapon selection and interception button |
System Status Control | System reset master switch, autopilot mode switch, autopilot altitude hold knob, autopilot heading lock knob |
Subtask | Execution Actions |
---|---|
Radar Weather Monitoring | Radar sub-mode shortcut button group, radar frequency/band selection switch, radar gain/sensitivity adjustment knob, radar scan center offset button |
Adjust Autopilot Route Parameters | Autopilot mode switch, autopilot altitude hold knob, autopilot heading lock knob, autopilot speed control slider |
MFD Interface Operation and Information Display | MFD touchscreen, MFD mode switch knob, MFD system status indicator lights |
Route Adjustment and Fuel Management | FMC keyboard input module, fuel sensors (capacitive/temperature/density) |
System Reset and Alarm Response | System reset master switch, HUD alert prompt area, frequency conflict alert light |
Subtask | Execution Actions |
---|---|
IRS System Calibration | IRS calibration indicator light, system reset master switch |
FMC Data Input and Verification | FMC keyboard input module, FMC control panel knobs, physical keyboard parameter setting keys |
MFD Mode Switching and Display | MFD touchscreen, MFD mode switch knob |
Fuel Sensing and Management | Fuel sensors (capacitive/temperature/density), automatic throttle switching switch, throttle control knob |
Autopilot Altitude and Heading Control | Autopilot altitude hold knob, autopilot heading lock knob, autopilot speed control slider |
Blind Landing Instrument and ILS Control | ILS control panel knob, system monitoring of ILS signal stability, HUD alert prompt area |
Subtask | Execution Actions |
---|---|
System Monitoring ILS Signal Stability | System monitoring of ILS signal stability, ILS control panel knob |
Autopilot Mode Configuration | Autopilot mode switch, autopilot heading lock knob, autopilot altitude hold knob |
FMC Input and Route Confirmation | FMC keyboard input module, FMC control panel knobs |
Radar and Target Recognition Assistance | Radar sub-mode shortcut button group, radar frequency/band selection switch, radar gain/sensitivity adjustment knob, radar emergency reset button |
HUD and HMD Navigation Display | HUD alert prompt area, HUD weapon aiming crosshair, HUD attitude guidance line, HMD navigation guidance line |
Throttle and Flight Control Adjustment | Throttle control knob, joystick flight control knob, joystick radar linkage button, joystick weapon selection and interception button |
System Alarm Handling | System reset master switch, frequency conflict alert light |
Subtask | Execution Actions |
---|---|
Communication Frequency Selection and Control | Communication control panel knob, frequency conflict alert light, emergency communication button |
FMC and MFD Coordinated Operation | FMC keyboard input module, MFD touchscreen, MFD mode switch knob |
Radar and Equipment Status Monitoring | Radar sub-mode shortcut button group, radar frequency/band selection switch, radar gain/sensitivity adjustment knob |
Joystick Control Related Buttons | Joystick radar linkage button, joystick weapon selection and interception button, joystick emergency operation button |
System Alarm and Reset | HUD alert prompt area, system reset master switch |
References
- Duric, Z.; Gray, W.; Heishman, R.; Li, F.; Rosenfeld, A.; Schoelles, M.; Schunn, C.; Wechsler, H. Integrating perceptual and cognitive modeling for adaptive and intelligent human-computer interaction. Proc. IEEE 2002, 90, 1272–1289. [Google Scholar] [CrossRef]
- Kong, Z.; Ge, Q.; Pan, C. Current status and future prospects of manned/unmanned teaming networking issues. Int. J. Syst. Sci. 2024, 56, 866–884. [Google Scholar] [CrossRef]
- Miller, C.A.; Hannen, M.D. The Rotorcraft Pilot’s Associate: Design and evaluation of an intelligent user interface for cockpit information management. Knowl. Base Syst. 1999, 12, 443–456. [Google Scholar] [CrossRef]
- Karaman, C.C.; Sezgin, T.M. Gaze-based predictive user interfaces: Visualizing user intentions in the presence of uncertainty. Int. J. Hum. Comput. Stud. 2018, 111, 78–91. [Google Scholar] [CrossRef]
- Cig, C.; Sezgin, T.M. Gaze-based prediction of pen-based virtual interaction tasks. Int. J. Hum. Comput. Stud. 2015, 73, 91–106. [Google Scholar] [CrossRef]
- Wang, H.; Pan, T.; Si, H.; Li, Y.; Jiang, N. Research on influencing factor selection of pilot’s intention. Int. J. Aero. Eng. 2020, 2020, 4294538. [Google Scholar] [CrossRef]
- Wei, D.; Chen, L.; Zhao, L.; Zhou, H.; Huang, B. A vision-based measure of environmental effects on inferring human intention during human robot interaction. IEEE Sens. J. 2022, 22, 4246–4256. [Google Scholar] [CrossRef]
- Zhang, R.; Qiu, X.; Han, J.; Wu, H.; Li, M.; Zhou, X. Hierarchical intention recognition framework in intelligent human–computer interactions for helicopter and drone collaborative wildfire rescue missions. Eng. Appl. Artif. Intell. 2025, 143, 110037. [Google Scholar] [CrossRef]
- Chen, X.-L.; Hou, W.-J. Gaze-based interaction intention recognition in virtual reality. Electronics 2022, 11, 1647. [Google Scholar] [CrossRef]
- Koochaki, F.; Najafizadeh, L. Predicting intention through eye gaze patterns. In Proceedings of the IEEE Biomedical Circuits and Systems Conference (BioCAS)—Advanced Systems for Enhancing Human Health, Cleveland, OH, USA, 17–19 October 2018; pp. 25–28. [Google Scholar] [CrossRef]
- Dong, L.; Chen, H.; Zhao, C.; Wang, P. Analysis of single-pilot intention modeling in commercial aviation. Int. J. Aero. Eng. 2023, 2023, 9713312. [Google Scholar] [CrossRef]
- Huang, W.; Wang, C.; Jia, H. Ergonomics analysis based on intention inference. J. Intell. Fuzzy Syst. 2021, 41, 1281–1296. [Google Scholar] [CrossRef]
- Feleke, A.G.; Bi, L.; Fei, W. EMG-based 3D hand motor intention prediction for information transfer from human to robot. Sensors 2021, 21, 1316. [Google Scholar] [CrossRef]
- Lin, X.; Zhang, J.; Zhu, Y.; Liu, W. Simulation study of algorithms for aircraft trajectory prediction based on ADS-B technology. In Proceedings of the Asia Simulation Conference/7th International Conference on System Simulation and Scientific Computing, Beijing, China, 10–12 October 2008. [Google Scholar] [CrossRef]
- Xia, J.; Chen, M.; Fang, W. Air combat intention recognition with incomplete information based on decision tree and GRU network. Entropy 2023, 25, 671. [Google Scholar] [CrossRef]
- Zhou, T.; Chen, M.; Wang, Y.; He, J.; Yang, C. Information entropy-based intention prediction of aerial targets under uncertain and incomplete information. Entropy 2020, 22, 279. [Google Scholar] [CrossRef] [PubMed]
- Tong, T.; Setchi, R.; Hicks, Y. Context change and triggers for human intention recognition. Procedia Comput. Sci. 2022, 207, 3826–3835. [Google Scholar] [CrossRef]
- Xiang, W.; Li, X.; He, Z.; Su, C.; Cheng, W.; Lu, C.; Yang, S. Intention estimation of adversarial spatial target based on fuzzy inference. Intell. Autom. Soft Comput. 2023, 35, 3627–3639. [Google Scholar] [CrossRef]
- Dubois, D.; Hájek, P.; Prade, H. Knowledge-driven versus data-driven logics. J. Logic Lang. Inf. 2000, 9, 65–89. [Google Scholar] [CrossRef]
- Gui, C.; Zhang, L.; Bai, Y.; Shi, G.; Duan, Y.; Du, J.; Wang, X.; Zhang, Y. Recognition of flight operation action based on expert system inference engine. In Proceedings of the 11th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hangzhou, China, 24–25 August 2019. [Google Scholar] [CrossRef]
- Jang, Y.-M.; Mallipeddi, R.; Lee, S.; Kwak, H.-W.; Lee, M. Human intention recognition based on eyeball movement pattern and pupil size variation. Neurocomputing 2014, 128, 421–432. [Google Scholar] [CrossRef]
- Qi, H.; Zhang, L.; Li, S.; Fu, Y. The identification method research for the helicopter flight based on decision-tree-based support vector machine with the parameter optimization. In Proceedings of the 36th Chinese Control Conference (CCC), Dalian, China, 26–28 July 2017. [Google Scholar] [CrossRef]
- Kang, J.-S.; Park, U.; Gonuguntla, V.; Veluvolu, K.C.; Lee, M. Human implicit intent recognition based on the phase synchrony of EEG signals. Pattern Recogn. Lett. 2015, 66, 144–152. [Google Scholar] [CrossRef]
- Liu, S. Aviation Safety Risk Analysis and Flight Technology Assessment Issues. arXiv 2023, arXiv:2309.12324. [Google Scholar] [CrossRef]
- Bejani, M.M.; Ghatee, M. A systematic review on overfitting control in shallow and deep neural networks. Artif. Intell. Rev. 2021, 54, 6391–6438. [Google Scholar] [CrossRef]
- Cambasi, H.; Kuru, O.; Amasyali, M.F.; Tahar, S. Comparison of dynamic bayesian network tools. In Proceedings of the Innovations in Intelligent Systems and Applications Conference (ASYU), Izmir, Turkey, 31 October–2 November 2019. [Google Scholar] [CrossRef]
- Pariès, J.; Wreathall, J.; Woods, D.D.; Hollnagel, E. Resilience Engineering in Practice: A Guidebook; Ashgate Publishing, Ltd.: London, UK, 2012; Volume 94, p. 291. [Google Scholar]
- Sorensen, L.J. Cognitive Work Analysis: Coping with Complexity. Ergonomics 2010, 53, 139. [Google Scholar] [CrossRef]
- Visser, R.; Peters, T.M.; Scharlau, I.; Hammer, B. Trust, distrust, and appropriate reliance in (X) AI: A conceptual clarification of user trust and survey of its empirical evaluation. Cogn. Syst. Res. 2025, 91, 101357. [Google Scholar] [CrossRef]
- Tang, W.; Mao, K.Z.; Mak, L.O.; Ng, G.W. Adaptive Fuzzy Rule-Based Classification System Integrating Both Expert Knowledge and Data. In Proceedings of the International Conference on Tools with Artificial Intelligence, Athens, Greece, 7–9 November 2012. [Google Scholar]
- Stanton, N.A. Hierarchical task analysis: Developments, applications, and extensions. Appl. Ergon. 2006, 37, 55–79. [Google Scholar] [CrossRef] [PubMed]
- Beernaerts, J.; Debever, E.; Lenoir, M.; De Baets, B.; Van de Weghe, N. A method based on the Levenshtein distance metric for the comparison of multiple movement patterns described by matrix sequences of different length. Expert Syst. Appl. 2019, 115, 373–385. [Google Scholar] [CrossRef]
- Ribeiro, M.; Singh, S.; Guestrin, C. “Why Should I Trust You?”: Explaining the Predictions of Any Classifier. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Diego, CA, USA, 12–17 June 2016; pp. 1135–1144. [Google Scholar]
Task | Execution Number Sequence |
---|---|
PIPT | 7, 22, 28, 10, 12, 11, 13, 7, 23, 8, 18, 9, 14, 32, 16, 24, 15, 7, 2, 8, 33, 21, 29, 37 |
AWADRA | 10, 12, 11, 13, 7, 23, 8, 18, 9, 14, 32, 7, 34, 8, 24, 15, 7, 2, 8, 33, 21, 29, 37 |
FMS-RCFO | 7, 38, 7, 41, 8, 39, 7, 33, 10, 8, 7, 40, 21, 37, 7, 2, 9, 14, 32 |
LVILS-AP | 7, 42, 10, 11, 33, 34, 8, 7, 21, 22, 43, 14, 33, 34, 9, 2, 9, 1, 35, 37 |
MCCCR | 44, 7, 10, 11, 45, 37, 8, 7, 21, 22, 34, 33, 9, 2, 32, 46, 9, 37 |
Recognition Features | Recognition Accuracy (%) | |||
---|---|---|---|---|
Overall | Segment Sequence | Abnormal Sequence | Special Sequence | |
Coverage rate | 87.78 ± 1.42 | 78.00 ± 3.15 | 92.00 ± 1.94 | 93.33 ± 3.14 |
Operation matching degree | 87.78 ± 1.42 | 78.00 ± 3.15 | 92.00 ± 1.94 | 93.33 ± 3.14 |
Sequence matching degree | 79.33 ± 1.82 | 76.00 ± 2.27 | 87.33 ± 2.52 | 74.67 ± 4.07 |
Feature linear weighting model | 88.89 ± 1.37 | 78.67 ± 1.37 | 93.33 ± 1.72 | 94.67 ± 3.41 |
Recognition Features | Average Score | Score Contribution Rate (%) |
---|---|---|
Coverage rate | 0.0823 | 34.61 |
Operation matching degree | 0.0882 | 36.89 |
Sequence matching degree | 0.0681 | 28.50 |
Recognition Features | Score Contribution Rate (%) | ||
---|---|---|---|
Segment Sequence | Abnormal Sequence | Special Sequence | |
Coverage rate | 37.12 | 34.23 | 35.30 |
Operation matching degree | 34.53 | 36.72 | 35.90 |
Sequence matching degree | 28.35 | 29.05 | 28.80 |
Recognition Model | Recognition Accuracy (%) | |||
---|---|---|---|---|
Overall | Segment Sequence | Abnormal Sequence | Special Sequence | |
Architecture A | 86.67 ± 1.37 | 77.33 ± 3.33 | 92.00 ± 1.94 | 90.67 ± 3.61 |
Architecture B | 85.33 ± 1.49 | 74.67 ± 2.18 | 91.33 ± 1.47 | 90.00 ± 3.62 |
Architecture C | 84.89 ± 1.27 | 78.00 ± 3.30 | 92.67 ± 1.56 | 84.00 ± 3.74 |
Recognition Model | Recognition Accuracy (%) | |||
---|---|---|---|---|
Overall | Segment Sequence | Abnormal Sequence | Special Sequence | |
Feature hierarchical model | 86.67 ± 1.37 | 77.33 ± 3.33 | 92.00 ± 1.94 | 90.67 ± 3.61 |
Feature linear weighting model | 88.89 ± 1.37 | 78.67 ± 1.37 | 93.33 ± 1.72 | 94.67 ± 3.41 |
KNN | 22.44 ± 0.70 | 21.33 ± 0.89 | 22.00 ± 1.02 | 24.00 ± 1.78 |
SVM | 76.89 ± 1.00 | 69.33 ± 1.78 | 82.67 ± 2.47 | 78.67 ± 1.66 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mao, X.; Ding, L.; Sun, X.; Pang, L.; Deng, Y.; Wang, X. Development and Implementation of a Pilot Intent Recognition Model Based on Operational Sequences. Aerospace 2025, 12, 780. https://doi.org/10.3390/aerospace12090780
Mao X, Ding L, Sun X, Pang L, Deng Y, Wang X. Development and Implementation of a Pilot Intent Recognition Model Based on Operational Sequences. Aerospace. 2025; 12(9):780. https://doi.org/10.3390/aerospace12090780
Chicago/Turabian StyleMao, Xiaodong, Lishi Ding, Xiaofang Sun, Liping Pang, Ye Deng, and Xin Wang. 2025. "Development and Implementation of a Pilot Intent Recognition Model Based on Operational Sequences" Aerospace 12, no. 9: 780. https://doi.org/10.3390/aerospace12090780
APA StyleMao, X., Ding, L., Sun, X., Pang, L., Deng, Y., & Wang, X. (2025). Development and Implementation of a Pilot Intent Recognition Model Based on Operational Sequences. Aerospace, 12(9), 780. https://doi.org/10.3390/aerospace12090780