This publication was co-funded by the European Union under the Grant Agreement 101103592. Its contents are the sole responsibility of the EPIIC (Enhanced Pilot Interfaces & Interactions for fighter Cockpit) Consortium and do not necessarily reflect the views of the European Union or the European Commission. Neither the European Union nor the granting authority can be held responsible for them.
1. Introduction
This paper provides a literature review on the diverse eye-tracking technologies employed to enhance multimodal interactions within aviation environments. The primary objective of this review is to examine and compare eye-tracking approaches, offering insights into their practical applications, strengths, and limitations. The integration of eye-tracking technology in cockpit operations is driven by its potential to address several critical challenges in aviation, such as reducing the pilot’s cognitive load, minimizing human error, and improving situation awareness. By evaluating current methodologies and technologies, this review aims to identify solutions that could be utilized to improve human–machine interactions in aviation with a particular focus on optimizing cockpit operations. This exploration is motivated by the need to find more effective interaction modalities that can support the increasingly complex environments pilots face, ensuring that critical information is accessed quickly and accurately. Additionally, we discuss the use of eye-tracking technologies as input mechanisms in aviation environments and their potential synergy with various multimodal human–machine interfaces towards optimizing cockpit operations, bolstering safety, and elevating user experience in aviation contexts.
To achieve a comprehensive overview, we started with a search strategy targeting three key databases (ACM Digital Library, IEEE Xplore, and Google Scholar) to cover technology and aviation topics. The search method included a generic search string: (“eye-tracking” OR “eye tracking technologies” OR “gaze tracking” OR “eye movement tracking”) AND (“multimodal interaction” OR “multimodal interfaces” OR “human–computer interaction”) AND (“aviation” OR “cockpit interaction” OR “pilot interface”). This search string was developed to specifically find papers analyzing the use of eye-tracking technologies in the context of multimodal interaction within aviation settings. Specific search strings were used for the following categories: (1) Eye-Tracking Technologies, (2) the Exploration of Gaze Model Estimations, (3) the Exploration of Fixations for Interaction Means, (4) the Exploration of Eyelids for Interaction Means, (5) the Exploration of Eyebrows for Interaction Means, (6) Requirements for the Calculation of Ocular Features, (7) Eye Tracking and Voice Multimodality, (8) Eye Tracking and Gesture Multimodality, (9) Eye Tracking and Cursor Control Devices Multimodality, and (10) Eye Tracking and BCI Multimodality.
These searches resulted in 6475 studies, and after removing duplicates, there were 5708 titles. Through title and abstract screening by three researchers, we selected 49 papers for this literature review. Apart from relevance, the selection criteria were the following: For relevant papers published in 2024, we included them regardless of the number of references to capture cutting-edge and emerging research. For relevant papers published between 2020 and 2023, we required at least one reference to ensure they had received some acknowledgment or validation within the academic community. For studies published in 2019 or earlier, we required a minimum of three references, reflecting the need for established contributions and demonstrated academic credibility. Studies failing to meet these thresholds were excluded.
Due to these constraints, the final selection was narrowed to 49 studies using categorization to ensure a broad representation of current technologies and their applications. Specifically, we grouped studies into four main categories: (a) foundational methods and technologies in eye-tracking technology (14 papers), highlighting key advancements like pupil–corneal reflection and video-based systems; (b) operational applications of eye tracking (15 papers), focusing on real-time gaze monitoring, adaptive automation, and gaze-based interface interactions; (c) the integration of eye tracking with other multimodal systems (10 papers), such as combining gaze with speech or gesture for enhanced cockpit usability; and (d) training and simulation use cases (10 papers), emphasizing application for visual attention tracking and pilot performance improvement.
Eye-tracking technologies in aviation hold substantial promise in enhancing cockpit interactions and training and simulation environments, and they provide detailed insights into pilots’ visual attention and decision-making processes. By capturing real-time data on gaze patterns and visual focus, eye-tracking systems can identify potential areas of improvement in pilot training programs, ultimately contributing to more effective and targeted instruction [
1]. This data-driven approach ensures that training can be tailored to address specific weaknesses, enhancing overall pilot competency and safety. Moreover, combining eye-tracking data with other physiological and performance metrics provides a view on pilot behavior, enhancing the understanding of human factors in aviation and driving advancements in interface design and operational procedures. By justifying the exploration of eye-tracking technologies in aviation, this work seeks to underscore their potential to revolutionize both operational and training paradigms, ensuring that the flight operator remains at the forefront of safety and efficiency. This work aims to gain a broad overview of the current diversity in the field of aviation and identify the interactions utilized as a practical usability tool in eye-tracking systems.
2. Eye-Tracking Technologies
Eye-tracking technologies have been employed in aviation to assess how the pilot’s cognitive load influences visual behavior and performance [
2,
3]. Also, they have been used in unmanned aerial vehicles to enhance adaptive human–machine interfaces [
4]. Since the inception of eye tracking, various methods have been widely utilized to measure eye movements. Among these, four main methods stand out: Electro-OculoGraphy (EOG), scleral contact lens/search coil, Photo-OculoGraphy (POG) or Video-OculoGraphy (VOG), and video-based combined pupil–corneal reflection [
5]. Modern eye-tracking systems utilize video-based technology, incorporating infrared illumination and a camera to capture users’ eye movements, categorized as remote, mobile, or tower-mounted [
6]. The development and refinement of eye gaze estimation systems and algorithms in consumer platforms have advanced the creation of standardized methodologies for performance evaluation. A key aspect of this development includes addressing diverse performance metrics and system configurations across platforms, which necessitates the methodological standardization for practical evaluation frameworks in gaze-tracking systems [
1]. These advancements in VOG and the development of less invasive, more comfortable systems have broadened the usability of eye-tracking technology in diverse fields [
7]. This includes marketing research, usability studies, and cognitive science, where understanding eye movement patterns can provide profound insights into human behavior and decision-making processes.
A significant advancement in eye-tracking technology is the development of head-mounted eye-trackers [
8]. These eye-tracking systems enable non-contact measurement of eye movements, and they are commonly employed in screen-based interaction experiments [
9]. Head-mounted eye-trackers offer the advantage of enabling more natural interactions in dynamic environments, but they can be cumbersome to wear for extended periods and may suffer from occlusion issues, especially in high-speed or physically demanding scenarios [
10]. Conversely, eye trackers with head stabilization, which keep the head in a fixed position and prevent movement, offer heightened saccade resolution. However, their limited mobility restricts their suitability for dynamic environments. These systems are often applied in studies that require precise measurements with participants viewing fixed screens. The flexibility and adaptability of modern eye-tracking technologies have been extensively documented, with researchers emphasizing the importance of tailoring systems to specific use cases and environments [
11].
Additionally, appearance-based gaze estimation methods, which rely on visual information captured by standard off-the-shelf cameras to estimate gaze direction, have shown significant improvements. These methods offer a promising alternative to traditional eye-tracking systems, like the ones that use infrared illumination, by providing flexibility in various interaction scenarios and enabling new applications with more accessible technology [
12]. However, appearance-based gaze estimation methods, which rely on feature extraction from images (such as the shape and movement of the eyes, face, or pupil), are not yet applicable in cockpit environments due to technical challenges such as high illumination levels and a limited field of view [
13].
Addressing these technical challenges is crucial to enhance the applicability and reliability of gaze estimation systems across different platforms [
1]. Moreover, gaze control technology is advancing towards greater integration into multimodal user interfaces, which combine eye movements with other inputs such as gestures, voice commands, and physical controls. The integration of eye gaze with thumb-to-finger micro gestures, as demonstrated in the “M[eye]cro interaction technique” [
10], presents a promising approach to maintain focus on high-priority tasks while managing various control systems in cockpit environments. For instance, the combination of gaze and voice control can significantly reduce the cognitive load on users by allowing seamless transitions between visual focus and verbal commands, as demonstrated in various experimental setups in both military and civilian aviation contexts [
14].
3. Use of Eye Tracking for Interaction in Cockpit Environments
The aviation sector has experienced notable technological advancements, fundamentally altering the design and operation of cockpit interfaces. Among these advancements, eye-tracking technologies have emerged as a promising avenue for enhancing pilot interactions within the cockpit. These technologies provide real-time insights into pilots’ visual attention and cognitive load, which is crucial for optimizing cockpit design and functionality.
3.1. Enhancing Pilot Interaction and Efficiency with Eye-Tracking Technologies
Eye-tracking systems can monitor gaze direction, fixations, and saccades, providing valuable data on how pilots interact with various cockpit elements [
15]. Through the precise detection and interpretation of eye movements and gaze patterns, these systems have the potential to augment efficiency, safety, and overall user experience in cockpit operations. For instance, the implementation of a prolonged dwell time (i.e., fixations exceeding natural durations) as an input mechanism has shown promising results [
16,
17]. Additionally, leveraging blinks and winks [
18] for object selection, validation, or cancelation or for transitioning between various modes or alternatives, alongside utilizing eyelids to control the duration or intensity of the action by closing them longer or shorter, has been explored. Moreover, eye tracking can be used to facilitate adaptive automation, where systems adjust based on the pilot’s gaze behavior, enhancing situation awareness and reducing the risk of errors [
19].
The broader context of using eye tracking for interaction in aviation environments highlights several key benefits and challenges. Eye tracking offers a fast and natural way for pilots to interact with cockpit systems, potentially reducing cognitive load and physical demands [
20]. For example, studies have demonstrated how gaze-based interactions can seamlessly integrate with traditional input mechanisms, overcoming issues like the “Midas Touch” problem by employing calibrated dwell times and intelligent fixation recognition algorithms [
21]. This technology can help in identifying critical areas where pilots may experience a high task load, allowing for better ergonomic cockpit design and system improvements [
22]. Findings from high-speed navigation studies underline the importance of gaze behavior in identifying critical visual cues, as novice users tend to focus more on proximal objects and electronic aids, while experienced ones balance attention across environmental cues and system data [
23]. Moreover, gaze-based interaction can enable the hands-free operation of various controls, allowing pilots to maintain situation awareness and focus on flying tasks [
24]. Additionally, incorporating eye tracking can improve the monitoring of pilot attention and fatigue, thereby enhancing overall flight safety. This can be achieved by continuously monitoring eye movements and alerting pilots to potential fatigue and distraction, thus mitigating associated risks [
25].
Maintaining and balancing an optimal level of cognitive load is essential for completing the task productively. Long flights is one such example, where the pilot is loaded heavily both physically and cognitively (handling multiple sensors, perceiving, processing, and multi-tasking, including communications and handling equipment) to fulfill flight requirements [
26]. Research has shown that eye-tracking systems can significantly improve interaction efficiency and accuracy in specific settings, providing immediate feedback and allowing for intuitive control [
16,
24]. This capability is crucial in dynamic and high-pressure environments like aircraft cockpits.
3.2. Application in Military Aviation
In military aviation, eye-tracking systems have been effectively utilized for operating Multi-Functional Displays (MFDs) and Head-Mounted Display Systems (HMDSs), significantly improving pilots’ visual scan patterns and situation awareness. These applications are crucial in environments where radiational manual control may be hindered by physical constraints or high-stress conditions [
27,
28]. Moreover, eye-tracking technologies have been integrated into advanced training simulators to provide real-time feedback to trainees, aiding in the development of more effective scanning techniques and skills for maintaining high situation awareness.
The ability to track and interpret eye movements in real time opens new possibilities for designing more responsive and adaptive cockpit interfaces. VOG has become a crucial tool to enable the precise monitoring of ocular movements through live video recordings. Research in military aviation environments has shown that wearable eye-tracking technology must address challenges such as high levels of illumination and a variable field of view [
13]. Advanced VOG systems are being integrated with machine learning algorithms to enhance their robustness and accuracy. Moreover, constantly enhanced machine learning algorithms are being developed to improve the accuracy and reliability of gaze estimation, ensuring consistent performance even during complex flight maneuvers [
29]. These algorithms are designed to filter out noise and adapt to individual differences in eye movements, providing more precise and actionable data.
The use of VOG in aviation is not limited to real-time tracking but extends to post-flight analysis and training. By examining eye-tracking data, researchers can gain insights into pilots’ decision-making processes and identify cockpit areas that produce a high cognitive load. This information can influence the design of more intuitive cockpit interfaces and training programs. Furthermore, the analysis of this eye-tracking data supports the development of predictive models for pilot performance, which are instrumental in selecting the most capable pilots for high-stress scenarios [
30].
4. Multimodal Interactions in Aviation Environments Enhanced with Eye-Tracking Input
Eye-tracking technologies could be used for integration within aviation settings, capitalizing on the inherent multimodal nature of human interaction. By harnessing this principle, pilots can seamlessly transition between various input modalities, thereby mitigating the risk of physical overexertion. This approach not only aligns with the complexity of human cognition and behavior but also holds potential for enhancing operational efficiency and safety within aviation environments.
Through the incorporation of multimodal interactions facilitated by eye-tracking systems, pilots gain the flexibility to engage with cockpit interfaces using a combination of visual cues, facial expressions, hand gestures, voice commands, cursor control devices, and brain computer interface (BCI) multimodalities. For instance, enhancing communication between pilots and systems using multimodal interaction, including human–autonomy teaming, has been shown to be effective [
31]. Additionally, systems that integrate eye tracking, speech recognition, and multitouch gestures have demonstrated improvements in operational efficiency and reduced cognitive load, further underlining the viability of multimodal systems in aviation [
22,
32].
By incorporating this approach, the integration of voice commands and speech synthesis with eye-tracking technology has been proposed to alleviate the cognitive and manual task load of pilots [
33]. This approach facilitates the hands-free operation of cockpit displays in aircraft by allowing pilots to direct their gaze toward user interface elements of interest while issuing verbal commands to the system. Additionally, the incorporation of facial expression recognition in conjunction with voice and eye-tracking technology has been explored [
34], and dynamic hand gesture technology [
35,
36] has demonstrated efficacy in executing spatial tasks such as resizing, moving, and manipulating objects on a display [
37]. Furthermore, advancements in eye-movement recognition, such as blink and wink detection, have shown significant potential for reliable and hands-free interaction under varied conditions, emphasizing their relevance for cockpit environments [
18,
38,
39]. Moreover, the combination of cursor control devices with eye-tracking technology has been shown to mitigate users’ cognitive load [
40]. Research in virtual and augmented reality environments demonstrates that eye-tracking-based systems significantly outperform head gaze in terms of speed, task load, and user comfort, particularly in high-resolution scenarios, suggesting its applicability in aviation settings where efficiency and reduced physical strain are critical [
41,
42,
43,
44]. Finally, the integration of BCI technology with eye tracking has been demonstrated to enable the effective remote piloting of a quadcopter (drone) [
45].
Recent studies have significantly advanced our understanding of eye tracking and multimodal interactions in aviation. For example, the EyePointing technique [
46,
47], which combines MAGIC pointing and mid-air gestures, has been shown to enhance interaction efficiency [
34,
35]. Integrating eye tracking with other modalities, such as gestures, has been shown to enhance interaction, although more efficient algorithms are needed for practical applications in aviation environments [
17,
36]. The combination of eye tracking with cursor control devices can improve task performance by addressing pointing inaccuracy and occlusion problems, leading to more precise control and reduced cognitive load [
48]. Additionally, the integration of BCIs with eye tracking has shown promising results in enhancing human–computer interaction, particularly in military aviation settings, by improving operational performance and reducing pilot task load [
49].
5. Conclusions
Through a review of 49 papers, this study explored the literature on eye-tracking technologies for facilitating multimodal interaction in aviation environments. By examining various eye-tracking methods and their integration with multimodal human–machine interfaces, this study underscores the potential for enhancing pilot interaction, operational efficiency, and safety within aviation contexts. Furthermore, the adoption of eye-tracking technologies extends beyond cockpit interactions to training and simulation environments, offering detailed insights into pilots’ decision-making processes and visual focus. These insights can significantly enhance pilot training programs by identifying areas for improvement and tailoring instruction to address specific weaknesses. The integration of eye-tracking data with other biometric and performance metrics provides a comprehensive view of pilot behavior, fostering innovations in interface design and operational protocols. As research continues to advance, the application of eye-tracking technologies in aviation has the potential to revolutionize cockpit operations, training, and overall flight safety, making them valuable tools for the future of aviation.
Author Contributions
Conceptualization, D.M. and M.X.; methodology, D.M., L.T., A.F. and M.X.; validation, D.M., L.T. and M.X.; investigation, D.M., L.T., A.F., M.X., A.C.-C. and M.R.-V.; resources, D.M., L.T., A.F., M.X., A.C.-C. and M.R.-V.; writing-original draft preparation, D.M.; writing-review and editing, D.M., L.T., A.F., M.X., A.C.-C. and M.R.-V.; supervision, D.M. and M.X.; project Administration, D.M.; funding acquisition, D.M., L.T., A.F., M.X., A.C.-C. and M.R.-V. All authors have read and agreed to the published version of the manuscript.
Funding
This research was co-funded by the European Union under the Grant Agreement 101103592.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
No new data were created or analyzed in this study. Data sharing is not applicable to this article.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Kar, A.; Corcoran, P. A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms. IEEE Access 2017, 5, 16495–16519. [Google Scholar] [CrossRef]
- Haslbeck, A.; Schubert, E.; Gontar, P.; Bengler, K. The relationship between pilots’ manual flying skills and their visual behavior: A flight simulator study using eye tracking. In Advances in Human Aspects of Aviation; CRC Press: Boca Raton, FL, USA, 2012; pp. 561–568. [Google Scholar]
- Li, W.-C.; Chiu, F.-C.; Wu, K.-J. The evaluation of pilots performance and mental workload by eye movement. In Proceedings of the 30th European Association for Aviation Psychology Conference, Sardinia, Italy, 24–28 September 2012. [Google Scholar]
- Xenos, M.; Mallas, A.; Minas, D. Using Eye-Tracking for Adaptive Human-Machine Interfaces for Pilots: A Literature Review and Sample Cases. J. Phys. Conf. Ser. 2024, 2716, 012072. [Google Scholar] [CrossRef]
- Duchowski, A.T. Eye Tracking Methodology; Springer International Publishing: Cham, Switzerland, 2017; ISBN 978-3-319-57881-1. [Google Scholar]
- Kovesdi, C.; Spielman, Z.; LeBlanc, K.; Rice, B. Application of Eye Tracking for Measurement and Evaluation in Human Factors Studies in Control Room Modernization. Nucl. Technol. 2018, 202, 220–229. [Google Scholar] [CrossRef]
- Singh, H. Human Eye Tracking and Related Issues: A Review. Int. J. Sci. Res. Publ. 2012, 2, 1–9. [Google Scholar]
- Cognolato, M.; Atzori, M.; Müller, H. Head-mounted eye gaze tracking devices: An overview of modern devices and recent advances. J. Rehabil. Assist. Technol. Eng. 2018, 5, 2055668318773991. [Google Scholar] [CrossRef]
- Martinez-Marquez, D.; Pingali, S.; Panuwatwanich, K.; Stewart, R.A.; Mohamed, S. Application of Eye Tracking Technology in Aviation, Maritime, and Construction Industries: A Systematic Review. Sensors 2021, 21, 4289. [Google Scholar] [CrossRef]
- Wambecke, J.; Goguey, A.; Nigay, L.; Dargent, L.; Hauret, D.; Lafon, S.; de Visme, J.-S.L. M[eye]cro: Eye-gaze+Microgestures for Multitasking and Interruptions. Proc. ACM Hum.-Comput. Interact. 2021, 5, 210.1–210.22. [Google Scholar] [CrossRef]
- Holmqvist, K.; Andersson, R. Eye tracking: A comprehensive guide to methods. Paradig. Meas. 2017. Available online: https://www.researchgate.net/profile/Kenneth-Holmqvist/publication/254913339_Eye_Tracking_A_Comprehensive_Guide_To_Methods_And_Measures/links/5459eb690cf2bccc4912e21a/Eye-Tracking-A-Comprehensive-Guide-To-Methods-And-Measures.pdf (accessed on 21 November 2024).
- Zhang, X.; Sugano, Y.; Bulling, A. Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Scotland, UK, 4–9 May 2019; ACM: Glasgow, Scotland, UK, 2019; pp. 1–13. [Google Scholar]
- Murthy, L.R.D.; Biswas, P. Deep Learning-based Eye Gaze Estimation for Military Aviation. In Proceedings of the 2022 IEEE Aerospace Conference (AERO), Big Sky, MT, USA, 5–12 March 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–8. [Google Scholar]
- Newton, D.; Gildea, K.M.; Knecht, W.; Hollomon, M.J.; Kratchounova, D. Current Status of Gaze Control Research and Technology Literature Review. 2017. Available online: https://rosap.ntl.bts.gov/view/dot/37289 (accessed on 25 June 2024).
- Lim, Y.; Gardi, A.; Pongsakornsathien, N.; Sabatini, R.; Ezer, N.; Kistan, T. Experimental characterisation of eye-tracking sensors for adaptive human-machine systems. Measurement 2019, 140, 151–160. [Google Scholar] [CrossRef]
- Sarcar, S.; Panwar, P.; Chakraborty, T. EyeK: An efficient dwell-free eye gaze-based text entry system. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, New York, NY, USA, 24–27 September 2013; Association for Computing Machinery: New York, NY, USA; pp. 215–220. [Google Scholar]
- Jungwirth, F.; Murauer, M.; Haslgrübler, M.; Ferscha, A. Eyes are different than Hands: An Analysis of Gaze as Input Modality for Industrial Man-Machine Interactions. In Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference, New York, NY, USA, 26–29 June 2018; Association for Computing Machinery: New York, NY, USA; pp. 303–310. [Google Scholar]
- Singh, H.; Singh, J. Real-time eye blink and wink detection for object selection in HCI systems. J. Multimodal User Interfaces 2018, 12, 55–65. [Google Scholar] [CrossRef]
- Prabhakar, G.; Biswas, P. Eye Gaze Controlled Projected Display in Automotive and Military Aviation Environments. Multimodal Technol. Interact. 2018, 2, 1. [Google Scholar] [CrossRef]
- Tezza, D.; Andujar, M. The State-of-the-Art of Human–Drone Interaction: A Survey. IEEE Access 2019, 7, 167438–167454. [Google Scholar] [CrossRef]
- Jacob, R.J.K. What you look at is what you get: Eye movement-based interaction techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 1–5 April 1990; Association for Computing Machinery: New York, NY, USA; pp. 11–18. [Google Scholar]
- Ohneiser, O.; Biella, M.; Schmugler, A.; Wallace, M. Operational Feasibility Analysis of the Multimodal Controller Working Position “TriControl”. Aerospace 2020, 7, 15. [Google Scholar] [CrossRef]
- Forsman, F.; Sjörs-Dahlman, A.; Dahlman, J.; Falkmer, T.; Lee, H.C. Eye tracking during high speed naviation at sea: Field trial in search of navigational gaze behaviour. J. Transp. Technol. 2012, 2, 277–283. [Google Scholar]
- Murthy, L.R.D.; Mukhopadhyay, A.; Arjun, S.; Yelleti, V.; Thomas, P.; Mohan, D.B.; Biswas, P. Eye-gaze-controlled HMDS and MFD for military aircraft. J. Aviat. Technol. Eng. (JATE) 2022, 10, 34–50. [Google Scholar] [CrossRef]
- Rajesh, J.; Biswas, P. Eye-gaze tracker as input modality for military aviation environment. In Proceedings of the 2017 International Conference on Intelligent Computing, Instrumentation and Control Technologies (ICICICT), Kerala State, Kannur, India, 6–7 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 558–564. [Google Scholar]
- Mohanavelu, K.; Poonguzhali, S.; Ravi, D.; Singh, P.K.; Mahajabin, M.; Ramachandran, K.; Singh, U.K.; Jayaraman, S. Cognitive Workload Analysis of Fighter Aircraft Pilots in Flight Simulator Environment. Def. Sci. J. 2020, 70, 131–139. [Google Scholar] [CrossRef]
- Murthy, L.; Mukhopadhyay, A.; Yellheti, V.; Arjun, S.; Thomas, P.; Dilli, M.; Saluja, K.P.S.; Dv, J.; Biswas, P. Eye Gaze Controlled Interfaces for Head Mounted and Multi-Functional Displays in Military Aviation Environment. arXiv 2020, arXiv:2005.13600. [Google Scholar]
- Li, L.; Lin, J.; Luo, Z.; Liu, Z. Research on the Design of In-Cockpit Display Interface for Fighter Aircraft Based on Visual Attention Mechanism. In Human Interface and the Management of Information: Visual and Information Design; Yamamoto, S., Mori, H., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 475–488. [Google Scholar]
- Zhou, J.; Li, G.; Shi, F.; Guo, X.; Wan, P.; Wang, M. EM-Gaze: Eye context correlation and metric learning for gaze estimation. Vis. Comput. Ind. Biomed. Art 2023, 6, 8. [Google Scholar] [CrossRef]
- Vlačić, S.; Knežević, A.; Ro\djenkov, S.; Mandal, S.; Vitsas, P.A. Improving the pilot selection process by using eye-tracking tools. J. Eye Mov. Res. 2020, 12, 10–16910. [Google Scholar] [CrossRef]
- Hourlier, S.; Diaz-Pineda, J.; Gatti, M.; Thiriet, A.; Hauret, D. Enhanced dialog for Human-Autonomy Teaming—A breakthrough approach. In Proceedings of the 2022 IEEE/AIAA 41st Digital Avionics Systems Conference (DASC), Portsmouth, VA, USA, 18–22 September 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–5. [Google Scholar]
- Neßelrath, R.; Moniri, M.M.; Feld, M. Combining speech, gaze, and micro-gestures for the multimodal control of in-car functions. In Proceedings of the 2016 12th International Conference on Intelligent Environments (IE), London, UK, 14–16 September 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 190–193. [Google Scholar]
- Hatfield, F.; Jenkins, E.A.; Jennings, M.W.; Calhoun, G. Principles and guidelines for the design of eye/voice interaction dialogs. In Proceedings of the Third Annual Symposium on Human Interaction with Complex Systems(HICS’96), Dayton, OH, USA, 6–9 August 1996; IEEE Computer Society Press: Dayton, OH, USA, 1996; pp. 10–19. [Google Scholar]
- Wang, K.-J.; You, K.; Chen, F.; Huang, Z.; Mao, Z.-H. Human-machine interface using eye saccade and facial expression physiological signals to improve the maneuverability of wearable robots. In Proceedings of the 2017 International Symposium on Wearable Robotics and Rehabilitation (WeRob), Houston, TX, USA, 5–7 November 2017; IEEE: Houston, TX, USA, 2017; pp. 1–2. [Google Scholar]
- Hu, B.; Wang, J. Deep Learning Based Hand Gesture Recognition and UAV Flight Controls. Int. J. Autom. Comput. 2020, 17, 17–29. [Google Scholar] [CrossRef]
- Qianzheng, Z.; Xiaodong, L.; Jie, R.; Yuanyuan, Q. Real Time Hand Gesture Recognition Applied for Flight Simulator Controls. In Proceedings of the 2021 IEEE 7th International Conference on Virtual Reality (ICVR), Foshan, China, 20–22 May 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 407–411. [Google Scholar]
- Levulis, S.J.; DeLucia, P.R.; Kim, S.Y. Effects of Touch, Voice, and Multimodal Input, and Task Load on Multiple-UAV Monitoring Performance During Simulated Manned-Unmanned Teaming in a Military Helicopter. Hum. Factors 2018, 60, 1117–1129. [Google Scholar] [CrossRef]
- Kowalczyk, P.; Sawicki, D. Blink and wink detection as a control tool in multimodal interaction. Multimed. Tools Appl. 2019, 78, 13749–13765. [Google Scholar] [CrossRef]
- Valeriani, D.; Matran-Fernandez, A. Towards a wearable device for controlling a smartphone with eye winks. In Proceedings of the 2015 7th Computer Science and Electronic Engineering Conference (CEEC), Colchester, UK, 24–25 September 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 41–46. [Google Scholar]
- Biswas, P.; Langdon, P. Multimodal Intelligent Eye-Gaze Tracking System. Int. J. Hum.-Comput. Interact. 2015, 31, 277–294. [Google Scholar] [CrossRef]
- Blattgerste, J.; Renner, P.; Pfeiffer, T. Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. In Proceedings of the Workshop on Communication by Gaze Interaction, Warsaw, Poland, 15 June 2018; ACM: Warsaw, Poland, 2018; pp. 1–9. [Google Scholar]
- Narkar, A.S.; Michalak, J.J.; Peacock, C.E.; David-John, B. GazeIntent: Adapting Dwell-time Selection in VR Interaction with Real-time Intent Modeling. Proc. ACM Hum.-Comput. Interact. 2024, 8, 1–18. [Google Scholar] [CrossRef]
- Piening, R.; Pfeuffer, K.; Esteves, A.; Mittermeier, T.; Prange, S.; Schröder, P.; Alt, F. Looking for Info: Evaluation of Gaze Based Information Retrieval in Augmented Reality. In Human-Computer Interaction—INTERACT 2021; Ardito, C., Lanzilotti, R., Malizia, A., Petrie, H., Piccinno, A., Desolda, G., Inkpen, K., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2021; Volume 12932, pp. 544–565. ISBN 978-3-030-85622-9. [Google Scholar]
- Wang, Z.; Wang, H.; Yu, H.; Lu, F. Interaction with gaze, gesture, and speech in a flexibly configurable augmented reality system. IEEE Trans. Hum. -Mach. Syst. 2021, 51, 524–534. [Google Scholar] [CrossRef]
- Kim, B.H.; Kim, M.; Jo, S. Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking. Comput. Biol. Med. 2014, 51, 82–92. [Google Scholar] [CrossRef]
- Schweigert, R.; Schwind, V.; Mayer, S. EyePointing: A Gaze-Based Selection Technique. In Proceedings of the Mensch und Computer 2019, Hamburg, Germany, 8–11 September 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 719–723. [Google Scholar]
- Zhu, L.; Zhu, Z.; Zhang, C.; Xu, Y.; Kong, X. Multimodal sentiment analysis based on fusion methods: A survey. Inf. Fusion. 2023, 95, 306–325. [Google Scholar] [CrossRef]
- Deng, S.; Chang, J.; Kirkby, J.; Zhang, J. Gaze–mouse coordinated movements and dependency with coordination demands in tracing. Behav. Inf. Technol. 2016, 35, 1–15. [Google Scholar] [CrossRef]
- Vortmann, L.-M.; Ceh, S.; Putze, F. Multimodal EEG and Eye Tracking Feature Fusion Approaches for Attention Classification in Hybrid BCIs. Front. Comput. Sci. 2022, 4, 780580. [Google Scholar] [CrossRef]
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).