Next Article in Journal
Machine Learning Based Object Classification and Identification Scheme Using an Embedded Millimeter-Wave Radar Sensor
Previous Article in Journal
Sound Detection Monitoring Tool in CNC Milling Sounds by K-Means Clustering Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Application of Eye Tracking Technology in Aviation, Maritime, and Construction Industries: A Systematic Review

1
School of Engineering and Built Environment, Griffith University, Gold Coast, QLD 4222, Australia
2
School of Civil Engineering and Technology, Sirindhorn International Institute of Technology, Thammasat University, Pathum Thani 12120, Thailand
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(13), 4289; https://doi.org/10.3390/s21134289
Submission received: 10 May 2021 / Revised: 15 June 2021 / Accepted: 17 June 2021 / Published: 23 June 2021
(This article belongs to the Section Electronic Sensors)

Abstract

:
Most accidents in the aviation, maritime, and construction industries are caused by human error, which can be traced back to impaired mental performance and attention failure. In 1596, Du Laurens, a French anatomist and medical scientist, said that the eyes are the windows of the mind. Eye tracking research dates back almost 150 years and it has been widely used in different fields for several purposes. Overall, eye tracking technologies provide the means to capture in real time a variety of eye movements that reflect different human cognitive, emotional, and physiological states, which can be used to gain a wider understanding of the human mind in different scenarios. This systematic literature review explored the different applications of eye tracking research in three high-risk industries, namely aviation, maritime, and construction. The results of this research uncovered the demographic distribution and applications of eye tracking research, as well as the different technologies that have been integrated to study the visual, cognitive, and attentional aspects of human mental performance. Moreover, different research gaps and potential future research directions were highlighted in relation to the usage of additional technologies to support, validate, and enhance eye tracking research to better understand human mental performance.

1. Introduction

1.1. Eye Tracking in High-Risk Industries

Industries with work environments and processes that pose a considerable risk of harm to people and nature are defined as high-risk industries [1]. Some of these high-risk industries include aviation, maritime, and construction [2,3]. Most accidents in the aviation, maritime, and construction industries are caused by human error [4,5,6,7,8]. Indeed, the National Transportation Safety Board survey found that 88% of aviation accidents between 1989 and 1992 were caused by human error [7]. According to the International Maritime Organisation (IMO), human error is also the major cause of incidents in the maritime industry, accounting for 85% of all industry accidents [8]. The construction industry is one of the most hazardous industries globally [9]. For example, though it comprises only 5% of the work force in the US, the construction industry accounted for almost 20% of workplace deaths among all industries between 2003 and 2012. Similar trends are shown in Australia and the UK [10]. These three industries are some of the most critical sectors in the global economy. Furthermore, they are exposed to numerous occupational risks that are very challenging to control and mitigate. These risks entail large inherent losses but also massive profits [11].
Most accidents in the aviation, maritime, and construction industries can be traced back to impaired mental performance as a result of attention failure [5]. Human mental performance can be affected by a variety of external and internal factors [5], such as emotional state, risk perception, training, and human-machine interfaces (HMIs). Almost half of the brain’s neural pathways are used for visual processing [12], making vision a key element in mental performance [13]. As such, an effective method for understanding the factors that affect human performance is the analysis of visual information processing [14]. Several psychology and neuroscience studies have concluded that eye movement can help in understanding the visual, cognitive, and attentional aspects of human performance [15]. A good tool for measuring eye movement and eye position information is by using an eye tracking device. This unique tool allows eye movement information to be recorded, which can help to assess an individual’s mental state, to understand cognitive processing and behaviour, and to interpret individuals’ responses to different visual stimuli [16].
Eye tracking research dates back almost 150 years [17] and has been widely used in different fields for numerous purposes [18,19]. Eye tracking technologies have been used in different industries for the evaluation of present and future working environments, especially when they involve high degrees of risk. For example, eye tracking has been used in medicine and surgery as well as the aviation and maritime industries to improve teaching strategies and trainee performance [20,21,22]. To enhance safety, eye tracking has also been used to understand construction workers’ risk perception and hazard identification [23].
Eye tracking technologies can also be integrated with various other technologies, such as simulators, motion capture devices, and augmented reality, to better understand individuals’ gaze patterns during different scenarios. For example, in the automobile, aviation, and maritime industries, eye tracking technologies have been used in conjunction with sophisticated simulators to understand where pilots and navigators are looking under different physical and mental states, such as fatigue, anxiety, and stress [24,25]. Furthermore, the impacts of HMIs such as cockpits (for airplanes) and bridges (for ships) on human visual attention and their performance have been studied using eye trackers [26,27].

1.2. Purpose and Objectives

Overall, eye tracking technologies provide the means to capture a variety of eye movement information that reflects different human cognitive, emotional, and physiological states in real time, which can be used to gain a wider understanding of the human mind in different scenarios. The purpose of this systematic literature review is to explore how eye tracking technologies have been used in high-risk industries such as the aviation, maritime, and construction industries and to uncover current eye tracking research gaps in these industries. Thus, the objectives of this study are as follows:
  • To perform a demographic analysis to identify the main countries that are using eye tracking research with applications in aviation, maritime, and construction fields;
  • To identify the main applications of eye tracking research in the aviation, maritime, and construction industries;
  • To identify the different human aspects that are studied in eye tracking research in aviation, maritime and construction scenarios;
  • To identify the technologies that are integrated with eye tracking devices to study the different human aspects in aviation, maritime, and construction scenarios; and
  • To determine research gaps in the development and application of eye tracking technologies within the aviation, maritime, and construction industries.

2. Materials and Methods

A systematic search was conducted on 17 August 2020 using the Science Direct and Google Scholar databases following the preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement [28]. In this systematic literature search, the keywords ‘eye move*’, ‘eye track*’, ‘gaze move*’, and ‘gaze track*’ were connected with the Boolean operator ‘“OR”’ and accompanied by the following keywords using the ‘“AND”’ Boolean operator: ‘maritime’, ‘aviation’, ‘construction’.
Three full phrases were used:
(‘eye move*’ OR ‘eye track*’ OR ‘gaze move*’ OR ‘gaze track*’) AND maritime;
(‘eye move*’ OR ‘eye track*’ OR ‘gaze move*’ OR ‘gaze track*’) AND aviation;
(‘eye move*’ OR ‘eye track*’ OR ‘gaze move*’ OR ‘gaze track*’) AND construction.

2.1. Selection Criteria

The initial literature search returned a total of 50,777 articles, from which a total of 3617 duplicated articles were removed, as presented in Table 1. We collected the first 10 pages of search results (sorted by relevance) for each search phrase, yielding a total of 832 articles. These 832 articles were screened according to the following inclusion criteria: (1) conference and peer-reviewed papers with the full text published within the last 20 years (2000 to 2020), (2) empirical studies reporting the use of eye tracking technologies, and (3) published in the English language. This resulted in 119 articles eligible for full-text assessment. In the full-text checking process, review articles, as well as articles that were not related to the aviation, construction, and maritime industries, were excluded. The full-text checking process determined that 80 articles met the inclusion criteria for this review, as shown in Figure 1.

2.2. Data Extraction and Analysis

The systematic literature search aimed to identify the different applications and methodologies used in eye tracking research in three high-risk industries. The classification topics were as follows: maritime, aviation, construction, year of publication, location of publication, industry application, type of scenario (e.g., real environment or simulation), cognitive (Cog), emotional (Emo), physiological (Phys), integration with other technologies, and types of eye tracking devices. Full-text screening was performed by D.M. and S.P. to avoid potential bias. A consensus meeting resolved any discrepancies between reviewers. Once all applicable literature was identified, the extracted data were analysed by normalising the number of studies according to the classification topics previously described.

3. Results

The systematic literature search identified a total of 80 studies that fulfilled the inclusion criteria. An initial data classification was performed as described previously (see Table 2). After this, a more detailed analysis was performed in which a total of 2240 qualitative data items were extracted and synthesised to uncover industry distribution and demographics as well as the different applications of eye tracking studies in the maritime, aviation, and construction industries.
This section first defines eye tracking studies regarding the aviation, maritime, and construction industries with respect to industry distribution and demographics. These studies were synthesised to uncover the countries that apply eye tracking in the three industries. Moreover, we also sought to uncover and explain how eye tracking technologies are used in these three industries. Particular attention has been devoted to the following aspects: the different applications of eye tracking studies; the different cognitive, emotional, and physiological processes studied in eye tracking research; the preferred types of eye tracking devices used; and the different technologies that are integrated to enhance the capabilities of eye tracking research.

3.1. Demographics

According to Figure 2, the highest number of studies identified in the systematic literature search relate to the aviation industry (36 studies), followed by the maritime and construction industries (25 and 19 studies, respectively). In general, the use of eye tracking technologies in these three industries increased from 2003 to 2020, as shown in Figure 3. Almost 70% of the 80 reviewed articles were published in the last five years (Figure 3). According to Figure 4a,c, research interest in eye tracking in the maritime and aviation industries dates back more than 13 years. In contrast, the application of eye tracking technologies only recently started to attract interest in the construction industry (Figure 4b), where the oldest article identified was published in 2015. This limited interest is also reflected by the low number of eye tracking studies in the construction industry identified in this study.
In terms of demographic distribution, researchers conducting eye tracking studies in the three high-risk industries were located in Europe (55%), North America (28%), Asia (14%), Australia (1%), the Middle East (1%), and South America (1%; Table 3). Although the studies were distributed among different locations, overall, the USA was the leader in eye tracking studies (20 articles), followed by Germany and Norway (both with 10 articles), and then China and the UK (both with seven articles).
Regarding the distribution of eye tracking studies per location in the aviation industry, the USA and Germany were in first place, followed by the UK (see Figure 5). These results are unsurprising since Boeing, the world’s foremost aircraft manufacturer from 2012 to 2020, is an American company [96]. Germany holds the major production sites of Airbus, which Boeing surpassed in 2020 to become the largest aircraft manufacturer company in the world [97,98]. The UK has the second largest aircraft manufacturing industry worldwide and is known as a global centre of excellence for the design and production of turbines, helicopters, and aircraft components and systems [99].
In the maritime industry, eye tracking research is dominated by Norway (nine articles), followed by Sweden (three articles) and Canada, Italy, and Singapore (two articles each). These results correlate with the importance of the maritime sector in these countries. For example, Norway’s economy heavily relies on its strong maritime activities, and the region currently has several leading shipping companies and the world’s largest shipping stock exchange [11,100]. However, despite the substantial dependence of Canada, Italy, and Singapore on marine transportation, there is a low number of eye tracking research studies in maritime applications. This trend can also be observed in Australia, Japan, and Taiwan, which are regions surrounded by water that nonetheless have one or no eye tracking studies in the maritime field.
Regarding the construction industry, the USA is also the leading region in eye tracking research (11 articles), followed by China (four articles). It is important to note that the USA is one of the largest construction markets worldwide [101], and China’s urbanisation infrastructure has been experiencing exceptional growth in recent decades [102,103]. The USA, Germany, and China are the only regions that have published eye tracking studies in the aviation, maritime, and construction industries.
Figure 5. Distribution of identified studies according to industry and number of articles per region (created with RAWGraphs [104]).
Figure 5. Distribution of identified studies according to industry and number of articles per region (created with RAWGraphs [104]).
Sensors 21 04289 g005

3.2. Eye Tracking Metrics

The visual system is considered to be the most important system of the human body after the brain and the foremost in terms of sensory systems, as 85% of the information that the organism obtains from the environment is processed through it [105]. In humans, the eyes play a very important role in communication. For example, eye contact and gaze direction are used to establish socio-emotional connection, indicate the target of our visual interest, or to regulate interaction [106]. Moreover, different cognitive processes and intentions can be reflected by our gaze behaviour (e.g., we often look at things before acting on them) [107]. In eye tracking research, different eye metrics have been identified and related to different cognitive, emotional, and physiological states, which can be used to gain a wider understanding of the human mind. These eye metrics are fixation, saccadic movements, pupillary response, and eye blink rate.

3.2.1. Fixation

Fixation is generally associated with visual processing and information acquisition [108]. Fixation occurs when the eye remains still and the pupil is stationary for approximately 180–300 ms [33,62,108]. During a fixation, people obtain new information from an object, stimulus, or location [31]. Statistical analysis of different fixation metrics such as fixation frequency, fixation duration, fixation duration max, and standard deviation of the fixation duration are related to human performance as well as various cognitive attentional processes [34]. For example, a pilot’s situation awareness (SA) performance and expertise level can be inferred from the distribution of fixations and fixation duration on relevant areas of interest (AOIs) [26]. In aviation research, a study found that experienced pilots spend more time fixating on multiple flight instruments, without dwelling for too long on any single one [109]. However, experts of different backgrounds can have different gaze behaviours. For example, in the maritime industry, expert operators spend more time fixating on the outside environment, whereas novices focus more on the deck [34]. Independent of experience, this difference can also be observed in the aviation industry. Commercial pilots spend more time looking at the instrument panel, whereas military pilots spend more time looking through the window [109]. During a visual search activity, fixation duration can also be used to predict a subject’s hazard recognition [93]. In the construction industry, a study found that workers with a higher perception of risk fixated for a longer time on objects that pose a hazard when they are identified for first time [23]. Similarly, it was found that air traffic controllers with a higher fixation count on relevant AOIs have a higher failure detection frequency [59]. On the contrary, a short fixation duration is generally associated with individuals experiencing anxiety and a threat state [68].

3.2.2. Saccades

Saccades are rapid eye movements that occur when a person shifts between fixations [43]. Saccades last around 10–100 ms, during which time visual information transfer is suppressed [110]; therefore, saccades are not directly related to cognitive processing [33]. However, saccade velocity can be related to lethargy, stress, and fatigue [64,93]. For example, saccade rate decreases with fatigue and difficult tasks [27]. Saccadic length has also been used to measure mental workload [33] and has been shown to increase with increased mental workload (MWL) [73], with very short saccades related to the presence of conflict [56]. In a maritime setting, the number of saccades can reveal improvements in the scanning technique of a navigator [111].

3.2.3. Pupil Size

One of the most distinguishing features of the human eye is the pupil [112] and extracting information about the size and location of its centre is relatively easy using video recording [113]. Pupil size is influenced by illumination and regulates the amount of light that enters the retina. Pupil size is also affected by emotions, muscular fatigue, cognitive processes, and MWL [26]. For example, an increased pupillary dilation response has been related to increased processing load in different maritime settings [34,45]. Moreover, pupil size has been used to understand the cognitive effects of stress in aviation emergencies and conflict [56,64]. In quay crane operators, pupil dilation was used to measure alertness and fatigue evolution [24]. Visual fatigue caused by aircraft instrument displays has been studied by analysing the effects of electronic displays on pupil diameter [78]. In a simulated maritime environment, pupil dilation was shown to reveal the upcoming judgment of the human operator [38].

3.2.4. Blink Rate

Blink rate refers to the number of blinks per second or minute and is related to physiological factors such as mood state, task demands, attention, and tension [34]. Blink rate is generally used to measure MWL and fatigue [48]. For example, high blink rates are correlated with fatigue and low MWL [31,113]. Regarding scanpaths, this eye tracking metric results from the combination of fixations and saccades. According to Bhoir et al. [85], ‘an optimal scanpath is a straight line eye movement to desired targets and a short fixation on targets’. A scanpath can provide information about a subject’s cognitive process for encoding information [70,114]. For example, a study found that with an increasing workload, subjects tend to use a less random scanpath [34]. In the aviation and maritime industries, subjects’ scanpath is used to understand how they interact and absorb information from their environment, such as computer interfaces and instrument panels [70]. A summary of the described eye metric characteristics can be found in Table 4.

3.3. Eye Tracking Data Visualisation Tools

Due to the large amount of information provided by eye tracking devices, eye tracking visualisation techniques provide additional insight when paired with comprehensive statistical analysis [50]. Visualisation techniques commonly used for representing eye tracking data are heatmaps and scan paths, as presented in Figure 6 [115].
Heatmaps are the main data visualisation tool used to analyse the accumulated positions of gaze and fixation distributions among AOIs [58,116]. A heatmap is a 2D visualisation of the analysed eye tracking data represented in colour scales [117]. In a heatmap, hot zones correspond to higher gaze and fixation densities, whereas cool zones correspond to lower densities [118]. Heatmaps can be used with data from an individual or from a group of people [89]. The limitation of heatmaps is that they only show density-based data and lack information about the sequential order of eye movements [117]. Another type of eye tracking visualisation tool is a scanpath. Scanpaths are used to compile the eye movements and are defined as a spatial arrangement of a sequence of saccade–fixation–saccade. Scanpaths are used to reveal the sequential order of observed areas [119] and provide information related to individuals’ search efficiency [120].

3.4. Overview of Types of Eye Tracking Methods

Currently, there are four main methods used to measure eye movements. These methods are Electro-OculoGraphy (EOG), scleral contact lens/search coil, Photo-OculoGraphy (POG) or Video-OculoGraphy (VOG), and video-based combined pupil–corneal reflection [49]. Electro-OculoGraphy (EOG) was the most used eye tracking method 50 years ago. The EOG method uses a series of electrodes placed around the eyes to measure the electric field in the tissue surrounding the eye. This electric field is caused by the electric potential difference between the cornea (transparent front part of the eye) and the ocular fundus (the interior surface of the eye) [121]. One of the main advantages of EOG is that light conditions have no impact on the quality of the eye recording. Moreover, the signal processing of EOG does not require any complex video and image processing [122].
Scleral contact lens/search coil is one of the most accurate eye movement measurement methods and is considered the gold standard in oculomotor research [123]. Scleral contact lens/search coil consists of a pair of contact lenses mounted with reflecting phosphors, line diagrams, or wire coils [121]. The working principle involves a coil of metallic wire that can be tracked when it moves through an electro-magnetic field [124]. The limitations of the scleral contact lens/search coil method include eye irritation and potential cornea damage. Therefore, this method has a limited experimental time of use of around 30 min per session [123]. Photo-oculography (POG) or video-oculography (VOG) comprise several eye movement tracking techniques and are based on the measurement of the relative position of the reflected image of an infrared source on the cornea and the pupil centre [49]. The disadvantage of POG or VOG is that the quality of measurements can be affected by head movements and blinking artifacts. Moreover, the measurement of ocular features provided by POG or VOG techniques can be extremely tedious and prone to error [125].
Although all the mentioned eye tracking techniques are, in general, suitable for eye movement measurements, they do not often provide point of regard measurement. To counteract this disadvantage, video-based trackers are equipped with high-resolution cameras and image processing hardware to measure the pupil centre and corneal reflection (of a light source, usually infrared). The combined pupil–corneal reflection method allows computation of the point of regard in real time [126]. The pupil is easily detected using the bright pupil phenomenon occurring when the eye is exposed to near-infrared light [127]. Advanced image processing algorithms and a physiological 3D model of the eye are then used to estimate the position of the eye in space and the point of gaze with high accuracy [128].

3.5. Types of Modern Video-Based Eye Tracking Devices

Today, most eye tracking systems are video-based, with infrared illumination and an eye video camera [108]. Depending on the type of activity to be studied, the experiment, and the environment, different video-based eye tracking systems are required. Eye tracking systems can be categorised as remote, mobile, or tower-mounted based on how they interface with the user and environment, as presented in Figure 7 [129].

3.5.1. Mobile Eye Tracking Devices

Mobile eye tracking devices are also referred to as head-mounted or wearable devices. Such devices usually have an additional scene camera that records the scene or field of view [135]. Mobile eye tracking devices are worn by the participant in the form of a headband, glasses, or a helmet-mounted system, allowing the participant to move freely in the experimental environment, as shown in Figure 7a–c [136]. Mobile devices are mainly binocular and are usually more accurate than remote devices [135]. On mobile devices, gaze tracking is performed relative to the entire field of view, which makes it ideal for real-world experiments [137]. Head- or helmet-mounted devices are less invasive and more comfortable, and they can be worn with other technologies such as electroencephalography (EEG) [138]. Nonetheless, mobile tracking devices have several limitations, such as difficulty in tracking eye movements in sunlight and being inappropriate for environments with heavy winds and water spray [32]. Moreover, eye movements to the periphery can be difficult to track and will often show less accuracy. Mobile tracking devices do not have an absolute coordinate system, instead requiring gaze data to be recorded in a coordinate system defined by the scene camera [138]. Finally, data inaccuracy can be introduced if the mobile eye tracking device does not properly fit the individual’s face [136].

3.5.2. Remote Eye Tracking Systems

Remote eye tracking systems do not touch the person at all and measure the eye from a distance, as presented in Figure 7d [121]. This type of eye tracking device is mainly used for screen-based interaction experiments. The advantage of remote eye tracking systems is that the participant can use a computer completely naturally while the eye tracking system records data [137]. Moreover, data processing can be less complex and significantly more efficient compared with wearable systems because the visual space is already integrated [136]. Furthermore, remote eye tracking devices work very well with other research technologies because they do not touch the participant [138]. However, the limitations of remote eye tracking devices are that they can only be used with fixed working areas; this can result in gaps in data accuracy and artefacts when the participant excessively moves his/her head [138]; they are also intolerant to infrared (IR) sources such as sunlight, especially if the sun is reflected in the participant’s eyes.
To further improve the accuracy of remote eye tracking systems, head-supporting towers are usually employed, as shown in Figure 7e. Head-supporting towers are in close contact with the participant via a bite bar or chin rest and therefore restrain head movements [137]. These devices allow the capture of the highest-quality data by restricting the participant’s head movement, although they are less realistic and natural. The saccade resolution of a remote eye tracker fitted with a head-supporting tower is two to five times greater than that of a remote/head-free eye tracker [16]. However, the restrictive setting of head-supporting towers limits their use for dynamic environments. As a result, they are mainly used in studies that require high precision and where the subject is usually looking at a fixed screen [139].

3.5.3. Eye Tracker Performance and Data Quality

The performance of eye tracking systems is primarily characterised by their sampling frequency, precision, accuracy, and the resolution of the system, as shown in Figure 8. Sampling frequency is the number of measurements taken by the system in one second (in Hz). Thus, a 100-hertz (Hz) eye tracker records a particular eye movement (or sample) 100 times per second [135]. Precision refers to the spread of the measured gaze points (in °). The accuracy refers to the difference between the measured and true eye position (in °) [140]. The saccade resolution is the ability of a system to detect saccade movement. For example, an eye tracker with 0.1° saccade resolution can detect movements as small as 0.1° [19]. The required performance of an eye tracking system relies on the type of movement that is detected; high-precision systems can accurately identify what a subject is looking at and where, while systems with a high sample frequency and high precision are more reliable in identifying the type of eye movement being observed [141]. For example, current high-tier systems offer sample frequency rates of 500 Hz to 2000 Hz, tracking accuracies below 0.3°, and precision better than 0.05° [108]. Mid-tier systems can provide sampling rates typically between 120 Hz and 200 Hz and have tracking accuracies of around 0.3° to 1° and precision of around 0.1° [142]. Low-end eye tracking systems typically have sampling frequencies of at least 30 Hz but reaching 120 Hz, tracking accuracies of 1° to 2°, and precision of more than 1° [141].
The performance specifications of commercial eye tracking systems are usually provided by their manufacturers/distributors. However, these values are obtained under ideal conditions and may be affected by the operating environment [19]. For example, several studies have reported large variations in the eye tracker’s accuracy compared to the values provided by the manufacturer [143,144,145]. Therefore, several studies have been performed to evaluate the performance of commercial eye tracking systems in different operating environments. For a detailed review about the methods for performance evaluation of eye tracking and technical characteristics of commercially available eye trackers, refer to the studies performed by Zhang et al. [146], Cognolato et al. [135], Lim et al. [142], and Stein et al. [147].
Data quality refers to the reliability, validity, and availability of the eye tracking data. In eye tracking, there is a large variety of factors that may influence the quality of data. One of these factors is the user’s neurology and psychology. For example, in dry eye syndrome, the use of glasses or contact lenses can lead to large data variation [148]. Environmental and light conditions can also affect the data collection processes. A very dry recording environment can change the rate at which an individual blinks, whereas changes in light conditions may lead to changes in pupil size [140]. Other factors that can significantly impact the precision and accuracy of the measures produced in an eye tracking experiment are the type of eye tracker used, the eye camera’s resolution and field of view, and the calibration procedure [149]. Therefore, to keep variation as small as possible, eye tracking studies try to carefully control eye tracking conditions and user positioning, frequently recalibrate the eye tracker, and exclude participants that do not track well [140]. While this is reasonable for research, it is not feasible in practice, limiting the use of eye tracking in everyday industry tasks. Thus, there is a great need to expand vision science beyond the controlled laboratory setting and into the natural world.
Equally important are the gaze estimation algorithm for data processing and the use of any filtering or pre-/post-processing algorithms. Currently, eye tracking algorithms have limitations and challenges to overcome. For example, the reliable classification of eye movements based on raw data is one of the challenges of processing algorithms. Moreover, there is a variety of different algorithms available today, which are frequently used without a systematic evaluation of their performance [150]. This makes it difficult to generalise research results when researchers use different hardware, algorithms, thresholds, and stimuli [150]. Another challenge is that most algorithms are bound to eye tracker settings and do not work even if the signal is not very noisy [151,152]. To overcome these limitations, several machine learning approaches have been developed to allow the algorithm to be retrained for any type of eye tracking system [140,153,154,155].

3.5.4. Types of Eye Tracking Devices Used in Aviation, Maritime, and Construction

According to our results, 100 percent of all the selected studies used some form of video-based eye tracking system. Based on Figure 9, mobile eye tracking devices are the preferred choice in aviation (84%), maritime (67%), and construction (68%) applications. We believe that the main reason is that mobile eye trackers are a flexible alternative that provides the user with freedom of movement, making them suitable for complex, dynamic, and real scenarios [8,16] such as those presented in these three industries. However, in simplified simulated environments in which the study environment is presented through a computer monitor, the preferred choice is a remote eye tracker [36]. We found that the majority of studies that used remote eye tracking systems in aviation (16%) and maritime (33%) scenarios were mainly in the context of air traffic control monitoring [7,18,38,48,56,65,69,76] and evaluation of computer systems and interfaces [36,37,80]. In the case of eye tracking research for construction applications, remote eye tracking systems were mainly used in studies that employed images presented in computer monitors to study visual search patterns [16,19,82,93] and for visual support systems [88]. Moreover, among all three industry sectors analysed, only one study in construction applications used tower-mounted eye tracking [86]. A summary of the advantages and disadvantages of video-based eye tracking system types for aviation, maritime and construction applications is provided in Table 5.

3.6. Application of Eye Tracking Technology in Aviation, Maritime, and Construction Scenarios

Video-based eye tracking technologies have been extensively used in various industry applications. In this study, we found that there are 13 main applications of eye tracking technologies for the aviation, maritime, and construction industries, as presented in Figure 10. In descending order, these applications were (1) visual attention, (2) MWL, (3) HMI, (4) SA, (5) training improvement, (6) hazard identification, (7) novice and expert comparison (8) fatigue, (9) stress, (10) foretelling, (11) anxiety, (12) trust, and (13) working memory load. The following 12 sections of the manuscript discuss the 13 main applications of eye tracking technologies; ‘stress and anxiety’ are combined under a single title.

3.6.1. Visual Attention and Gaze Pattern

According to our results, visual attention is the most studied aspect in eye tracking research for maritime, construction, and aviation applications. Visual attention guides human perception, ensuring that an individual perceives and processes information selectively. Distraction occurs when attention shifts away from the original task [5]. Abundant evidence indicates that visual attention is essential for many cognitive tasks, such as attention distribution, hazard identification, decision-making, and SA [6,81,89]. Visual attention can also be used to compare tacit knowledge, such as scan patterns between novices and experts [32]. Moreover, working memory and its capacity are related to an individual’s ability to control their attention [86]. Based on our results, 56% of the eye tracking studies on visual attention and gaze pattern belong to the aviation industry, while the remaining 44% are almost equally distributed between the maritime (20%) and construction (24%) industries.
In aviation, Li et al. [74] studied air traffic controllers’ visual scan patterns to investigate the effectiveness of multiple remote tower operations. Their results showed that the visual scan patterns of air traffic controllers presented significant task-related variation while performing different tasks and interacting with various interfaces on the controller’s working position. According to Li et al. [74], air traffic controllers’ visual attention was influenced by the characteristics of the operating environment, how the information was presented, and the complexity of this information. For maritime applications, Li et al. [6] proposed a novel approach to assess trainees’ visual attention in a maritime operation simulator. For this purpose, expert knowledge was used to divide the task, identify critical operation, and define AOIs. An operation-dependent weighted attention map of the expert’s visual attention was then generated using their spatial and temporal perspectives. To test the effectiveness of the resulting attention map for training purposes, ten trainees were separated into two groups to assess their performance in a heavy lifting operation. The first group received detailed information about critical AOIs, the risks in operation, and the visual focus to ensure safety, whereas the second group only received information about the potential risks. According to their results, the second group had inferior visual focus, demonstrating the effectiveness of the debriefing provided to the first group based on the expert’s attention map. Li et al. [6] concluded from their results that their proposed method is valid to assist in training programs in maritime operations.
In another study, Pinheiro et al. [16] studied workers’ gazing patterns during a hazard recognition task to understand the difference in visual patterns when a 2D sketch representation and real images of construction sites are used. Their results showed that when 2D sketch images are used, workers’ attention is considerably more dispersed, while, in the realistic image, the attention is more directed towards the AOI. The subjects spent 18% less time when observing 2D sketch images and the identification of hazards was faster. According to Pinheiro et al. [16], the use of 2D sketch images may be useful to introduce students and workers to the different types of hazards in a construction site. However, due to the complexity of a real construction site, 2D sketch images cannot fully prepare the students for a real scenario. Therefore, the use of mobile eye tracking systems during and after training sessions in real construction sites can help to analyse group patterns and develop new prevention measures.

3.6.2. Mental Workload

Mental workload refers to the mental effort required of an individual to perform a specific task [157]. MWL can be affected by cognitive, physiological, and emotional factors such as short-term memory, capacity, fatigue, and motivation. Therefore, mental workload is vital for the assessment of human performance. Excessively high or low MWL has a significant impact on an individual’s performance [158]. For example, in the aviation industry, it has been found that a pilot’s situation awareness is strongly correlated with MWL. Consequently, excessive MWL may result in poor situation awareness [159]. The advantages of using eye tracking instead of self-assessment methods to measure mental workload are the elimination of various human factors such as bias, likelihood of falsified results, likelihood of random responses, mistakes, and complaint attitudes [27]. Moreover, self-assessment methods cannot be used in real time, whereas eye tracking technologies offer the possibility of continuously monitoring an individual’s cognitive state without interfering with their performance in real-life situations [77].
Approximately 29% of all the studies measured mental workload for different purposes. Mental workload was the second most studied aspect for the maritime (48%) and aviation (48%) industries, whereas eye tracking studies in the construction industry showed little interest (4%) in studying this factor (Figure 10). The studies that applied eye tracking in the maritime and aviation sectors studied MWL in situations involving different emotional and physiological states and to understand the impact of HMIs on individuals’ MWL. For example, Yan et al. [27] studied the relationship between operators’ MWL and eye responses in the task of operating a marine engine interface. Moreover, they developed an artificial neural network (ANN) model to predict the operators’ MWL based on integrating eye response data. According to Yan et al.’s [27] results, eye response is sensitive to MWL when using the interface control. Furthermore, their ANN model presented high levels of prediction accuracy for the prediction of operators’ MWL based on eye response indices, with an R2 (determination coefficient) of 0.971, 0.912, and 0.918 for training, validation, and testing, respectively. For aviation applications, Martin et al. [56] studied MWL experienced by air traffic controllers during their work activity. Their results confirm previous studies’ results, showing that MWL increases when task requirements increase. Moreover, their results particularly highlight the crucial status of conflict in MWL and attention during air traffic control task execution.
By contrast, the construction industry had only one study that involved MWL and eye tracking devices. In this study, Li et al. [92] used eye tracking glasses to evaluate the impact of mental fatigue on construction equipment operators’ ability to detect hazards. For this purpose, they used the NASA Task Load Index in conjunction with eye tracking to measure operators’ perceived workload in six dimensions: mental demand, physical demand, effort, own performance, temporal demand, and frustration.

3.6.3. Human–Machine Interfaces

With the rapid development of technology, sophisticated HMIs have evolved to facilitate complex operating procedures [160]. However, the design of HMIs can also negatively affect human cognitive performance. Therefore, to properly design an HMI with a human-centred design concept, it is first vital to understand the effects and interaction between humans and technology [43]. It has been found that the eye scanning pattern is one of the most robust methods for evaluating human cognitive processes when interacting with computers and machines [73]. According to our results, 27% of the studies used eye tracking to understand the impact of HMIs on individuals’ performance (Figure 10). However, it was found that only one study used eye tracking to study HMIs in construction applications. The lack of eye tracking studies applied to HMIs in construction can be explained by the lack of complex machine and computer interfaces in this industry. Construction equipment operation is relatively simple compared to the complex equipment in airplane cockpits and ship bridges. Most of the studies on HMIs were from the aviation and maritime industries, where eye tracking was mainly used to assess the impact of HMIs such as the cockpit (for airplanes) and the bridge (for ships) on individuals’ visual attention, gaze pattern, and MWL, as well as for training improvement [26,27,39,52]. For example, Li et al. [26] investigated pilots’ visual parameters to compare a traditional crew alerting system with a new, integrated system designed to assist pilots during urgent situations. They found that pilots’ visual parameters had significant differences while interacting with different types of displays showing numeric, symbolic, and textual messages. Li et al. [26] concluded that it is important to adopt a holistic approach for the design of flight decks to allow pilots to gain situation awareness rather than focusing on only one display. Hareide et al. [39] used eye tracking on board the Skjold-class Corvette ship, which is the world’s fastest littoral combat ship, and the exact replica of the Skjold-class corvette bridge to determine how to design better ship bridges and navigator interfaces. By analysing scanpaths and sequence charts, Hareide et al. [39] identified several bridge design factors that divert time and attention from the primary focus area of the navigator. Moreover, they found that the Route Monitor window in the graphical user interface is time-consuming, reducing the time spent looking outside. These findings provided valuable information for future work to facilitate the design of bridges and graphical user interfaces for combat ships.

3.6.4. Situation Awareness

Situation awareness is one of the most dominant human factors to efficiently perform tasks in dynamic time-sensitive and safety-critical situations [89]. Numerous studies have confirmed that 88% of human-related causes of accidents in high-risk environments can be traced back to SA [5]. Situation awareness refers to an individual’s perception and comprehension of their surrounding environment within a defined volume of space–time and projecting their status in the near future [161]. An individual can only develop SA by paying attention to, perceiving, and processing the environment. However, SA can be affected by distractive factors and attentional demands that exceed an individual’s attentional resources [5,89]. Thus, SA is considered critical in scenarios that require sequences of multiple interdependent decisions in real time in a continuously changing environment such as hazard identification, error detection, and activity monitoring [38,162].
Our results indicated that SA is the fourth most studied aspect with eye tracking technology in aviation, maritime, and construction (see Figure 10). According to our results, SA in eye tracking research has mainly been studied in aviation, representing 39% of aviation studies. In the case of maritime and construction applications, eye tracking research has not focused much on SA, representing only 16% and 10% of studies, respectively.
In construction, Hasanzadeh et al. used eye tracking in two different studies to measure workers’ real-time SA in a real scenario [4], as well as to understand the relationship between SA and visual attention under fall and tripping hazard conditions [89]. Using eye tracking, SA has been studied for a variety of maritime applications. For example, Sanfilippo [41] integrated a multi-layer and multi-sensor fusion framework with one of the world’s most advanced simulators of demanding offshore operations to improve SA as an integrated component of simulation training. Hareide et al. [44] used eye tracking to improve graphical user interfaces in the bridge displays of high-speed crafts; this was done to understand the impact of the user interface on navigators’ SA and workload [44].
In the aviation monitoring context, SA—also called mode awareness—is defined as ‘the ability of a supervisor to track and to anticipate the behaviour of automated systems’ [53]. Situation awareness is also closely related to an individual’s error recognition capacity, which depends on their ability to notice changes [81]. In aviation, 36% of the studies used SA mainly for monitoring and error recognition applications. For example, Moacdieh et al. [60] studied loss of SA to examine pilots’ automation monitoring strategies and performance, as well as to understand human–automation interaction. In another study, Björklund et al. [53] studied the effect of verbal callouts on SA, automation errors, and flight performance during simulated commercial flights.

3.6.5. Training Improvement

Expert knowledge is usually externalised through courses, training programmes, and written material. However, tacit knowledge-sharing practices are rare and atypical in many industries [163]. Tacit knowledge such as know-how, know-what, and experience is mainly unconscious and unique to each individual [10]. As a result, extracting expert knowledge is challenging [164]. With the use of eye tracking technology, it is possible to extract and transfer expert knowledge to be used for the development of enhanced training programmes [20,22]. Several studies have demonstrated that experts garner similar visual patterns and problem-solving strategies over time, with little variance between individuals compared with novices [62,165]. This is important because it demonstrates that extracting and sharing expert knowledge such as visual problem-solving strategies is worthwhile for improving trainees’ performance in complex visual domains [62].
According to our findings, training improvement is the fifth most studied application of eye tracking in research in the maritime, construction, and aviation industries. Eye tracking for training improvement has mainly been used in the maritime industry and represents 32% of the maritime research articles collected, followed by the aviation and construction industries (16% and 10% of the articles for each industry, respectively). In the maritime industry, eye tracking technologies have been used for the evaluation of present and future human working environments. For example, Hareide et al. [40] used eye tracking to compare the visual focus of the navigator in onboard navigation and bridge simulators. Their results suggest that, despite the higher MWL required in simulator navigation training, simulators provide similar training outcomes to onboard navigation. In aviation, eye tracking was used by Robinski et al. [61] to identify differences in scanning techniques between helicopter pilots with different experience levels during landing training. They observed that, during take-off and landing, experienced pilots tended to use more helicopter instruments to retrieve information than inexperienced pilots, who assess conditions by looking through the window. Moreover, Robinski et al. [61] revealed that eye tracking feedback can enhance simulator training transfer and can be highly useful for real flights.
Traditional safety training programmes usually do not properly determine why construction workers fail to identify safety hazards [166]. However, through eye tracking, experienced workers can assist in training novice workers to maximise hazard recognition performance and safety awareness. Taking this into consideration, Jeelani et al. [93] used eye tracking to provide personalised training to construction workers on visual search patterns and hazards. For this purpose, construction site images with visual attention maps were used to trigger self-reflection and improvement in novices. Their findings demonstrated that personalised training assisted eye tracking, improving construction workers’ hazard recognition performance by 35%.

3.6.6. Hazard Identification

According to our results, hazard identification is the sixth most studied aspect using eye tracking technologies (Figure 10). However, hazard identification has only been studied in construction scenarios and is the most studied aspect of eye tracking research in construction, accounting for 74% of the total studies. Since the construction sector is one of the most hazardous industries, hazard identification is fundamental to construction safety management [167]; as such, these results are not surprising.
A hazard is defined as something that can cause detrimental effects. When hazards are unidentified, individuals are more likely to be exposed to unanticipated hazards, indulge in unsafe behaviour, and suffer disastrous injuries [93]. Because hazard identification is largely a visual search task [10,92], eye tracking technologies have been extensively used to understand individuals’ search patterns and visual attention. For example, a study found that workers who expend more time inspecting the worksite and devote higher levels of attention demonstrate superior hazard recognition [93]. Moreover, subjects with superior hazard recognition performance tend to focus less on noncritical distractors [91]. Additionally, Dzeng et al. [10] demonstrated that experienced workers spend more time searching for inconspicuous hazards than they do for obvious hazards.
Hazard identification and risk perception are closely related factors. Risk can be defined as the probability that a hazard will occur. However, risk can be perceived differently between individuals [168] in two fundamental ways. Risk can be perceived using logic to anticipate a hazard’s effects and facilitate risk assessment and decision-making [169]. Conversely, in daily life, risk is mostly handled through instinctive and intuitive reactions deriving from an emotional perception of danger [169]. It has been found that risk perception is one of the factors that most affects individuals’ safety on construction sites. For example, Habibnezhad et al. [23] found that, contrary to construction workers with higher risk perception, workers with lower risk perception generally spend more time analysing a hazardous situation. A similar trend was also found by Hasanzadeh et al. [83], who studied the impact of safety knowledge on construction workers’ hazard detection. They found that experienced workers spend less time analysing hazards because they can identify them more quickly [23].

3.6.7. Comparison between Novices and Experts

Through experience, experts learn to organise simple thoughts in a more organised and conceptually richer way than novices. At the same time, with less experience, novices usually reason, solve, and perform tasks in a different manner to experts. Novices’ decisions are based on rigid rules, whereas experts rely on experience [76]. Because of their experience, experts have a deeper understanding that enables them to process more information and automatically apply it [76]. With eye tracking, it is easier to understand the differences between novices and experts in a variety of scenarios and situations that would be difficult to measure through other means.
Based on our results (Figure 10), comparisons between novices and experts are the seventh most studied application of eye tracking research for the maritime, construction, and aviation industries. Approximately 17.5% of the selected studies used eye tracking to compare novices and experts. This application of eye tracking seems more attractive for maritime applications, followed by construction and aviation applications, representing 24%, 16%, and 14%, of the studies in each industry, respectively. For example, in a maritime application, eye tracking was used to investigate novice and expert maritime operators’ foci of attention during safety-critical maritime operations [34]. The results showed that novice ship operators tend to focus for shorter times and less frequently on the outside environment than expert operators. Moreover, maritime expert operators fixate more, which reflects their level of experience as they possess knowledge of what to look for and possible dangers to be aware of [34].
For construction industry applications, Hasanzadeh et al. [83] used eye tracking to identify the impact of safety knowledge, training, work experience, and injury exposure on construction workers’ attentional allocation and hazard detection. For comparison purposes, participants were divided into less experienced and experienced workers. Their results revealed that experienced workers tracked back more frequently to hazardous areas and spent less time exploring the hazardous areas. This behaviour shows that experienced workers have a better balance between processing and searching the scene than those with less experience [83]. In terms of safety training, their findings demonstrated no significant difference between the search strategies of workers with or without safety training. Workers with previous injury exposure behaved more cautiously and conservatively. Hasanzadeh et al.’s [83] results demonstrated that past injury exposure significantly impacts the cognitive processes of individuals and increases their risk awareness.
In aviation, air traffic controllers of different expertise levels were subjected to a series of radar screen tests to determine their visual problem-solving strategies. The results showed that individuals with higher levels of expertise more efficiently retrieved relevant information and used more efficient scanpaths than novices [62]. In another study, Skvarekova et al. [75] employed eye tracking to determine the differences in scanning techniques and attention distribution of experienced and inexperienced pilots during a precision instrument landing system approach and a non-precision non-directional beacon instrument approach. Their results revealed that experienced and inexperienced pilots’ scanning techniques differed considerably. Experienced pilots were able to scan each instrument faster and retrieve more information in less time, which gave them more time to detect errors. On the other hand, novice pilots made more mistakes and ignored some of the cockpit instruments.

3.6.8. Fatigue

According to Van Cutsem et al. [170], ‘Mental fatigue is a psychobiological state caused by prolonged periods of demanding cognitive activity’. Mental fatigue, stress, and strong emotions can hinder individuals’ SA when attentional demands exceed their attentional resources [5,89]. It is estimated that 20% of traffic accidents are caused by fatigue, making it one of the main contributors to transportation accidents [171]. Fatigue not only leads to the risk of falling asleep but also to decreased performance and attention, slower reaction times, memory lapses, and an increased risk of error [172,173,174]. There are several types of observable task- and sleep-related fatigue. Task-related fatigue can be active or passive. Active fatigue is caused by cognitively difficult tasks that require high mental effort. In contrast, passive fatigue is caused by the underload of cognitive processes, which is typical in monotonous work situations that require low mental effort [175]. Sleep-related fatigue is usually caused by the disruption of the individual’s circadian rhythm as well as environmental factors that reduce sleep quantity and quality [176]. It is therefore unsurprising that shift workers are especially susceptible to this type of fatigue [177]. However, according to Hopstaken et al. [178], increased motivation can counteract mental fatigue.
Fatigue detectors have recently received great attention in eye tracking research [48]. In this study, fatigue was identified as the eighth most studied application of eye tracking technology in the aviation, maritime, and construction industries, as presented in Figure 10. With eye tracking technologies, researchers have attempted to detect fatigue in different scenarios in real time. For example, Gupta et al. [176] developed a framework for monitoring submarine teams via online. The aim of the proposed framework was to determine the fatigue level of individuals 24/7 in maritime environments. From a total of 58 metrics, they identified three that can be used to identify individual fatigue state, team fatigue states, and social cohesion. According to Gupta et al. [176], the only feasible technology that they identified for measuring the different fatigue factors in a submariner environment was eye tracking. Nonetheless, the proposed framework for fatigue monitoring faces challenges regarding individual and team fatigue assessment, as well as in the design and consideration of the effect of a fatigue management system and countermeasures.
In the construction industry, wearable eye tracking devices were applied by Li et al. [92] to evaluate the impact of mental fatigue on construction equipment operators’ ability to detect hazards. Their results demonstrated that operators’ ability to detect hazards and reaction time were significantly affected by mental fatigue. According to Li et al. [92], mental fatigue made it difficult for excavator operators to maintain adequate hazard monitoring performance for their surroundings and related details. In a more recent study, Li et al. [95] developed a novel methodology to identify and classify multi-level mental fatigue in construction equipment operators. The identification and classification of mental fatigue levels were achieved using a combination of the Toeplitz Inverse Covariance-Based Clustering method and the support vector machine (SVM) algorithm. Overall, these two studies demonstrate the feasibility and effectiveness of wearable eye tracking technology for construction equipment operators.
In the aviation industry, fatigue is seen as a safety threat for pilots and air traffic controllers, who often experience disruptions to their circadian rhythms due to night and shift work [57]. Nevertheless, the displays in smart technologies and the interfaces of modern flight instruments may also affect pilots’ circadian rhythms and cause eye fatigue and stress. To study the effect of artificial light from electronic displays on commercial pilots’ visual fatigue, Brezonakova et al. [78] used a wearable eye tracking device in a flight simulator with a modern glass cockpit. Their results verified that the backlight of digital displays in the cockpit can cause visual fatigue in pilots, as a result of constant eye adaptation and long exposure to the artificially illuminated environment [78]. Moreover, their results showed that pilots’ visual fatigue depends on the instrument display’s backlight intensity levels. This demonstrates the importance of setting the correct backlighting intensity during flights. In another study, Wang and Sun [57] proposed a framework for real-time fatigue measurement combining face recognition and eye tracking technologies. The proposed framework was based on the percentage closure of eyes (PERCLOS) value as the fatigue judgment index, which proved to be a suitable index for detecting fatigue in aviation practice.

3.6.9. Stress and Anxiety

Stress is a type of mental tension caused by uncontrollable situations [179]. Stress affects cognitive and emotional processes, thereby impeding individuals’ decision-making processes [180]. Individuals under stress tend to expend less time analysing information, relying instead on automatised intuitive reactions [181,182,183]. Individuals’ personal coping resources and situation demands determine their response to a stressful situation [184]. A situation is perceived as challenging by individuals when adequate resources to meet the situation are available. On the contrary, a situation is judged as a threat when the available resources are insufficient [185].
One of the negative emotions triggered by stress is anxiety [186]. Anxiety is composed of heightened autonomic nervous system activity and feelings of unease and tension [187]. Anxiety is also known to cause detrimental effects on psychomotor and attentional skills [188] by disrupting the balance between goal-directed and stimulus-driven attentional systems; thus, it causes a diversion of available processing resources from task-relevant to task-irrelevant stimuli [189].
Our findings suggested that stress and anxiety are the 9th and 11th most studied applications of eye tracking research, respectively. Stress has been studied with eye tracking in the maritime and aviation industries, whereas anxiety was only studied in aviation scenarios, as presented in Figure 10. For example, in aviation, eye tracking was used by Stankovic et al. [64] to study information sampling and decision-making under acute stress. Different participants were required to perform a modified version of the Matching Familiar Figures Test (MFFT). The MFFT is an established measure of cognitive impulsivity. Their results showed that, under stress, the participants made decisions before fully sampling all available information, thereby demonstrating more impulsive decision-making behaviour. According to Stankovic et al. [64], the implications of their study could help to improve the design of visual displays, information consoles, and warning systems in cockpits to reduce accidents in aviation emergencies. In another study, Vine et al. [68] examined the influence of stress on the performance of highly skilled commercial pilots. Their study involved an engine failure on take-off scenario performed in a high-fidelity flight simulator. Vine et al. performed a series of hierarchical regression analyses to examine the extent to which demand and resource evaluations predicted pilots’ performance. Their findings suggested that pilots who adopted a threat response to stress displayed increased randomness in scanning behaviour (entropy), higher search rates, and reduced ability to inhibit distraction, indicating increased disruptions to attentional control and poor performance.
In a different study, Allsop and Gray [63] used eye tracking and a heart rate monitor to study the effects of anxiety on attention and gaze behaviour while interacting with complex, dynamic systems. Their study comprised an aircraft landing simulation in low-visibility conditions. Anxiety was multidimensionally manipulated by combining ego-threatening and evaluative instructions, monetary incentives, and immediate consequences for performance failures. The study found that anxious participants presented an increase in the randomness of scanning behaviour and in the percentage of dwelling time toward the outside world. According to Allsop and Gray [63], their results can help in implementing eye tracking technologies in aircraft warning systems to identify pilot anxiety during operational activity via visual scanning behaviour.

3.6.10. Foretelling

The used of eye tracking for foretelling individuals’ choices and human performance was identified as the 10th main application of this technology and is mainly applied in dynamic decision-making (DDM) environments. Continuously evolving DDM environments such as hazard monitoring, air traffic control, and emergency responses require a series of multiple interdependent critical decisions to be made in real time [190]. To make good decisions in high-risk environments, intensive training in highly procedural scenarios is required [81]. According to Figure 10, eye tracking has been employed for foretelling purposes in maritime, aviation, and construction scenarios.
Regarding applications in the maritime industry, Peysakhovich at al. [38] explored the applicability of oculometry to enable an abstract decision support system to foresee future decisions made by maritime operators. The participants had to monitor a radar screen to assess the level of threat posed by an aircraft by classifying it as hostile, uncertain, or nonhostile. Their results revealed that when participants identified a sign of threat, they revealed a higher task-evoked pupillary response and increased pupil diameter compared to nonhostile classifications. Peysakhovich at al. [38] concluded that fixation transitions and pupil dilation can help to predict the upcoming decision of the human operator by approximately half a second before the decision is made.
In aviation, Hasse et al. [59] employed eye tracking to improve the selection of future monitoring aviation personnel. For this purpose, eye tracking data were used to assess participants’ monitoring and detection performance for automation failures. Hasse et al. [59] used defined AOIs and participants’ visual fixation to distinguish whether automation failures were promptly identified. Their results indicated that, during anticipation and detection phases, low performers demonstrated significantly lower fixation counts on all potentially relevant AOIs versus high performers. Furthermore, during detection phases, low performers demonstrated significantly shorter gaze durations on all potentially relevant AOIs during the anticipation phase than high performers. Hasse et al.’s [59] results demonstrated that eye tracking can be used to predict individuals’ failure detection performance to improve the selection of future aviation personnel.
Crane operation in construction sites is not an easy task. Crane operation operators face challenges such as load oscillation, control input lag, and depth perception in the radial direction [88]. To facilitate crane operation, in-vehicle visual support systems (IVVSs) equipped with external cameras can be used, informing the operator about the operation state [88]. Nonetheless, the impact of the new information provided to the crane operator is unknown. Taking this into consideration, Chew et al. [88] used eye tracking to improve the design of construction cranes’ visual support systems. For this purpose, they employed a crane simulator with various IVVS designs and developed a gaze analysis solution for dynamic AOIs. To estimate the subjective responses of users, the researchers used six different gaze metrics: sparseness of attention, maximal fixation duration, randomness, uniformity, the summation of these metrics, and the proportion of gaze on the IVVS. Finally, they demonstrated that, using the selected gaze metrics, it is possible to employ gaze behavioural analysis as an everyday IVVS design tool for nonexperts.

3.6.11. Trust

Trust and trustworthiness are considered difficult-to-measure characteristics in individuals [191]. Trust implicitly refers to one’s positive expectations towards others’ actions and the belief that there is a high probability that others will act or behave as expected [192,193]. Conversely, distrust is ‘negative and implies fear of the other’ [193]. The results of our study showed that trust is one of the least studied human aspects of eye tracking technologies, ranking 12th (see Figure 10). Only one study used eye tracking to study trust for applications in aviation. Gontar et al. [73] used eye tracking to understand the behaviour of commercial pilots confronted with a cyberattack. According to Gontar et al. [73], when an aircraft system reports technical problems, pilots usually anticipate the aircraft’s behaviour and the most adequate course of action. However, when an aircraft is under a cyberattack, pilots’ trust in the system may be compromised as attackers can use pilots’ standard procedures to manipulate their behaviour [73]. Their results showed that the presence of a cyberattack led to more incorrect decisions, increased pilots’ workload, and weakened trust in the system, without delayed responses to alarms. They concluded that a need exists for training programmes that increase pilots’ awareness of potential cyber threats, how an aircraft can be infected with malicious software, and how to resolve this type of situation. Moreover, Gontar et al. [73] indicated that cyberattack warning systems similar to virus scanners or firewalls must be developed and installed to inform pilots when parts of the system have been infected with malicious software.

3.6.12. Working Memory Load

Working memory is understood as a system used by the brain to temporarily store and manipulate information for short periods of time [194]. The amount of information held at any given time within the working memory system is referred to as memory load [195]. According to Bouchacourt et al. [196], working memory “acts as a workspace on which information can be held, manipulated, and then used to guide behavior”. Working memory is vital for most cognitive processes and is closely related to attention [197]. Working memory abilities may be impaired due to mental and physical fatigue [198,199]. Moreover, multitasking is an important characteristic of working memory and can be inhibited when working memory capacity is reached [195]. For example, an impaired or high working memory load increases the likelihood of human error affecting an individual’s attentional allocation and hazard detection capacity [86].
Despite the high importance of working memory for most cognitive processes, this study identified that this is the least studied aspect in eye tracking research (13th place) in aviation, maritime, and construction applications. The only eye tracking study identified related to working memory load was by Hasanzadeh et al. [86], who investigated the impact of working memory load on the safety performance of construction workers. Several participants were subjected to a series of visual tests using images of a construction scenario to monitor their ability to detect hazards under low and high working memory loads. According to Hasanzadeh et al.’s [86] results, workers under high working memory load paid less attention to hazards compared with workers under low working memory load. Moreover, it was noted that workers’ ability to search for and identify hazards was inhibited when they experienced high working memory loads. Thus, high cognitive loads and working memory loads can influence individuals’ hazard detection skills. Several studies have suggested that attention and working memory load are the same entity [200,201,202]; this is important because a clear understanding of the relationship between attention and working memory load can help to design better strategies and training in order to reduce the rate of accidents in the construction industry.

3.7. Integrating Eye Tracking and Other Technologies for Evaluating Human Factors

Eye tracking technologies are mainly used to study different aspects of individuals’ cognitive processes. However, eye tracking technologies can also be integrated with different technologies to further study different physiological and emotional human aspects. This technology integration can be used to better understand individuals’ minds during different scenarios and behaviours as well as the relationship between the different human cognitive, emotional, and physiological factors. Taking all this into consideration, we identified which human aspects are the most studied in the aviation, maritime, and construction industries with eye tracking technologies, including the different technologies that were integrated for their study.
From the two types of eye tracking devices available, we identified that 72.5% of all the reviewed eye tracking studies preferred to use mobile eye tracking devices, while the remaining 26.5% used remote eye tracking devices (Figure 11). The preference for using mobile eye tracking devices could be due to the complex scenarios encountered in the maritime, aviation, and construction industries. Moreover, to accurately simulate these complex scenarios, subjects need to move freely to allow the eye tracking device to capture the entire field of view.
Our results also showed that human cognition was the most studied aspect for aviation, maritime, and construction applications, representing a total of 87.5% of studies (Figure 11). The second most studied human aspect was a cognitive-physiological combination (7.5% of studies), followed by the cognitive-emotional and physiological aspects, each representing 2.5% of the studies. The study of the cognitive-physiological human aspects was mainly applied in the aviation industry, followed by the construction and maritime industries. However, the study of cognitive-emotional aspects was only applied in the aviation industry, whereas the physiological aspect was studied for applications in the maritime and aviation industries (Figure 11).
As mentioned, several studies have investigated the relationships among these aspects, such as cognitive–physiological and cognitive-emotional relationships. To capture these dynamics, these studies were required to integrate eye tracking with several technologies. We identified the use of the following 13 technologies: training simulators (74%), video recording (12.5%), audio recording (8.75%), head trackers (7.5%), electrocardiography (ECG; 6.25%), EEG (2.5%), computer vision (2.5%), augmented reality (AR; 2.5%), virtual reality (VR; 1.25%), pressure interface (1.25%), electromyography (EMG; 1.25%), motion capture (1.25%), and facial recognition (1.25%; see Figure 11).
According to our results, 74% of eye tracking studies preferred to use simulators to replicate real-world scenarios. Compared with real-world scenarios, simulators facilitate training, eliminate risks, and provide economic advantages. It was found that simulator training provides the same training outcomes as real-world scenarios [40]. However, to achieve this, simulators need to be as close to reality as possible [203]. Training personnel in simulators facilitate the overall understanding of the different operations to be performed [41] and the improvement of cognitive and psycho-motor skills, thereby increasing self-confidence. Moreover, simulation training can enhance communication and teamwork [5] and help to improve the design of graphical user interfaces [39].

3.7.1. Simulators

The majority of eye tracking research applied in the aviation industry (92%) employed cockpit simulators. For maritime applications, 88% of eye tracking studies preferred the use of simulators to replicate ship bridges; in fact, international regulations for maritime training make simulators mandatory to use [49]. In the case of eye tracking studies for construction industry applications, only 21% used simulators to replicate construction sites or equipment operation cabins. Of the remaining eye tracking studies in construction, 26% used eye tracking in real scenarios, and 53% used pictures to represent construction sites. These findings are surprising as it has been demonstrated that two-dimensional images of scenes do not completely reflect the stimulus conditions of natural environments [87,89]. For example, it has been found that construction workers are not able to see all hazards in static images [23,91].

3.7.2. Video and Audio Recording

Video recording is a technology used to assist eye tracking research by controlling the quality of the study or monitoring participants without causing disruptions [27,49]. Audio recording serves a similar purpose [27], but it can also be a beneficial source of information for debriefing [79]. Video and audio recording are simple but powerful technologies that assist eye tracking studies in further understanding and recreating the environment in which the studied subjects are tested. Moreover, these two technologies allow for monitoring and interacting with test subjects without physical interaction.

3.7.3. Head Tracking Systems

Head tracking systems can easily be integrated with mobile eye tracking devices. Head trackers are used in eye tracking studies to counteract some of the limitations of mobile eye tracking devices. For example, head trackers allow the orientation of the subject’s head to be tracked in space to calculate gaze as a 3D vector relative to the environment [53]. These data combined with eye tracking can be used to improve the accuracy of calculating the subject’s gaze areas, especially in environments where the subject moves freely. Moreover, knowing the position of the subject head facilitates the calibration, thus improving the accuracy of eye tracking process [19].

3.7.4. Electroencephalography and Electrocardiography Technologies

It has been demonstrated that numerous mental states, such as vigilance, fatigue, alertness, stress, and performance, have physiological roots [57,204,205]. As a result, several mental states can be measured not only with eye tracking technologies but also with other techniques, such as EEG and ECG. Electroencephalography is a non-invasive method that uses electrodes placed on the subject’s scalp to sense brain cell activity. Brain cells communicate via electrical pulses. The number of neurons that discharge electrical pulses at the same time is workload- and task-dependent [77]; this allows EEGs to be used to study and monitor individuals’ mental and emotional states in real time during various activities [206,207]. Electroencephalography is so effective that it is considered one of the most powerful methods for monitoring task loads in real time. As a result, EEG has been used to evaluate the early onset of fatigue and drowsiness [24] as well as to validate the results of task load variations in eye tracking research [77]. Electrocardiography devices are used to visualise the electrical activity of the heart. Such devices allow the identification of different emotional and physiological states thanks to the relationship of the heart with the nervous system [208,209]. Since ECG can easily identify slight changes in normal ECG patterns, this technique is considered the most critical source of fatigue indicators [210].

3.7.5. Body Pressure Mapping and EMG Systems

In human subjects, physical changes such as body posture and muscle activation can be used to identify certain mental states. For example, incorrect sitting posture can affect internal physiological conditions [211], leading to mental fatigue, impaired performance, and human error more generally [212]. The technologies used to measure body posture and muscle activation are body pressure mapping and EMG. Body pressure mapping, also known as body pressure imaging, is a technology that measures pressure distribution in the human body and support surfaces in real time [213]. Body pressure measurement systems are composed of a pressure mat fitted with an array of sensors that dynamically measure the interface pressure, indicating where pressure is concentrated [211]. This type of technology is usually used to help to assess comfort, design, and ergonomics in the automotive industry [214,215,216]. Electromyography is an electrodiagnostic medicine technique for evaluating and recording the electrical activity produced by skeletal muscles [217]. Electromyography measures the electric potential generated by contracting muscle cells when they are electrically or neurologically activated [218]. The information obtained using EMG can be used to inform eye tracking systems or to validate eye tracking data. For example, Fadda [24] used these two technologies in combination with eye tracking to understand fatigue evolution in quay crane operators.

3.7.6. Computer Vision

In eye tracking research, gaze behaviour is usually evaluated using qualitative methods such as heatmaps and gaze paths. However, the quantitative evaluation of gaze behaviour is one of the major challenges in eye tracking research [88] and is especially important in dynamic scenarios where the AOI is constantly moving or in computer displays with dynamically changing content. This challenge has been addressed by integrating computer vision tools with eye tracking software [58]. For example, the use of computer vision with eye tracking allows for the easy identification of exactly where a subject is looking in dynamic scenarios in real time. Chew at al. [88] used a computer vision tool to track the moving load of cranes to improve cranes’ IVVS. In another study, Weibel at al. [58] used computer vision to achieve eye gaze-to-object registration to collect, analyse, and visualise eye tracking data from pilots in commercial airline flight deck scenarios.

3.7.7. Augmented Reality and Virtual Reality

Augmented reality has been extensively used for educational purposes [219]. Augmented reality technology adds virtual information to a real environment that is viewed through a device. Not only are users able to see and touch their natural surroundings, but they can also add virtual features such as images, videos, and sound [220]. Augmented reality has been applied in the construction industry to facilitate the comprehension of complex dynamic and spatial–temporal constraints, bring remote job sites indoors, and improve learning processes [90]. Augmented reality technologies have also been used in combination with eye tracking devices to improve HMIs by providing users with a more information-rich environment [37]. Virtual technology uses a head-mounted display helmet fitted with motion tracking sensors that allows users to engage in a fully immersive sensory experience in a designed space [221]. Virtual reality technology provides the user with an engaging experience by determining their spatial position in the visual environment, which is presented using 3D stereoscopic images and videos [222]. In recent years, VR has been rapidly recognised and implemented in engineering and medical education and training [222,223,224]. A combination of eye tracking with VR can be used to enhance the study of involved cognitive process and to improve the human–computer interaction [94]. For example, Ye et al. [94] used eye tracking and VR to demonstrate how integrating these technologies can enable the study of hazard identification in a realistic complex construction site by improving cognitive data collection and human–computer interaction.

3.7.8. Motion Capture

Motion capture is the process of recording human movement [225]. There are several types of motion capture systems, which are categorised into five groups based on their physical working principles. These principles include electromagnetic systems, image processing systems, optoelectronic systems, ultrasonic localisation systems, and inertia sensory unit systems [225]. In eye tracking research, there are situations where it is not possible to use traditional eye tracking systems to study gaze behaviour. These situations include when subjects use sunglasses or binoculars or when they need to move freely without using any equipment [226]. In such situations, non-invasive marker-less motion capture systems such as image processing systems can be used to determine an individual’s point of gaze [227]. Moreover, in relevant or special cases, extra information can be obtained by determining the position of the subject and their point of gaze [18,228].

3.7.9. Face Recognition

The human face is used as a biometric trait in various areas, such as human–computer interaction, health, education, security and law enforcement, entertainment, marketing, and finance [229]. Face recognition technologies can discreetly extract information from an individual’s face, such as emotions, mental states, race, identity, age, and gender [230,231]. Thanks to these capabilities, face recognition technologies have been used in conjunction with eye tracking devices to study consumer brand awareness [232] as well as fatigue detection in car drivers [233] and commercial pilots [57]. In this study, we identified only one study that combined face recognition and eye tracking techniques. Wang and Sun [57] proposed a framework that combines both technologies to measure fatigue in real time. According to Wang and Sun [57], the combination of these two technologies with the appropriate algorithms can result in a more effective tool to manage fatigue risks in aviation.

4. Discussion

Human error is one of the main reasons for accidents in the aviation, maritime, and construction industries. While technological advancements assist in enhancing safety, the human operator still plays a key role in actively maintaining safety. It is the human operator’s responsibility to visually attend to and obtain information from available sources and to use it to maintain SA, identify hazards, and make appropriate decisions. This responsibility is particularly important as most accidents in these three industries can be linked to a lack of attention. Hence, eye trackers are a good way to measure an individual’s visual attention. The objective data provided by eye trackers make these a valuable technological tool for a variety of purposes, including research and training. The results of this research show that industries such as the aviation industry have been using eye tracking technology for longer and therefore have more pertinent articles than other industries, such as construction. Regardless, there has been an increase in the usage of eye tracking technology in the mentioned industries. The main countries using such technology can also be seen to be dependent on their location and industry resources.

4.1. Gaps in Application

All three studied industries applied eye trackers to most of the topics identified in this study but at varying percentages. Topics such as fatigue were addressed within all three industries. However, other topics, such as MWL, were primarily studied by one or two industries. Finally, there were a few topics, such as working memory load, that were only studied within one industry. Such a trend reveals gaps in the literature.
Topics such as HMIs and MWL have not been studied in depth in the construction industry. While the human operator might not have a complex workspace in the construction industry compared to the workspaces of the maritime or aviation industries, it is nonetheless important to conduct construction HMI studies utilising eye trackers. Such studies can aid in understanding how operators in the construction industry interact with their work interface and pay attention to vital information. Depending on the technologies used, the data could even reveal how workload is managed. Similar topics have been studied in the other two industries [26,27,39,52]. Such research might be particularly beneficial in construction as it is considered the most hazardous industry.
At the same time, topics such as hazard detection have been widely covered in the construction industry but not in the aviation or maritime industries [10,91,93]. Given that hazards are present in any high-risk industry, similar research might be beneficial to these industries. The dynamic operating environment of aircrafts and ships means that there might be even more hazards that the operator has to bear in mind. For example, moving vehicles (such as aircrafts and/or ships) depart one location and travel to another location; hence, they must monitor not only the hazards at the departure location but also hazards at the destination. Conducting studies on hazard detection in the aviation and maritime industries will shed light on how operators visually identify hazards in these dynamic environments. Such studies will show how operators focus on non-essential distractors and even obvious hazards. Similarly, there are applications of eye tracking research that have been studied less in one or two industries, revealing a gap in the literature. Eye trackers provide data about where a person is looking; hence, it is no surprise that visual attention is one of the most studied topics in all three industries.

4.2. Gaps in Human Aspects

The most studied human aspect in all three industries is cognition. However, it is possible to apply eye trackers to other topics as well, particularly in combination with other technologies, which will increase the literature on cognitive–physiological, cognitive–emotional, and physiological aspects.
Topics such as trust, working memory load, stress, anxiety, and so on could be investigated by integrating multiple technologies. However, some studies have only been conducted in one of the industries, revealing a gap in the literature regarding such studies on other industries. For example, anxiety is a relevant topic in most high-risk industries. Hence, a study similar to that of Allsop and Gray [63] could be conducted in the construction and maritime industries with the assistance of eye trackers to understand operator anxiety. Similarly, Hasanzadeh et al.’s [83] study could be replicated in aviation and maritime scenarios to understand the operator’s ability to pay attention to hazards under various working memory loads. This research is particularly relevant as the aviation and maritime industries do not have any eye tracking studies regarding hazard detection, as discussed in the previous paragraph. Moreover, applying multiple technologies will contribute to understanding the cognitive as well as other aspects of hazard detection.

4.3. Gaps in Technology Integration

In relation to technology, the technology consistently integrated with eye trackers for research in all three industries was simulators. Using simulators has obvious benefits, but the failure to integrate other technologies reveals a gap in the literature. Using additional technology could be done independently (one additional technology with eye trackers) or through integration with even more technologies (i.e., eye trackers and multiple other technologies). For example, a head tracker could be integrated with an eye tracker to obtain eye movement data along with head movement data. Additionally, EEG could be integrated, which could provide extra data on the operator’s mental state. These two examples are discussed further in the next paragraph.
To be able to make appropriate decisions, an operator must have a good scanpath. Studying an individual’s scanpath helps in understanding the overall process of gathering information from available sources to inform decision-making. As such, in a dynamic operating environment, integrating head trackers along with eye trackers can be beneficial because several workspaces in the three industries require the operator to move their head to scan relevant AOIs. Head trackers can further assist in mapping an individual’s gaze pattern and creating heatmaps. In a similar way, it could be valuable to integrate EEG technology with head trackers and eye trackers to provide an additional layer of objective data related to brain activity. Such data not only indicate where an individual is looking but also help with understanding the wider context of how the individual is using the obtained information to make decisions. This understanding is valuable because the acquisition and processing of information have close physiological links. We identified that EEG is not widely integrated with eye trackers, revealing a gap in the literature.
Likewise, pressure interfaces used to measure body pressure are not often used in eye tracking research. Integrating such technology with eye trackers might be beneficial to understanding other human aspects such as physiological conditions. Body pressure data are particularly relevant to the three industries in this paper as some roles might require the operator to spend a great deal of time sitting down. Depending on the role, operators might not have the flexibility to stand up and move around (e.g., a crane operator sitting in a small cabin for an extended period of time or a pilot sitting in a smaller aircraft during a flight lasting several hours).
The lack of literature using certain technologies in conjunction with eye trackers does not necessarily indicate a reluctance to use certain technologies. It could be that some technologies are new and have only been made available recently, such as AR and VR technologies. Augmented reality and VR could present a suitable alternative to simulators. As discussed, simulators are the most frequently integrated technology with eye trackers because of the advantages that they offer, such as low risk. Augmented reality and VR technology have several similar benefits and are also more portable, cheaper, easier to operate by one individual (i.e., the person wanting to use them), and require less maintenance. Moreover, AR/VR technology can even have inbuilt eye tracking technology, eliminating the need for additional hardware. While more studies are needed to determine whether VR can provide similar training outcomes to simulators [40,61], there is potential for using VR for learning and training purposes. Outside the aviation, maritime, and construction industries, these technologies are being used in other industries for learning purposes [222,223,224], and it might be possible to teach proper information gathering techniques to novices by experts [20,22,34,62,165] using the eye trackers integrated within AR/VR technologies. In this way, novices can learn to identify relevant information at the appropriate time and thus make correct decisions. Novice training could also become more efficient. Additionally, training could be offered to experienced operators as well through refresher trainings. Augmented reality and VR technologies can benefit such training syllabi, but the gap in AR/VR research means that further studies are needed to understand whether these benefits exist.
The application of non-invasive brain stimulation (NIBS) in conjunction with eye trackers is a technology integration that was not identified in this study. Non-invasive brain stimulation can induce persisting modifications of cortical excitability in humans [234,235], with beneficial effects on cognitive and physiological performance [236,237,238]. For example, Waters-Metenier et al. [239] showed in a double-blind experiment that transcranial direct current stimulation (tDCS, a type of NIBS) can be used to augment synergy learning, leading subsequently to faster and more synchronised execution of difficult muscular activation patterns when playing the piano. Their results demonstrated that NIBS could be used to facilitate and speed up the learning process of individuals in complex multitasking activities. In another study, Ciechanski et al. [240] used tDCS to enhance the motor skill learning of medicine students for surgical procedural training. According to their results, tDCS may enhance skill acquisition in a simulation-based environment. Taking into consideration the reported beneficial effects of NIBS, NIBS integrated with eye tracking can represent a suitable alternative to enhance individuals’ working memory and knowledge acquisition during training programs in high-risk industries. For example, this technology integration can facilitate the skill acquisition process of future pilots for complex operating procedures in aviation. In the construction industry, eye tracking and NIBS can be used to improve construction workers’ hazard recognition capability. Since modulation of neuroplasticity can be achieve with NIBS [241], this technology can be used to improve individuals’ working memory function [237] to shorten the learning curve in training programs.

4.4. Gaps in Types of Eye Trackers Used

As mentioned, simulators are the most used technology in all three industries for eye tracking research. There are many reasons for this, including regulatory requirements in the maritime industry [49]. The construction industry does not use simulators as much as the other two industries, but several real-world construction industry studies have used eye trackers; this reveals a gap in the literature regarding using eye trackers in real-world situations for the aviation and maritime industries. Mobile eye tracking devices and remote eye tracking devices have been used in studies of the three high-risk industries. There are obvious reasons for using a mobile device in an operating environment where the individual needs to be able to move their head freely. For example, the three high-risk industries have a broad view of the outside world that the human operator has to monitor from their workspace (e.g., the bridge of a ship) in addition to various screens. The number of displays, the type of information they present, and the frequency at which the operator monitors these displays vary considerably between the three industries and could also be dependent on the operator’s role. The gap in the literature regarding real-world situations, along with the low usage of remote eye tracker devices, indicates that more research is needed in this area. It might be possible to permanently install remote eye tracking devices using the screens in certain workplaces in order to collect data discreetly in real-world situations. Despite the limitations of such devices, such as the inability to collect eye movement data beyond the screens (e.g., when the operator is looking at the outside world), remote eye tracking devices could be a suitable option for real-world situations [121,138]. Such devices could provide real-time eye tracking data to operators, describing fixation, saccades, and blink rate, which could then be used to produce a collective heatmap [89]. Remote eye tracking devices also offer other benefits, such as collecting objective real-time data whilst the operator is completing a task [77], which could be used to identify anxiety, stress, or fatigue, as in Allsop and Gray’s study [63].
One additional benefit of permanently installing remote eye tracker devices in a workspace is the ability to collect longitudinal data, which could assist developers in monitoring eye tracking data over a long period of time to determine if enhancements to the HMI are needed. Data can be used to produce a collative heatmap and monitor gaze patterns over several years [53,115], revealing any limitations of the screen displays over the course of executing a given task. As discussed, understanding how humans interact with systems is vital to having a suitable HMI [43]. Remote eye trackers have the potential to provide real-time data that can assist in long-term system development.

5. Conclusions

In 1596, Du Laurens, a French anatomist and medical scientist, said that the eyes are the windows of the mind [242]. In high-risk industries, it is vital for humans to use their sight to obtain the right information at the right time, from the right sources. Hence, eye tracking technology is a valuable tool for collecting relevant objective data. While eye tracking technology has a long history, its uses are constantly evolving.
This study systematically identified the demographic distribution and applications of eye tracking research in the aviation, maritime, and construction industries, as well as the different technologies that have been integrated to study human aspects of various high-risk environments. We also uncovered various gaps regarding the usage of additional technologies to support and validate eye tracking research for applications in the aviation, maritime, and construction industries.
These gaps provide insight into some of the future eye tracking topics that could be explored in these three industries. Moreover, it may be beneficial for future research to explore the application of eye tracking in other high-risk industries, such as space exploration, mining, and oil and gas. Further research could help to reveal additional similarities and differences between different applications of eye tracking in the three industries.
However, eye tracking does not always translate smoothly into the real world. To achieve an adequate translation, several challenges need to be overcome in terms of data quality and algorithmic variability. Therefore, future work is necessary to create appropriate experimental and industry standards for eye tracking technologies.
Overall, this study highlighted that eye tracking can be used in relation to different human aspects for a variety of applications and industries. Eye tracking research has an exciting future, particularly in light of potential technology integrations such as the integration of NIBS, VR, and AR.

Author Contributions

Conceptualisation, D.M.-M. and S.M.; methodology, D.M.-M. and S.M.; validation, D.M.-M.; formal analysis, D.M.-M. and S.P.; investigation, D.M.-M.; resources, D.M.-M.; data curation, D.M.-M.; writing—original draft preparation, D.M.-M. and S.P.; writing—review and editing, D.M.-M., S.P., S.M., K.P. and R.A.S.; visualisation, D.M.-M.; supervision, S.M. and R.A.S.; project administration, D.M.-M., S.M. and R.A.S.; funding acquisition, S.M. and R.A.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing not applicable. The new data created has already been presented in this study.

Acknowledgments

This work was supported and founded by Griffith School of Engineering and Built Environment.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Grote, G. Safety management in different high-risk domains—All the same? Saf. Sci. 2012, 50, 1983–1992. [Google Scholar] [CrossRef]
  2. Hudson, P. Applying the lessons of high risk industries to health care. Qual. Saf. Health Care 2003, 12, i7–i12. [Google Scholar] [CrossRef] [Green Version]
  3. Amiri, M.; Ardeshir, A.; Fazel Zarandi, M.H.; Soltanaghaei, E. Pattern extraction for high-risk accidents in the construction industry: A data-mining approach. Int. J. Inj. Contr. Saf. Promot. 2016, 23, 264–276. [Google Scholar] [CrossRef]
  4. Hasanzadeh, S.; Esmaeili, B.; Dodd, M.D. Measuring construction workers’ real-time situation awareness using mobile eye-tracking. In Proceedings of the Construction Research Congress, San Juan, Puerto Rico, 31 May–2 June 2016; pp. 2894–2904. [Google Scholar] [CrossRef]
  5. Muehlethaler, C.M.; Knecht, C.P. Situation Awareness Training for General Aviation Pilots using Eye Tracking. IFAC PapersOnLine 2016, 49, 66–71. [Google Scholar] [CrossRef]
  6. Li, G.; Mao, R.; Hildre, H.P.; Zhang, H. Visual Attention Assessment for Expert-in-the-loop Training in a Maritime Operation Simulator. IEEE Trans. Ind. Informat. 2019, 16, 522–531. [Google Scholar] [CrossRef]
  7. Xinyao, G.; Yawei, L.; Qingmin, S.; Linqing, N.; Huibin, J. Measuring the situation awareness of tower controllers by using eye movement analysis. J. Eng. Sci. Technol. Rev. 2020, 13, 133–140. [Google Scholar] [CrossRef]
  8. Di Nocera, F.; Mastrangelo, S.; Colonna, S.P.; Steinhage, A.; Baldauf, M.; Kataria, A. Mental workload assessment using eye-tracking glasses in a simulated maritime scenario. In Proceedings of the Human Factors Ergonomics Society Europe, Groningen, The Netherlands, 14–16 October 2015; pp. 14–16. [Google Scholar]
  9. Cheng, C.-W.; Leu, S.-S.; Lin, C.-C.; Fan, C. Characteristic analysis of occupational accidents at small construction enterprises. Saf. Sci. 2010, 48, 698–707. [Google Scholar] [CrossRef]
  10. Dzeng, R.-J.; Lin, C.-T.; Fang, Y.-C. Using eye-tracker to compare search patterns between experienced and novice workers for site hazard identification. Saf. Sci. 2016, 82, 56–67. [Google Scholar] [CrossRef]
  11. Benito, G.R.G.; Berger, E.; de la Forest, M.; Shum, J. A cluster analysis of the maritime sector in Norway. Int. J. Transp. Manag. 2003, 1, 203–215. [Google Scholar] [CrossRef]
  12. Woo, M. Eyes hint at hidden mental-health conditions. Nature 2019. [Google Scholar] [CrossRef]
  13. Miranda, S.B.; Hack, M.; Fantz, R.L.; Fanaroff, A.A.; Klaus, M.H. Neonatal pattern vision: A predictor of future mental performance? J. Pediatr. 1977, 91, 642–647. [Google Scholar] [CrossRef]
  14. Peißl, S.; Wickens, C.D.; Baruah, R. Eye-tracking measures in aviation: A selective literature review. Int. J. Aerosp. Psychol. 2018, 28, 98–112. [Google Scholar] [CrossRef]
  15. Duchowski, A.T. A breadth-first survey of eye-tracking applications. BMR 2002, 34, 455–470. [Google Scholar] [CrossRef]
  16. Pinheiro, R.; Pradhananga, N.; Jianu, R.; Orabi, W. Eye-tracking technology for construction safety: A feasibility study. In Proceedings of the ISARC—International Symposium on Automation and Robotics in Construction, Auburn, AL, USA, 18–21 July 2016; pp. 282–290. [Google Scholar]
  17. Rayner, K. Eye movements in reading and information processing: 20 years of research. Psychol. Bull. 1998, 124, 372. [Google Scholar] [CrossRef]
  18. Jankovics, I.; Kale, U.; Rohacs, J.; Rohacs, D. Developing ATCOs ‘Support System: Load Management, Integrated Sensors and Eye Tracking. Int. J. Mech. Aerosp. Eng. 2017, 3, 25–42. [Google Scholar]
  19. Yousefi, M.V.; Karan, E.; Mohammadpour, A.; Asadi, S. Implementing eye tracking technology in the construction process. In Proceedings of the 51st ASC Annual International Conference, College Station, TX, USA, 22–25 April 2015; pp. 752–759. [Google Scholar]
  20. Chetwood, A.S.; Kwok, K.-W.; Sun, L.-W.; Mylonas, G.P.; Clark, J.; Darzi, A.; Yang, G.-Z. Collaborative eye tracking: A potential training tool in laparoscopic surgery. Surg. Endosc. 2012, 26, 2003–2009. [Google Scholar] [CrossRef]
  21. Van Der Gijp, A.; Ravesloot, C.; Jarodzka, H.; Van der Schaaf, M.; Van der Schaaf, I.; van Schaik, J.P.; Ten Cate, T.J. How visual search relates to visual diagnostic performance: A narrative systematic review of eye-tracking research in radiology. Adv. Health Sci. Educ. Theory Pract. 2017, 22, 765–787. [Google Scholar] [CrossRef] [Green Version]
  22. Menezes, P.; Francisco, J.; Patrão, B. The Importance of Eye-Tracking Analysis in Immersive Learning—A Low Cost Solution. In Online Engineering & Internet of Things, Proceedings of the 14th International Conference on Remote Engineering and Virtual Instrumentation, New York, NY, USA, 15–17 March 2017; Columbia University: New York, NY, USA, 2018; pp. 689–697. [Google Scholar]
  23. Habibnezhad, M.; Fardhosseini, S.; Vahed, A.M.; Esmaeili, B.; Dodd, M.D. The relationship between construction workers’ risk perception and eye movement in hazard identification. In Proceedings of the Construction Research Congress 2016, San Juan, Puerto Rico, 31 May–2 June 2016; pp. 2984–2994. [Google Scholar]
  24. Fadda, P.; Meloni, M.; Fancello, G.; Pau, M.; Medda, A.; Pinna, C.; Del Rio, A.; Lecca, L.I.; Setzu, D.; Leban, B. Multidisciplinary Study of Biological Parameters and Fatigue Evolution in Quay Crane Operators. Procedia Manuf. 2015, 3, 3301–3308. [Google Scholar] [CrossRef] [Green Version]
  25. Heikoop, D.D.; de Winter, J.C.; van Arem, B.; Stanton, N.A. Effects of platooning on signal-detection performance, workload, and stress: A driving simulator study. Appl. Ergon. 2017, 60, 116–127. [Google Scholar] [CrossRef] [Green Version]
  26. Li, W.-C.; Zhang, J.; Le Minh, T.; Cao, J.; Wang, L. Visual scan patterns reflect to human-computer interactions on processing different types of messages in the flight deck. Int. J. Ind. Ergon. 2019, 72, 54–60. [Google Scholar] [CrossRef]
  27. Yan, S.; Wei, Y.; Tran, C.C. Evaluation and prediction mental workload in user interface of maritime operations using eye response. Int. J. Ind. Ergon. 2019, 71, 117–127. [Google Scholar] [CrossRef]
  28. Liberati, A.; Altman, D.G.; Tetzlaff, J.; Mulrow, C.; Gøtzsche, P.C.; Ioannidis, J.P.; Clarke, M.; Devereaux, P.J.; Kleijnen, J.; Moher, D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: Explanation and elaboration. Ann. Intern. Med. 2009, 151, W-65–W-94. [Google Scholar] [CrossRef] [Green Version]
  29. Lützhöft, M.; Dukic, T. Show me where you look and I’ll tell you if you’re safe: Eye tracking of maritime watchkeepers. In Proceedings of the 39th Nordic Ergonomics Society Conference, Lysekil, Sweden, 1–3 October 2007; pp. 75–78. [Google Scholar]
  30. Arenius, M.; Athanassiou, G.; Sträter, O. Systemic assessment of the effect of mental stress and strain on performance in a maritime ship-handling simulator. IFAC Proc. Vol. 2010, 43, 43–46. [Google Scholar] [CrossRef]
  31. Bjørneseth, F.B.; Renganayagalu, S.K.; Dunlop, M.D.; Hornecker, E.; Komandur, S. Towards an experimental design framework for evaluation of dynamic workload and situational awareness in safety critical maritime settings. In Proceedings of the 26th BCS Conference on Human Computer Interaction, Birmingham, UK, 12–16 September 2016; pp. 309–314. [Google Scholar] [CrossRef]
  32. Forsman, F.; Sjörs-Dahlman, A.; Dahlman, J.; Falkmer, T.; Lee, H.C. Eye tracking during high speed navigation at sea: Field trial in search of navigational gaze behaviour. J. Transp. Technol. 2012, 2, 277–283. [Google Scholar]
  33. Muczyński, B.; Gucma, M.; Bilewski, M.; Zalewski, P. Using eye tracking data for evaluation and improvement of training process on ship’s navigational bridge simulator. Sci. J. Marit. Univ. Szczec. 2013, 33, 75–78. [Google Scholar]
  34. Bjørneseth, F.B.; Clarke, L.; Dunlop, M.; Komandur, S. Towards an understanding of operator focus using eye-tracking in safety-critical maritime settings. In Proceedings of the International Conference on Human Factors in Ship Design & Operation, London, UK, 26–27 February 2014. [Google Scholar]
  35. Moore, L.J.; Vine, S.J.; Smith, A.N.; Smith, S.J.; Wilson, M.R.J.M.P. Quiet eye training improves small arms maritime marksmanship. Mil. Psychol. 2014, 26, 355–365. [Google Scholar] [CrossRef]
  36. Hodgetts, H.M.; Tremblay, S.; Vallières, B.R.; Vachon, F. Decision support and vulnerability to interruption in a dynamic multitasking environment. Int. J. Hum. Comput. Stud. 2015, 79, 106–117. [Google Scholar] [CrossRef]
  37. Hong, T.C.; Andrew, H.S.Y.; Kenny, C.W.L. Assessing the situation awareness of operators using maritime augmented reality system (MARS). In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Los Angeles, CA, USA, 26–30 October 2015; pp. 1722–1726. [Google Scholar] [CrossRef]
  38. Peysakhovich, V.; Vachon, F.; Vallières, B.R.; Dehais, F.; Tremblay, S. Pupil dilation and eye movements can reveal upcoming choice in dynamic decision-making. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Los Angeles, CA, USA, 26–30 October 2015; pp. 210–214. [Google Scholar] [CrossRef]
  39. Hareide, O.S.; Ostnes, R.; Mjelde, F.V. Understanding the eye of the navigator. In Proceedings of the European Navigation Conference, Helsinki, Finland, 30 May–2 June 2016. [Google Scholar]
  40. Hareide, O.S.; Ostnes, R. Comparative study of the Skjold-class bridge- and simulator navigation training. Eur. J. Navig. 2016, 14, 11. [Google Scholar]
  41. Sanfilippo, F. A multi-sensor fusion framework for improving situational awareness in demanding maritime training. Reliab. Eng. Syst. Saf. 2017, 161, 12–24. [Google Scholar] [CrossRef]
  42. Keller, M.D.; Ziriax, J.M.; Barns, W.; Sheffield, B.; Brungart, D.; Thomas, T.; Jaeger, B.; Yankaskas, K. Performance in noise: Impact of reduced speech intelligibility on Sailor performance in a Navy command and control environment. Hear. Res. 2017, 349, 55–66. [Google Scholar] [CrossRef] [PubMed]
  43. Hareide, O.S.; Ostnes, R. Maritime usability study by analysing eye tracking data. J. Navig. 2016, 70, 1–17. [Google Scholar] [CrossRef] [Green Version]
  44. Hareide, O.S.; Mjelde, F.V.; Glomsvoll, O.; Ostnes, R. Developing a high-speed craft route monitor window. In Augmented Cognition. Enhancing Cognition and Behavior in Complex Human Environments. Proceedings of the International Conference on Augmented Cognition, Vancoucer, BC, Canada, 9–14 July 2017; Schmorrow, D., Fidopiastis, C., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2017; Volume 10285. [Google Scholar] [CrossRef]
  45. Orlandi, L.; Brooks, B. Measuring mental workload and physiological reactions in marine pilots: Building bridges towards redlines of performance. Appl. Ergon. 2018, 69, 74–92. [Google Scholar] [CrossRef]
  46. Hareide, O.S.; Ostnes, R. Validation of a maritime usability study with eye tracking data. In Proceedings of the International Conference on Augmented Cognition, Las Vegas, NV, USA, 15–20 July 2018; pp. 273–292. [Google Scholar] [CrossRef]
  47. Costa, N.A.; Jakobsen, J.J.; Weber, R.; Lundh, M.; MacKinnon, S.N. Assessing a maritime service website prototype in a ship bridge simulator: Navigators’ experiences and perceptions of novel e-Navigation solutions. J. Marit. Aff. 2018, 17, 521–542. [Google Scholar] [CrossRef] [Green Version]
  48. Li, F.; Chen, C.-H.; Xu, G.; Khoo, L.P.; Liu, Y. Proactive mental fatigue detection of traffic control operators using bagged trees and gaze-bin analysis. Adv. Eng. Inform. 2019, 42, 100987. [Google Scholar] [CrossRef]
  49. Atik, O.; Arslan, O. Use of eye tracking for assessment of electronic navigation competency in maritime training. J. Eye Mov. Res. 2019, 12. [Google Scholar] [CrossRef]
  50. Colvin, K.; Dodhia, R.M.; Belcher, S.; Dismukes, R. Scanning for visual traffic: An eye tracking study. In Proceedings of the 12th International Symposium on Aviation Psychology, Dayton, OH, USA, 14–17 April 2003. [Google Scholar]
  51. Thomas, L.C.; Wickens, C.D. Eye-tracking and individual differences in off-normal event detection when flying with a synthetic vision system display. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, New Orleans, LA, USA, 20–24 September 2004; pp. 223–227. [Google Scholar] [CrossRef]
  52. Wilson, J.; Hooey, B.L.; Foyle, D.C. Head-up display symbology for surface operations: Eye tracking analysis of command-guidance vs. situation-guidance formats. In Proceedings of the 13th International Symposium on Aviation Psychology, Oklahoma City, OK, USA, 18–21 April 2005; p. 835. [Google Scholar]
  53. Björklund, C.M.; Alfredson, J.; Dekker, S.W.A. Mode monitoring and call-outs: An eye-tracking study of two-crew automated flight deck operations. Int. J. Aviat. Psychol. 2006, 16, 263–275. [Google Scholar] [CrossRef] [Green Version]
  54. Sarter, N.B.; Mumaw, R.J.; Wickens, C.D. Pilots’ monitoring strategies and performance on automated flight decks: An empirical study combining behavioral and eye-tracking data. Hum. Factors 2007, 49, 347–357. [Google Scholar] [CrossRef]
  55. Hasse, C.; Bruder, C.; Grasshoff, D.; Eißfeldt, H. Future ability requirements for human operators in aviation. In Proceedings of the International Conference on Engineering Psychology and Cognitive Ergonomics, San Diego, CA, USA, 19–24 July 2009; pp. 537–546. [Google Scholar] [CrossRef]
  56. Martin, C.; Cegarra, J.; Averty, P. Analysis of mental workload during en-route air traffic control task execution based on eye-tracking technique. In Proceedings of the 9th International Conference on Engineering Psychology and Cognitive Ergonomics, Orlando, FL, USA, 9–14 July 2011; pp. 592–597. [Google Scholar]
  57. Wang, L.; Sun, R. Study of fatigue measurement based on eye tracking technique. In Proceedings of the ICTIS 2011: Multimodal Approach to Sustained Transportation System Development: Information, Technology, Implementation, Wuhan, China, 30 June–2 July 2011; pp. 1723–1729. [Google Scholar]
  58. Weibel, N.; Fouse, A.; Emmenegger, C.; Kimmich, S.; Hutchins, E. Let’s look at the cockpit: Exploring mobile eye-tracking for observational research on the flight deck. In Proceedings of the Symposium on Eye Tracking Research and Applications, Santa Barbara, CA, USA, 28–30 March 2012; pp. 107–114. [Google Scholar] [CrossRef]
  59. Hasse, C.; Grasshoff, D.; Bruder, C. Eye-tracking parameters as a predictor of human performance in the detection of automation failures. In Proceedings of the Human Factors: A View from an Integrative Perspective, Toulouse, France, 10–11 October 2012. [Google Scholar]
  60. Moacdieh, N.M.; Prinet, J.C.; Sarter, N.B. Effects of modern primary flight display clutter: Evidence from performance and eye tracking data. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Perth, Australia, 2–4 December 2013. [Google Scholar] [CrossRef]
  61. Robinski, M.; Stein, M. Tracking visual scanning techniques in training simulation for helicopter landing. J. Eye Mov. Res. 2013, 6. [Google Scholar] [CrossRef]
  62. van Meeuwen, L.W.; Jarodzka, H.; Brand-Gruwel, S.; Kirschner, P.A.; de Bock, J.J.P.R.; van Merriënboer, J.J.G. Identification of effective visual problem solving strategies in a complex visual domain. Learn Instr. 2014, 32, 10–21. [Google Scholar] [CrossRef]
  63. Allsop, J.; Gray, R. Flying under pressure: Effects of anxiety on attention and gaze behavior in aviation. J. Appl. Res. Mem. Cogn. 2014, 3, 63–71. [Google Scholar] [CrossRef]
  64. Stankovic, A.; Aitken, M.R.F.; Clark, L. An eye-tracking study of information sampling and decision-making under stress: Implications for alarms in aviation emergencies. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Chicago, IL, USA, 27–31 October 2014; Volume 58, pp. 125–129. [Google Scholar] [CrossRef]
  65. Bruder, C.; Eißfeldt, H.; Maschke, P.; Hasse, C. A model for future aviation: Operators monitoring appropriately. APAHF 2014, 4, 13–22. [Google Scholar] [CrossRef]
  66. Dill, E.T.; Young, S.D. Analysis of Eye-Tracking Data with Regards to the Complexity of Flight Deck Information Automation and Management—Inattentional Blindness, System State Awareness, and EFB Usage. In Proceedings of the 15th AIAA Aviation Technology, Integration, and Operations Conference, Dallas, TX, USA, 22–26 June 2015; p. 2901. [Google Scholar]
  67. Dehais, F.; Peysakhovich, V.; Scannella, S.; Fongue, J.; Gateau, T. “Automation surprise” in aviation: Real-time solutions. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Korea, 18–23 April 2015; pp. 2525–2534. [Google Scholar] [CrossRef] [Green Version]
  68. Vine, S.J.; Uiga, L.; Lavric, A.; Moore, L.J.; Tsaneva-Atanasova, K.; Wilson, M.R. Coping. Individual reactions to stress predict performance during a critical aviation incident. Anxiety Stress Coping 2015, 28, 467–477. [Google Scholar] [CrossRef]
  69. Hasse, C.; Bruder, C. Eye-tracking measurements and their link to a normative model of monitoring behaviour. Ergonomics 2015, 58, 355–367. [Google Scholar] [CrossRef]
  70. Li, W.-C.; Lin, J.J.; Braithwaite, G.; Greaves, M. The development of eye tracking in aviation (ETA) technique to investigate pilot’s cognitive processes of attention and decision-making. In Proceedings of the 32nd Conference of the European Association for Aviation Psychology, Cascais, Portugal, 26–30 September 2016. [Google Scholar]
  71. Dehais, F.; Behrend, J.; Peysakhovich, V.; Causse, M.; Wickens, C.D. Pilot flying and pilot monitoring’s aircraft state awareness during go-around execution in aviation: A behavioral and eye tracking study. Int. J. Aerosp. Psychol. 2017, 27, 15–28. [Google Scholar] [CrossRef] [Green Version]
  72. Kim, J.H.; Zhao, X.; Du, W. Assessing the performance of visual identification tasks using time window-based eye inter-fixation duration. Int. J. Ind. Ergon. 2018, 64, 15–22. [Google Scholar] [CrossRef]
  73. Gontar, P.; Homans, H.; Rostalski, M.; Behrend, J.; Dehais, F.; Bengler, K. Are pilots prepared for a cyber-attack? A human factors approach to the experimental evaluation of pilots’ behavior. J. Air Transp. Manag. 2018, 69, 26–37. [Google Scholar] [CrossRef] [Green Version]
  74. Li, W.-C.; Kearney, P.; Braithwaite, G.; Lin, J.J.H. How much is too much on monitoring tasks? Visual scan patterns of single air traffic controller performing multiple remote tower operations. Int. J. Ind. Ergon. 2018, 67, 135–144. [Google Scholar] [CrossRef]
  75. Skvarekova, I.; Skultety, F. Objective Measurement of Pilot’s Attention Using Eye Track Technology during IFR Flights. TRPRO 2019, 40, 1555–1562. [Google Scholar] [CrossRef]
  76. Bruder, C.; Hasse, C. Differences between experts and novices in the monitoring of automated systems. Int. J. Ind. Ergon. 2019, 72, 1–11. [Google Scholar] [CrossRef]
  77. Diaz-Piedra, C.; Rieiro, H.; Cherino, A.; Fuentes, L.J.; Catena, A.; Di Stasi, L.L. The effects of flight complexity on gaze entropy: An experimental study with fighter pilots. Appl. Ergon. 2019, 77, 92–99. [Google Scholar] [CrossRef] [PubMed]
  78. Brezonakova, A.; Skvarekova, I.; Pecho, P.; Davies, R.; Bugaj, M.; Kandera, B. The effects of back lit aircraft instrument displays on pilots fatigue and performance. TRPRO 2019, 40, 1273–1280. [Google Scholar] [CrossRef]
  79. Ryffel, C.P.; Muehlethaler, C.M.; Huber, S.M.; Elfering, A. Eye tracking as a debriefing tool in upset prevention and recovery training (UPRT) for general aviation pilots. Ergonomics 2019, 62, 319–329. [Google Scholar] [CrossRef]
  80. Rudi, D.; Kiefer, P.; Raubal, M. The instructor assistant system (iASSYST)-utilizing eye tracking for commercial aviation training purposes. Ergonomics 2020, 63, 61–79. [Google Scholar] [CrossRef]
  81. Behrend, J.; Dehais, F. How role assignment impacts decision-making in high-risk environments: Evidence from eye-tracking in aviation. Saf. Sci. 2020, 127, 104738. [Google Scholar] [CrossRef]
  82. Mohammadpour, A.; Asadi, S.; Karan, E.; Rothrock, L. Measuring end-user satisfaction in the design of building projects using eye-tracking technology. J. Comput. Civ. Eng. 2015, 564–571. [Google Scholar] [CrossRef]
  83. Hasanzadeh, S.; Esmaeili, B.; Dodd, M.D. Measuring the Impacts of Safety Knowledge on Construction Workers’ Attentional Allocation and Hazard Detection Using Remote Eye-Tracking Technology. J. Constr. Eng. Manag. 2017, 33, 4017024. [Google Scholar] [CrossRef] [Green Version]
  84. Hasanzadeh, S.; Esmaeili, B.; Dodd, M.D. Impact of construction workers’ hazard identification skills on their visual attention. J. Constr. Eng. Manag. 2017, 143, 04017070. [Google Scholar] [CrossRef] [Green Version]
  85. Bhoir, S.A.; Hasanzadeh, S.; Esmaeili, B.; Dodd, M.D.; Fardhosseini, M.A. Measuring construction workers’ attention using eye-tracking technology. In Proceedings of the Canadian Society for Civil Engineering 5th International/11th Construction Specialty Conference, Vancouver, BC, Canada, 8–10 June 2015. [Google Scholar]
  86. Hasanzadeh, S.; Dao, B.; Esmaeili, B.; Dodd, M.D. Measuring the Impact of Working Memory Load on the Safety Performance of Construction Workers. J. Comput. Civ. Eng. 2017, 158–166. [Google Scholar] [CrossRef]
  87. Jeelani, I.; Han, K.; Albert, A. Automating and scaling personalized safety training using eye-tracking data. Autom. Constr. 2018, 93, 63–77. [Google Scholar] [CrossRef]
  88. Chew, J.Y.; Ohtomi, K.; Suzuki, H. Glance behavior as design indices of in-vehicle visual support system: A study using crane simulators. Appl. Ergon. 2018, 73, 183–193. [Google Scholar] [CrossRef]
  89. Hasanzadeh, S.; Esmaeili, B.; Dodd, M.D. Examining the relationship between construction workers’ visual attention and situation awareness under fall and tripping hazard conditions: Using mobile eye tracking. J. Constr. Eng. Manag. 2018, 144, 04018060. [Google Scholar] [CrossRef] [Green Version]
  90. Wang, T.-K.; Huang, J.; Liao, P.-C.; Piao, Y. Does augmented reality effectively foster visual learning process in construction? An eye-tracking study in steel installation. Adv. Civ. Eng. 2018. [Google Scholar] [CrossRef]
  91. Xu, Q.; Chong, H.-Y.; Liao, P.-C. Exploring eye-tracking searching strategies for construction hazard recognition in a laboratory scene. Saf. Sci. 2019, 120, 824–832. [Google Scholar] [CrossRef]
  92. Li, J.; Li, H.; Wang, H.; Umer, W.; Fu, H.; Xing, X. Evaluating the impact of mental fatigue on construction equipment operators’ ability to detect hazards using wearable eye-tracking technology. Autom. Constr. 2019, 105, 102835. [Google Scholar] [CrossRef]
  93. Jeelani, I.; Albert, A.; Han, K.; Azevedo, R. Are Visual Search Patterns Predictive of Hazard Recognition Performance? Empirical Investigation Using Eye-Tracking Technology. J. Constr. Eng. Manag. 2019, 145, 4018115. [Google Scholar] [CrossRef]
  94. Ye, X.; König, M. Applying eye tracking in virtual construction environments to improve cognitive data collection and human-computer interaction of site hazard identification. In Proceedings of the ISARC—International Symposium on Automation and Robotics in Construction, Banff, AB, Canada, 21–24 May 2019; pp. 1073–1080. [Google Scholar] [CrossRef]
  95. Li, J.; Li, H.; Umer, W.; Wang, H.; Xing, X.; Zhao, S.; Hou, J. Identification and classification of construction equipment operators’ mental fatigue using wearable eye-tracking technology. Autom. Constr. 2020, 109, 103000. [Google Scholar] [CrossRef]
  96. Petrescu, R.V.; Aversa, R.; Akash, B.; Corchado, J.; Berto, F.; Apicella, A.; Petrescu, F.I. When boeing is dreaming—A review. J. Aircr. Spacecr. Tech. 2017, 1, 13. [Google Scholar]
  97. Commerce, U.S.D.o. Germany—Aerospace/Defense/Security. Available online: https://www.privacyshield.gov/article?id=Germany-Aerospace-Defense-Security#:~:text=Leading%20Sub%2DSectors,France%20at%20USD%2077.2%20billion.&text=Aerospace%20is%20a%20German%20Government%20priority (accessed on 7 July 2020).
  98. Pandey, A. Airbus topples Boeing as biggest plane maker. Available online: https://www.dw.com/en/airbus-topples-boeing-as-biggest-plane-maker/a-49536539 (accessed on 3 February 2020).
  99. Commerce, U.S.D.o. United Kingdom—Aerospace. Available online: https://www.export.gov/apex/article2?id=United-Kingdom-Aerospace#:~:text=The%20UK%20aerospace%20industry%20is,a%2017%25%20global%20market%20share.&text=The%20aerospace%20industry%20is%20a,exports%20in%20the%20United%20Kingdom (accessed on 7 July 2020).
  100. Benito, G.R.; Berger, E.; de la Forest, M.; Sum, J. Industrial clusters and foreign companies’ centres of excellence in Norway. In The Emergence and Impact of MNC Centres of Excellence: A Subsidiary Perspective; St. Martin’s Press: New York, NY, USA, 2000; pp. 97–112. [Google Scholar]
  101. Wang, T. US Construction Industry—Statistics & Facts. Available online: https://www.statista.com/topics/974/construction/ (accessed on 7 July 2020).
  102. Ahmad, M.; Zhao, Z.-Y.; Li, H. Revealing stylized empirical interactions among construction sector, urbanization, energy consumption, economic growth and CO2 emissions in China. Sci. Total Environ. 2019, 657, 1085–1098. [Google Scholar] [CrossRef]
  103. Huang, L.; Krigsvoll, G.; Johansen, F.; Liu, Y.; Zhang, X. Carbon emission of global construction sector. Renew. Sust. Energ. Rev. 2018, 81, 1906–1916. [Google Scholar] [CrossRef] [Green Version]
  104. Mauri, M.; Elli, T.; Caviglia, G.; Uboldi, G.; Azzi, M. RAWGraphs: A visualisation platform to create open outputs. In Proceedings of the 12th Biannual Conference on Italian SIGCHI Chapter, Cagliari, Italy, 18–20 September 2017; pp. 1–5. [Google Scholar]
  105. Baritz, M.I.; Lazar, A.M. Methodology for monitoring the behavior of the visual system. In Proceedings of the E-Health and Bioengineering Conference (EHB), Iasi, Romania, 17–18 September 2009; pp. 1–4. [Google Scholar] [CrossRef]
  106. Kleinke, C.L. Gaze and Eye Contact: A Research Review. Psychol. Bull. 1986, 100, 78–100. [Google Scholar] [CrossRef]
  107. Land, M.F.; Furneaux, S. The knowledge base of the oculomotor system. Philos. Trans. R. Soc. Lond. B Biol. Sci. 1997, 352, 1231–1239. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  108. Holmqvist, K.; Nyström, M.; Andersson, R.; Dewhurst, R.; Jarodzka, H.; Van de Weijer, J. Eye Tracking: A Comprehensive Guide to Methods and Measures; OUP: Oxford, UK, 2011. [Google Scholar]
  109. Ziv, G. Gaze behavior and visual attention: A review of eye tracking studies in aviation. Int. J. Aviat. Psychol. 2016, 26, 75–104. [Google Scholar] [CrossRef]
  110. Häggström, C.; Englund, M.; Lindroos, O. Examining the gaze behaviors of harvester operators: An eye-tracking study. Int. J. For. Eng. 2015, 26, 96–113. [Google Scholar] [CrossRef]
  111. Hareide, O.S.; Ostnes, R. Scan Pattern for the Maritime Navigator. TransNav. 2017, 11, 39–47. [Google Scholar] [CrossRef] [Green Version]
  112. Snowden, R.J.; Thompson, P.; Troscianko, T. Basic Vision: An Introduction to Visual Perception; Oxford University Press: Oxford, UK, 2012. [Google Scholar]
  113. Muczyński, B.; Gucma, M. Application of eye-tracking techniques in human factor research in marine operations. Challenges and methodology. Zesz. Nauk. Akad. Morska Szczec. 2013, 36, 116–120. [Google Scholar]
  114. Kar, A.; Corcoran, P. A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms. IEEE Access 2017, 5, 16495–16519. [Google Scholar] [CrossRef]
  115. Raschke, M.; Blascheck, T.; Richter, M.; Agapkin, T.; Ertl, T. Visual analysis of perceptual and cognitive processes. In Proceedings of the International Conference on Information Visualization Theory and Applications (IVAPP), Lisbon, Portugal, 5–8 January 2014; pp. 284–291. [Google Scholar]
  116. Takahashi, R.; Suzuki, H.; Chew, J.Y.; Ohtake, Y.; Nagai, Y.; Ohtomi, K. A system for three-dimensional gaze fixation analysis using eye tracking glasses. J. Comput. Des. Eng. 2018, 5, 449–457. [Google Scholar] [CrossRef]
  117. Bojko, A.A. Informative or misleading? Heatmaps deconstructed. In Proceedings of the International Conference on Human-Computer Interaction, Berlin/Heidelberg, Germany, 19–24 July 2009; pp. 30–39. [Google Scholar] [CrossRef]
  118. Pfeiffer, T.; Memili, C. Model-based real-time visualization of realistic three-dimensional heat maps for mobile eye tracking and eye tracking in virtual reality. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016; pp. 95–102. [Google Scholar]
  119. Laeng, B.; Teodorescu, D.-S. Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cogn. Sci. 2002, 26, 207–231. [Google Scholar] [CrossRef]
  120. Goldberg, J.; Stimson, M.; Lewenstein, M.; Scott, N.; Wichansky, A. Eye tracking in web search tasks: Design implications. In Proceedings of the Symposium on Eye Tracking Research & Applications, Orleans, LA, USA, 25–27 March 2002; pp. 51–58. [Google Scholar]
  121. Duchowski, A.T. Eye tracking techniques. In Eye Tracking Methodology: Theory and Practice; Springer International Publishing: Cham, Switzerland, 2017; pp. 49–57. [Google Scholar] [CrossRef]
  122. Majaranta, P.; Bulling, A. Eye tracking and eye-based human–computer interaction. In Advances in Physiological Computing; Springer: London, UK, 2014; pp. 39–65. [Google Scholar] [CrossRef]
  123. Sprenger, A.; Neppert, B.; Köster, S.; Gais, S.; Kömpf, D.; Helmchen, C.; Kimmig, H. Long-term eye movement recordings with a scleral search coil-eyelid protection device allows new applications. J. Neurosci. 2008, 170, 305–309. [Google Scholar] [CrossRef]
  124. Van der Geest, J.N.; Frens, M.A. Recording eye movements with video-oculography and scleral search coils: A direct comparison of two methods. J. Neurosci. 2002, 114, 185–195. [Google Scholar] [CrossRef]
  125. Ruetsche, A.; Baumann, A.; Jiang, X.; Mojon, D.S. Automated analysis of eye tracking movements. Ophthalmologica 2003, 217, 320–324. [Google Scholar] [CrossRef] [Green Version]
  126. Zhu, Z.; Fujimura, K.; Ji, Q. Real-time eye detection and tracking under various light conditions. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, New Orleans, LA, USA, 25–27 March 2002; pp. 139–144. [Google Scholar]
  127. Galdi, C.; Nappi, M.; Riccio, D.; Wechsler, H. Eye movement analysis for human authentication: A critical survey. Pattern Recognit. Lett. 2016, 84, 272–283. [Google Scholar] [CrossRef]
  128. Tobii. How do Tobii Eye Trackers Work? Available online: https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/how-do-tobii-eye-trackers-work/ (accessed on 31 May 2021).
  129. Morimoto, C.H.; Mimica, M.R.M. Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 2005, 98, 4–24. [Google Scholar] [CrossRef]
  130. Goldberg, J.H.; Wichansky, A.M. Eye tracking in usability evaluation. A practitioner’s guide. In The Mind’s Eye Cognitive and Applied Aspects of Eye Movement Research; Elsevier: Amsterdam, The Netherlands, 2003; pp. 493–516. [Google Scholar] [CrossRef]
  131. Larrazabal, A.J.; García Cena, C.E.; Martínez, C.E. Video-oculography eye tracking towards clinical applications: A review. Comput. Biol. Med. 2019, 108, 57–66. [Google Scholar] [CrossRef]
  132. Sharafi, Z.; Soh, Z.; Guéhéneuc, Y.-G. A systematic literature review on the usage of eye-tracking in software engineering. Inf. Softw. Technol. 2015, 67, 79–107. [Google Scholar] [CrossRef]
  133. Klaib, A.F.; Alsrehin, N.O.; Melhem, W.Y.; Bashtawi, H.O.; Magableh, A.A. Eye tracking algorithms, techniques, tools, and applications with an emphasis on machine learning and Internet of Things technologies. Expert Syst. Appl. 2021, 166. [Google Scholar] [CrossRef]
  134. Kar, A. Machine learning-based analysis of gaze error patterns in consumer eye tracking systems. Vision 2020, 4, 25. [Google Scholar] [CrossRef]
  135. Cognolato, M.; Atzori, M.; Müller, H. Head-mounted eye gaze tracking devices: An overview of modern devices and recent advances. RATE 2018, 5. [Google Scholar] [CrossRef]
  136. Kovesdi, C.; Spielman, Z.; LeBlanc, K.; Rice, B. Application of eye tracking for measurement and evaluation in human factors studies in control room modernization. Nucl. Technol. 2018, 202, 220–229. [Google Scholar] [CrossRef]
  137. Andrychowicz-Trojanowska, A. Basic terminology of eye-tracking research. Appl. Linguist. Pap. 2018, 123–132. [Google Scholar] [CrossRef]
  138. Mento, M.A. Different Kinds of Eye Tracking Devices. Available online: https://www.bitbrain.com/blog/eye-tracking-devices (accessed on 17 May 2021).
  139. Wang, D.; Mulvey, F.B.; Pelz, J.B.; Holmqvist, K. A study of artificial eyes for the measurement of precision in eye-trackers. Behav. Res. Methods 2017, 49, 947–959. [Google Scholar] [CrossRef] [Green Version]
  140. Feit, A.M.; Williams, S.; Toledo, A.; Paradiso, A.; Kulkarni, H.; Kane, S.; Morris, M.R. Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the Chi Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 1118–1130. [Google Scholar]
  141. Lim, Y.; Gardi, A.; Ezer, N.; Kistan, T.; Sabatini, R. Eye-Tracking Sensors for Adaptive Aerospace Human-Machine Interfaces and Interactions. In Proceedings of the 2018 5th IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace), Rome, Italy, 20–22 June 2018; pp. 311–316. [Google Scholar]
  142. Lim, Y.; Gardi, A.; Pongsakornsathien, N.; Sabatini, R.; Ezer, N.; Kistan, T. Experimental characterisation of eye-tracking sensors for adaptive human-machine systems. Measurement 2019, 140, 151–160. [Google Scholar] [CrossRef]
  143. Komogortsev, O.; Khan, J. Eye movement prediction by Kalman filter with integrated linear horizontal oculomotor plant mechanical model. In Proceedings of the Symposium on Eye Tracking Research & Applications, Savannah, GA, USA, 26–28 March 2008; pp. 229–236. [Google Scholar]
  144. Hansen, D.W.; Ji, Q. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE PAMI 2010, 32, 478–500. [Google Scholar] [CrossRef]
  145. Zhang, Y.; Hornof, A.J. Mode-of-disparities error correction of eye-tracking data. Behav. Res. Methods 2011, 43, 834–842. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  146. Zhang, X.; MacKenzie, I.S. Evaluating Eye Tracking with ISO 9241—Part 9. In Proceedings of the International Conference on Human-Computer Interaction, Berlin/Heidelberg, Germany, 22–27 July 2007; Volume 4552, pp. 779–788. [Google Scholar] [CrossRef] [Green Version]
  147. Stein, N.; Niehorster, D.C.; Watson, T.; Steinicke, F.; Rifai, K.; Wahl, S.; Lappe, M. A Comparison of Eye Tracking Latencies among Several Commercial Head-Mounted Displays. i-Perception 2021, 12. [Google Scholar] [CrossRef]
  148. Nyström, M.; Andersson, R.; Holmqvist, K.; van de Weijer, J. The influence of calibration method and eye physiology on eyetracking data quality. Behav. Res. Methods 2013, 45, 272–288. [Google Scholar] [CrossRef]
  149. Holmqvist, K.; Nyström, M.; Mulvey, F. Eye tracker data quality: What it is and how to measure it. In Proceedings of the ETRA ’12: Symposium on Eye Tracking Research and Applications, Santa Barbara, CA, USA, 28–30 March 2012; pp. 45–52. [Google Scholar] [CrossRef]
  150. Andersson, R.; Larsson, L.; Holmqvist, K.; Stridh, M.; Nyström, M. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms. Behav. Res. Methods 2017, 49, 616–637. [Google Scholar] [CrossRef] [Green Version]
  151. Fuhl, W.; Eivazi, S.; Hosp, B.; Eivazi, A.; Rosenstiel, W.; Kasneci, E. BORE: Boosted-oriented edge optimization for robust, real time remote pupil center detection. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland, 14–17 June 2018; pp. 1–5. [Google Scholar]
  152. Fuhl, W.; Gao, H.; Kasneci, E. Tiny convolution, decision tree, and binary neuronal networks for robust and real time pupil outline estimation. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications, Stuttgart, Germany, 2–5 June 2020; pp. 1–5. [Google Scholar]
  153. Fuhl, W.; Gao, H.; Kasneci, E. Neural networks for optical vector and eye ball parameter estimation. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications, New York, NY, USA, 2–5 June 2020; pp. 1–5. [Google Scholar]
  154. Fuhl, W.; Rosenstiel, W.; Kasneci, E. 500,000 Images Closer to Eyelid and Pupil Segmentation. In Computer Analysis of Images and Patterns; Springer: Cham, Switzerland, 2019; pp. 336–347. [Google Scholar] [CrossRef]
  155. Fuhl, W.; Rong, Y.; Kasneci, E. Fully Convolutional Neural Networks for Raw Eye Tracking Data Segmentation, Generation, and Reconstruction. In Proceedings of the 25th International Conference on Pattern Recognition, Milan, Italy, 10–15 January 2021; pp. 142–149. [Google Scholar]
  156. Valtakari, N.V.; Hooge, I.T.C.; Viktorsson, C.; Nyström, P.; Falck-Ytter, T.; Hessels, R.S. Eye tracking in human interaction: Possibilities and limitations. Behav. Res. Methods 2021. [Google Scholar] [CrossRef] [PubMed]
  157. Gao, Q.; Wang, Y.; Song, F.; Li, Z.; Dong, X. Mental workload measurement for emergency operating procedures in digital nuclear power plants. Ergonomics 2013, 56, 1070–1085. [Google Scholar] [CrossRef]
  158. Nachreiner, F. Standards for ergonomics principles relating to the design of work systems and to mental workload. Appl. Ergon. 1995, 26, 259–263. [Google Scholar] [CrossRef]
  159. Ahlstrom, U.; Friedman-Berg, F.J. Using eye movement activity as a correlate of cognitive workload. Int. J. Ind. Ergon. 2006, 36, 623–636. [Google Scholar] [CrossRef]
  160. Lim, Y.; Gardi, A.; Sabatini, R.; Ramasamy, S.; Kistan, T.; Ezer, N.; Vince, J.; Bolia, R. Avionics human-machine interfaces and interactions for manned and unmanned aircraft. Prog. Aerosp. Sci. 2018, 102, 1–46. [Google Scholar] [CrossRef]
  161. Endsley, M.R. Design and evaluation for situation awareness enhancement. In Proceedings of the Human Factors Society 32nd Annual Meeting, Virtual Meeting, Los Angeles, CA, USA, 1 October 1988; pp. 97–101. [Google Scholar] [CrossRef]
  162. Raza, M.A.; Salehi, S.; Ghazal, S.; Ybarra, V.T.; Mehdi Naqvi, S.A.; Cokely, E.T.; Teodoriu, C. Situational awareness measurement in a simulation-based training framework for offshore well control operations. J. Loss. Prev. Process Ind. 2019, 62, 103921. [Google Scholar] [CrossRef]
  163. Sik-wah Fong, P.; Chu, L. Exploratory study of knowledge sharing in contracting companies: A sociotechnical perspective. J. Constr. Eng. Manag. 2006, 132, 928–939. [Google Scholar] [CrossRef]
  164. Mahroeian, H.; Forozia, A. Challenges in managing tacit knowledge: A study on difficulties in diffusion of tacit knowledge in organizations. IJBSS 2012, 3, 303–308. [Google Scholar]
  165. Lohmeyer, Q.; Meboldt, M.; Matthiesen, S. Analysing visual strategies of novice and experienced designers by eye tracking application. In Proceedings of the DS 76: E&PDE 2013, 15th International Conference on Engineering and Product Design Education, Dublin, Ireland, 5–6 September 2013; pp. 202–207. [Google Scholar]
  166. Jeelani, I.; Albert, A.; Gambatese, J.A. Why do construction hazards remain unrecognized at the work interface? J. Constr. Eng. Manag. 2017, 143, 04016128. [Google Scholar] [CrossRef]
  167. Carter, G.; Smith, S.D. Safety hazard identification on construction projects. J. Constr. Eng. Manag. 2006, 132, 197–205. [Google Scholar] [CrossRef]
  168. Slovic, P.; Fischhoff, B.; Lichtenstein, S. Why Study Risk Perception? Risk Anal. 1982, 2, 83–93. [Google Scholar] [CrossRef]
  169. Slovic, P.; Peters, E. Risk Perception and Affect. Curr. Dir. Psychol. Sci. 2006, 15, 322–325. [Google Scholar] [CrossRef]
  170. Van Cutsem, J.; Marcora, S.; De Pauw, K.; Bailey, S.; Meeusen, R.; Roelands, B. The effects of mental fatigue on physical performance: A systematic review. Sports Med. 2017, 47, 1569–1588. [Google Scholar] [CrossRef] [Green Version]
  171. Roets, B.; Christiaens, J. Shift work, fatigue, and human error: An empirical analysis of railway traffic control. J. Transp. Saf. Secur. 2019, 11, 207–224. [Google Scholar] [CrossRef]
  172. Wascher, E.; Rasch, B.; Sänger, J.; Hoffmann, S.; Schneider, D.; Rinkenauer, G.; Heuer, H.; Gutberlet, I. Frontal theta activity reflects distinct aspects of mental fatigue. Biol. Psychol. 2014, 96, 57–65. [Google Scholar] [CrossRef]
  173. Möckel, T.; Beste, C.; Wascher, E. The Effects of Time on Task in Response Selection—An ERP Study of Mental Fatigue. Sci. Rep. 2015, 5, 10113. [Google Scholar] [CrossRef] [Green Version]
  174. Marcora, S.M.; Staiano, W.; Manning, V. Mental fatigue impairs physical performance in humans. J. Appl. Physiol. 2009, 106, 857–864. [Google Scholar] [CrossRef]
  175. May, J.F.; Baldwin, C.L. Driver fatigue: The importance of identifying causal factors of fatigue when considering detection and countermeasure technologies. Transp. Res. Part F Traffic Psychol. Behav. 2009, 12, 218–224. [Google Scholar] [CrossRef]
  176. Gupta, C.C.; Centofanti, S.; Rauffet, P.; Banks, S.; Coppin, G.; Chauvin, C. Framework and metrics for online fatigue monitoring within submarine teams working in 24/7 environments. IFAC PapersOnLine 2019, 52, 259–264. [Google Scholar] [CrossRef]
  177. Anund, A.; Fors, C.; Kecklund, G.; Leeuwen, W.V.; Åkerstedt, T. Countermeasures for Fatigue in Transportation: A Review of Existing Methods for Drivers on Road, Rail, Sea and in Aviation; Swedish Transport Agency: Norrkoping, Sweeden, 2015. [Google Scholar]
  178. Hopstaken, J.; Linden, D.V.D.; Bakker, A.B.; Kompier, M.A.J. A multifaceted investigation of the link between mental fatigue and task disengagement. Psychophysiology 2015, 52, 305–315. [Google Scholar] [CrossRef]
  179. Simonovic, B.; Stupple, E.J.; Gale, M.; Sheffield, D. Performance under stress: An eye-tracking investigation of the Iowa Gambling Task (IGT). Front. Behav. Neurosci. 2018, 12, 217. [Google Scholar] [CrossRef]
  180. Novak, A.; Mrazova, M. Research of physiological factors affecting pilot performance in flight simulation training device. Commun. Sci. Lett. Univ. Zilina 2015, 17, 103–107. [Google Scholar]
  181. Keinan, G. Decision making under stress: Scanning of alternatives under controllable and uncontrollable threats. J. Pers. Soc. Psychol. 1987, 52, 639. [Google Scholar] [CrossRef]
  182. Janis, I.L.; Mann, L. Emergency decision making: A theoretical analysis of responses to disaster warnings. J. Hum. Stress. 1977, 3, 35–48. [Google Scholar] [CrossRef]
  183. Frederick, S. Cognitive reflection and decision making. J. Econ. Perspect. 2005, 19, 25–42. [Google Scholar] [CrossRef] [Green Version]
  184. Elliot, A.J. Challenge and threat. In Handbook of Approach and Avoidance Motivation; Psychology Press: New York, NY, USA, 2008; pp. 431–445. [Google Scholar]
  185. Seery, M.D. Challenge or threat? Cardiovascular indexes of resilience and vulnerability to potential stress in humans. Neurosci. Biobehav. Rev. 2011, 35, 1603–1610. [Google Scholar] [CrossRef]
  186. Staal, M.A. Stress, Cognition, and Human Performance: A Literature Review and Conceptual Framework; National Aeronautics and Space Administration (NASA): Washington, DC, USA, 2004. [Google Scholar]
  187. Spielberger, C.D.; Gonzalez-Reigosa, F.; Martinez-Urrutia, A.; Natalicio, L.; Natalicio, D.S. Development of the Spanish edition of the state-trait anxiety inventory. Rev. Interam. J. Psychol. 1971, 5, 145–158. [Google Scholar]
  188. Stokes, A.; Kite, K. Flight stress: Stress, Fatigue, and Performance in Aviation; Avebury Aviation: Brookfield, VT, USA; Aldershot: Hants, UK, 1994. [Google Scholar]
  189. Eysenck, M.W.; Derakshan, N.; Santos, R.; Calvo, M.G. Anxiety and cognitive performance: Attentional control theory. Emotion 2007, 7, 336–353. [Google Scholar] [CrossRef] [Green Version]
  190. Gonzalez, C. Decision support for real-time, dynamic decision-making tasks. Organ. Behav. Hum. Decis. Process. 2005, 96, 142–154. [Google Scholar] [CrossRef]
  191. Glaeser, E.L.; Laibson, D.I.; Scheinkman, J.A.; Soutter, C.L. Measuring trust. Q. J. Econ. 2000, 115, 811–846. [Google Scholar] [CrossRef] [Green Version]
  192. Gambetta, D. Can we trust? In Trust: Making Breaking Cooperative Relations; B. Blackwell: Oxford, UK, 2000; Volume 13, pp. 213–237. [Google Scholar]
  193. Lewicki, R.J.; Wiethoff, C. Trust, trust development, and trust repair. In The Handbook of Conflict Resolution: Theory Practice; Jossey-Bass: Hoboken, NJ, USA, 2006; pp. 92–119. [Google Scholar]
  194. Baddeley, A. Working memory: The interface between memory and cognition. J. Cogn. Neurosci. 1992, 4, 281–288. [Google Scholar] [CrossRef]
  195. Howard, M.W.; Rizzuto, D.S.; Caplan, J.B.; Madsen, J.R.; Lisman, J.; Aschenbrenner-Scheibe, R.; Schulze-Bonhage, A.; Kahana, M. Gamma oscillations correlate with working memory load in humans. Cereb. Cortex. 2003, 13, 1369–1374. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  196. Bouchacourt, F.; Buschman, T.J. A flexible model of working memory. Neuron 2019, 103, 147–160. [Google Scholar] [CrossRef]
  197. Kim, S.-Y.; Kim, M.-S.; Chun, M.M. Concurrent working memory load can reduce distraction. Proc. Natl. Acad. Sci. USA 2005, 102, 16524–16529. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  198. Jain, S.; Nataraja, N.P. The Effect of Fatigue on Working Memory and Auditory Perceptual Abilities in Trained Musicians. Am. J. Audiol. 2019, 28, 483–494. [Google Scholar] [CrossRef] [PubMed]
  199. Herlambang, M.B.; Cnossen, F.; Taatgen, N.A. The effects of intrinsic motivation on mental fatigue. PLoS ONE 2021, 16, e0243754. [Google Scholar] [CrossRef] [PubMed]
  200. De Fockert, J.W.; Leiser, J. Better target detection in the presence of collinear flankers under high working memory load. Front. Hum. Neurosci. 2014, 8, 821. [Google Scholar] [CrossRef] [Green Version]
  201. Judah, M.R.; Grant, D.M.; Lechner, W.V.; Mills, A.C. Working memory load moderates late attentional bias in social anxiety. Cogn. Emot. 2013, 27, 502–511. [Google Scholar] [CrossRef]
  202. Redick, T.S.; Engle, R.W. Working memory capacity and attention network test performance. Appl. Cogn. Psychol. 2006, 20, 713–721. [Google Scholar] [CrossRef]
  203. Kopp, W.; Hanson, M.A. High-fidelity and gaming simulations enhance nursing education in end-of-life care. Clin. Simul. Nurs. 2012, 8, e97–e102. [Google Scholar] [CrossRef]
  204. Van der Linden, D. The Urge to Stop: The Cognitive and Biological Nature of Acute Mental Fatigue; Ackerman, P.L., Ed.; American Psychological Association: Washington, DC, USA, 2011; pp. 149–164. [Google Scholar] [CrossRef]
  205. Boksem, M.A.S.; Meijman, T.F.; Lorist, M.M. Mental fatigue, motivation and action monitoring. Biol. Psychol. 2006, 72, 123–132. [Google Scholar] [CrossRef]
  206. Liu, Y.; Sourina, O.; Liew, H.P.; Salem, H.S.; Ang, E. Human factors evaluation in maritime virtual simulators using mobile EEG-based neuroimaging. In Transdisciplinary Engineering: A Paradigm Shift; IOS Press: Amsterdam, The Netherlands, 2017; Volume 5, pp. 261–268. [Google Scholar]
  207. Tracey, I.; Flower, R. The warrior in the machine: Neuroscience goes to war. Nat. Rev. Neurosci. 2014, 15, 825–834. [Google Scholar] [CrossRef]
  208. Selvaraj, J.; Murugappan, M.; Wan, K.; Yaacob, S. Classification of emotional states from electrocardiogram signals: A non-linear approach based on Hurst. Biomed. Eng. Online 2013, 12, 44. [Google Scholar] [CrossRef] [Green Version]
  209. Brás, S.; Ferreira, J.H.T.; Soares, S.C.; Pinho, A.J. Biometric and Emotion Identification: An ECG Compression Based Method. Front. Psychol. 2018, 9. [Google Scholar] [CrossRef] [Green Version]
  210. Hu, X.; Lodewijks, G. Detecting fatigue in car drivers and aircraft pilots by using non-invasive measures: The value of differentiation of sleepiness and mental fatigue. J Safety Res. 2020, 72, 173–187. [Google Scholar] [CrossRef]
  211. Li, W.; Mo, R.; Yu, S.; Chu, J.; Hu, Y.; Wang, L. The effects of the seat cushion contour and the sitting posture on surface pressure distribution and comfort during seated work. Int. J. Occup. Med. Environ. Health 2020, 33, 675–689. [Google Scholar] [CrossRef]
  212. Leban, B.; Arippa, F.; Fancello, G.; Fadda, P.; Pau, M. Analysis of discomfort during a 4-hour shift in quay crane operators objectively assessed through in-chair movements. In Proceedings of the Congress of the International Ergonomics Association, Florence, Italy, 31 August–1 September 2018; pp. 90–100. [Google Scholar] [CrossRef]
  213. Tekscan. Seated & Body Pressure Measurement. Available online: https://biosensemedical.com/seated-body-pressure-measurement/ (accessed on 4 January 2021).
  214. Wang, Y.; Xing, L.-F.; Huang, Y.-Q. Polyurethane Foam Performances’ Influence on Body Pressure Distribution on an Automotive Seat. In Proceedings of the New Energy & Intelligent Connected Vehicle Technology Conference, Shanghai, China, 21–22 May 2019. [Google Scholar] [CrossRef]
  215. Jones, M.L.H.; Park, J.; Ebert-Hamilton, S.; Kim, K.H.; Reed, M.P. Effects of Seat and Sitter Dimensions on Pressure Distribution in Automotive Seats. SAE Tech. Pap. 2017. [Google Scholar] [CrossRef]
  216. Naddeo, A.; Di Brigida, L.; Fontana, C.; Montese, J.; Quartuccia, M.; Nasti, M.; Pisani, M.M.; Turco, V.; De Stefano, M.; Fiorillo, I. A body-shaped lumbar-sacral support for improving car-seat comfort. Work 2019, 68, 1–10. [Google Scholar]
  217. Robertson, D.G.E. Research Methods in Biomechanics; Human Kinetics: Champaign, IL, USA, 2004. [Google Scholar]
  218. De Luca, C. Electromyography. In Encyclopedia of Medical Devices and Instrumentation; Webster, J.G., Ed.; John Wiley & Sons: Hoboken, NJ, USA, 2006. [Google Scholar] [CrossRef]
  219. Lee, K. Augmented Reality in Education and Training. TechTrends 2012, 56, 13–21. [Google Scholar] [CrossRef]
  220. Çolak, O.; Yünlü, L. A review on augmented reality and virtual reality in engineering education. J. Educ. Instr. Stud. World 2018, 8, 1–8. [Google Scholar]
  221. Parong, J.; Mayer, R.E. Learning science in immersive virtual reality. J. Educ. Psychol. 2018, 110, 785. [Google Scholar] [CrossRef]
  222. Li, L.; Yu, F.; Shi, D.; Shi, J.; Tian, Z.; Yang, J.; Wang, X.; Jiang, Q. Application of virtual reality technology in clinical medicine. Am. J. Transl. Res. 2017, 9, 3867. [Google Scholar]
  223. Wang, P.; Wu, P.; Wang, J.; Chi, H.-L.; Wang, X. A critical review of the use of virtual reality in construction engineering education and training. Int. J. Environ. Res. Public Health 2018, 15, 1204. [Google Scholar] [CrossRef] [Green Version]
  224. Salah, B.; Abidi, M.H.; Mian, S.H.; Krid, M.; Alkhalefah, H.; Abdo, A. Virtual reality-based engineering education to enhance manufacturing sustainability in industry 4.0. Sustainability 2019, 11, 1477. [Google Scholar] [CrossRef] [Green Version]
  225. Van der Kruk, E.; Reijne, M.M. Accuracy of human motion capture systems for sport applications; state-of-the-art review. Eur. J. Sport Sci. 2018, 18, 806–819. [Google Scholar] [CrossRef] [PubMed]
  226. Burger, B.; Puupponen, A.; Jantunen, T. Synchronizing eye tracking and optical motion capture: How to bring them together. J. Eye Mov. Res. 2018, 11, 5. [Google Scholar]
  227. Fischer, T.; Chang, H.; Demiris, Y. Rt-gene: Real-time eye gaze estimation in natural environments. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 334–352. [Google Scholar]
  228. Merel, J.; Tassa, Y.; Tb, D.; Srinivasan, S.; Lemmon, J.; Wang, Z.; Wayne, G.; Heess, N. Learning human behaviors from motion capture by adversarial imitation. arXiv 2017, arXiv:1707.02201. [Google Scholar]
  229. Taskiran, M.; Kahraman, N.; Erdem, C.E. Face recognition: Past, present and future (a review). Digit. Signal Process. 2020, 106, 102809. [Google Scholar] [CrossRef]
  230. Zhao, W.; Chellappa, R.; Phillips, P.; Rosenfeld, A. Face recognition: A literature survey. ACM Comput. Surv. 2003, 35, 399–458. [Google Scholar] [CrossRef]
  231. Mehta, D.; Siddiqui, M.F.H.; Javaid, A.Y. Facial emotion recognition: A survey and real-world user experiences in mixed reality. Sensors 2018, 18, 416. [Google Scholar] [CrossRef] [Green Version]
  232. Lin, H.W.; Lin, Y.-H. Face Detection Based on the Use of Eyes Tracking. In Proceedings of the International Computer Symposium, Chiayi, Taiwan, 15–17 December 2016; pp. 402–405. [Google Scholar]
  233. Ryan, C.; O’Sullivan, B.; Elrasad, A.; Cahill, A.; Lemley, J.; Kielty, P.; Posch, C.; Perot, E. Real-time face & eye tracking and blink detection using event cameras. Neural Netw. 2021. [Google Scholar] [CrossRef]
  234. Makowiecki, K.; Garrett, A.; Harvey, A.R.; Rodger, J. Low-intensity repetitive transcranial magnetic stimulation requires concurrent visual system activity to modulate visual evoked potentials in adult mice. Sci. Rep. 2018, 8, 5792. [Google Scholar] [CrossRef]
  235. Yavari, F.; Jamil, A.; Samani, M.M.; Vidor, L.P.; Nitsche, M.A. Basic and functional effects of transcranial electrical stimulation (tES)—An introduction. Neurosci. Biobehav. Rev. 2018, 85, 81–92. [Google Scholar] [CrossRef]
  236. Moscatelli, F.; Valenzano, A.; Monda, V.; Ruberto, M.; Monda, G.; ITriggiani, A.; Monda, E.; Chieffi, S.; Villano, I.; Parisi, L. Transcranial magnetic stimulation (tms) application in sport medicine: A brief. Acta Méd. 2017, 33, 423. [Google Scholar]
  237. Hsu, W.-Y.; Zanto, T.P.; Anguera, J.A.; Lin, Y.-Y.; Gazzaley, A. Delayed enhancement of multitasking performance: Effects of anodal transcranial direct current stimulation on the prefrontal cortex. Cortex 2015, 69, 175–185. [Google Scholar] [CrossRef] [Green Version]
  238. Cerruti, C.; Schlaug, G. Anodal transcranial direct current stimulation of the prefrontal cortex enhances complex verbal associative thought. J. Cogn. Neurosci. 2009, 21, 1980–1987. [Google Scholar] [CrossRef] [Green Version]
  239. Waters-Metenier, S.; Husain, M.; Wiestler, T.; Diedrichsen, J. Bihemispheric transcranial direct current stimulation enhances effector-independent representations of motor synergy and sequence learning. J. Neurosci. 2014, 34, 1037–1050. [Google Scholar] [CrossRef]
  240. Ciechanski, P.; Cheng, A.; Lopushinsky, S.; Hecker, K.; Gan, L.S.; Lang, S.; Zareinia, K.; Kirton, A. Effects of transcranial direct-current stimulation on neurosurgical skill acquisition: A randomized controlled trial. World Neurosurg. 2017, 108, 876–884. [Google Scholar] [CrossRef]
  241. Nitsche, M.A.; Kuo, M.-F.; Paulus, W.; Antal, A. Transcranial direct current stimulation: Protocols and physiological mechanisms of action. In Textbook of Neuromodulation; Springer: New York, NY, USA, 2015; pp. 101–111. [Google Scholar]
  242. Van Gompel, R.P.G.; Fischer, M.H.; Murray, W.S.; Hill, R.L. Eye-movement research. An overview of current and past developments. In Eye Movements a Window on Mind and Brain; Elsevier: Amsterdam, The Netherlands, 2007; pp. 1–28. [Google Scholar] [CrossRef]
Figure 1. Search strategy and study selection.
Figure 1. Search strategy and study selection.
Sensors 21 04289 g001
Figure 2. The distribution of eye tracking research articles by industry type.
Figure 2. The distribution of eye tracking research articles by industry type.
Sensors 21 04289 g002
Figure 3. Temporal trends of eye tracking research articles in the aviation, maritime, and construction industries published between 2000 and 2020 (N = 80).
Figure 3. Temporal trends of eye tracking research articles in the aviation, maritime, and construction industries published between 2000 and 2020 (N = 80).
Sensors 21 04289 g003
Figure 4. Temporal trends of the number of eye tracking research articles per year in the (a) maritime industry (N = 25), (b) construction industry (N = 19), and (c) aviation industry (N = 36).
Figure 4. Temporal trends of the number of eye tracking research articles per year in the (a) maritime industry (N = 25), (b) construction industry (N = 19), and (c) aviation industry (N = 36).
Sensors 21 04289 g004
Figure 6. (a) Pilot scanpath of cockpit flight instruments during a landing approach [75]; (b) Excavator operator gaze point distribution heatmap [92].
Figure 6. (a) Pilot scanpath of cockpit flight instruments during a landing approach [75]; (b) Excavator operator gaze point distribution heatmap [92].
Sensors 21 04289 g006
Figure 7. Example of the different types of eye tracking devices: (a) eye tracking glasses [75]; (b) headband [92]; (c) helmet-mounted [130,131]; (d); remote or table [132]; (e) tower-mounted [133,134].
Figure 7. Example of the different types of eye tracking devices: (a) eye tracking glasses [75]; (b) headband [92]; (c) helmet-mounted [130,131]; (d); remote or table [132]; (e) tower-mounted [133,134].
Sensors 21 04289 g007
Figure 8. Performance specifications of current eye tracking systems.
Figure 8. Performance specifications of current eye tracking systems.
Sensors 21 04289 g008
Figure 9. Preferred types of video-based eye tracking devices used in research for aviation, maritime, and construction applications.
Figure 9. Preferred types of video-based eye tracking devices used in research for aviation, maritime, and construction applications.
Sensors 21 04289 g009
Figure 10. Different applications of eye tracking studies in the aviation, maritime, and construction industries (created with RAWGraphs [104]).
Figure 10. Different applications of eye tracking studies in the aviation, maritime, and construction industries (created with RAWGraphs [104]).
Sensors 21 04289 g010
Figure 11. Different applications of mobile and remote eye tracking devices in relation to the study of human aspects, industry, and integration with various technologies.
Figure 11. Different applications of mobile and remote eye tracking devices in relation to the study of human aspects, industry, and integration with various technologies.
Sensors 21 04289 g011
Table 1. Search strategy, custom range 2000–2020.
Table 1. Search strategy, custom range 2000–2020.
DatabaseRecords IdentifiedTotal
Google Scholar48,76050,777
Science Direct2017
Duplicates361747,160
Table 2. Classification of the selected classification of eye tracking studies.
Table 2. Classification of the selected classification of eye tracking studies.
RefCodeYearLocationMaritimeAviationConstructionCognitiveEmotionalPhysiologicalTech Integration
[29]S12007Sweden
[30]S22010Germany
[31]S32012Norway
[32] S42012Sweden
[33]S52013Poland
[34]S62014Norway
[35]S72014UK
[24]S82015Italy
[36]S92015Canada
[37]S102015Singapore
[38]S112015Canada
[39]S122016Norway
[8]S132016Italy
[40]S142016Norway
[41]S152017Norway
[42]S162017USA
[43]S172017Norway
[44]S182017Norway
[45]S192018Australia
[46]S202018Norway
[47]S212018Sweden
[48]S222019Singapore
[27]S232019China
[6]S242019Norway
[49]S252019Turkey
[50]S262003USA
[51]S272004USA
[52]S282005USA
[53]S292006Sweden
[54]S302007USA
[55]S312009Germany
[56]S322011France
[57]S332011China
[58]S342012USA
[59]S352012Germany
[60]S362013USA
[61]S372013Germany
[62]S382014The Netherlands
[63]S392014UK
[64]S402014UK
[65]S412014Germany
[66]S422015USA
[67]S432015Germany
[68]S442015UK
[69]S452015Germany
[5]S462016Switzerland
[70]S472016UK
[71]S482017France
[18]S492017Hungary
[72]S502018USA
[73]S512018Germany
[74]s522018UK
[75]s532019Slovakia
[76]S542019Germany
[77]S552019Spain
[78]S562019Slovakia
[26]S572019UK
[79]S582019Switzerland
[80]S592020Switzerland
[7]S602020China
[81]S612020France
[82]S622015USA
[19]S632015USA
[83]S642017USA
[10]S652016Taiwan
[16]S662016Brazil
[4]S672016USA
[23]S682016USA
[84]S692017USA
[85] S702015USA
[86]S712017USA
[87]S722018USA
[88]S732018Japan
[89]S742018USA
[90]S752018China
[91]S762019China
[92]S772019China
[93]S782019USA
[94]S792019Germany
[95]S802020China
Table 3. Locations where eye tracking research was conducted on the aviation, maritime, and construction industries.
Table 3. Locations where eye tracking research was conducted on the aviation, maritime, and construction industries.
ContinentLocation/RegionNumber of Articles
South AmericaBrazil1
North AmericaUSA20
Canada2
EuropeGermany10
Norway9
United Kingdom7
Sweden4
France3
Switzerland3
Italy2
Slovakia2
Hungary1
Poland1
Spain1
The Netherlands1
AsiaChina7
Singapore2
Japan1
Taiwan1
AustraliaAustralia1
Middle EastTurkey1
Table 4. Summary of eye metric characteristics.
Table 4. Summary of eye metric characteristics.
Eye MeasureCharacteristics
Movement RateLatencyRelation to Individuals’ Functional State
Fixation<15–100°/ms180–300 msAttention, acquisition of information
Saccade30–700°/s20–200 msAttention and visual search
Change in pupil diameter4–7 mm/s140 msCognitive workload, information processing, fatigue
Blink12–15 per min300 msAttention, stress, fatigue
Table 5. Advantages, disadvantages, and ideal applications of video-based eye tracking system types [122,156].
Table 5. Advantages, disadvantages, and ideal applications of video-based eye tracking system types [122,156].
Eye Tracking Device TypeIdeal Application AdvantagesDisadvantages
MobileAviation: Real-world applications and realistic cockpit simulators.
Maritime: Real-world applications and realistic bridge simulators.
Construction: Real-world applications such as construction site.
• Lightweight.
• Can be fitted with a head tracker.
• Provides freedom of movement, ideal for a real-world environment or realistic simulators.
• Has cameras that records the scene image or environment.
• Sunlight may affect the quality of the data collection.
• Gaze mapping is more challenging.
• Gaze estimates are typically less accurate than those from remote systems.
• Prone to movement, causing drifting.
• Requires more recalibrations than remote systems.
RemoteAviation: Ideal for simplified computer-based simulators.
Maritime: Ideal for simplified computer-based simulators.
Construction: Ideal for computer-based simulators of excavators or cranes.
• Ideal for on-screen studies on PCs, laptop monitors, and simulators.
• Provides a good level of experimental control.
• High accuracy and data quality.
• Sunlight may affect the quality of the data collection.
• Experimental results cannot reflect the realistic and natural movements present in complex scenarios.
Remote with head-supporting towersAviation, maritime, and construction: when the accuracy and saccade resolution are the most important. For example, detection of micro saccades.• Minimises artifacts caused by head movements.
• Provide the greatest data quality and high level of experimental control.
• Ideal when the accuracy and saccade resolution are the most important.
• The saccade resolution is two to five times more than remote eye trackers.
• Facilitates the calculation of the positions of stimuli on the monitor.
• Sunlight may affect the quality of the data collection.
• Further constrains the subject of realistic and natural head movements.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Martinez-Marquez, D.; Pingali, S.; Panuwatwanich, K.; Stewart, R.A.; Mohamed, S. Application of Eye Tracking Technology in Aviation, Maritime, and Construction Industries: A Systematic Review. Sensors 2021, 21, 4289. https://doi.org/10.3390/s21134289

AMA Style

Martinez-Marquez D, Pingali S, Panuwatwanich K, Stewart RA, Mohamed S. Application of Eye Tracking Technology in Aviation, Maritime, and Construction Industries: A Systematic Review. Sensors. 2021; 21(13):4289. https://doi.org/10.3390/s21134289

Chicago/Turabian Style

Martinez-Marquez, Daniel, Sravan Pingali, Kriengsak Panuwatwanich, Rodney A. Stewart, and Sherif Mohamed. 2021. "Application of Eye Tracking Technology in Aviation, Maritime, and Construction Industries: A Systematic Review" Sensors 21, no. 13: 4289. https://doi.org/10.3390/s21134289

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop