sensors-logo

Journal Browser

Journal Browser

Wearable Technologies and Applications for Eye Tracking

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Biomedical Sensors".

Deadline for manuscript submissions: closed (15 December 2022) | Viewed by 32133

Special Issue Editors


E-Mail Website
Guest Editor
Goldsmiths, University of London, UK.
Interests: wearable computing; body-worn sensing; eye tracking; human activity recognition; social neuroscience

E-Mail Website
Guest Editor
Keio Graduate School of Media Design (KMD), Keio University, Japan.
Interests: wearable computing; eyewear computing; amplifying human sensing; human enhancement

Special Issue Information

Of all the human sensing mechanisms, the eyes are unique in that they both perceive and emit signals—they allow us to see the world around us, but also signal to that world through distinctive movements, gaze, and subtle changes such as pupil dilation. By harnessing these signals, sensor-based eye tracking can be a useful tool in such fields as psychology, health, and user-interface design, revealing valuable information on hitherto difficult-to-study human behaviours and cognitive processes. Wearable and head-worn eye-trackers have advanced the potential of this work even further, offering the opportunity to move outside of the laboratory setting and into the real world. This has opened up new avenues for eye-tracker technology as a tool to study real-world social and cognitive behaviours, and as the basis of new commercial applications in virtual reality (VR), wearable health monitoring, and driver-assistive technology, among others.

There remain significant challenges to be solved in developing eye-tracking technology, particularly when striking the balance between trackers that can be worn unobtrusively yet are also sufficiently robust at capturing data for use in real-world applications. Equally, there remain challenges in our understanding of how we might interpret captured eye signals, from analysis of gaze patterns to uncovering of social behaviours. 

In this Special Issue, we invite submissions from a wide range of fields related to the development and application of wearable eye-tracking technology: from development of body-worn sensors, new signal processing algorithms, machine learning analysis of captured signals, and novel application design to basic research into the cognitive and social processes that might be studied using wearable eye trackers. 

Dr. Jamie A Ward
Prof. Kai Kunze
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • eye tracking
  • sensors
  • gaze
  • vision
  • EOG
  • pupil
  • wearable
  • body-worn
  • cognitive
  • social signals
  • psychology
  • HCI

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

9 pages, 1209 KiB  
Article
Improving Performance of the Human Pupil Orbit Model (HPOM) Estimation Method for Eye-Gaze Tracking
by Seungbong Lee, Jaehoon Jeong, Nahyun Kim, Manjae Shin and Sungmin Kim
Sensors 2022, 22(23), 9398; https://doi.org/10.3390/s22239398 - 02 Dec 2022
Viewed by 1414
Abstract
Eye-gaze direction-tracking technology is used in fields such as medicine, education, engineering, and gaming. Stability, accuracy, and precision of eye-gaze direction-tracking are demanded with simultaneous upgrades in response speed. In this study, a method is proposed to improve the speed with decreases in [...] Read more.
Eye-gaze direction-tracking technology is used in fields such as medicine, education, engineering, and gaming. Stability, accuracy, and precision of eye-gaze direction-tracking are demanded with simultaneous upgrades in response speed. In this study, a method is proposed to improve the speed with decreases in the system load and precision in the human pupil orbit model (HPOM) estimation method. The new method was proposed based on the phenomenon that the minor axis of the elliptical-deformed pupil always pointed toward the rotational center presented in various eye-gaze direction detection studies and HPOM estimation methods. Simulation experimental results confirmed that the speed was improved by at least 74 times by consuming less than 7 ms compared to the HPOM estimation. The accuracy of the eye’s ocular rotational center point showed a maximum error of approximately 0.2 pixels on the x-axis and approximately 8 pixels on the y-axis. The precision of the proposed method was 0.0 pixels when the number of estimation samples (ES) was 7 or less, which showed results consistent with those of the HPOM estimation studies. However, the proposed method was judged to work conservatively against the allowable angle error (AAE), considering that the experiment was conducted under the worst conditions and the cost used to estimate the final model. Therefore, the proposed method could estimate HPOM with high accuracy and precision through AAE adjustment according to system performance and the usage environment. Full article
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Show Figures

Figure 1

18 pages, 5118 KiB  
Article
An Investigation on the Influence of Operation Experience on Virtual Hazard Perception Using Wearable Eye Tracking Technology
by Siyu Li, Yongqing Jiang, Chao Sun, Kangkang Guo and Xin Wang
Sensors 2022, 22(14), 5115; https://doi.org/10.3390/s22145115 - 07 Jul 2022
Cited by 2 | Viewed by 1495
Abstract
Poor electrical hazard recognition is a widespread issue in the production industry. Hazard perception has impacts workers’ hazard recognition, causing them to experience unanticipated hazard exposure and suffer catastrophic injuries. To improve the factors of affecting hazard perception, the current study examined hazard [...] Read more.
Poor electrical hazard recognition is a widespread issue in the production industry. Hazard perception has impacts workers’ hazard recognition, causing them to experience unanticipated hazard exposure and suffer catastrophic injuries. To improve the factors of affecting hazard perception, the current study examined hazard recognition as an everyday visual search task. A comparative test was carried out combining the advantages and disadvantages of the two test methods. It was confirmed that the virtual image test data can replace the real image test data and demonstrate superior flexible settings performance, so the virtual image test method is used. A hazard perception test method based on wearable eye tracking technology was proposed to analyze the eye-tracking data (i.e., fixation, count, search duration, mean fixation duration, eye tracking, and hazard recognition performance feedback) were compared between experts in the field of electrical safety: skilled workers with at least five years of work experience and workers who had been on the job for less than a year. It was found that experts had a better hazard recognition accuracy and missed detection rate than other workers. Experts’ hazards research track was more concised and paid less attention time. This advantage is most obvious in complex risk environments. The findings also suggest that workers who have different working years was not obvious visual search patterns other than the search duration. As can be seen the work experience is not an absolute factor in improving hazard perception. The present research will be useful to understand the influence of working years on hazard perception and provide a theoretical basis for corresponding training. Full article
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Show Figures

Figure 1

22 pages, 4564 KiB  
Article
Exploring Eye Movement Biometrics in Real-World Activities: A Case Study of Wayfinding
by Hua Liao, Wendi Zhao, Changbo Zhang and Weihua Dong
Sensors 2022, 22(8), 2949; https://doi.org/10.3390/s22082949 - 12 Apr 2022
Cited by 3 | Viewed by 2509
Abstract
Eye movement biometrics can enable continuous verification for highly secure environments such as financial transactions and defense establishments, as well as a more personalized and tailored experience in gaze-based human–computer interactions. However, there are numerous challenges to recognizing people in real environments using [...] Read more.
Eye movement biometrics can enable continuous verification for highly secure environments such as financial transactions and defense establishments, as well as a more personalized and tailored experience in gaze-based human–computer interactions. However, there are numerous challenges to recognizing people in real environments using eye movements, such as implicity and stimulus independence. In the instance of wayfinding, this research intends to investigate implicit and stimulus-independent eye movement biometrics in real-world situations. We collected 39 subjects’ eye movement data from real-world wayfinding experiments and derived five sets of eye movement features (the basic statistical, pupillary response, fixation density, fixation semantic and saccade encoding features). We adopted a random forest and performed biometric recognition for both identification and verification scenarios. The best accuracy we obtained in the identification scenario was 78% (equal error rate, EER = 6.3%) with the 10-fold classification and 64% (EER = 12.1%) with the leave-one-route-out classification. The best accuracy we achieved in the verification scenario was 89% (EER = 9.1%). Additionally, we tested performance across the 5 feature sets and 20 time window sizes. The results showed that the verification accuracy was insensitive to the increase in the time window size. These findings are the first indication of the viability of performing implicit and stimulus-independent biometric recognition in real-world settings using wearable eye tracking. Full article
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Show Figures

Figure 1

16 pages, 806 KiB  
Article
Eye Movement Alterations in Post-COVID-19 Condition: A Proof-of-Concept Study
by Cecilia García Cena, Mariana Campos Costa, Roque Saltarén Pazmiño, Cristina Peixoto Santos, David Gómez-Andrés and Julián Benito-León
Sensors 2022, 22(4), 1481; https://doi.org/10.3390/s22041481 - 14 Feb 2022
Cited by 9 | Viewed by 2590
Abstract
There is much evidence pointing out eye movement alterations in several neurological diseases. To the best of our knowledge, this is the first video-oculography study describing potential alterations of eye movements in the post-COVID-19 condition. Visually guided saccades, memory-guided saccades, and antisaccades in [...] Read more.
There is much evidence pointing out eye movement alterations in several neurological diseases. To the best of our knowledge, this is the first video-oculography study describing potential alterations of eye movements in the post-COVID-19 condition. Visually guided saccades, memory-guided saccades, and antisaccades in horizontal axis were measured. In all visual tests, the stimulus was deployed with a gap condition. The duration of the test was between 5 and 7 min per participant. A group of n=9 patients with the post-COVID-19 condition was included in this study. Values were compared with a group (n=9) of healthy volunteers whom the SARS-CoV-2 virus had not infected. Features such as centripetal and centrifugal latencies, success rates in memory saccades, antisaccades, and blinks were computed. We found that patients with the post-COVID-19 condition had eye movement alterations mainly in centripetal latency in visually guided saccades, the success rate in memory-guided saccade test, latency in antisaccades, and its standard deviation, which suggests the involvement of frontoparietal networks. Further work is required to understand these eye movements’ alterations and their functional consequences. Full article
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Show Figures

Figure 1

15 pages, 3259 KiB  
Article
Assessing Perceptual Load and Cognitive Load by Fixation-Related Information of Eye Movements
by Jung-Chun Liu, Kuei-An Li, Su-Ling Yeh and Shao-Yi Chien
Sensors 2022, 22(3), 1187; https://doi.org/10.3390/s22031187 - 04 Feb 2022
Cited by 18 | Viewed by 2963
Abstract
Assessing mental workload is imperative for avoiding unintended negative consequences in critical situations such as driving and piloting. To evaluate mental workload, measures of eye movements have been adopted, but unequivocal results remain elusive, especially those related to fixation-related parameters. We aimed to [...] Read more.
Assessing mental workload is imperative for avoiding unintended negative consequences in critical situations such as driving and piloting. To evaluate mental workload, measures of eye movements have been adopted, but unequivocal results remain elusive, especially those related to fixation-related parameters. We aimed to resolve the discrepancy of previous results by differentiating two kinds of mental workload (perceptual load and cognitive load) and manipulated them independently using a modified video game. We found opposite effects of the two kinds of mental workload on fixation-related parameters: shorter fixation durations and more fixations when participants played an episode with high (vs. low) perceptual load, and longer fixation durations and fewer fixations when they played an episode with high (vs. low) cognitive load. Such opposite effects were in line with the load theory and demonstrated that fixation-related parameters can be used to index mental workload at different (perceptual and cognitive) stages of mental processing. Full article
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Show Figures

Figure 1

21 pages, 1481 KiB  
Article
Automatic Visual Attention Detection for Mobile Eye Tracking Using Pre-Trained Computer Vision Models and Human Gaze
by Michael Barz and Daniel Sonntag
Sensors 2021, 21(12), 4143; https://doi.org/10.3390/s21124143 - 16 Jun 2021
Cited by 12 | Viewed by 5521
Abstract
Processing visual stimuli in a scene is essential for the human brain to make situation-aware decisions. These stimuli, which are prevalent subjects of diagnostic eye tracking studies, are commonly encoded as rectangular areas of interest (AOIs) per frame. Because it is a tedious [...] Read more.
Processing visual stimuli in a scene is essential for the human brain to make situation-aware decisions. These stimuli, which are prevalent subjects of diagnostic eye tracking studies, are commonly encoded as rectangular areas of interest (AOIs) per frame. Because it is a tedious manual annotation task, the automatic detection and annotation of visual attention to AOIs can accelerate and objectify eye tracking research, in particular for mobile eye tracking with egocentric video feeds. In this work, we implement two methods to automatically detect visual attention to AOIs using pre-trained deep learning models for image classification and object detection. Furthermore, we develop an evaluation framework based on the VISUS dataset and well-known performance metrics from the field of activity recognition. We systematically evaluate our methods within this framework, discuss potentials and limitations, and propose ways to improve the performance of future automatic visual attention detection methods. Full article
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Show Figures

Figure 1

18 pages, 2185 KiB  
Article
ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays
by Sebastian Kapp, Michael Barz, Sergey Mukhametov, Daniel Sonntag and Jochen Kuhn
Sensors 2021, 21(6), 2234; https://doi.org/10.3390/s21062234 - 23 Mar 2021
Cited by 51 | Viewed by 8335
Abstract
Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR [...] Read more.
Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers. Full article
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Show Figures

Figure 1

15 pages, 1885 KiB  
Article
Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality
by Jose Llanes-Jurado, Javier Marín-Morales, Jaime Guixeres and Mariano Alcañiz
Sensors 2020, 20(17), 4956; https://doi.org/10.3390/s20174956 - 01 Sep 2020
Cited by 31 | Viewed by 5626
Abstract
Fixation identification is an essential task in the extraction of relevant information from gaze patterns; various algorithms are used in the identification process. However, the thresholds used in the algorithms greatly affect their sensitivity. Moreover, the application of these algorithm to eye-tracking technologies [...] Read more.
Fixation identification is an essential task in the extraction of relevant information from gaze patterns; various algorithms are used in the identification process. However, the thresholds used in the algorithms greatly affect their sensitivity. Moreover, the application of these algorithm to eye-tracking technologies integrated into head-mounted displays, where the subject’s head position is unrestricted, is still an open issue. Therefore, the adaptation of eye-tracking algorithms and their thresholds to immersive virtual reality frameworks needs to be validated. This study presents the development of a dispersion-threshold identification algorithm applied to data obtained from an eye-tracking system integrated into a head-mounted display. Rules-based criteria are proposed to calibrate the thresholds of the algorithm through different features, such as number of fixations and the percentage of points which belong to a fixation. The results show that distance-dispersion thresholds between 1–1.6° and time windows between 0.250.4 s are the acceptable range parameters, with 1° and 0.25 s being the optimum. The work presents a calibrated algorithm to be applied in future experiments with eye-tracking integrated into head-mounted displays and guidelines for calibrating fixation identification algorithms Full article
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Show Figures

Figure 1

Back to TopTop