Next Issue
Volume 7, September
Previous Issue
Volume 7, April
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).

J. Eye Mov. Res., Volume 7, Issue 3 (April 2014) – 5 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
9 pages, 545 KiB  
Article
Image Preference Estimation with a Data-Driven Approach: A Comparative Study Between Gaze and Image Features
by Yusuke Sugano, Yasunori Ozaki, Hiroshi Kasai, Keisuke Ogaki and Yoichi Sato
J. Eye Mov. Res. 2014, 7(3), 1-9; https://doi.org/10.16910/jemr.7.3.5 (registering DOI) - 18 Apr 2014
Cited by 10 | Viewed by 51
Abstract
Understanding how humans subjectively look at and evaluate images is an important task for various applications in the field of multimedia interaction. While it has been repeatedly pointed out that eye movements can be used to infer the internal states of humans, not [...] Read more.
Understanding how humans subjectively look at and evaluate images is an important task for various applications in the field of multimedia interaction. While it has been repeatedly pointed out that eye movements can be used to infer the internal states of humans, not many successes have been reported concerning image understanding. We investigate the possibility of image preference estimation based on a person’s eye movements in a supervised manner in this paper. A dataset of eye movements is collected while the participants are viewing pairs of natural images, and it is used to train image preference label classifiers. The input feature is defined as a combination of various fixation and saccade event statistics, and the use of the random forest algorithm allows us to quantitatively assess how each of the statistics contributes to the classification task. We show that the gaze-based classifier had a higher level of accuracy than metadata-based baseline methods and a simple rule-based classifier throughout the experiments. We also present a quantitative comparison with image-based preference classifiers, and discuss the potential and limitations of the gaze-based preference estimator. Full article
Show Figures

Figure 1

9 pages, 11476 KiB  
Article
Estimation of a Focused Object Using a Corneal Surface Image for Eye-Based Interaction
by Kentaro Takemura, Tomohisa Yamakawa, Jun Takamatsu and Tsukasa Ogasawara
J. Eye Mov. Res. 2014, 7(3), 1-9; https://doi.org/10.16910/jemr.7.3.4 (registering DOI) - 27 Mar 2014
Cited by 11 | Viewed by 49
Abstract
Researchers are considering the use of eye tracking in head-mounted camera systems, such as Google’s Project Glass. Typical methods require detailed calibration in advance, but long periods of use disrupt the calibration record between the eye and the scene camera. In addition, the [...] Read more.
Researchers are considering the use of eye tracking in head-mounted camera systems, such as Google’s Project Glass. Typical methods require detailed calibration in advance, but long periods of use disrupt the calibration record between the eye and the scene camera. In addition, the focused object might not be estimated even if the point-of-regard is estimated using a portable eye-tracker. Therefore, we propose a novel method for estimating the object that a user is focused upon, where an eye camera captures the reflection on the corneal surface. Eye and environment information can be extracted from the corneal surface image simultaneously. We use inverse ray tracing to rectify the reflected image and a scale-invariant feature transform to estimate the object where the point-of-regard is located. Unwarped images can also be generated continuously from corneal surface images. We consider that our proposed method could be applied to a guidance system and we confirmed the feasibility of this application in experiments that estimated the object focused upon and the point-of-regard. Full article
Show Figures

Figure 1

8 pages, 3607 KiB  
Article
Evaluation of Accurate Eye Corner Detection Methods for Gaze Estimation
by Jose Javier Bengoechea, Juan J. Cerrolaza, Arantxa Villanueva and Rafael Cabeza
J. Eye Mov. Res. 2014, 7(3), 1-8; https://doi.org/10.16910/jemr.7.3.3 (registering DOI) - 27 Mar 2014
Cited by 8 | Viewed by 62
Abstract
Accurate detection of iris center and eye corners appears to be a promising approach for low cost gaze estimation. In this paper we propose novel eye inner corner detection methods. Appearance and feature based segmentation approaches are suggested. All these methods are exhaustively [...] Read more.
Accurate detection of iris center and eye corners appears to be a promising approach for low cost gaze estimation. In this paper we propose novel eye inner corner detection methods. Appearance and feature based segmentation approaches are suggested. All these methods are exhaustively tested on a realistic dataset containing images of subjects gazing at different points on a screen. We have demonstrated that a method based on a neural network presents the best performance even in light changing scenarios. In addition to this method, algorithms based on AAM and Harris corner detector present better accuracies than recent high performance face points tracking methods such as Intraface. Full article
Show Figures

Figure 1

10 pages, 1142 KiB  
Article
A Cheap Portable Eye-Tracker Solution for Common Setups
by Onur Ferhat, Fernando Vilariño and Francisco Javier Sánchez
J. Eye Mov. Res. 2014, 7(3), 1-10; https://doi.org/10.16910/jemr.7.3.2 - 27 Mar 2014
Cited by 10 | Viewed by 64
Abstract
We analyze the feasibility of a cheap eye-tracker where the hardware consists of a single webcam and a Raspberry Pi device. Our aim is to discover the limits of such a system and to see whether it provides an acceptable performance. We base [...] Read more.
We analyze the feasibility of a cheap eye-tracker where the hardware consists of a single webcam and a Raspberry Pi device. Our aim is to discover the limits of such a system and to see whether it provides an acceptable performance. We base our work on the open source Opengazer (Zielinski, 2013) and we propose several improvements to create a robust, real-time system which can work on a computer with 30Hz sampling rate. After assessing the accuracy of our eye-tracker in elaborated experiments involving 12 subjects under 4 different system setups, we install it on a Raspberry Pi to create a portable stand-alone eye-tracker which achieves 1.42° horizontal accuracy with 3 Hz refresh rate for a building cost of €70. Full article
Show Figures

Figure 1

2 pages, 91 KiB  
Article
Introduction to the PETMEI Special Issue
by Andreas Bulling and Roman Bednarik
J. Eye Mov. Res. 2014, 7(3), 1-2; https://doi.org/10.16910/jemr.7.3.1 (registering DOI) - 27 Mar 2014
Viewed by 36
Abstract
Latest developments in remote and head-mounted eye tracking and automated eye movement analysis point the way toward unobtrusive eye-based humancomputer interfaces that will become pervasively usable in everyday life. We call this new paradigm pervasive eye tracking—continuous eye monitoring and analysis 24/7. Pervasive [...] Read more.
Latest developments in remote and head-mounted eye tracking and automated eye movement analysis point the way toward unobtrusive eye-based humancomputer interfaces that will become pervasively usable in everyday life. We call this new paradigm pervasive eye tracking—continuous eye monitoring and analysis 24/7. Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI) is a workshop series that revolves around the theme of pervasive eye-tracking as a trailblazer for pervasive eye-based human-computer interaction and eye-based context-awareness. This special issue is composed from extended versions of the top-scoring papers from the 3rd workshop in the PETMEI series held in 2013. Full article
Previous Issue
Next Issue
Back to TopTop