Next Issue
Volume 2, May
Previous Issue
Volume 2, December
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).

J. Eye Mov. Res., Volume 2, Issue 4 (November 2008) – 7 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
8 pages, 1175 KiB  
Article
Eye Tracking Analysis in Reading Online Newspapers
by Daniela Zambarbieri, Elena Carniglia and Carlo Robino
J. Eye Mov. Res. 2008, 2(4), 1-8; https://doi.org/10.16910/jemr.2.4.7 - 26 Nov 2008
Cited by 11 | Viewed by 53
Abstract
Reading online newspapers is increasingly becoming more common, so that thousands of newspapers are published online today. Despite this development, there are many unanswered questions concerning subjects’ behaviour during reading an online newspapers. Recording eye movements when a subject is navigating within a [...] Read more.
Reading online newspapers is increasingly becoming more common, so that thousands of newspapers are published online today. Despite this development, there are many unanswered questions concerning subjects’ behaviour during reading an online newspapers. Recording eye movements when a subject is navigating within a news website can provide quantitative and objective information on subject’s behaviour and combined with other methodologies–usability testing, focus groups, log analysis-represents a powerful tool for improving news websites functionality, and by consequence their achievement among readers. Two Italian newspaper websites have been considered for the study described in this paper and the analysis has been focused on the exploration behaviour within the newspaper home page and the reading of article pages. Full article
Show Figures

Figure 1

8 pages, 229 KiB  
Article
Eye Typing in Application: A Comparison of Two Systems with ALS Patients
by Sebastian Pannasch, Jens R. Helmert, Susann Malischke, Alexander Storch and Boris M. Velichkovsky
J. Eye Mov. Res. 2008, 2(4), 1-8; https://doi.org/10.16910/jemr.2.4.6 - 26 Nov 2008
Cited by 9 | Viewed by 64
Abstract
A variety of eye typing systems has been developed during the last decades. Such systems can provide support for people who lost the ability to communicate, e.g. patients suffering from motor neuron diseases such as amyotrophic lateral sclerosis (ALS). In the current retrospective [...] Read more.
A variety of eye typing systems has been developed during the last decades. Such systems can provide support for people who lost the ability to communicate, e.g. patients suffering from motor neuron diseases such as amyotrophic lateral sclerosis (ALS). In the current retrospective analysis, two eye typing applications were tested (EyeGaze, GazeTalk) by ALS patients (N = 4) in order to analyze objective performance measures and subjective ratings. An advantage of the EyeGaze system was found for most of the evaluated criteria. The results are discussed in respect of the special target population and in relation to requirements of eye tracking devices. Full article
Show Figures

Figure 1

18 pages, 286 KiB  
Article
Gaze Path Stimulation in Retrospective Think-Aloud
by Aulikki Hyrskykari, Saila Ovaska, Pävi Majaranta, Kari-Jouko Räihä and Merja Lehtinen
J. Eye Mov. Res. 2008, 2(4), 1-18; https://doi.org/10.16910/jemr.2.4.5 - 26 Nov 2008
Cited by 50 | Viewed by 60
Abstract
For a long time, eye tracking has been thought of as a promising method for usability testing. During the last couple of years, eye tracking has finally started to live up to these expectations, at least in terms of its use in usability [...] Read more.
For a long time, eye tracking has been thought of as a promising method for usability testing. During the last couple of years, eye tracking has finally started to live up to these expectations, at least in terms of its use in usability laboratories. We know that the user’s gaze path can reveal usability issues that would otherwise go unnoticed, but a common understanding of how best to make use of eye movement data has not been reached. Many usability practitioners seem to have intuitively started to use gaze path replays to stimulate recall for retrospective walk through of the usability test. We review the research on thinkaloud protocols in usability testing and the use of eye tracking in the context of usability evaluation. We also report our own experiment in which we compared the standard, concurrent think-aloud method with the gaze path stimulated retrospective think-aloud method. Our results suggest that the gaze path stimulated retrospective think-aloud method produces more verbal data, and that the data are more informative and of better quality as the drawbacks of concurrent think-aloud have been avoided. Full article
7 pages, 142 KiB  
Article
On Object Selection in Gaze Controlled Environments
by Anke Huckauf and Mario H. Urbina
J. Eye Mov. Res. 2008, 2(4), 1-7; https://doi.org/10.16910/jemr.2.4.4 - 26 Nov 2008
Cited by 18 | Viewed by 41
Abstract
In the past twenty years, gaze control has become a reliable alternative input method not only for handicapped users. The selection of objects, however, which is of highest importance and of highest frequency in computer control, requires explicit control not inherent in eye [...] Read more.
In the past twenty years, gaze control has become a reliable alternative input method not only for handicapped users. The selection of objects, however, which is of highest importance and of highest frequency in computer control, requires explicit control not inherent in eye movements. Objects have been therefore usually selected via prolonged fixations (dwell times). Dwell times seemed to be for many years the unique reliable method for selection. In this paper, we review pros and cons of classical selection methods and novel metaphors, which are based on pies and gestures. The focus is on the effectiveness and efficiency of selections. In order to discover the real potential of current suggestions for selection, a basic empirical comparison is recommended. Full article
Show Figures

Figure 1

8 pages, 125 KiB  
Article
Influences of Dwell Time and Cursor Control on the Performance in Gaze Driven Typing
by Jens R. Helmert, Sebastian Pannasch and Boris M. Velichkovsky
J. Eye Mov. Res. 2008, 2(4), 1-8; https://doi.org/10.16910/jemr.2.4.3 - 26 Nov 2008
Cited by 14 | Viewed by 44
Abstract
In gaze controlled computer interfaces the dwell time is often used as selection criterion. But this solution comes along with several problems, especially in the temporal domain: Eye movement studies on scene perception could demonstrate that fixations of different durations serve different purposes [...] Read more.
In gaze controlled computer interfaces the dwell time is often used as selection criterion. But this solution comes along with several problems, especially in the temporal domain: Eye movement studies on scene perception could demonstrate that fixations of different durations serve different purposes and should therefore be differentiated. The use of dwell time for selection implies the need to distinguish intentional selections from merely per-ceptual processes, described as the Midas touch problem. Moreover, the feedback of the actual own eye position has not yet been addressed to systematic studies in the context of usability in gaze based computer interaction. We present research on the usability of a simple eye typing set up. Different dwell time and eye position feedback configurations were tested. Our results indicate that smoothing raw eye position and temporal delays in visual feedback enhance the system's functionality and usability. Best overall performance was obtained with a dwell time of 500 ms. Full article
Show Figures

Figure 1

7 pages, 213 KiB  
Article
Gaze Tracking in Semi-Autonomous Grasping
by Claudio Castellini
J. Eye Mov. Res. 2008, 2(4), 1-7; https://doi.org/10.16910/jemr.2.4.2 - 26 Nov 2008
Cited by 1 | Viewed by 36
Abstract
In critical human/robotic interactions such as, e.g., teleoperation by a disabled master or with insufficient bandwidth, it is highly desirable to have semi-autonomous robotic artifacts interact with a human being. Semi-autonomous grasping, for instance, consists of having a smart slave able to guess [...] Read more.
In critical human/robotic interactions such as, e.g., teleoperation by a disabled master or with insufficient bandwidth, it is highly desirable to have semi-autonomous robotic artifacts interact with a human being. Semi-autonomous grasping, for instance, consists of having a smart slave able to guess the master’s intentions and initiating a grasping sequence whenever the master wants to grasp an object in the slave’s workspace. In this paper we investigate the possibility of building such an intelligent robotic artifact by training a machine learning system on data gathered from several human subjects while trying to grasp objects in a teleoperation setup. In particular, we investigate the usefulness of gaze tracking in such a scenario. The resulting system must be light enough to be usable on-line and flexible enough to adapt to different masters, e.g., elderly and/or slow. The outcome of the experiment is that such a system, based upon Support Vector Machines, meets all the requirements, being (a) highly accurate, (b) compact and fast, and (c) largely unaffected by the subjects’ diversity. It is also clearly shown that gaze tracking significantly improves both the accuracy and compactness of the obtained models, if compared with the use of the hand position alone. The system can be trained with something like 3.5 minutes of human data in the worst case. Full article
Show Figures

Figure 1

4 pages, 109 KiB  
Editorial
Eye Tracking and Usability Research: An Introduction to the Special Issue
by Sebastian Pannasch, Jens R. Helmert and Boris M. Velichkovsky
J. Eye Mov. Res. 2008, 2(4), 1-4; https://doi.org/10.16910/jemr.2.4.1 - 26 Nov 2008
Viewed by 44
Abstract
Undoubtedly, there is a great potential for the eye movement research community with the ECEM as face-to-face platform and the JEMR as open access journal to bridge the gap between basic knowledge and experimen-tal procedures, on the one hand, and the idiosyncratic requirements [...] Read more.
Undoubtedly, there is a great potential for the eye movement research community with the ECEM as face-to-face platform and the JEMR as open access journal to bridge the gap between basic knowledge and experimen-tal procedures, on the one hand, and the idiosyncratic requirements for a successful application of eye tracking, on the other [...] Full article
Previous Issue
Next Issue
Back to TopTop