SMOOVS: Towards Calibration-Free Text Entry by Gaze Using Smooth Pursuit Movements
Abstract
:Introduction
Current Concepts of Gaze Spellers
Smooth Pursuit Movements in Gaze Interaction
SMOOVS
Layout and Interaction Design
Technical Environment
One-Point Calibration
Pre-Test
Design and Procedure
Results and Discussion
Classification Algorithm
Empirical Evaluation
Experimental Design
Task and Procedure
Participants
Results
Discussion
Effects of object movement Speed
Comparison with other gaze spellers
Limitations
Outlook
Conclusion
References
- Arif, A. S., and W. Stuerzlinger. 2009. Analysis of text entry performance metrics. In 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH). September. pp. 100–105. [Google Scholar]
- Bahill, A. T., and J. D. McDonald. 1983. Smooth pursuit eye movements in response to predictable target motions. Vision research 23, 12: 1573–83. [Google Scholar] [CrossRef] [PubMed]
- Bee, N., and E. André. 2008. Writing with your eye: A dwell time free writing system adapted to the nature of human eye gaze. Perception in Multimodal Dialogue Systems, 111–122. [Google Scholar]
- Blankertz, B., G. Dornhege, M. Krauledat, M. Schr, J. Williamson, and R. Murray-smith. 2006. The Berlin Brain-Computer Interface presents the novel mental typewriter HEX-O-SPELL. In Proceedings of the 3rd international brain-computer interface workshop and training course. Graz, pp. 108–109. [Google Scholar]
- Burke, M. R., and G. R. Barnes. 2006. Quantitative differences in smooth pursuit and saccadic eye movements. Experimental brain research 175, 4: 596–608. [Google Scholar] [CrossRef] [PubMed]
- Collewijn, H., and E. Tamminga. 1984. Human smooth and saccadic eye movements during voluntary pursuit of different target motions on different backgrounds. The Journal of physiology, 217–250. [Google Scholar] [CrossRef] [PubMed]
- Cymek, D., A. Venjakob, S. Ruff, O. H.-M. Lutz, S. Hofmann, and M. Rötting. 2014. Entering PIN Codes by Smooth Pursuit Eye Movements. Journal of Eye Movement Research 7, 4: 1–11. [Google Scholar] [CrossRef]
- Drewes, H., and A. Schmidt. 2007. Interacting with the computer using gaze gestures. In Human-computer interactioninteract 2007. [Google Scholar]
- Hansen, D. W., H. H. T. Skovsgaard, J. P. Hansen, and E. Mollenbach. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proceedings of the 2008 symposium on Eye tracking research & applications-ETRA ’08. p. 205. [Google Scholar]
- Holmqvist, K., M. Nyström, R. Andersson, R. Dewhurst, J. Halszka, and J. van de Weijer. 2011. Eye Tracking: A Comprehensive Guide to Methods and Measures. New York: Oxford University Press. [Google Scholar]
- Huckauf, A., and M. Urbina. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 symposium on eye-tracking research & applications. pp. 51–54. [Google Scholar]
- Jacob, R. J. K. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems 9, 2: 152–169. [Google Scholar]
- Lutz, O. H.-M. 2013. Mousey: A multi-purpose eye-tracking and gaze-interaction interface (Tech. Rep.); Technical University Berlin, Fachgebiet Mensch-Maschine-Systeme. Retrieved from http://www.mms.tu-berlin.de/fileadmin/fg268/Mitarbeiter/Mousey2_Documentation.pdf.
- Majaranta, P. 2011. Edited by P. Majaranta, H. Aoki and M. Donegan. Communication and text entry by gaze. In Gaze interaction and applications of eye tracking-advances in assistive technologies. Hershey, PA: IGI Global. [Google Scholar]
- Majaranta, P., and K. Räihä. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on eye tracking research & applications. pp. 15–22. [Google Scholar]
- Mollenbach, E., J. Hansen, and M. Lillholm. 2013. Eye Movements in Gaze Interaction. Journal of Eye Movement Research 6: 1–15. [Google Scholar] [CrossRef]
- Morimoto, C., and A. Amir. 2010. Context switching for fast key selection in text entry applications. In Proceedings of the 2010 symposium on eye-tracking research & applications. Vol. 1, pp. 271–274. [Google Scholar]
- Pfeuffer, K., M. Vidal, and J. Turner. 2013. Pursuit calibration: making gaze calibration less tedious and more flexible. In Proceedings of the 26th annual ACM symposium on User interface software and technology. pp. 261–269. [Google Scholar]
- Pommerening, K. 2013. Pommerenings Pangramm-Sammlung. Retrieved from http://www.staff.uni-mainz.de/pommeren/Miszellen/Pangramme.html.
- Rottach, K. G., a. Z. Zivotofsky, V. E. Das, L. Averbuch-Heller, a. O. Discenna, a. Poonyathalang, and R. J. Leigh. 1996. Comparison of horizontal, vertical and diagonal smooth pursuit eye movements in normal human subjects. Vision research 36, 14: 2189–95. [Google Scholar] [CrossRef] [PubMed]
- Rozado, D., J. S. Agustin, F. B. Rodriguez, and P. Varona. 2012. Gliding and saccadic gaze gesture recognition in real time. ACM Transactions on Interactive Intelligent Systems 1, 2: 1–27. [Google Scholar]
- Rozado, D., F. Rodriguez, and P. Varona. 2010. Optimizing hierarchical temporal memory for multivariable time series. In Proceedings of the 20th international conference on artificial neural networks icann 2010. pp. 506–518. [Google Scholar]
- Tula, A., F. de Campos, and C. Morimoto. 2012. Dynamic context switching for gaze based interaction. Proceedings of the 2012 Symposium on Eye Tracking Research and Applications 1, 212: 353–356. [Google Scholar]
- Vidal, M., A. Bulling, and H. Gellersen. 2013. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 acm international joint conference on pervasive and ubiquitous computing. pp. 439–448. [Google Scholar]
- Vidal, M., and K. Pfeuffer. 2013. Pursuits: Eye-based interaction with moving targets. In Chi ’13 extended abstracts on human factors in computing systems. pp. 3147–3150. [Google Scholar]
- Villanueva, A., R. Cabeza, and S. Porta. 2004. Eye tracking system model with easy calibration. In Proceedings of the 2004 symposium on eye tracking research & applications. Vol. 1, p. 58113. [Google Scholar]
- Wallace, J. M., L. S. Stone, G. S. Masson, and M. Julian. 2005. Object Motion Computation for the Initiation of Smooth Pursuit Eye Movements in Humans. Journal of Neurophysiology, 2279–2293. [Google Scholar] [CrossRef] [PubMed]
- Ward, D. J., A. F. Blackwell, and D. J. C. MacKay. 2000. Dasher—a data entry interface using continuous gestures and language models. In Proceedings of the 13th annual ACM symposium on User interface software and technology-UIST ’00. vol. 2, pp. 129–137. [Google Scholar]
- Wobbrock, J., J. Rubinstein, M. Sawyer, and A. Duchowsky. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 symposium on eye tracking research & applications. pp. 11–19. [Google Scholar]
Copyright © 2015. This article is licensed under a Creative Commons Attribution 4.0 International License.
Share and Cite
Lutz, O.H.-M.; Venjakob, A.C.; Ruff, S. SMOOVS: Towards Calibration-Free Text Entry by Gaze Using Smooth Pursuit Movements. J. Eye Mov. Res. 2015, 8, 1-11. https://doi.org/10.16910/jemr.8.1.2
Lutz OH-M, Venjakob AC, Ruff S. SMOOVS: Towards Calibration-Free Text Entry by Gaze Using Smooth Pursuit Movements. Journal of Eye Movement Research. 2015; 8(1):1-11. https://doi.org/10.16910/jemr.8.1.2
Chicago/Turabian StyleLutz, Otto Hans-Martin, Antje Christine Venjakob, and Stefan Ruff. 2015. "SMOOVS: Towards Calibration-Free Text Entry by Gaze Using Smooth Pursuit Movements" Journal of Eye Movement Research 8, no. 1: 1-11. https://doi.org/10.16910/jemr.8.1.2
APA StyleLutz, O. H.-M., Venjakob, A. C., & Ruff, S. (2015). SMOOVS: Towards Calibration-Free Text Entry by Gaze Using Smooth Pursuit Movements. Journal of Eye Movement Research, 8(1), 1-11. https://doi.org/10.16910/jemr.8.1.2