Using Gaze for Behavioural Biometrics
Abstract
:1. Introduction
- To lay down a principled approach to exploit eye movements in behavioural biometrics;
- To derive, in the context of such a framework, a computational model suitable to operationalize biometric assessment.
2. Related Works
3. Background and Motivation
- patch-choice: the forager has to make a decision concerning what type of patch (a clump of food) to search for (e.g., good bushes full of berries);
- patch-finding: the forager wanders through a landscape where certain resources are available and makes a choice on how to move between patches (optimal movements);
- target/prey-finding: once a patch is located, the forager should decide what prey to take and handle (optimal diet choice);
- patch-leaving: the available resources in the patch decrease over time spent handling preys; eventually, a forager makes the decision of leaving the current patch to search in a more profitable patch, or just to finish the task (giving up or departure times from patches).
4. Modelling Eye Movements Dynamics
4.1. Observers as Foraging Animals
- Coarse-grained representation of saccadic movements: saccades are assumed to be ballistic motions, i.e., the path from two successive fixations points is assumed to be a straight line. This assumption is clearly oversimplified (e.g., see again Figure 3).
- Mathematical hindrances: LFs rely on the use of -stable distributions as the innovation term; such representation involves dealing with infinite variance and the absence of a closed-form probability density function (PDF).
4.2. Observers as Random Walkers—The Ornstein-Uhlenbeck Process
5. Proposed Approach
- Gaze event classification; namely, parse the raw gaze data of one trajectory, in order to identify the sequence of fixations and saccades in that trajectory (usually defined as a scan path, in the literature).
- SDE parameter estimation for each identified event along the trajectory/scan path.
5.1. Gaze Event Classification
5.2. Bayesian Estimation of SDE Parameters
5.3. Identity Recognition from Gaze Dynamics
6. Experimental Results and Analyses
- First, a straightforward standard evaluation in terms of identification performance is provided.
- Second, an analysis is carried out, which aims at gaining a deeper insight into the features made available by the method and the identification results so far achieved.
6.1. Dataset
6.2. Identification Performance
6.3. Analysis of Results
- the average fixation feature vector and the average saccade feature vector associated to the scan path k:
- the descriptor of scan path k obtained by concatenating the two vectors above:
- the summary descriptor of the visual behaviour of observer , over the set of the K observed stimuli:
7. Discussion and Conclusions
7.1. Advantages of the Method
7.2. Limitations
7.3. Session Duration
7.4. The Stimulus Problem
7.5. The Task Problem
7.6. Handling Collected Eye-Tracking Data
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Canosa, R. Real-world vision: Selective perception and task. ACM Trans. Appl. Percept. 2009, 6, 11. [Google Scholar] [CrossRef]
- Cerf, M.; Frady, E.; Koch, C. Faces and text attract gaze independent of the task: Experimental data and computer model. J. Vis. 2009, 9, 10. [Google Scholar] [CrossRef] [Green Version]
- Faber, M.; Bixler, R.; D’Mello, S.K. An automated behavioral measure of mind wandering during computerized reading. Behav. Res. Methods 2018, 50, 134–150. [Google Scholar] [CrossRef] [Green Version]
- Zhang, H.; Anderson, N.C.; Miller, K.F. Refixation patterns of mind-wandering during real-world scene perception. J. Exp. Psychol. Hum. Percept. Perform. 2021, 47, 36. [Google Scholar] [CrossRef] [PubMed]
- Lee, H.H.; Chen, Z.L.; Yeh, S.L.; Hsiao, J.H.; Wu, A.Y. When eyes wander around: Mind-wandering as revealed by eye movement analysis with hidden Markov models. Sensors 2021, 21, 7569. [Google Scholar] [CrossRef] [PubMed]
- Ibaceta, M.; Madrid, H.P. Personality and Mind-Wandering Self Perception: The Role of Meta-Awareness. Front. Psychol. 2021, 12, 581129. [Google Scholar] [CrossRef] [PubMed]
- Friston, K.J. The labile brain. II. Transients, complexity and selection. Philos. Trans. R. Soc. London. Ser. B Biol. Sci. 2000, 355, 237–252. [Google Scholar] [CrossRef] [Green Version]
- Ainley, V.; Apps, M.A.; Fotopoulou, A.; Tsakiris, M. ‘Bodily precision’: A predictive coding account of individual differences in interoceptive accuracy. Philos. Trans. R. Soc. B Biol. Sci. 2016, 371, 20160003. [Google Scholar] [CrossRef] [Green Version]
- Rudrauf, D.; Bennequin, D.; Granic, I.; Landini, G.; Friston, K.; Williford, K. A mathematical model of embodied consciousness. J. Theor. Biol. 2017, 428, 106–131. [Google Scholar] [CrossRef] [Green Version]
- Solms, M.; Friston, K. How and why consciousness arises: Some considerations from physics and physiology. J. Conscious. Stud. 2018, 25, 202–238. [Google Scholar]
- Atzil, S.; Gao, W.; Fradkin, I.; Barrett, L.F. Growing a social brain. Nat. Hum. Behav. 2018, 2, 624–636. [Google Scholar] [CrossRef]
- Hohwy, J.; Michael, J. Why should any body have a self? In The Subject’s Matter: Self-Consciousness and the Body; De Vignemont, F., Alsmith, A.J., Eds.; The MIT Press: Cambridge, MA, USA, 2017; Chapter 16; pp. 363–392. [Google Scholar]
- Apps, M.A.; Tsakiris, M. The free-energy self: A predictive coding account of self-recognition. Neurosci. Biobehav. Rev. 2014, 41, 85–97. [Google Scholar] [CrossRef] [Green Version]
- Mirza, M.B.; Adams, R.A.; Mathys, C.D.; Friston, K.J. Scene Construction, Visual Foraging, and Active Inference. Front. Comput. Neurosci. 2016, 10, 56. [Google Scholar] [CrossRef] [Green Version]
- Cerf, M.; Harel, J.; Einhäuser, W.; Koch, C. Predicting human gaze using low-level saliency combined with face detection. In Proceedings of the Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8–11 December 2008; Volume 20. [Google Scholar]
- Proust, M. À la Recherche du Temps Perdu; Grasset and Gallimard; Aegitas: Toronto, ON, Canada, 1913. [Google Scholar]
- Hrechak, A.K.; McHugh, J.A. Automated fingerprint recognition using structural matching. Pattern Recognit. 1990, 23, 893–904. [Google Scholar] [CrossRef]
- Galbally, J.; Haraksim, R.; Beslay, L. A study of age and ageing in fingerprint biometrics. IEEE Trans. Inf. Forensics Secur. 2018, 14, 1351–1365. [Google Scholar] [CrossRef]
- Wang, M.; Deng, W. Deep face recognition: A survey. Neurocomputing 2021, 429, 215–244. [Google Scholar] [CrossRef]
- Cuculo, V.; D’Amelio, A.; Grossi, G.; Lanzarotti, R.; Lin, J. Robust single-sample face recognition by sparsity-driven sub-dictionary learning using deep features. Sensors 2019, 19, 146. [Google Scholar] [CrossRef] [Green Version]
- Rattani, A.; Derakhshani, R. Ocular biometrics in the visible spectrum: A survey. Image Vis. Comput. 2017, 59, 1–16. [Google Scholar] [CrossRef]
- Holland, C.; Komogortsev, O.V. Biometric Identification via Eye Movement Scanpaths in Reading. In Proceedings of the 2011 International Joint Conference on Biometrics (IJCB ’11), Washington, DC, USA, 11–13 October 2011; pp. 1–8. [Google Scholar]
- Bayat, A.; Pomplun, M. Biometric identification through eye-movement patterns. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Los Angeles, CA, USA, 17–21 July 2017; pp. 583–594. [Google Scholar]
- Bednarik, R.; Kinnunen, T.; Mihaila, A.; Fränti, P. Eye-Movements as a Biometric. In Image Analysis; Lecture Notes in Computer Science; Kalviainen, H., Parkkinen, J., Kaarna, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2005; Volume 3540, pp. 780–789. [Google Scholar]
- Komogortsev, O.V.; Jayarathna, S.; Aragon, C.R.; Mahmoud, M. Biometric identification via an oculomotor plant mathematical model. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, Austin, TX, USA, 22–24 March 2010; pp. 57–60. [Google Scholar]
- Kasprowski, P. Human Identification Using Eye Movements. Ph.D. Dissertation, Silesian University of Technology, Gliwice, Poland, 2004. [Google Scholar]
- Kasprowski, P.; Ober, J. Enhancing eye-movement-based biometric identification method by using voting classifiers. In Proceedings of the Biometric Technology for Human Identification II, SPIE, Orlando, FL, USA, 28–29 March 2005; Volume 5779, pp. 314–323. [Google Scholar]
- Lohr, D.; Berndt, S.H.; Komogortsev, O. An implementation of eye movement-driven biometrics in virtual reality. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland, 14–17 June 2018; pp. 1–3. [Google Scholar]
- Lohr, D.J.; Aziz, S.; Komogortsev, O. Eye movement biometrics using a new dataset collected in virtual reality. In Proceedings of the ACM Symposium on Eye Tracking Research and Applications, Stuttgart, Germany, 2–5 June 2020; pp. 1–3. [Google Scholar]
- Lohr, D.; Griffith, H.; Komogortsev, O.V. Eye know you: Metric learning for end-to-end biometric authentication using eye movements from a longitudinal dataset. IEEE Trans. Biom. Behav. Identity Sci. 2022. [Google Scholar] [CrossRef]
- Deravi, F.; Guness, S.P. Gaze Trajectory as a Biometric Modality. In Proceedings of the Biosignals, Rome, Italy, 26–29 January 2011; pp. 335–341. [Google Scholar]
- Porta, M.; Barboni, A. Strengthening security in industrial settings: A study on gaze-based biometrics through free observation of static images. In Proceedings of the 2019 24th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Zaragoza, Spain, 10–13 September 2019; pp. 1273–1277. [Google Scholar]
- Cantoni, V.; Galdi, C.; Nappi, M.; Porta, M.; Riccio, D. GANT: Gaze analysis technique for human identification. Pattern Recognit. 2015, 48, 1027–1038. [Google Scholar] [CrossRef]
- Rigas, I.; Economou, G.; Fotopoulos, S. Biometric identification based on the eye movements and graph matching techniques. Pattern Recognit. Lett. 2012, 33, 786–792. [Google Scholar] [CrossRef]
- Kinnunen, T.; Sedlak, F.; Bednarik, R. Towards Task-independent Person Authentication Using Eye Movement Signals. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA’10), Austin, TX, USA, 22–24 March 2010; pp. 187–190. [Google Scholar]
- Kasprowski, P.; Ober, J. Eye movements in biometrics. In Proceedings of the International Workshop on Biometric Authentication, Hong Kong, China, 15–17 July 2004; pp. 248–258. [Google Scholar]
- Silver, D.L.; Biggs, A. Keystroke and Eye-Tracking Biometrics for User Identification. In Proceedings of the IC-AI, Las Vegas, NV, USA, 26–29 June 2006; pp. 344–348. [Google Scholar]
- Friedman, L.; Nixon, M.S.; Komogortsev, O.V. Method to assess the temporal persistence of potential biometric features: Application to oculomotor, gait, face and brain structure databases. PLoS ONE 2017, 12, e0178501. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jia, S.; Koh, D.H.; Seccia, A.; Antonenko, P.; Lamb, R.; Keil, A.; Schneps, M.; Pomplun, M. Biometric recognition through eye movements using a recurrent neural network. In Proceedings of the 2018 IEEE International Conference on Big Knowledge (ICBK), Singapore, 17–18 November 2018; pp. 57–64. [Google Scholar]
- Jäger, L.A.; Makowski, S.; Prasse, P.; Liehr, S.; Seidler, M.; Scheffer, T. Deep Eyedentification: Biometric identification using micro-movements of the eye. In Proceedings of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Würzburg, Germany, 16 September 2019; pp. 299–314. [Google Scholar]
- Makowski, S.; Prasse, P.; Reich, D.R.; Krakowczyk, D.; Jäger, L.A.; Scheffer, T. DeepEyedentificationLive: Oculomotoric biometric identification and presentation-attack detection using deep neural networks. IEEE Trans. Biom. Behav. Identity Sci. 2021, 3, 506–518. [Google Scholar] [CrossRef]
- Abdelwahab, A.; Landwehr, N. Deep distributional sequence embeddings based on a wasserstein loss. Neural Process. Lett. 2022, 54, 3749–3769. [Google Scholar] [CrossRef]
- Yin, J.; Sun, J.; Li, J.; Liu, K. An Effective Gaze-Based Authentication Method with the Spatiotemporal Feature of Eye Movement. Sensors 2022, 22, 3002. [Google Scholar] [CrossRef]
- Trappes, R. Individual differences, uniqueness, and individuality in behavioural ecology. Stud. Hist. Philos. Sci. 2022, 96, 18–26. [Google Scholar] [CrossRef]
- Stephens, D.W. Foraging Theory; Princeton University Press: Princeton, NJ, USA, 1986. [Google Scholar]
- Viswanathan, G.M.; Da Luz, M.G.; Raposo, E.P.; Stanley, H.E. The Physics of Foraging: An Introduction to Random Searches and Biological Encounters; Cambridge University Press: Cambridge, UK, 2011. [Google Scholar]
- Bartumeus, F.; Catalan, J. Optimal search behavior and classic foraging theory. J. Phys. A Math. Theor. 2009, 42, 434002. [Google Scholar] [CrossRef] [Green Version]
- Toscano, B.J.; Gownaris, N.J.; Heerhartz, S.M.; Monaco, C.J. Personality, foraging behavior and specialization: Integrating behavioral and food web ecology at the individual level. Oecologia 2016, 182, 55–69. [Google Scholar] [CrossRef]
- Todd, P.M.; Hills, T.T. Foraging in mind. Curr. Dir. Psychol. Sci. 2020, 29, 309–315. [Google Scholar] [CrossRef]
- Budaev, S.; Jørgensen, C.; Mangel, M.; Eliassen, S.; Giske, J. Decision-making from the animal perspective: Bridging ecology and subjective cognition. Front. Ecol. Evol. 2019, 7, 164. [Google Scholar] [CrossRef] [Green Version]
- Rosati, A.G. Foraging cognition: Reviving the ecological intelligence hypothesis. Trends Cogn. Sci. 2017, 21, 691–702. [Google Scholar] [CrossRef]
- Pirolli, P. Information Foraging Theory: Adaptive Interaction with Information; Oxford University Press: New York, NY, USA, 2007. [Google Scholar]
- Hills, T.T. Animal Foraging and the Evolution of Goal-Directed Cognition. Cogn. Sci. 2006, 30, 3–41. [Google Scholar] [CrossRef] [Green Version]
- Liversedge, S.P.; Findlay, J.M. Saccadic eye movements and cognition. Trends Cogn. Sci. 2000, 4, 6–14. [Google Scholar] [CrossRef]
- Wolfe, J.M. When is it time to move to the next raspberry bush? Foraging rules in human visual search. J. Vis. 2013, 13. [Google Scholar] [CrossRef]
- Ehinger, K.A.; Wolfe, J.M. When is it time to move to the next map? Optimal foraging in guided visual search. Attent. Percept. Psychophys. 2016, 78, 2135–2151. [Google Scholar] [CrossRef] [Green Version]
- Cain, M.S.; Vul, E.; Clark, K.; Mitroff, S.R. A Bayesian optimal foraging model of human visual search. Psychol. Sci. 2012, 23, 1047–1054. [Google Scholar] [CrossRef] [Green Version]
- Friston, K.J.; Shiner, T.; FitzGerald, T.; Galea, J.M.; Adams, R.; Brown, H.; Dolan, R.J.; Moran, R.; Stephan, K.E.; Bestmann, S. Dopamine, affordance and active inference. PLoS Comput. Biol. 2012, 8, e1002327. [Google Scholar] [CrossRef] [Green Version]
- Land, M.F. Eye movements and the control of actions in everyday life. Prog. Retin. Eye Res. 2006, 25, 296–324. [Google Scholar] [CrossRef]
- Shepherd, S.V.; Platt, M.L. Spontaneous social orienting and gaze following in ringtailed lemurs (Lemur catta). Anim. Cogn. 2007, 11, 13. [Google Scholar] [CrossRef]
- Guy, N.; Azulay, H.; Kardosh, R.; Weiss, Y.; Hassin, R.R.; Israel, S.; Pertzov, Y. A novel perceptual trait: Gaze predilection for faces during visual exploration. Sci. Rep. 2019, 9, 10714. [Google Scholar] [CrossRef] [Green Version]
- Boccignone, G.; Cuculo, V.; D’Amelio, A.; Grossi, G.; Lanzarotti, R. Give Ear to My Face: Modelling Multimodal Attention to Social Interactions. In Computer Vision—ECCV 2018 Workshops; Leal-Taixé, L., Roth, S., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 331–345. [Google Scholar]
- Kowler, E. Eye movements: The past 25 years. Vis. Res. 2011, 51, 1457–1483. [Google Scholar] [CrossRef] [PubMed]
- Henderson, J.M. Gaze control as prediction. Trends Cogn. Sci. 2017, 21, 15–23. [Google Scholar] [CrossRef] [PubMed]
- Cuculo, V.; D’Amelio, A.; Lanzarotti, R.; Boccignone, G. Personality gaze patterns unveiled via automatic relevance determination. In Proceedings of the Federation of International Conferences on Software Technologies: Applications and Foundations, Toulouse, France, 25–29 June 2018; pp. 171–184. [Google Scholar]
- Tatler, B.; Vincent, B. The prominence of behavioural biases in eye guidance. Vis. Cogn. 2009, 17, 1029–1054. [Google Scholar] [CrossRef]
- Tatler, B.; Vincent, B. Systematic tendencies in scene viewing. J. Eye Mov. Res. 2008, 2, 1–18. [Google Scholar] [CrossRef]
- Tatler, B.; Hayhoe, M.; Land, M.; Ballard, D. Eye guidance in natural vision: Reinterpreting salience. J. Vis. 2011, 11, 5. [Google Scholar] [CrossRef] [Green Version]
- Dorr, M.; Martinetz, T.; Gegenfurtner, K.; Barth, E. Variability of eye movements when viewing dynamic natural scenes. J. Vis. 2010, 10, 28. [Google Scholar] [CrossRef] [Green Version]
- Hu, Z.; Li, S.; Zhang, C.; Yi, K.; Wang, G.; Manocha, D. DGaze: CNN-Based Gaze Prediction in Dynamic Scenes. IEEE Trans. Vis. Comput. Graph. 2020, 26, 1902–1911. [Google Scholar] [CrossRef]
- Henderson, J.M.; Luke, S.G. Stable individual differences in saccadic eye movements during reading, pseudoreading, scene viewing, and scene search. J. Exp. Psychol. Hum. Percept. Perform. 2014, 40, 1390. [Google Scholar] [CrossRef]
- Bargary, G.; Bosten, J.M.; Goodbourn, P.T.; Lawrance-Owen, A.J.; Hogg, R.E.; Mollon, J.D. Individual differences in human eye movements: An oculomotor signature? Vis. Res. 2017, 141, 157–169. [Google Scholar] [CrossRef]
- Engbert, R. Microsaccades: A microcosm for research on oculomotor control, attention, and visual perception. Prog. Brain Res. 2006, 154, 177–192. [Google Scholar]
- Engbert, R.; Mergenthaler, K.; Sinn, P.; Pikovsky, A. An integrated model of fixational eye movements and microsaccades. Proc. Natl. Acad. Sci. USA 2011, 108, E765–E770. [Google Scholar] [CrossRef] [Green Version]
- Makarava, N.; Bettenbühl, M.; Engbert, R.; Holschneider, M. Bayesian estimation of the scaling parameter of fixational eye movements. Europhys. Lett. 2012, 100, 40003. [Google Scholar] [CrossRef]
- Engbert, R.; Kliegl, R. Microsaccades keep the eyes’ balance during fixation. Psychol. Sci. 2004, 15, 431. [Google Scholar] [CrossRef]
- Brockmann, D.; Geisel, T. The ecology of gaze shifts. Neurocomputing 2000, 32, 643–650. [Google Scholar] [CrossRef]
- Gnedenko, B.; Kolmogórov, A. Limit Distributions for Sums of Independent Random Variables; Addison-Wesley Pub. Co.: Boston, MA, USA, 1954. [Google Scholar]
- Bouchaud, J.P.; Georges, A. Anomalous diffusion in disordered media: Statistical mechanisms, models and physical applications. Phys. Rep. 1990, 195, 127–293. [Google Scholar] [CrossRef]
- Boccignone, G.; Ferraro, M. Modelling gaze shift as a constrained random walk. Phys. A Stat. Mech. Its Appl. 2004, 331, 207–218. [Google Scholar] [CrossRef]
- Kümmerer, M.; Bethge, M.; Wallis, T.S. DeepGaze III: Modeling free-viewing human scanpaths with deep learning. J. Vis. 2022, 22, 7. [Google Scholar] [CrossRef]
- Kümmerer, M.; Bethge, M. State-of-the-art in human scanpath prediction. arXiv 2021, arXiv:2102.12239. [Google Scholar]
- Viswanathan, G.; Afanasyev, V.; Buldyrev, S.; Murphy, E.; Prince, P.; Stanley, H. Lévy flight search patterns of wandering albatrosses. Nature 1996, 381, 413–415. [Google Scholar] [CrossRef]
- Bella-Fernández, M.; Suero Suñé, M.; Gil-Gómez de Liaño, B. Foraging behavior in visual search: A review of theoretical and mathematical models in humans and animals. Psychol. Res. 2022, 86, 331–349. [Google Scholar] [CrossRef]
- Benhamou, S. How many animals really do the Lévy walk? Ecology 2007, 88, 1962–1969. [Google Scholar] [CrossRef] [PubMed]
- Edwards, A.M.; Phillips, R.A.; Watkins, N.W.; Freeman, M.P.; Murphy, E.J.; Afanasyev, V.; Buldyrev, S.V.; da Luz, M.G.; Raposo, E.P.; Stanley, H.E.; et al. Revisiting Lévy flight search patterns of wandering albatrosses, bumblebees and deer. Nature 2007, 449, 1044–1048. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Codling, E.; Plank, M.; Benhamou, S. Random walk models in biology. J. R. Soc. Interface 2008, 5, 813. [Google Scholar] [CrossRef] [PubMed]
- Bénichou, O.; Loverdo, C.; Moreau, M.; Voituriez, R. Two-dimensional intermittent search processes: An alternative to Lévy flight strategies. Phys. Rev. E 2006, 74, 020102. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Otero-Millan, J.; Troncoso, X.; Macknik, S.; Serrano-Pedraza, I.; Martinez-Conde, S. Saccades and microsaccades during visual fixation, exploration, and search: Foundations for a common saccadic generator. J. Vis. 2008, 8, 21. [Google Scholar] [CrossRef] [Green Version]
- Otero-Millan, J.; Macknik, S.L.; Langston, R.E.; Martinez-Conde, S. An oculomotor continuum from exploration to fixation. Proc. Natl. Acad. Sci. USA 2013, 110, 6175–6180. [Google Scholar] [CrossRef] [Green Version]
- Martinez-Conde, S.; Otero-Millan, J.; Macknik, S.L. The impact of microsaccades on vision: Towards a unified theory of saccadic function. Nat. Rev. Neurosci. 2013, 14, 83–96. [Google Scholar] [CrossRef]
- Uhlenbeck, G.E.; Ornstein, L.S. On the theory of the Brownian motion. Phys. Rev. 1930, 36, 823. [Google Scholar] [CrossRef]
- Gardiner, C. Handbook of Stochastic Methods; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
- Oud, J.H.; Singer, H. Continuous time modeling of panel data: SEM versus filter techniques. Stat. Neerl. 2008, 62, 4–28. [Google Scholar] [CrossRef]
- Boccignone, G.; Cuculo, V.; D’Amelio, A.; Grossi, G.; Lanzarotti, R. On Gaze Deployment to Audio-Visual Cues of Social Interactions. IEEE Access 2020, 8, 161630–161654. [Google Scholar] [CrossRef]
- D’Amelio, A.; Boccignone, G. Gazing at Social Interactions Between Foraging and Decision Theory. Front. Neurorobot. 2021, 15, 31. [Google Scholar] [CrossRef]
- Lemons, D.S. An Introduction to Stochastic Processes in Physics; JHU Press: Baltimore, MD, USA, 2002. [Google Scholar]
- Kloeden, P.E.; Platen, E. Numerical Solution of Stochastic Differential Equations; Springer Science & Business Media: Berlin, Germany, 2013; Volume 23. [Google Scholar]
- Hessels, R.S.; Niehorster, D.C.; Nyström, M.; Andersson, R.; Hooge, I.T.C. Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. R. Soc. Open Sci. 2018, 5, 180502. [Google Scholar] [CrossRef] [Green Version]
- Pekkanen, J.; Lappi, O. A new and general approach to signal denoising and eye movement classification based on segmented linear regression. Sci. Rep. 2017, 7, 17726. [Google Scholar] [CrossRef]
- Higham, D. An algorithmic introduction to numerical simulation of stochastic differential equations. SIAM Rev. 2001, 43, 525–546. [Google Scholar] [CrossRef]
- Wang, Z.; Wu, Y.; Chu, H. On equivalence of the LKJ distribution and the restricted Wishart distribution. arXiv 2018, arXiv:1809.04746. [Google Scholar]
- Lewandowski, D.; Kurowicka, D.; Joe, H. Generating random correlation matrices based on vines and extended onion method. J. Multivar. Anal. 2009, 100, 1989–2001. [Google Scholar] [CrossRef] [Green Version]
- Kucukelbir, A.; Tran, D.; Ranganath, R.; Gelman, A.; Blei, D.M. Automatic differentiation variational inference. J. Mach. Learn. Res. 2017, 18, 430–474. [Google Scholar]
- Bettenbuhl, M.; Rusconi, M.; Engbert, R.; Holschneider, M. Bayesian Selection of Markov Models for Symbol Sequences: Application to Microsaccadic Eye Movements. PLoS ONE 2012, 7, e43388. [Google Scholar] [CrossRef] [Green Version]
- George, A.; Routray, A. A score level fusion method for eye movement biometrics. Pattern Recognit. Lett. 2016, 82, 207–215. [Google Scholar] [CrossRef] [Green Version]
- Schröder, C.; Al Zaidawi, S.M.K.; Prinzler, M.H.; Maneth, S.; Zachmann, G. Robustness of eye movement biometrics against varying stimuli and varying trajectory length. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–7. [Google Scholar]
- Salvatier, J.; Wiecki, T.V.; Fonnesbeck, C. Probabilistic programming in Python using PyMC3. PeerJ Comput. Sci. 2016, 2, e55. [Google Scholar] [CrossRef] [Green Version]
- McInnes, L.; Healy, J.; Saul, N.; Großberger, L. UMAP: Uniform Manifold Approximation and Projection. J. Open Source Softw. 2018, 3, 861. [Google Scholar] [CrossRef]
- McInnes, L.; Healy, J.; Melville, J. Umap: Uniform manifold approximation and projection for dimension reduction. arXiv 2018, arXiv:1802.03426. [Google Scholar]
- Van der Maaten, L.; Hinton, G. Visualizing data using t-SNE. J. Mach. Learn. Res. 2008, 9, 2579–2605. [Google Scholar]
- Le Meur, O.; Liu, Z. Saccadic model of eye movements for free-viewing condition. Vis. Res. 2015, 116, 152–164. [Google Scholar] [CrossRef] [PubMed]
- Foulsham, T. Scenes, saliency maps and scanpaths. In Eye Movement Research; Klein, C., Ettinger, U., Eds.; Springer: Cham, Switzerland, 2019; pp. 197–238. [Google Scholar]
- Zhang, R.; Saran, A.; Liu, B.; Zhu, Y.; Guo, S.; Niekum, S.; Ballard, D.; Hayhoe, M. Human Gaze Assisted Artificial Intelligence: A Review. IJCAI 2020, 2020, 4951. [Google Scholar]
- Van Der Linde, I.; Rajashekar, U.; Bovik, A.C.; Cormack, L.K. DOVES: A database of visual eye movements. Spat. Vis. 2009, 22, 161–177. [Google Scholar]
- Fan, S.; Shen, Z.; Jiang, M.; Koenig, B.L.; Xu, J.; Kankanhalli, M.S.; Zhao, Q. Emotional attention: A study of image sentiment and visual attention. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 7521–7531. [Google Scholar]
- Griffith, H.; Lohr, D.; Abdulin, E.; Komogortsev, O. GazeBase, a large-scale, multi-stimulus, longitudinal eye movement dataset. Sci. Data 2021, 8, 184. [Google Scholar] [CrossRef]
- Lencastre, P.; Bhurtel, S.; Yazidi, A.; Denysov, S.; Lind, P.G. EyeT4Empathy: Dataset of foraging for visual information, gaze typing and empathy assessment. Sci. Data 2022, 9, 752. [Google Scholar] [CrossRef]
- Demšar, J. Statistical Comparisons of Classifiers over Multiple Data Sets. J. Mach. Learn. Res. 2006, 7, 1–30. [Google Scholar]
- Benavoli, A.; Corani, G.; Demšar, J.; Zaffalon, M. Time for a change: A tutorial for comparing multiple classifiers through Bayesian analysis. J. Mach. Learn. Res. 2017, 18, 2653–2688. [Google Scholar]
- Torralba, A.; Efros, A.A. Unbiased look at dataset bias. In Proceedings of the CVPR 2011, Colorado Springs, CO, USA, 20–25 June 2011; pp. 1521–1528. [Google Scholar]
- Graczyk, M.; Lasota, T.; Telec, Z.; Trawiński, B. Nonparametric Statistical Analysis of Machine Learning Algorithms for Regression Problems. In Proceedings of the Knowledge-Based and Intelligent Information and Engineering Systems, Cardiff, UK, 8–10 September 2010; Setchi, R., Jordanov, I., Howlett, R.J., Jain, L.C., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 111–120. [Google Scholar]
- Eisinga, R.; Heskes, T.; Pelzer, B.; Te Grotenhuis, M. Exact p-values for pairwise comparison of Friedman rank sums, with application to comparing classifiers. BMC Bioinform. 2017, 18, 68. [Google Scholar] [CrossRef]
- Jain, A.K.; Kumar, A. Biometric Recognition: An Overview. In Second Generation Biometrics: The Ethical, Legal and Social Context; Mordini, E., Tzovaras, D., Eds.; Springer: Dordrecht, The Netherlands, 2012; pp. 49–79. [Google Scholar]
- Rothkopf, C.; Ballard, D.; Hayhoe, M. Task and context determine where you look. J. Vis. 2007, 7, 16. [Google Scholar] [CrossRef] [Green Version]
- Triesch, J.; Ballard, D.H.; Hayhoe, M.M.; Sullivan, B.T. What you see is what you need. J. Vis. 2003, 3, 9. [Google Scholar] [CrossRef] [Green Version]
- Schütz, A.; Braun, D.; Gegenfurtner, K. Eye movements and perception: A selective review. J. Vis. 2011, 11, 9. [Google Scholar] [CrossRef]
- Foulsham, T.; Underwood, G. What can saliency models predict about eye movements? Spatial and sequential aspects of fixations during encoding and recognition. J. Vis. 2008, 8, 6. [Google Scholar] [CrossRef]
- Yang, S.C.H.; Wolpert, D.M.; Lengyel, M. Theoretical perspectives on active sensing. Curr. Opin. Behav. Sci. 2016, 11, 100–108. [Google Scholar] [CrossRef]
- Berridge, K.C.; Robinson, T.E. Parsing reward. Trends Neurosci. 2003, 26, 507–513. [Google Scholar] [CrossRef]
- Zhang, Y.; Juhola, M. On biometrics with eye movements. IEEE J. Biomed. Health Inform. 2016, 21, 1360–1366. [Google Scholar] [CrossRef] [Green Version]
- Ballard, D.; Hayhoe, M.; Li, F.; Whitehead, S.; Frisby, J.; Taylor, J.; Fisher, R. Hand-eye coordination during sequential tasks [and discussion]. Philos. Trans. R. Soc. London. Ser. B Biol. Sci. 1992, 337, 331–339. [Google Scholar]
- Coen-Cagli, R.; Coraggio, P.; Napoletano, P.; Schwartz, O.; Ferraro, M.; Boccignone, G. Visuomotor characterization of eye movements in a drawing task. Vis. Res. 2009, 49, 810–818. [Google Scholar] [CrossRef] [Green Version]
- Al Zaidawi, S.M.K.; Prinzler, M.H.; Lührs, J.; Maneth, S. An extensive study of user identification via eye movements across multiple datasets. Signal Process. Image Commun. 2022, 108, 116804. [Google Scholar] [CrossRef]
- Klein, C.; Ettinger, U. A hundred years of eye movement research in psychiatry. Brain Cogn. 2008, 68, 215–218. [Google Scholar] [CrossRef] [PubMed]
- Meyhöfer, I.; Bertsch, K.; Esser, M.; Ettinger, U. Variance in saccadic eye movements reflects stable traits. Psychophysiology 2016, 53, 566–578. [Google Scholar] [CrossRef]
- Nissens, T.; Failing, M.; Theeuwes, J. People look at the object they fear: Oculomotor capture by stimuli that signal threat. Cogn. Emot. 2017, 31, 1707–1714. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kulke, L.; Pasqualette, L. Emotional content influences eye-movements under natural but not under instructed conditions. Cogn. Emot. 2022, 36, 332–344. [Google Scholar] [CrossRef]
- Lim, J.Z.; Mountstephens, J.; Teo, J. Emotion recognition using eye-tracking: Taxonomy, review and current challenges. Sensors 2020, 20, 2384. [Google Scholar] [CrossRef]
- Rauthmann, J.F.; Seubert, C.T.; Sachse, P.; Furtner, M.R. Eyes as windows to the soul: Gazing behavior is related to personality. J. Res. Personal. 2012, 46, 147–156. [Google Scholar] [CrossRef]
- Risko, E.F.; Anderson, N.C.; Lanthier, S.; Kingstone, A. Curious eyes: Individual differences in personality predict eye movement behavior in scene-viewing. Cognition 2012, 122, 86–90. [Google Scholar] [CrossRef]
- Hoppe, S.; Loetscher, T.; Morey, S.A.; Bulling, A. Eye movements during everyday behavior predict personality traits. Front. Hum. Neurosci. 2018, 12, 105. [Google Scholar] [CrossRef]
- Jost, J.; Havlisova, H.; Bilkova, Z.; Malinovska, O.; Stefankova, Z. Saccadic Eye Movements are Related to Personality Traits: A Comparison of Maltreated and Non-maltreated Young Women. J. Psychiatry Psychiatr. Disord. 2019, 3, 245–261. [Google Scholar] [CrossRef]
Visual Attentive Processing of a Scene | Patchy Landscape Foraging |
---|---|
Perceiver | Forager |
Perceiver’s gaze shift | Forager’s relocation |
RoI/object | Patch |
RoI/object selection | Patch choice |
Deploying attention to RoI/object items | Patch/prey handling |
Disengaging from RoI/object | Patch leave or giving-up |
Dimension | Description |
---|---|
() | The element in position of the posterior distribution mean of the inferred matrix. It represents the drift (strength of the attraction) towards the attractor point (arriving/current fixation) in the horizontal direction. High values denote higher velocity saccades or less pronounced physiological drift (see [105]) in fixational eye movements (FEMs) in the i dimension. (: uncertainty of the estimation) |
() | The element in position of the posterior distribution mean of the inferred matrix. It represents the cross-correlation between the drifts in the horizontal and vertical dimension. High values denote more curved saccades/FEMs. (: uncertainty of the estimation) |
() | The element in position of the posterior distribution mean of the inferred matrix. It represents the drift (strength of the attraction) towards the attractor point (arriving/current fixation) in the vertical direction. High values denote higher velocity saccades or less pronounced physiological drift (see [105]) in FEMs in the j dimension. (: Uncertainty of the estimation) |
() | The element in position of the posterior distribution mean of the inferred matrix. It represents the amount of diffusion exhibithed by a fixation/saccade in the horizontal direction. High values denote higher noise in saccades/FEMs in the i dimension. (: uncertainty of the estimation) |
() | The element in position of the posterior distribution mean of the inferred matrix. It represents the covariance of the diffusion between horizontal and vertical dimension. High values denote more anisotropic saccades/FEMs. (: uncertainty of the estimation) |
() | The element in position of the posterior distribution mean of the inferred matrix. It represents the amount diffusion exhibited by a fixation/saccade in the vertical direction. High values denote more noise in saccades/FEMs in the j dimension. (: uncertainty of the estimation) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
D’Amelio, A.; Patania, S.; Bursic, S.; Cuculo, V.; Boccignone, G. Using Gaze for Behavioural Biometrics. Sensors 2023, 23, 1262. https://doi.org/10.3390/s23031262
D’Amelio A, Patania S, Bursic S, Cuculo V, Boccignone G. Using Gaze for Behavioural Biometrics. Sensors. 2023; 23(3):1262. https://doi.org/10.3390/s23031262
Chicago/Turabian StyleD’Amelio, Alessandro, Sabrina Patania, Sathya Bursic, Vittorio Cuculo, and Giuseppe Boccignone. 2023. "Using Gaze for Behavioural Biometrics" Sensors 23, no. 3: 1262. https://doi.org/10.3390/s23031262
APA StyleD’Amelio, A., Patania, S., Bursic, S., Cuculo, V., & Boccignone, G. (2023). Using Gaze for Behavioural Biometrics. Sensors, 23(3), 1262. https://doi.org/10.3390/s23031262