What Is Hidden in Clear Sight and How to Find It—A Survey of the Integration of Artificial Intelligence and Eye Tracking
Abstract
:1. Introduction
2. Materials and Methods
- Which eye trackers were used when collecting data for AI?
- Which sampling frequencies were used when collecting data for AI?
- What kind of non-eye tracking parameters were used when AI was used?
- How many people participated in the experiments collecting eye tracking data to be used with AI?
- What is the gender distribution of the participants and what age range are they in?
- How were the features extracted?
- How many artificial intelligence methods were used in one eye tracking study?
- Which methods of artificial intelligence were used with eye tracking data?
- How were the results of using AI with eye tracking data verified?
- eye AND tracking AND artificial AND intelligence;
- eye AND movement AND artificial AND intelligence;
- gaze AND estimation AND artificial AND intelligence;
- smartphone AND eye AND tracking;
- webcam AND eye AND tracking.
- The research analyzed only static images.
- The research detected only the eye position.
- The eye tracking data were collected using Electroencephalography (EEG).
- Artificial intelligence was not used on eye tracking data nor to calculate eye tracking data.
- The paper is not accessible.
2.1. Applications of Artificial Intelligence Enhanced Eye Tracking
2.2. Eye Trackers
2.3. Participants
2.4. Additional Data for Artificial Intelligence
2.5. Features Extraction
- A.
- Typical eye tracking data which are gathered via typical eye tracker software, such as Tobii Studio or Tobii Pro Lab [104];
- B.
- Eye tracking data after some additional processing, which are not present in typical eye tracker software, such as more sophisticated statistics or transformations (i.e., DWT);
- C.
- The application of some basic ML algorithms to eye tracker data such as k-means or decision trees;
- D.
- The application of neural networks or deep learning;
- N.
- Not specified in the article.
2.6. Artificial Intelligence Methods Used with Eye Tracking
2.7. Methods for Verification of the Results
3. Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Holmqvist, K.; Nyström, M.; Andersson, R.; Dewhurst, R.; Jarodzka, H.; Van de Weijer, J. Eye Tracking: A Comprehensive Guide to Methods and Measures; OUP: Oxford, UK, 2011. [Google Scholar]
- Bojko, A. Eye Tracking the User Experience: A Practical Guide to Research; Rosenfeld Media: Brooklyn, NY, USA, 2013. [Google Scholar]
- Russell, S.J.; Norvig, P. Artificial Intelligence: A Modern Approach, 3rd ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2009. [Google Scholar]
- Bishop, C.M. Pattern Recognition and Machine Learning; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Kitchenham, B.; Charters, S. Guidelines for Performing Systematic Literature Reviews in Software Engineering; Version 2.3; Technical Report, EBSE Technical Report EBSE-2007-01; Keele University and Durham University Joint Report; Keele: Staffs, UK, 2007. [Google Scholar]
- Sharma, K.; Giannakos, M.; Dillenbourg, P. Eye-tracking and artificial intelligence to enhance motivation and learning. Smart Learn Environ. 2020, 7, 13. [Google Scholar] [CrossRef]
- Sharma, K.; Papamitsiou, Z.; Giannakos, M. Building pipelines for educational data using AI and multimodal analytics: A “grey-box” approach. Br. J. Educ. Technol. 2019, 50, 3004–3031. [Google Scholar] [CrossRef]
- Peterson, J.; Pardos, Z.; Rau, M.; Swigart, A.; Gerber, C.; McKinsey, J. Understanding Student Success in Chemistry Using Gaze Tracking and Pupillometry. In Artificial Intelligence in Education; Conati, C., Heffernan, N., Mitrovic, A., Verdejo, M.F., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 358–366. [Google Scholar]
- Zhan, Z.; Zhang, L.; Mei, H.; Fong, P.S.W. Online Learners’ Reading Ability Detection Based on Eye-Tracking Sensors. Sensors 2016, 16, 1457. [Google Scholar] [CrossRef] [PubMed]
- Yi, J.; Sheng, B.; Shen, R.; Lin, W. Real Time Learning Evaluation Based on Gaze Tracking. In Proceedings of the 2015 14th International Conference on Computer-Aided Design and Computer Graphics (CAD/Graphics), Xi’an, China, 26–28 August 2015; pp. 157–164. [Google Scholar]
- Liao, W.-H.; Chang, C.-W.; Wu, Y.-C. Classification of Reading Patterns Based on Gaze Information. In Proceedings of the 2017 IEEE International Symposium on Multimedia (ISM), Taichung, Taiwan, 11–13 December 2017; pp. 595–600. [Google Scholar]
- González-Garduño, A.; Søgaard, A. Learning to Predict Readability Using Eye-Movement Data From Natives and Learners. In Proceedings of the AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018. [Google Scholar] [CrossRef]
- Garain, U.; Pandit, O.; Augereau, O.; Okoso, A.; Kise, K. Identification of Reader Specific Difficult Words by Analyzing Eye Gaze and Document Content. In Proceedings of the 2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR), Kyoto, Japan, 9–15 November 2017; pp. 1346–1351. [Google Scholar]
- Orlosky, J.; Huynh, B.; Hollerer, T. Using Eye Tracked Virtual Reality to Classify Understanding of Vocabulary in Recall Tasks. In Proceedings of the 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), San Diego, CA, USA, 9–11 December 2019; pp. 66–667. [Google Scholar]
- Li, J.; Ngai, G.; Leong, H.V.; Chan, S.C.F. Your Eye Tells How Well You Comprehend: 2016 IEEE 40th Annual Computer Software and Applications Conference, COMPSAC 2016. In Proceedings of the 2016 IEEE 40th Annu Comput Softw Appl Conf Workshop COMPSAC 2016, Atlanta, GA, USA, 10–14 June 2016; Volume 2, pp. 503–508. [Google Scholar]
- Howe, A.; Nguyen, P. SAT Reading Analysis Using Eye-Gaze Tracking Technology and Machine Learning. In Intelligent Tutoring Systems; Nkambou, R., Azevedo, R., Vassileva, J., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 332–338. [Google Scholar]
- Conati, C.; Lallé, S.; Rahman, M.A.; Toker, D. Further Results on Predicting Cognitive Abilities for Adaptive Visualizations. In Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, Melbourne, Australia, 19–25 August 2017; pp. 1568–1574. [Google Scholar]
- Lallé, S.; Toker, D.; Conati, C.; Carenini, G. Prediction of Users’ Learning Curves for Adaptation while Using an Information Visualization. In Proceedings of the 20th International Conference on Intelligent User Interfaces, Atlanta, GA, USA, 29 March–1 April 2015; pp. 357–368. [Google Scholar]
- Lallé, S.; Conati, C.; Carenini, G. Prediction of individual learning curves across information visualizations. User Model. User-Adapt. Interact. 2016, 26, 307–345. [Google Scholar] [CrossRef]
- Król, M.; Król, M.E. Eye movement anomalies as a source of diagnostic information in decision process analysis. J. Exp. Psychol. Learn Mem. Cogn. 2021, 47, 1012–1026. [Google Scholar] [CrossRef]
- Prieto, L.; Sharma, K.; Dillenbourg, P.; Jesús, M. Teaching Analytics: Towards Automatic Extraction of Orchestration Graphs Using Wearable Sensors. In Proceedings of the LAK‘16: Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, Edinburgh, UK, 25–29 April 2016. [Google Scholar] [CrossRef]
- Matsuda, Y.; Fedotov, D.; Takahashi, Y.; Arakawa, Y.; Yasumoto, K.; Minker, M. EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data. Sensors 2018, 18, 3978. [Google Scholar] [CrossRef] [PubMed]
- Pappas, I.O.; Sharma, K.; Mikalef, P.; Giannakos, M.N. How Quickly Can We Predict Users’ Ratings on Aesthetic Evaluations of Websites? Employing Machine Learning on Eye-Tracking Data. Responsible Des Implement Use Inf. Commun. Technol. 2020, 12067, 429–440. [Google Scholar]
- Sun, W.; Li, Y.; Sheopuri, A.; Teixeira, T. Computational Creative Advertisements. In Proceedings of the WWW ‘18: Companion Proceedings of the The Web Conference 2018, Lyon, France, 23–27 April 2018. [Google Scholar] [CrossRef]
- Schweikert, C.; Gobin, L.; Xie, S.; Hsu, D.F. Preference Prediction Based on Eye Movement Using Multi-layer Combinatorial Fusion. In Brain Informatics; Wang, S., Yamamoto, V., Su, J., Yang, Y., Jones, E., Iasemidis, L., Mitchell, T., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 282–293. [Google Scholar]
- Emsawas, T.; Fukui, K.; Numao, M. Feasible Affect Recognition in Advertising Based on Physiological Responses from Wearable Sensors. In Advances in Artificial Intelligence; Ohsawa, Y., Yada, K., Ito, T., Takama, Y., Sato-Shimokawara, E., Abe, A., Mori, J., Matsumura, N., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 27–36. [Google Scholar]
- Felício, C.Z.; De Almeida, C.M.M.; Alves, G.; Pereira, F.S.F.; Paixão, K.V.R.; De Amo, S.; Barcelos, C.A.Z. VP-Rec: A Hybrid Image Recommender Using Visual Perception Network. In Proceedings of the 2016 IEEE 28th International Conference on Tools with Artificial Intelligence (ICTAI), San Jose, CA, USA, 6–8 November 2016; pp. 70–77. [Google Scholar]
- Abdessalem, H.B.; Chaouachi, M.; Boukadida, M.; Frasson, C. Toward Real-Time System Adaptation Using Excitement Detection from Eye Tracking. In Intelligent Tutoring Systems; Coy, A., Hayashi, Y., Chang, M., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 214–223. [Google Scholar]
- Gonzalez Viejo, C.; Fuentes, S.; Howell, K.; Dunshea, F.R. Robotics and computer vision techniques combined with non-invasive consumer biometrics to assess quality traits from beer foamability using machine learning: A potential for artificial intelligence applications. Food Control 2018, 92, 72–79. [Google Scholar] [CrossRef]
- Healy, G.; Smeaton, A. Eye fixation related potentials in a target search task. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 4203–4206. [Google Scholar]
- Smith, J.; Legg, P.; Matovic, M.; Kinsey, K. Predicting User Confidence During Visual Decision Making. ACM Trans. Interact. Intell. Syst. 2018, 8, 10:1–10:30. [Google Scholar] [CrossRef]
- Lallé, S.; Conati, C.; Carenini, G. Predicting Confusion in Information Visualization from Eye Tracking and Interaction Data. In Proceedings of the IJCAI’16: Twenty-Fifth International Joint Conference on Artificial Intelligence, New York, NY, USA, 9–15 July 2016. [Google Scholar]
- Ciupe, A.; Florea, C.; Orza, B.; Vlaicu, A.; Petrovan, B. A Bag of Words Model for Improving Automatic Stress Classification. In Proceedings of the Second International Afro-European Conference for Industrial Advancement AECIA 2015, Villejuif, France, 9–11 September 2015; pp. 339–349. [Google Scholar]
- Lu, W.; Jia, Y. Inferring User Preference in Good Abandonment from Eye Movements. In Web-Age Information Management; Dong, X.L., Yu, X., Li, J., Sun, Y., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 457–460. [Google Scholar]
- López-Gil, J.-M.; Virgili-Gomá, J.; Gil, R.; Guilera, T.; Batalla, I.; Soler-González, J.; García, R. Method for Improving EEG Based Emotion Recognition by Combining It with Synchronized Biometric and Eye Tracking Technologies in a Non-invasive and Low Cost Way. Front. Comput. Neurosci. 2016, 10, 85. [Google Scholar]
- Lu, B.; Duan, X. Facial Expression Recognition Based on Strengthened Deep Belief Network with Eye Movements Information. In Artificial Intelligence in China; Liang, Q., Wang, W., Mu, J., Liu, X., Na, Z., Chen, B., Eds.; Springer: Singapore, 2020; pp. 645–652. [Google Scholar]
- Nag, A.; Haber, N.; Voss, C.; Tamura, S.; Daniels, J.; Ma, J.; Chiang, B.; Ramachandran, S.; Schwartz, S.; Winograd, T.; et al. Toward Continuous Social Phenotyping: Analyzing Gaze Patterns in an Emotion Recognition Task for Children With Autism Through Wearable Smart Glasses. J. Med. Internet Res. 2020, 22, e13810. [Google Scholar] [CrossRef]
- Liu, W.; Yu, X.; Raj, B.; Yi, L.; Zou, X.; Li, M. Efficient autism spectrum disorder prediction with eye movement: A machine learning framework. In Proceedings of the 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), Xi’an, China, 21–24 September 2015; pp. 649–655. [Google Scholar]
- Kacur, J.; Polec, J.; Csoka, F.; Smolejova, E. GMM Based Detection of Schizophrenia Using Eye Tracking. In Proceedings of the 2019 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB), Tuscany, Italy, 9–11 July 2019; pp. 1–4. [Google Scholar]
- Przybyszewski, A.W.; Szlufik, S.; Dutkiewicz, J.; Habela, P.; Koziorowski, D.M. Machine Learning on the Video Basis of Slow Pursuit Eye Movements Can Predict Symptom Development in Parkinson’s Patients. In Intelligent Information and Database Systems; Nguyen, N.T., Trawiński, B., Kosala, R., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 268–276. [Google Scholar]
- Rello, L.; Ballesteros, M. Detecting readers with dyslexia using machine learning with eye tracking measures. In Proceedings of the 12th International Web for All Conference, Florence, Italy, 18–20 May 2015; Association for Computing Machinery: New York, NY, USA, 2015; pp. 1–8. [Google Scholar]
- Kupas, D.; Harangi, B.; Czifra, G.; Andrassy, G. Decision support system for the diagnosis of neurological disorders based on gaze tracking. In Proceedings of the 10th International Symposium on Image and Signal Processing and Analysis, Ljubljana, Slovenia, 18–20 September 2017; pp. 37–40. [Google Scholar]
- Zhang, Y.; Wilcockson, T.; Kim, K.I.; Crawford, T.; Gellersen, H.; Sawyer, P. Monitoring dementia with automatic eye movements analysis. In Intelligent Decision Technologies 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 299–309. [Google Scholar]
- Mao, Y.; He, Y.; Liu, L.; Chen, X. Disease Classification Based on Eye Movement Features With Decision Tree and Random Forest. Front. Neurosci. 2020, 14, 798. [Google Scholar] [CrossRef] [PubMed]
- Ahmed, M.; Noble, J.A. An eye-tracking inspired method for standardised plane extraction from fetal abdominal ultrasound volumes. In Proceedings of the 2016 IEEE 13th International Symposium on Biomedical Imaging (ISBI), Prague, Czech Republic, 13–16 April 2016; pp. 1084–1087. [Google Scholar]
- de Lope, J.; Graña, M. Comparison of Labeling Methods for Behavioral Activity Classification Based on Gaze Ethograms. In Hybrid Artificial Intelligent Systems; de la Cal, E.A., Villar Flecha, J.R., Quintián, H., Corchado, E., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 132–144. [Google Scholar]
- Destyanto, T.Y.R.; Lin, R.F. Detecting computer activities using eye-movement features. J. Ambient. Intell. Humaniz. Comput. 2020. [Google Scholar] [CrossRef]
- Kit, D.; Sullivan, B. Classifying mobile eye tracking data with hidden Markov models. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, Florence, Italy, 6–9 September 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 1037–1040. [Google Scholar]
- Frutos-Pascual, M.; Garcia-Zapirain, B. Assessing Visual Attention Using Eye Tracking Sensors in Intelligent Cognitive Therapies Based on Serious Games. Sensors 2015, 15, 11092–11117. [Google Scholar] [CrossRef] [PubMed]
- Fan, X.; Wang, F.; Song, D.; Lu, Y.; Liu, J. GazMon: Eye Gazing Enabled Driving Behavior Monitoring and Prediction. IEEE Trans. Mob. Comput. 2021, 20, 1420–1433. [Google Scholar] [CrossRef]
- Meng, C.; Zhao, X. Webcam-Based Eye Movement Analysis Using CNN. IEEE Access 2017, 5, 19581–19587. [Google Scholar] [CrossRef]
- Yin, P.-Y.; Day, R.-F.; Wang, Y.-C. Tabu search-based classification for eye-movement behavioral decisions. Neural Comput. Appl. 2018, 29, 1433–1443. [Google Scholar] [CrossRef]
- Fernandes, D.L.; Siqueira-Batista, R.; Gomes, A.P.; Souza, C.R.; da Costa, I.T.; Cardoso, F.d.S.L.; de Assis, J.V.; Caetano, G.H.L.; Cerqueira, F.L. Investigation of the visual attention role in clinical bioethics decision-making using machine learning algorithms. Procedia Comput. Sci. 2017, 108, 1165–1174. [Google Scholar] [CrossRef]
- Zhang, R.; Walshe, C.; Liu, Z.; Guan, L.; Muller, K.S.; Whritner, J.A.; Zhang, L.; Hayhoe, M.M.; Ballard, D.H. Atari-HEAD: Atari Human Eye-Tracking and Demonstration Dataset. Proc. AAAI Conf. Artif. Intell. 2020, 34, 6811–6820. [Google Scholar] [CrossRef]
- Emerson, A.; Henderson, N.; Rowe, J.; Min, W.; Lee, S.; Minogue, J.; Lester, J. Investigating Visitor Engagement in Interactive Science Museum Exhibits with Multimodal Bayesian Hierarchical Models. In Artificial Intelligence in Education; Bittencourt, I.I., Cukurova, M., Muldner, K., Luckin, R., Millán, E., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 165–176. [Google Scholar]
- Eivazi, S.; Slupina, M.; Fuhl, W.; Afkari, H.; Hafez, A.; Kasneci, E. Towards Automatic Skill Evaluation in Microsurgery. In Proceedings of the 22nd International Conference on Intelligent User Interfaces Companion, Limassol, Cyprus, 13–16 March 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 73–76. [Google Scholar]
- Ye, N.; Tao, X.; Dong, L.; Li, Y.; Ge, N. Indicating eye contacts in one-to-many video teleconference with one web camera. In Proceedings of the 2015 Asia Pacific Conference on Multimedia and Broadcasting, Bali, Indonesia, 23–25 April 2015; pp. 1–5. [Google Scholar]
- Pettersson, J.; Falkman, P. Human Movement Direction Classification using Virtual Reality and Eye Tracking. Procedia Manuf. 2020, 51, 95–102. [Google Scholar] [CrossRef]
- Hu, B.; Liu, X.; Wang, W.; Cai, R.; Li, F.; Yuan, S. Prediction of interaction intention based on eye movement gaze feature. In Proceedings of the 2019 IEEE 8th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), Chongqing, China, 24–26 May 2019; pp. 378–383. [Google Scholar]
- Castellanos, J.L.; Gomez, M.F.; Adams, K.D. Using machine learning based on eye gaze to predict targets: An exploratory study. In Proceedings of the 2017 IEEE Symposium Series on Computational Intelligence (SSCI), Honolulu, HI, USA, 27 November–1 December 2017; pp. 1–7. [Google Scholar]
- Jadue, J.; Slanzi, G.; Salas, L.; Velásquez, J.D. Web User Click Intention Prediction by Using Pupil Dilation Analysis. In Proceedings of the 2015 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT), Singapore, 2–9 December 2015; pp. 433–436. [Google Scholar]
- Chen, O.T.-C.; Chen, P.-C.; Tsai, Y.-T. Attention estimation system via smart glasses. In Proceedings of the 2017 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB), Manchester, UK, 23–25 August 2017; pp. 1–5. [Google Scholar]
- Delvigne, V.; Wannous, H.; Vandeborre, J.-P.; Ris, L.; Dutoit, T. Attention Estimation in Virtual Reality with EEG based Image Regression. In Proceedings of the 2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), Utrecht, The Netherlands, 14–18 December 2020; IEEE Computer Society: Washington, DC, USA, 2020; pp. 10–16. [Google Scholar]
- Yoshizawa, A.; Nishiyama, H.; Iwasaki, H.; Mizoguchi, F. Machine-learning approach to analysis of driving simulation data. In Proceedings of the 2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC), Palo Alto, CA, USA, 22–23 August 2016; pp. 398–402. [Google Scholar]
- Koma, H.; Harada, T.; Yoshizawa, A.; Iwasaki, H. Considering eye movement type when applying random forest to detect cognitive distraction. In Proceedings of the 2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC), Palo Alto, CA, USA, 22–23 August 2016; pp. 377–382. [Google Scholar]
- Liu, T.; Yang, Y.; Huang, G.-B.; Yeo, Y.K.; Lin, Z. Driver Distraction Detection Using Semi-Supervised Machine Learning. IEEE Trans. Intell. Transp. Syst. 2016, 17, 1108–1120. [Google Scholar] [CrossRef]
- Bixler, R.; D’Mello, S. Automatic Gaze-Based Detection of Mind Wandering with Metacognitive Awareness. In User Modeling, Adaptation and Personalization; Ricci, F., Bontcheva, K., Conlan, O., Lawless, S., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 31–43. [Google Scholar]
- Yamada, Y.; Kobayashi, M. Detecting mental fatigue from eye-tracking data gathered while watching video: Evaluation in younger and older adults. Artif. Intell. Med. 2018, 91, 39–48. [Google Scholar] [CrossRef] [PubMed]
- Shojaeizadeh, M.; Djamasbi, S.; Paffenroth, R.C.; Trapp, A.C. Detecting task demand via an eye tracking machine learning system. Decis. Support Syst. 2019, 116, 91–101. [Google Scholar] [CrossRef]
- Lotz, A.; Weissenberger, S. Predicting Take-Over Times of Truck Drivers in Conditional Autonomous Driving. Adv. Intell. Syst. Comput. 2019, 786, 329–338. [Google Scholar]
- Monfort, S.S.; Sibley, C.M.; Coyne, J.T. Using machine learning and real-time workload assessment in a high-fidelity UAV simulation environment. In Next-Generation Analyst IV; SPIE: Bellingham, WA, USA, 2016; pp. 93–102. [Google Scholar]
- Mannaru, P.; Balasingam, B.; Pattipati, K.; Sibley, C.; Coyne, J. Cognitive Context Detection for Adaptive Automation. Proc. Hum. Factors Ergon. Soc. Annu. Meet 2016, 60, 223–227. [Google Scholar] [CrossRef]
- Larue, G.S.; Rakotonirainy, A.; Pettitt, A.N. Predicting Reduced Driver Alertness on Monotonous Highways. IEEE Pervasive Comput. 2015, 14, 78–85. [Google Scholar] [CrossRef]
- Liu, D.; Dong, B.; Gao, X.; Sibley, C.; Coyne, J. Exploiting Eye Tracking for Smartphone Authentication. In Applied Cryptography and Network Security; Malkin, T., Kolesnikov, V., Lewko, A.B., Polychronakis, M., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 457–477. [Google Scholar]
- Tiwari, A.; Pal, R. Gaze-Based Graphical Password Using Webcam. In Information Systems Security; Ganapathy, V., Jaeger, T., Shyamasundar, R., Eds.; ICISS 2018. Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2018; Volume 11281. [Google Scholar] [CrossRef]
- Li, N.; Wu, Q.; Liu, J.; Hu, W.; Qin, B.; Wu, W. EyeSec: A Practical Shoulder-Surfing Resistant Gaze-Based Authentication System. In Information Security Practice and Experience; Liu, J.K., Samarati, P., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 435–453. [Google Scholar]
- Das, A.; Pal, U.; Ferrer Ballester, M.A.; Blumenstein, M. Multi-angle based lively sclera biometrics at a distance. In Proceedings of the 2014 IEEE Symposium on Computational Intelligence in Biometrics and Identity Management (CIBIM), Orlando, FL, USA, 9–12 December 2014; pp. 22–29. [Google Scholar]
- Qiao, Y.; Wang, J.; Chen, J.; Ren, J. Design and Realization of Gaze Gesture Control System for Flight Simulation. J. Phys. Conf. Ser. 2020, 1693, 012213. [Google Scholar] [CrossRef]
- Kabir, A.; Shahin, F.B.; Islam, M. Design and Implementation of an EOG-based Mouse Cursor Control for Application in Human-Computer Interaction. J. Phys. Conf. Ser. 2020, 1487, 012043. [Google Scholar] [CrossRef]
- Reda, R.; Tantawi, M.; Shedeed, H.; Tolba, M.F. Eye Movements Recognition Using Electrooculography Signals. In Proceedings of the International Conference on Artificial Intelligence and Computer Vision (AICV2020); Hassanien, A.-E., Azar, A.T., Gaber, T., Oliva, D., Tolba, F.M., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 490–500. [Google Scholar]
- Pai, S.; Bhardwaj, A. Eye Gesture Based Communication for People with Motor Disabilities in Developing Nations. In Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 14–19 July 2019; pp. 1–8. [Google Scholar]
- Taban, R.A.; Croock, M.S. Eye Tracking Based Directional Control System using Mobile Applications. Int. J. Comput. Digit. Syst. 2018, 7, 365–374. [Google Scholar]
- López, A.; Fernández, D.; Ferrero, F.J.; Valledor, M.; Postolache, O. EOG signal processing module for medical assistive systems. In Proceedings of the 2016 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Benevento, Italy, 15–18 May 2016; pp. 1–5. [Google Scholar]
- Jigang, L.; Francis, B.S.L.; Rajan, D. Free-Head Appearance-Based Eye Gaze Estimation on Mobile Devices. In Proceedings of the 2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), Okinawa, Japan, 11–13 February 2019; pp. 232–237. [Google Scholar]
- Semmelmann, K.; Weigelt, S. Online webcam-based eye tracking in cognitive science: A first look. Behav. Res. Methods 2018, 50, 451–465. [Google Scholar] [CrossRef]
- Papoutsaki, A.; Sangkloy, P.; Laskey, J.; Daskalova, N.; Huang, J.; Hays, J. Webgazer: Scalable webcam eye tracking using user interactions. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, New York, NY, USA, 9–15 July 2016; AAAI Press: New York, NY, USA, 2016; pp. 3839–3845. [Google Scholar]
- Saikh, T.; Bangalore, S.; Carl, M.; Bandyopadhyay, S. Predicting source gaze fixation duration: A machine learning approach. In Proceedings of the 2015 International Conference on Cognitive Computing and Information Processing(CCIP), Noida, India, 3–4 March 2015. [Google Scholar] [CrossRef]
- Valliappan, N.; Dai, N.; Steinberg, E.; He, J.; Rogers, K.; Ramachandran, V.; Xu, P.; Shojaeizadeh, M.; Guo, L.; Kohlhoff, K.; et al. Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nat. Commun. 2020, 11, 4553. [Google Scholar] [CrossRef] [PubMed]
- Tősér, Z.; Rill, R.A.; Faragó, K.; Jeni, L.A.; Lőrincz, A. Personalization of Gaze Direction Estimation with Deep Learning. In Proceedings of the KI 2016: Advances in Artificial Intelligence, Klagenfurt, Austria, 26–30 September 2016; pp. 200–207. [Google Scholar]
- Dechterenko, F.; Lukavsky, J. Predicting eye movements in multiple object tracking using neural networks. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications; Association for Computing Machinery: New York, NY, USA, 2016; pp. 271–274. [Google Scholar]
- Lai, H.-Y.; Saavedra-Peña, G.; Sodini, C.G.; Sze, V.; Heldt, T. Measuring Saccade Latency Using Smartphone Cameras. IEEE J. Biomed Health Inform. 2020, 24, 885–897. [Google Scholar] [CrossRef]
- Brousseau, B.; Rose, J.; Eizenman, M. Hybrid Eye-Tracking on a Smartphone with CNN Feature Extraction and an Infrared 3D Model. Sensors 2020, 20, 543. [Google Scholar] [CrossRef] [PubMed]
- Rakhmatulin, I.; Duchowski, A.T. Deep Neural Networks for Low-Cost Eye Tracking. Procedia Comput. Sci. 2020, 176, 685–694. [Google Scholar] [CrossRef]
- Al-Btoush, A.I.; Abbadi, M.A.; Hassanat, A.B.; Tarawneh, A.S.; Hasanat, A.; Prasath, V.B.S. New Features for Eye-Tracking Systems: Preliminary Results. In Proceedings of the 2019 10th International Conference on Information and Communication Systems (ICICS), Irbid, Jordan, 11–13 June 2019; pp. 179–184. [Google Scholar]
- Hossain, M.S.; Ali, A.A.; Amin, M.A. Eye-Gaze to Screen Location Mapping for UI Evaluation of Webpages. In Proceedings of the 2019 3rd International Conference on Graphics and Signal Processing, Hong Kong, China, 1–3 June 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 100–104. [Google Scholar]
- Krafka, K.; Khosla, A.; Kellnhofer, P.; Kannan, H.; Bhandarkar, S.; Matusik, W.; Torralba, A. Eye Tracking for Everyone. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; IEEE Computer Society: Washington, DC, USA, 2016; pp. 2176–2184. [Google Scholar]
- Wan, Q.; Kaszowska, A.; Samani, A.; Panetta, K.; Taylor, H.A.; Agaian, S. Aerial Border Surveillance for Search and Rescue Missions Using Eye Tracking Techniques. In Proceedings of the 2018 IEEE International Symposium on Technologies for Homeland Security (HST), Woburn, MA, USA, 23–24 October 2018; pp. 1–5. [Google Scholar]
- Xiaodong, D.; Bo, L.; Peng, L.; Chunhong, G. Study of Eye Movement Behavior Pattern Diversity between Chinese Ethnic Groups. In Proceedings of the 2015 IEEE International Conference on Computational Intelligence & Communication Technology, London, UK, 8–12 June 2015; pp. 767–770. [Google Scholar]
- Holmqvist, K.; Andersson, R. Eye-Tracking: A Comprehensive Guide to Methods, Paradigms and Measures; OUP Oxford: Oxford, UK, 2017. [Google Scholar]
- Leube, A.; Rifai, K.; Rifai, K. Sampling rate influences saccade detection in mobile eye tracking of a reading task. J. Eye Mov. Res. 2017, 10. [Google Scholar] [CrossRef] [PubMed]
- Juhola, M.; Jäntti, V.; Pyykkö, I. Effect of sampling frequencies on computation of the maximum velocity of saccadic eye movements. Biol. Cybern. 1985, 53, 67–72. [Google Scholar] [CrossRef]
- Andersson, R.; Nyström, M.; Holmqvist, K. Sampling frequency and eye-tracking measures: How speed affects durations, latencies, and more. J. Eye Mov. Res. 2010, 3, 1–12. [Google Scholar] [CrossRef]
- Rolfs, M. Microsaccades: Small steps on a long way. Vision Res. 2009, 49, 2415–2441. [Google Scholar] [CrossRef]
- Tobii Pro Lab User Manual; Tobii AB: Danderyd, Sweden, 2021.
- Mathôt, S. Pupillometry: Psychology, Physiology, and Function. J. Cogn. 2018, 1, 1–23. [Google Scholar] [CrossRef]
- Carette, R.; Elbattah, M.; Cilia, F.; Dequen, G.; Guérin, J.L.; Bosche, J. Learning to Predict Autism Spectrum Disorder based on the Visual Patterns of Eye-tracking Scanpaths. In Proceedings of the 12th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2019), Prague, Czech Republic, 22–24 February 2019; pp. 103–112. [Google Scholar]
- Elbattah, M.; Carette, R.; Dequen, G.; Guérin, J.L.; Cilia, F. Learning clusters in autism spectrum disorder: Image-based clustering of eye-tracking scanpaths with deep autoencoder. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin/Heidelberg, Germany, 23–27 July 2019. [Google Scholar]
- Fawcett, T. An introduction to ROC analysis. Pattern Recognit. Lett. 2006, 27, 861–874. [Google Scholar] [CrossRef]
Ref. | Task | Additional Parameters |
---|---|---|
[64] | attention estimation | EEG, head movement |
[38] | identifying children with ASD | questionnaire, age, gender |
[56] | predicting dwell time in a museum | face expression, body movement, interaction trace logs |
[27] | affect recognition | EEG, ECG |
[8] | predicting students’ performance and effort | EEG, face videos, arousal data from wristband |
[71] | predicting take-over time | head position, body posture, simulation data |
[30] | predicting liking a video | infrared thermal image, heart rate, face expression |
[32] | predicting user confidence | Time |
[25] | predicting reaction to ads | gender, age, survey, time, ad parameters, behavior connected with an ad (e.g., sharing) |
[13] | predicting readability | text features |
[17] | predicting SAT score | Time |
[36] | predicting the emotion of an observed person | EEG, empatica bracelet |
[32] | predicting social plane of interaction | EEG, accelerometer, audio, video |
[33] | detecting user confusion | mouse actions, distance of the user’s head from the screen |
[72] | predicting mental workload | Reaction time |
[42] | detecting people with dyslexia | age, text characteristics |
[74] | predicting reduced driver alertness | EEG |
[19] | predicting learning curve | perceptual speed, verbal working memory, visual working memory, locus of control |
[37] | classifying emotions in pictures | image |
[89] | predicting eye movement | distance between the object and the dis- tractor |
[41] | predicting Parkinson symptoms’ development | age, sex, duration of the disease |
[23] | emotion estimation | head movement, body movement, audio, video of the face |
Ref. | Task | AI | N | Accuracy |
---|---|---|---|---|
[47] | detecting the type of behavior when using laptop | SVM | ns | 99.77% |
[67] | detecting driver distraction | Semi-Supervised Extreme Learning Machine | 34 | 97.2% |
[95] | gaze estimation | Random Forest | 10 | 97.2% |
[37] | classifying emotions in pictures | Strengthened Deep Belief Network | 40 | 97.1% |
[45] | predicting neurological diseases | Random Forest | 96 | 96.88% |
[12] | predicting type of reading | SVM | 30 | 96.69% |
[81] | detecting eye gestures | naive Bayes | ext | 95.0% |
[11] | reading behavior recognition | Hidden Markov Model | 4 | 95.0% |
[92] | gaze estimation | ANN | 29 | 94.1% |
[61] | predicting targets | MLP | 5 | 94.0% |
[48] | detecting computer activity type | CNN | 150 | 93.15% |
[76] | authentication | CNN | 26 | 93.14% |
[63] | detecting attention | SVM | 10 | 93.1% |
[80] | detecting eye gestures | SVM | 5 | 93.0% |
[46] | detecting organs | AdaBoost | 10 | 92.5% |
[29] | predicting excitement | DNN | 20 | 92.0% |
[75] | smartphone authentication | Random Sample Consensus | 21 | 91.6% |
[82] | eye gestures for patients | Recurrent Neural Network | 270 | 91.4% |
[69] | detecting mental fatigue | SVM | 18 | 91.0% |
[54] | predicting ethical decision-making | MLP | 75 | 90.7% |
[22] | predicting social plane of interaction | Gradient Boosted Decision Tree | 1 | 90.6% |
[59] | predicting movement intention | CNN | 24 | 88.37% |
[32] | predicting user confidence | Random Forest | 23 | 88.0% |
[73] | detecting operator overload | Linear Discriminant Analysis | 20 | 87.91% |
[39] | detecting people with ASD | SVM | 130 | 86.89% |
[13] | predicting readability | MLP | ext | 86.62% |
[50] | identifying children’s behavior | Random forest | 32 | 84.0% |
[38] | distinguishing children with ASD | Logistic Regression | 33 | 83.9% |
[62] | predicting web user click intention | ANN | 25 | 82.0% |
[83] | choosing direction | Viola-Jones Algorithm with HAAR Cascade Classifiers | ns | 82.0% |
[30] | predicting liking a video | ANN | 30 | 81.8% |
[35] | detecting satisfaction | SVM | 30 | 80.53% |
[42] | detecting people with dyslexia | SVM | 97 | 80.18% |
[20] | detecting speed of learning | Random Forest | 161 | 80.0% |
[99] | distinguishing Chinese ethnic groups | SVM | 35 | 80.0% |
[58] | detecting eye contact | SVM | ns | 80.0% |
[41] | predicting Parkinson symptoms’ development | Decomposition Tree | 10 | 79.5% |
[40] | detection of Schizophrenia | Generative Model Base Method | 44 | 79.2% |
[70] | detecting task demand | Random Forest | 48 | 79.0% |
[72] | predicting mental workload | Ensemble | 20 | 78.0% |
[19] | predicting learning curve | Random Forest | 95 | 77.0% |
[17] | predicting SAT score | Decision Tree | 30 | 76.67% |
[15] | predicting word understanding | SVM | 16 | 75.6% |
[68] | detecting mind wandering | SVM | 178 | 74.0% |
[65] | detecting cognitive distraction | SVM | 18 | 73.0% |
[27] | affect recognition | Long Short-Term Memory Network | 130 | 72.8% |
[57] | automatic surgery skills assessment | Random Forest | 9 | 69.0% |
[96] | gaze estimation | ANN | 10 | 68.31% |
[18] | predicting user’s cognitive abilities | Random forest | 166 | 66.1% |
[60] | predicting intention | SVM | 20 | 64.0% |
[9] | predicting student’s performance | Logistic Regression | 95 | 63.0% |
[32] | detecting user confusion | Random Forest | 136 | 61.0% |
[49] | predicting type of task | Hidden Markov Model | 8 | 57.0% |
[21] | predicting the prior disclosure type from eye data | Gradient Boosted Decision Tree | 20 | 53.5% |
[88] | predicting the duration of gaze fixation | SVM | Ext. | 49.1% |
[36] | predicting the emotion of an observed person | MLP | 44 | 42.7% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kędras, M.; Sobecki, J. What Is Hidden in Clear Sight and How to Find It—A Survey of the Integration of Artificial Intelligence and Eye Tracking. Information 2023, 14, 624. https://doi.org/10.3390/info14110624
Kędras M, Sobecki J. What Is Hidden in Clear Sight and How to Find It—A Survey of the Integration of Artificial Intelligence and Eye Tracking. Information. 2023; 14(11):624. https://doi.org/10.3390/info14110624
Chicago/Turabian StyleKędras, Maja, and Janusz Sobecki. 2023. "What Is Hidden in Clear Sight and How to Find It—A Survey of the Integration of Artificial Intelligence and Eye Tracking" Information 14, no. 11: 624. https://doi.org/10.3390/info14110624
APA StyleKędras, M., & Sobecki, J. (2023). What Is Hidden in Clear Sight and How to Find It—A Survey of the Integration of Artificial Intelligence and Eye Tracking. Information, 14(11), 624. https://doi.org/10.3390/info14110624