Eye Tracking: Cognition, Computation and Challenges
A special issue of Future Internet (ISSN 1999-5903). This special issue belongs to the section "Big Data and Augmented Intelligence".
Deadline for manuscript submissions: closed (22 August 2022) | Viewed by 609
Special Issue Editor
Interests: multimedia security; steganography; secret sharing; information hiding; blockchain; digital forensics; deep learning
Special Issues, Collections and Topics in MDPI journals
Special Issue Information
Dear Colleagues,
Eye tracking is an important method of measuring a human’s eye movements in order to reflect their attentional behavior, and it has been considered by some researchers to provide an insight into the brain and mind. Eye tracking has been widely used for decades in many research communities, such as visual attention mechanisms, digital forensics, driving, gaming, and medicine. Despite the numerous achievements of eye tracking, however, the high cost and limited ability of eye tracking equipment and techniques make the eye tracking quite challenging.
Since machine learning (ML) techniques are effective in computer vision, they have attracted a lot of attention in recent years. Typical examples are the deep learning models such as deep convolutional neural networks, which have gained great success in many basic computer vision tasks, such as image classification and object detection. Due to the promising performances of the machine learning, it is also possible to extend these techniques to eye tracking.
The aim of this Special Issue is to invite authors to submit original manuscripts that explore deep learning related techniques/methodologies and their applications in eye tracking.
Prof. Dr. Zhili Zhou
Guest Editor
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Future Internet is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
Keywords
- machine learning
- information security
- digital forensics
- deep learning
- eye tracking
Benefits of Publishing in a Special Issue
- Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
- Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
- Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
- External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
- Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.
Further information on MDPI's Special Issue policies can be found here.