Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (7)

Search Parameters:
Keywords = mobile and wearable eye tracking

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 1039 KiB  
Review
Identifying Pertinent Digital Health Topics to Incorporate into Self-Care Pharmacy Education
by Jason C. Wong, Luiza Hekimyan, Francheska Anne Cruz and Taylor Brower
Pharmacy 2024, 12(3), 96; https://doi.org/10.3390/pharmacy12030096 - 20 Jun 2024
Cited by 2 | Viewed by 2540
Abstract
The ever-evolving landscape of digital health technology has dramatically enhanced patients’ ability to manage their health through self-care effectively. These advancements have created various categories of self-care products, including medication management, health tracking, and wellness. There is no published research regarding integrating digital [...] Read more.
The ever-evolving landscape of digital health technology has dramatically enhanced patients’ ability to manage their health through self-care effectively. These advancements have created various categories of self-care products, including medication management, health tracking, and wellness. There is no published research regarding integrating digital health into pharmacy self-care courses. This study aims to identify pertinent digital health devices and applications to incorporate into self-care course education. Digital health limitations, challenges incorporating digital health in self-care pharmacy education, and potential solutions are also reviewed. In conducting this research, many resources, including PubMed, APhA, ASHP, fda.gov, and digital.health, were reviewed in March 2024 to gather information on digital health devices and applications. To supplement this, targeted keyword searches were conducted on topics such as “digital health”, “devices”, “applications”, “technology”, and “self-care” across various online platforms. We identified digital health devices and applications suitable for self-care education across eight topics, as follows: screening, insomnia, reproductive disorders, eye disorders, home medical equipment, GI disorders, pediatrics, and respiratory disorders. Among these topics, wellness screening had the most digital health products available. For all other topics, at least three or more products were identified as relevant to self-care curriculum. By equipping students with digital health knowledge, they can effectively apply it in patient care throughout their rotations and future practice. Many digital health products, including telemedicine, electronic health records, mobile health applications, and wearable devices, are ideal for inclusion in pharmacy curriculum as future educational material. Future research is needed to develop the best strategies for incorporating relevant digital health into self-care education and defining the best student-learning strategies. Full article
(This article belongs to the Special Issue Digital Health in Pharmacy Practice and Education)
Show Figures

Figure 1

22 pages, 4564 KiB  
Article
Exploring Eye Movement Biometrics in Real-World Activities: A Case Study of Wayfinding
by Hua Liao, Wendi Zhao, Changbo Zhang and Weihua Dong
Sensors 2022, 22(8), 2949; https://doi.org/10.3390/s22082949 - 12 Apr 2022
Cited by 9 | Viewed by 4348
Abstract
Eye movement biometrics can enable continuous verification for highly secure environments such as financial transactions and defense establishments, as well as a more personalized and tailored experience in gaze-based human–computer interactions. However, there are numerous challenges to recognizing people in real environments using [...] Read more.
Eye movement biometrics can enable continuous verification for highly secure environments such as financial transactions and defense establishments, as well as a more personalized and tailored experience in gaze-based human–computer interactions. However, there are numerous challenges to recognizing people in real environments using eye movements, such as implicity and stimulus independence. In the instance of wayfinding, this research intends to investigate implicit and stimulus-independent eye movement biometrics in real-world situations. We collected 39 subjects’ eye movement data from real-world wayfinding experiments and derived five sets of eye movement features (the basic statistical, pupillary response, fixation density, fixation semantic and saccade encoding features). We adopted a random forest and performed biometric recognition for both identification and verification scenarios. The best accuracy we obtained in the identification scenario was 78% (equal error rate, EER = 6.3%) with the 10-fold classification and 64% (EER = 12.1%) with the leave-one-route-out classification. The best accuracy we achieved in the verification scenario was 89% (EER = 9.1%). Additionally, we tested performance across the 5 feature sets and 20 time window sizes. The results showed that the verification accuracy was insensitive to the increase in the time window size. These findings are the first indication of the viability of performing implicit and stimulus-independent biometric recognition in real-world settings using wearable eye tracking. Full article
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Show Figures

Figure 1

22 pages, 10247 KiB  
Review
HMD-Based VR Tool for Traffic Psychological Examination: Conceptualization and Design Proposition
by Vojtěch Juřík, Václav Linkov, Petr Děcký, Sára Klečková and Edita Chvojková
Appl. Sci. 2021, 11(19), 8832; https://doi.org/10.3390/app11198832 - 23 Sep 2021
Cited by 3 | Viewed by 4321
Abstract
In the present theoretical paper, the current body of knowledge regarding the use of wearable virtual reality (VR) technologies for traffic psychological examination (TPE) is introduced, critically discussed and a specific application is suggested. The combination of wearable head-mounted displays for VR with [...] Read more.
In the present theoretical paper, the current body of knowledge regarding the use of wearable virtual reality (VR) technologies for traffic psychological examination (TPE) is introduced, critically discussed and a specific application is suggested. The combination of wearable head-mounted displays for VR with an interactive and cost-effective haptic driving interface is emphasized as a valid and viable platform for a driving skills psychological assessment, which is in several aspects superior to standard TPE as well as driving simulators. For this purpose, existing psychological examination methods and psychological phenomena relevant in the process of driving are discussed together with VR technology’s properties and options. Special focus is dedicated to situation awareness as a crucial, but currently hardly measurable construct, where VR in combination with embedded eye-tracking (ET) technology represents a promising solution. Furthermore, the suitability and possibilities of these VR tools for valid traffic psychological examination are analyzed and discussed. Additionally, potentially desirable measures for driving assessment based on recent advances of VR are outlined and practical applications are suggested. The aim of this article is to bring together recent advances in TPE, VR and ET; revise previous relevant studies in the field; and to propose concept of the cost effective, mobile and expandable HMD-based driving simulator, which can be suitable for an ecologically valid driving assessment and follow-up TPE in common practice. Full article
(This article belongs to the Special Issue AR, VR: From Latest Technologies to Novel Applications)
Show Figures

Figure 1

32 pages, 10830 KiB  
Article
Gaze Tracking and Point Estimation Using Low-Cost Head-Mounted Devices
by Ko-Feng Lee, Yen-Lin Chen, Chao-Wei Yu, Kai-Yi Chin and Chen-Han Wu
Sensors 2020, 20(7), 1917; https://doi.org/10.3390/s20071917 - 30 Mar 2020
Cited by 20 | Viewed by 7912
Abstract
In this study, a head-mounted device was developed to track the gaze of the eyes and estimate the gaze point on the user’s visual plane. To provide a cost-effective vision tracking solution, this head-mounted device is combined with a sized endoscope camera, infrared [...] Read more.
In this study, a head-mounted device was developed to track the gaze of the eyes and estimate the gaze point on the user’s visual plane. To provide a cost-effective vision tracking solution, this head-mounted device is combined with a sized endoscope camera, infrared light, and mobile phone; the devices are also implemented via 3D printing to reduce costs. Based on the proposed image pre-processing techniques, the system can efficiently extract and estimate the pupil ellipse from the camera module. A 3D eye model was also developed to effectively locate eye gaze points from extracted eye images. In the experimental results, average accuracy, precision, and recall rates of the proposed system can achieve an average of over 97%, which can demonstrate the efficiency of the proposed system. This study can be widely used in the Internet of Things, virtual reality, assistive devices, and human-computer interaction applications. Full article
Show Figures

Figure 1

2 pages, 39 KiB  
Article
From Lab-Based Studies to Eye-Tracking in Virtual and Real Worlds: Conceptual and Methodological Problems and Solutions. Symposium 4 at the 20th European Conference on Eye Movement Research (Ecem) in Alicante, 20.8.2019
by Ignace T. C. Hooge, Roy S. Hessels, Diederick C. Niehorster, Gabriel J. Diaz, Andrew T. Duchowski and Jeff B. Pelz
J. Eye Mov. Res. 2019, 12(7), 1-2; https://doi.org/10.16910/jemr.12.7.8 - 25 Nov 2019
Cited by 5 | Viewed by 144
Abstract
Wearable mobile eye trackers have great potential as they allow the measurement of eye movements during daily activities such as driving, navigating the world and doing groceries. Although mobile eye trackers have been around for some time, developing and operating these eye trackers [...] Read more.
Wearable mobile eye trackers have great potential as they allow the measurement of eye movements during daily activities such as driving, navigating the world and doing groceries. Although mobile eye trackers have been around for some time, developing and operating these eye trackers was generally a highly technical affair. As such, mobile eye-tracking research was not feasible for most labs. Nowadays, many mobile eye trackers are available from eye-tracking manufacturers (e.g., Tobii, Pupil labs, SMI, Ergoneers) and various implementations in virtual/augmented reality have recently been released.The wide availability has caused the number of publications using a mobile eye tracker to increase quickly. Mobile eye tracking is now applied in vision science, educational science, developmental psychology, marketing research (using virtual and real supermarkets), clinical psychology, usability, architecture, medicine, and more. Yet, transitioning from lab-based studies where eye trackers are fixed to the world to studies where eye trackers are fixed to the head presents researchers with a number of problems. These problems range from the conceptual frameworks used in world-fixed and head-fixed eye tracking and how they relate to each other, to the lack of data quality comparisons and field tests of the different mobile eye trackers and how the gaze signal can be classified or mapped to the visual stimulus. Such problems need to be addressed in order to understand how world-fixed and head-fixed eye-tracking research can be compared and to understand the full potential and limits of what mobile eye-tracking can deliver. In this symposium, we bring together presenting researchers from five different institutions (Lund University, Utrecht University, Clemson University, Birkbeck University of London and Rochester Institute of Technology) addressing problems and innovative solutions across the entire breadth of mobile eye-tracking research. Hooge, presenting Hessels et al. paper, focus on the definitions of fixations and saccades held by researchers in the eyemovement field and argue how they need to be clarified in order to allow comparisons between world-fixed and head-fixed eye-tracking research.—Diaz et al. introduce machine-learning techniques for classifying the gaze signal in mobile eye-tracking contexts where head and body are unrestrained. Niehorster et al. compare data quality of mobile eye trackers during natural behavior and discuss the application range of these eye trackers. Duchowski et al. introduce a method for automatically mapping gaze to faces using computer vision techniques. Pelz et al. employ state-of-the-art techniques to map fixations to objects of interest in the scene video and align grasp and eye-movement data in the same reference frame to investigate the guidance of eye movements during manual interaction. Full article
31 pages, 10178 KiB  
Article
A New Integrated System for Assistance in Communicating with and Telemonitoring Severely Disabled Patients
by Radu Gabriel Bozomitu, Lucian Niţă, Vlad Cehan, Ioana Dana Alexa, Adina Carmen Ilie, Alexandru Păsărică and Cristian Rotariu
Sensors 2019, 19(9), 2026; https://doi.org/10.3390/s19092026 - 30 Apr 2019
Cited by 13 | Viewed by 5424
Abstract
In this paper, we present a new complex electronic system for facilitating communication with severely disabled patients and telemonitoring their physiological parameters. The proposed assistive system includes three subsystems (Patient, Server, and Caretaker) connected to each other via the Internet. The two-way communication [...] Read more.
In this paper, we present a new complex electronic system for facilitating communication with severely disabled patients and telemonitoring their physiological parameters. The proposed assistive system includes three subsystems (Patient, Server, and Caretaker) connected to each other via the Internet. The two-way communication function is based on keywords technology using a WEB application implemented at the server level, and the application is accessed remotely from the patient’s laptop/tablet PC. The patient’s needs can be detected by using different switch-type sensors that are adapted to the patient’s physical condition or by using eye-tracking interfaces. The telemonitoring function is based on a wearable wireless sensor network, organized around the Internet of Things concept, and the sensors acquire different physiological parameters of the patients according to their needs. The mobile Caretaker device is represented by a Smartphone, which uses an Android application for communicating with patients and performing real-time monitoring of their physiological parameters. The prototype of the proposed assistive system was tested in “Dr. C.I. Parhon” Clinical Hospital of Iaşi, Romania, on hospitalized patients from the Clinic of Geriatrics and Gerontology. The system contributes to an increase in the level of care and treatment for disabled patients, and this ultimately lowers costs in the healthcare system. Full article
(This article belongs to the Special Issue Sensors and Wearable Assistive Devices)
Show Figures

Figure 1

13 pages, 1625 KiB  
Article
The Central Bias in Day-to-Day Viewing
by Flora Ioannidou, Frouke Hermens and Timothy L. Hodgson
J. Eye Mov. Res. 2016, 9(6), 1-13; https://doi.org/10.16910/jemr.9.6.6 (registering DOI) - 30 Sep 2016
Cited by 14 | Viewed by 212
Abstract
Eye tracking studies have suggested that, when viewing images centrally presented on a computer screen, observers tend to fixate the middle of the image. This so-called ‘central bias’ was later also observed in mobile eye tracking during outdoors navigation, where observers were found [...] Read more.
Eye tracking studies have suggested that, when viewing images centrally presented on a computer screen, observers tend to fixate the middle of the image. This so-called ‘central bias’ was later also observed in mobile eye tracking during outdoors navigation, where observers were found to fixate the middle of the head-centered video image. It is unclear, however, whether the extension of the central bias to mobile eye tracking in outdoors navigation may have been due to the relatively long viewing distances towards objects in this task and the constant turning of the body in the direction of motion, both of which may have reduced the need for large amplitude eye movements. To examine whether the central bias in day-to-day viewing is related to the viewing distances involved, we here compare eye movements in three tasks (indoors navigation, tea making, and card sorting), each associated with interactions with objects at different viewing distances. Analysis of gaze positions showed a central bias for all three tasks that was independent of the task performed. These results confirm earlier observations of the central bias in mobile eye tracking data, and suggest that differences in the typical viewing distance during different tasks have little effect on the bias. The results could have interesting technological applications, in which the bias is used to estimate the direction of gaze from head-centered video images, such as those obtained from wearable technology. Full article
Show Figures

Figure 1

Back to TopTop