Eyeglass-Type Switch: A Wearable Eye-Movement and Blink Switch for ALS Nurse Call
Abstract
1. Introduction
- Integrated eyeglass-type design for low-light operation: A near-infrared (NIR) imaging system and lightweight CNN-based estimator are embedded in an eyeglass-type form factor to achieve robust pupil and eyelid tracking even in dark, bedside environments.
- Configurable caregiver-centered interaction framework: We introduce a switch operation model that allows caregivers to flexibly adjust geometric and temporal thresholds (e.g., off-area shape, duration parameters) to match individual user conditions, thereby improving safety and personalization.
- Comprehensive validation across real-world settings: The system’s performance was verified through controlled experiments with healthy participants and actual use by multiple ALS patients, as well as public demonstrations including the Osaka–Kansai Expo, confirming its robustness and usability.
2. Related Research and Products
2.1. Alternative Input Interfaces
2.2. Algorithms for Detecting Gaze, Pupil, and Blinks
2.3. Wearable Infrared Tracking for Low-Light and Nighttime Use
2.4. Integration with Nurse-Call Systems and Safety Design
2.5. Positioning of This Work
3. System Architecture
3.1. Design Rationale and Hardware Configuration
3.2. Software Architecture
3.3. Eye-Image Processing Pipeline
- ROI extraction: To suppress background intrusion specific to wearable imaging, the caregiver manually extracts the periocular ROI.
- Two-stage CNN estimation: We consider two alternatives for the estimation pipeline (Figure 2). Our default pipeline estimates eight eyelid/eye contour landmarks with CNN1 and the pupil center with CNN2. Eye closure is gated either by a binary open/closed classifier (CNN3) or by a threshold on the angle computed from the CNN1 landmarks.
- User interface (UI) and visualization: Figure 3 shows the main window of the eyeglass-type switch. The upper-left pane overlays the IR eye image with estimated landmarks and the pupil center, allowing the caregiver to verify tracking quality in real time. The upper-right area presents system status (current user, binary eye state, and transmission flag). Below it, a key-hint menu lists caregiver operations such as mode switching, placement/toggling of the off-area map, ROI setting, and user switching. Along the bottom of the window, time thresholds (e.g., long-press and closure times) and runtime messages are displayed for rapid tuning (, , ). All on-screen text is presented in Japanese to suit the intended patient/caregiver population; localization to other languages is straightforward. This layout prioritizes caregiver-centered parameter adjustment during setup and bedside operation.


3.4. Thresholds, Output Modes, and Operation
4. Algorithms and Interaction Design
4.1. Input and ROI Extraction
4.2. Two-Stage CNN for Feature Estimation
4.3. Design of the Threshold Region (Off-Area)
4.4. Decision Logic for Switch Output
4.4.1. Oculomotor-Based Decision
- Single: Transmit exactly once at the moment of crossing; further transmissions are inhibited until the gaze returns.
- Continuous: Transmit periodically for as long as the gaze remains beyond the boundary; holding gaze alone suffices for operation.
- Long-press: Transmit when the dwell beyond the boundary exceeds , then transmit once more upon return. Combined with relay latching, this realizes long-press behavior. is adjustable from 1.5 s to 3.5 s in 0.5 s steps.
- Hold-to-activate: Transmit exactly once when the boundary crossing persists longer than ; no additional transmission occurs on return. is adjustable from 1.5 s to 3.5 s in 0.5 s steps. This mode is provided to prevent false activations that can arise in the single-shot mode due to pupil-center jitter or incidental, unintended glances.
4.4.2. Eyelid-Based Decision (Blink Input)
4.5. Temporal Stabilization and Operational Feedback
4.6. Handling Alignment Error and Individual Differences
5. Experiments
5.1. Feature-Point Detection Experiment
5.2. Switch Operation Experiment
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Sakamoto, K.; Saitoh, T.; Itoh, K. Development of Night Time Calling System by Eye Movement Using Wearable Camera. In HCI International 2020—Late Breaking Papers: Universal Access and Inclusive Design, Proceedings of the 22nd HCI International Conference, HCII 2020, Copenhagen, Denmark, 19–24 July 2020; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2020; Volume 12426, pp. 345–357. [Google Scholar] [CrossRef]
- Tobii Dynavox. Dwell Time and Selection Options—User Documentation and Support Articles. 2025. Available online: https://www.tobiidynavox.com/ (accessed on 21 October 2025).
- OptiKey. Optikey V4. Available online: https://www.optikey.org/ (accessed on 21 October 2025).
- Isomoto, T.; Yamanaka, S.; Shizuki, B. Dwell Selection with ML-based Intent Prediction Using Only Gaze Data. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2022, 6, 1–21. [Google Scholar] [CrossRef]
- Cai, S.; Venugopalan, S.; Seaver, K.; Xiao, X.; Tomanek, K.; Jalasutram, S.; Morris, M.R.; Kane, S.; Narayanan, A.; MacDonald, R.L.; et al. Using large language models to accelerate communication for eye gaze typing users with ALS. Nat. Commun. 2024, 15, 9449. [Google Scholar] [CrossRef] [PubMed]
- Hitachi KE Systems, Ltd. Den-no-Shin. Available online: https://www.hke.jp/products/dennosin/denindex.htm (accessed on 21 October 2025).
- OryLab, Inc. OriHime eye+Switch. Available online: https://orihime.orylab.com/eye/ (accessed on 21 October 2025).
- Cyberdyne, Inc. CYIN. Available online: https://www.cyberdyne.jp/products/cyin.html (accessed on 21 October 2025).
- Accessyell Co., Ltd. Fine-Chat. Available online: https://accessyell.co.jp/products/fine-chat/ (accessed on 21 October 2025).
- Ezzat, M.; Maged, M.; Gamal, Y.; Adel, M.; Alrahmawy, M.; El-Metwally, S. Blink-To-Live eye-based communication system for users with speech impairments. Sci. Rep. 2023, 13, 7961. [Google Scholar] [CrossRef] [PubMed]
- Golparvar, A.J.; Yapici, M.K. Graphene Smart Textile-Based Wearable Eye Movement Sensor for Electro-Ocular Control and Interaction with Objects. J. Electrochem. Soc. 2019, 166, B3184–B3193. [Google Scholar] [CrossRef]
- Golparvar, A.J.; Yapici, M.K. Toward graphene textiles in wearable eye tracking systems for human–machine interaction. Beilstein J. Nanotechnol. 2021, 12, 180–190. [Google Scholar] [CrossRef] [PubMed]
- Hosp, B.; Eivazi, S.; Maurer, M.; Fuhl, W.; Geisler, D.; Kasneci, E. RemoteEye: An open-source high-speed remote eye tracker. Behav. Res. Methods 2020, 52, 1387–1401. [Google Scholar] [CrossRef] [PubMed]
- Steffan, A.; Zimmer, L.; Arias-Trejo, N.; Bohn, M.; Ben, R.D.; Flores-Coronado, M.A.; Franchin, L.; Garbisch, I.; Wiesmann, C.G.; Hamlin, J.K.; et al. Validation of an Open Source, Remote Web-based Eye-tracking Method (WebGazer) for Research in Early Childhood. Infancy 2024, 29, 31–55. [Google Scholar] [CrossRef] [PubMed]
- Larumbe-Bergera, A.; Garde, G.; Porta, S.; Cabeza, R.; Villanueva, A. Accurate Pupil Center Detection in Off-the-Shelf Eye Tracking Systems Using Convolutional Neural Networks. Sensors 2021, 21, 6847. [Google Scholar] [CrossRef] [PubMed]
- Ou, W.L.; Kuo, T.L.; Chang, C.C.; Fan, C.P. Deep-Learning-Based Pupil Center Detection and Tracking Technology for Visible-Light Wearable Gaze Tracking Devices. Appl. Sci. 2021, 11, 851. [Google Scholar] [CrossRef]
- Chinsatit, W.; Saitoh, T. CNN-Based Pupil Center Detection for Wearable Gaze Estimation System. Appl. Comput. Intell. Soft Comput. 2017, 2017, 8718956. [Google Scholar] [CrossRef]
- Xiong, J.; Dai, W.; Wang, Q.; Dong, X.; Ye, B.; Yang, J. A review of deep learning in blink detection. PeerJ Comput. Sci. 2025, 11, e2594. [Google Scholar] [CrossRef] [PubMed]
- Liu, M.; Bian, S.; Zhao, Z.; Zhou, B.; Lukowicz, P. Energy-efficient, low-latency, and non-contact eye blink detection with capacitive sensing. Front. Comput. Sci. 2024, 6, 1394397. [Google Scholar] [CrossRef]
- Onkhar, V.; Dodou, D.; de Winter, J.C.F. Evaluating the Tobii Pro Glasses 2 and 3 in static and dynamic conditions. Behav. Res. Methods 2023, 56, 4221–4238. [Google Scholar] [CrossRef] [PubMed]
- Zhu, L.; Chen, J.; Yang, H.; Zhou, X.; Gao, Q.; Loureiro, R.; Gao, S.; Zhao, H. Wearable Near-Eye Tracking Technologies for Health: A Review. Bioengineering 2024, 11, 738. [Google Scholar] [CrossRef] [PubMed]
- PyTorch. PyTorch. Available online: https://pytorch.org/ (accessed on 21 October 2025).






| CNN Model | Number of Points | Training Method | Error [Pixel] |
|---|---|---|---|
| Xception | 9 | FT | 2.84 |
| Xception | 17 | FT | 2.98 |
| Xception | 8 | FT | 3.44 |
| VGG16 | 9 | FT | 5.04 |
| VGG16-D | 8 | FT | 5.05 |
| CNN Model | Number of Points | Training Method | Error [Pixel] |
|---|---|---|---|
| VGG16-BN | 1 | D1+D2 | 1.33 |
| VGG16-D | 1 | D1 | 1.33 |
| VGG16 | 1 | FT | 1.41 |
| Xception | 1 | D1+D2 | 1.44 |
| Xception | 1 | FT | 1.46 |
| Mode | R | P | F | |||
|---|---|---|---|---|---|---|
| # 1 | 326 | 4 | 27 | 0.988 | 0.924 | 0.955 |
| # 2 | 321 | 9 | 1 | 0.973 | 0.997 | 0.985 |
| right | 132 | 0 | 2 | 1.000 | 0.985 | 0.992 |
| left | 131 | 1 | 0 | 0.992 | 1.000 | 0.996 |
| up | 132 | 0 | 9 | 1.000 | 0.936 | 0.967 |
| down | 120 | 12 | 16 | 0.909 | 0.882 | 0.896 |
| close [eyes] | 132 | 0 | 1 | 1.000 | 0.992 | 0.996 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tamai, R.; Saitoh, T.; Itoh, K.; Zhang, H. Eyeglass-Type Switch: A Wearable Eye-Movement and Blink Switch for ALS Nurse Call. Electronics 2025, 14, 4201. https://doi.org/10.3390/electronics14214201
Tamai R, Saitoh T, Itoh K, Zhang H. Eyeglass-Type Switch: A Wearable Eye-Movement and Blink Switch for ALS Nurse Call. Electronics. 2025; 14(21):4201. https://doi.org/10.3390/electronics14214201
Chicago/Turabian StyleTamai, Ryuto, Takeshi Saitoh, Kazuyuki Itoh, and Haibo Zhang. 2025. "Eyeglass-Type Switch: A Wearable Eye-Movement and Blink Switch for ALS Nurse Call" Electronics 14, no. 21: 4201. https://doi.org/10.3390/electronics14214201
APA StyleTamai, R., Saitoh, T., Itoh, K., & Zhang, H. (2025). Eyeglass-Type Switch: A Wearable Eye-Movement and Blink Switch for ALS Nurse Call. Electronics, 14(21), 4201. https://doi.org/10.3390/electronics14214201

