Continuous Behavioral Biometric Authentication for Secure Metaverse Workspaces in Digital Environments
Abstract
1. Introduction
- To address illegal access by unauthenticated users in the Metaverse office, an active and continuous user authentication framework that uses virtual hand movements and dwell time is proposed.
- Based on a systematic literature review, we analyze prior studies on user identification or authentication for the Metaverse. In addition, the requirements for authentication are defined.
- To validate the feasibility of the proposed framework, the framework was implemented as a prototype. The performance of the framework was validated by several evaluation metrics using various feature combinations and compared with existing studies using the defined requirements to demonstrate its strengths.
2. Literature Review
2.1. Literature Review Process
2.2. Knowledge-Based Authentication
2.3. Physiological Biometric Authentication
2.4. Behavioral Biometric Authentication
3. Continuous Behavioral Biometric Authentication for Secure Metaverse Workspaces
3.1. Requirements for User Authentication in the Metaverse
- Continuity. In general, users perform various activities while maintaining long sessions in the Metaverse; thus, they should be verified continuously by the authentication process during active sessions. This process is referred to as continuous user authentication, which is essential because an attacker can intercept sessions authenticated through one-time authentication and access sensitive data without sufficient credentials.
- Universality. The Metaverse is being used increasingly across a wide range of fields, so its user authentication methods should be designed for broad applicability. In other words, focusing on specific domains or environments is inappropriate. Thus, common or implicit tasks in various environments should be adopted in authentication processes. Complex tasks that may induce user fatigue are also inappropriate because repeated authentication processes can be demanded to satisfy the continuity requirement. The adoption of implicit and common tasks is generally accepted as active authentication, which is performed without the user’s awareness.
- Non-additional sensor. Immersive devices allow social activities in the Metaverse using various mounted sensors to offer more realistic experiences. However, they are very heavy, which leads to user discomfort during extended use due to these sensors. Given the current weight, adopting additional sensors poses a considerable challenge because an increase in weight may further increase user discomfort during prolonged use. Thus, authentication methods should operate using sensors already built into immersive devices without requiring additional sensors.
- Resistance to shoulder-surfing. As mentioned earlier, immersive devices are widely used in the Metaverse. To enhance the realism of virtual spaces, they provide more immersive experiences by obstructing the user’s field of view and then showing only virtual environments via the display. However, this blocking makes it difficult to feel whether an attacker in a real space is spying on users while they perform confidential tasks. Even if the pass-through function is adopted in confidential behaviors or input processes, users should still continuously check whether there is an attacker behind, because the function is only offered in the gaze direction. Accordingly, methods for authenticating users on the Metaverse should resist the shoulder-surfing attack.
- High acceptability. Acceptability is closely related to privacy and refers to the extent to which users commonly exhibit reluctance to provide authentication factors. According to the previous study [32], biometric factors have lower acceptance than the knowledge factor, with the exception of a few factors, e.g., fingerprints. In addition, privacy concerns caused by collecting physiological biometrics have been raised. In the Metaverse, this low acceptance may become more pronounced when adopting active and continuous authentication, which collects biometric data repeatedly and implicitly. The concerns also remain, as sharing sensitive data with third-party servers can be required due to the low performance of immersive devices. Thus, user authentication methods should adopt factors that have relatively proper acceptability and those that do not involve sensitive data.
3.2. Assumption
3.3. Overview
- Metaverse user. Metaverse users are workers in a specific company that permits remote work. In the Metaverse office, this work involves handling documents or joining meetings without time and space constraints. In particular, the user can work with the keyboard and immersive device at home or in a coworking space without going to the physical company office. While performing document tasks, the user is verified continuously by determining whether preregistered typing patterns are similar to the current user’s typing patterns using virtual hand movement data collected by the data collection module.
- Data collection module. To support the authentication process, the data collection module continuously collects virtual hand-related data while users are performing document work. The data is behavioral patterns regarding input and hand movement. Specifically, the user’s spatial data of virtual right and left hands, input values, and dwell times during keyboard typing are collected. The module preprocesses the collected hand data and then transmits the processed data to the ML classifier to request continuous user authentication. In the Metaverse office, the module operates alongside the ML classifier for each active user session.
- Machine learning classifier. The ML classifier is a significant component of the proposed framework. While a specific user session is active, the ML classifier continuously validates the user, who was previously verified via the initial authentication process, using the data processed by the data collection module (i.e., hand movements). The classifier also manages each user’s authentication profile. When a new user is registered, the ML classifier creates the user’s profile. Otherwise, it deletes or updates user profiles when users want to reregister their profiles or resign from the company.
- Metaverse office. The Metaverse office is a virtual workspace that facilitates remote work by enabling employees to collaborate or perform business tasks free from space and time constraints. The Metaverse office involves many business and confidential documents; thus, if the ML classifier detects a potential unauthenticated user during an active session, the office restricts access to the user temporarily and prompts additional authentication. As the central component of the proposed framework, the Metaverse office serves as the environment where all other components function to enable continuous user authentication.
3.4. Sequences
4. Implementing Continuous User Authentication Framework
4.1. Implementation
4.2. Participants and Procedure
5. Evaluation
5.1. Data Preprocessing
5.2. Performance Analysis
5.2.1. Study 1: Relative Coordinates
5.2.2. Study 2: Each Spatial Attribute
5.2.3. Study 3: Significant Features
5.2.4. Study 4: False Acceptance Rate (FAR) and False Rejection Rate (FRR)
5.3. Comparison with Prior Work
6. Discussion
6.1. Assumptions for the Proposed Framework
6.2. Participants in the User Study
6.3. Evaluation of the Proposed Framework
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
HMD | Head-mounted displays |
IoT | Internet of things |
OTP | One-time password |
ML | Machine learning |
IMU | Inertial measurement unit |
IRB | Institutional review board |
FAR | False acceptance rate |
FRR | False rejection rate |
ERR | Equal error rate |
References
- Great Expectations: Making Hybrid Work. Available online: https://www.microsoft.com/en-us/worklab/work-trend-index/great-expectations-making-hybrid-work-work (accessed on 2 June 2025).
- Mahindru, R.; Bapat, G.; Bhoyar, P.; Abishek, G.D.; Kumar, A.; Vaz, S. Redefining Workspaces: Young Entrepreneurs Thriving in the Metaverse’s Remote Realm. Eng. Proc. 2024, 59, 209. [Google Scholar] [CrossRef]
- Dale, G.; Wilson, H.; Tucker, M. What is healthy hybrid work? Exploring employee perceptions on well-being and hybrid work arrangements. Int. J. Workplace Health Manag. 2024, 17, 335–352. [Google Scholar] [CrossRef]
- Dwivedi, Y.K.; Hughes, L.; Baabdullah, A.M.; Ribeiro-Navarrete, S.; Giannakis, M.; Al-Debei, M.M.; Dennehy, D.; Metri, B.; Buhalis, D.; Cheung, C.M.K.; et al. Metaverse beyond the hype: Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. Int. J. Inf. Manag. 2022, 66, 102542. [Google Scholar] [CrossRef]
- Kang, G.; Koo, J.; Kim, Y.-G. Security and privacy requirements for the metaverse: A metaverse applications perspective. IEEE Commun. Mag. 2024, 62, 148–154. [Google Scholar] [CrossRef]
- Turkmen, R.; Nwagu, C.; Rawat, P.; Riddle, P.; Sunday, K.; Machuca, M.B. Put your glasses on: A voxel-based 3D authentication system in VR using eye-gaze. In Proceedings of the 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Shanghai, China, 25–29 March 2023. [Google Scholar]
- Wazir, W.; Khattak, H.A.; Almogren, A.; Khan, M.A.; Din, I.U. Doodle-based authentication technique using augmented reality. IEEE Access 2020, 8, 4022–4034. [Google Scholar] [CrossRef]
- Zhong, H.; Huang, C.; Zhang, X. Metaverse CAN: Embracing Continuous, Active, and Non-intrusive Biometric Authentication. IEEE Netw. 2023, 37, 67–73. [Google Scholar] [CrossRef]
- Li, S.; Savaliya, S.; Marino, L.; Leider, A.M.; Tappert, C.C. Brain Signal Authentication for Human-Computer Interaction in Virtual Reality. In Proceedings of the 2019 IEEE International Conference on Computational Science and Engineering (CSE) and the 2019 IEEE International Conference on Embedded and Ubiquitous Computing (EUC), New York, NY, USA, 1–3 August 2019. [Google Scholar]
- Salturk, S.; Kahraman, N. Deep learning-powered multimodal biometric authentication: Integrating dynamic signatures and facial data for enhanced online security. Neural Comput. Appl. 2024, 36, 1131–11322. [Google Scholar] [CrossRef]
- Li, F.; Zhao, J.; Yang, H.; Yu, D.; Zhou, Y.; Shen, Y. Vibhead: An authentication scheme for smart headsets through vibration. ACM Trans. Sens. Netw. 2024, 20, 91. [Google Scholar] [CrossRef]
- Seok, C.L.; Song, Y.D.; An, B.S.; Lee, E.C. Photoplethysmogram biometric authentication using a 1D siamese network. Sensors 2023, 23, 4634. [Google Scholar] [CrossRef] [PubMed]
- Boutros, F.; Damer, N.; Raja, K.; Ramachandra, R.; Kirchbuchner, F.; Kuijper, A. Iris and periocular biometrics for head mounted displays: Segmentation, recognition, and synthetic data generation. Image Vis. Comput. 2020, 104, 104007. [Google Scholar] [CrossRef]
- Gupta, B.B.; Gaurav, A.; Arya, V. Fuzzy logic and biometric-based lightweight cryptographic authentication for metaverse security. Appl. Soft Comput. 2024, 164, 111973. [Google Scholar] [CrossRef]
- Wang, M.; Qin, Y.; Liu, J.; Li, W. Identifying personal physiological data risks to the Internet of Everything: The case of facial data breach risks. Humanit. Soc. Sci. Commun. 2023, 10, 216. [Google Scholar] [CrossRef] [PubMed]
- Shen, Y.; Wen, H.; Luo, C.; Xu, W.; Zhang, T.; Hu, W.; Rus, D. GaitLock: Protect virtual and augmented reality headsets using gait. IEEE Trans. Dependable Secure Comput. 2018, 16, 484–497. [Google Scholar] [CrossRef]
- LaRubbio, K.; Wright, J.; David-John, B.; Enqvist, A.; Jain, E. Who do you look like?-Gaze-based authentication for workers in VR. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Christchurch, New Zealand, 12–16 March 2022. [Google Scholar]
- Miller, R.; Ajit, A.; Banerjee, N.K.; Banerjee, S. Realtime behavior-based continual authentication of users in virtual reality environments. In Proceedings of the 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), San Diego, CA, USA, 9–11 December 2019. [Google Scholar]
- Mai, Z.; He, Y.; Feng, J.; Tu, H.; Weng, J. Behavioral authentication with head-tilt based locomotion for metaverse. In Proceedings of the 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Shanghai, China, 25–29 March 2023. [Google Scholar]
- Lu, D.; Deng, Y.; Huang, D. Global feature analysis and comparative evaluation of freestyle in-air-handwriting passcode for user authentication. In Proceedings of the 37th Annual Computer Security Applications Conference (ACSAC), Online, 6–10 December 2021. [Google Scholar]
- Gopal, S.R.K.; Gyreyiri, P.; Shukla, D. HM-Auth: Redefining User Authentication in Immersive Virtual World Through Hand Movement Signatures. In Proceedings of the 18th IEEE International Conference on Automatic Face and Gesture Recognition (FG), Istanbul, Turkiye, 27–31 May 2024. [Google Scholar]
- Suzuki, M.; Iijima, R.; Nomoto, K.; Ohki, T.; Mori, T. PinchKey: A Natural and User-Friendly Approach to VR User Authentication. In Proceedings of the 2023 European Symposium on Usable Security (EuroUSEC), Copenhagen, Denmark, 16–17 October 2023. [Google Scholar]
- Jitpanyoyos, T.; Sato, Y.; Maeda, S.; Nishigaki, M.; Ohki, T. ExpressionAuth: Utilizing Avatar Expression Blendshapes for Behavioral Biometrics in VR. In Proceedings of the 2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Orlando, FL, USA, 16–21 March 2024. [Google Scholar]
- Lu, X.; Zhang, S.; Hui, P.; Lio, P. Continuous authentication by free-text keystroke based on CNN and RNN. Comput. Secur. 2020, 96, 101861. [Google Scholar] [CrossRef]
- Darabseh, A.; Pal, D. Performance analysis of keystroke dynamics using classification algorithms. In Proceedings of the 2020 3rd International Conference on Information and Computer Technologies (ICICT), San Jose, CA, USA, 9–12 March 2020. [Google Scholar]
- Tsai, C.-J.; Huang, P.-H. Keyword-based approach for recognizing fraudulent messages by keystroke dynamics. Pattern Recognit. 2020, 98, 107067. [Google Scholar] [CrossRef]
- Rahman, A.; Chowdhury, M.E.H.; Khandakar, A.; Kiranyaz, S.; Zaman, K.S.; Reaz, M.B.I.; Islam, M.T.; Ezeddin, M.; Kadir, M.A. Multimodal EEG and keystroke dynamics based biometric system using machine learning algorithms. IEEE Access 2021, 9, 94625–94643. [Google Scholar] [CrossRef]
- Bu, Z.; Zheng, H.; Xin, W.; Zhang, Y.; Liu, Z.; Luo, W.; Gao, B. Secure authentication with 3d manipulation in dynamic layout for virtual reality. In Proceedings of the 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Shanghai, China, 25–29 March 2023. [Google Scholar]
- Abdelrahman, Y.; Mathis, F.; Knierim, P.; Kettler, A.; Alt, F.; Khamis, M. Cuevr: Studying the usability of cue-based authentication for virtual reality. In Proceedings of the 2022 International Conference on Advanced Visual Interfaces (AVI), Frascati, Rome, Italy, 6–10 June 2022. [Google Scholar]
- Rupp, D.; Grießer, P.; Bonsch, A.; Kuhlen, T.W. Authentication in Immersive Virtual Environments through Gesture-Based Interaction with a Virtual Agent. In Proceedings of the 2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Orlando, FL, USA, 16–21 March 2024. [Google Scholar]
- Li, P.; Pan, L.; Chen, F.; Hoang, T.; Wang, R. TOTPAuth: A Time-based One Time Password Authentication Proof-of-Concept against Metaverse User Identity Theft. In Proceedings of the 2023 IEEE International Conference on Metaverse Computing, Networking and Applications (MetaCom), Kyoto, Japan, 26–28 June 2023. [Google Scholar]
- Alrawili, R.; AlQahtani, A.A.S.; Khan, M.K. Comprehensive survey: Biometric user authentication application, evaluation, and discussion. Comput. Electr. Eng. 2024, 119, 109485. [Google Scholar] [CrossRef]
- Chen, Z. Metaverse office: Exploring future teleworking model. Kybernetes 2024, 53, 2029–2045. [Google Scholar] [CrossRef]
- Park, H.; Ahn, D.; Lee, J. Towards a metaverse workspace: Opportunities, challenges, and design implications. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI), Hamburg, Germany, 23–28 April 2023. [Google Scholar]
- A-FRAME. Available online: https://aframe.io (accessed on 2 June 2025).
- Aframe-Keyboard. Available online: https://github.com/WandererOU/aframe-keyboard (accessed on 2 June 2025).
- Aframe-Hand-Tracking-Controls-Extras. Available online: https://github.com/gftruj/aframe-hand-tracking-controls-extras (accessed on 2 June 2025).
- WebXR Hand Input Module—Level 1. Available online: https://immersive-web.github.io/webxr-hand-input/#skeleton-joints-section (accessed on 2 June 2025).
- Murphy, C.; Huang, J.; Hou, D.; Schuckers, S. Shared dataset on natural human-computer interaction to support continuous authentication research. Proceedings of 2017 IEEE International Conference on Biometrics (IJCB), Denver, CO, USA, 1–4 October 2017. [Google Scholar]
- Liebers, J.; Brockel, S.; Gruenefeld, U.; Schneegass, S. Identifying users by their hand tracking data in augmented and virtual reality. Int. J. Hum.-Comput. Interact. 2022, 40, 409–424. [Google Scholar] [CrossRef]
- Meta Develops Virtual Keyboard for Meta Quest. Available online: https://gadgetadvisor.com/gadgets/ar-vr/meta-develops-virtual-keyboard-for-meta-quest (accessed on 2 June 2025).
References | Strength | Weakness |
---|---|---|
Shen et al. [16] | High recognition accuracy (i.e., approximately 98%), and additional biometric sensors are not required. | Users wearing immersive devices can experience physical limitations (e.g., lightheadedness) when walking and then refuse to perform this behavior. |
LaRubbio et al. [17] | Investigated gaze-based user authentication during random dot viewing, 360-degree image viewing, and nuclear training simulations. | Consistent data collection can be difficult, as user focus may decrease because the method requires extended gaze collection. |
Miller et al. [18] | This system continuously authenticates users as they interact in the virtual environment, allowing them to continue activities. | Trajectory data are utilized during ball-throwing, which may limit applicability in various domains. |
Mai et al. [19] | They explored the potential of head-based movement as a method for user authentication. | Head movement may be uncomfortable for some users and cause fatigue over long sessions. |
Lu et al. [20] | The users’ passcodes and unique handwriting styles are adopted for multi-factor authentication. | Their method requires additional sensors to capture users’ handwriting patterns. |
Gopal et al. [21] | This approach verifies the secret code and user’s hand movements during code entry, preventing attackers from impersonating users even if the password is obtained. | Not suitable for use in the Metaverse office due to one-time authentication. |
Suzuki et al. [22] | PinchKey allows for authentication utilizing small finger movements without larger movements. | Pinch gestures are difficult to apply for continuous authentication because they are only used in limited situations. |
Jitpanyoyos et al. [23] | Their scheme resists shoulder surfing by authenticating users based on facial expressions obscured by the HMD. | Continuous authentication is impossible because the user must have a smiling expression to be authenticated. |
Our framework | This framework achieves active and continuous user authentication by adopting the most common office task across many domains. | The framework needs validation on large-scale datasets and evaluation using diverse metrics. |
Type | Specification |
---|---|
Processor | AMD Ryzen 5 5600G with Radeon Graphics |
Memory | 8 GB |
Development stack | Python 3.12, Nodejs, JavaScript, HTML |
A-Frame | Ver. 1.5.0 |
Machine learning model | Random Forest |
Bluetooth Keyboard | Logitech K380 |
Head-mounted display | Meta Quest 3 |
Coordinates | Specification (%) | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
User1 | User4 | User5 | User6 | User7 | User8 | User9 | User12 | User13 | User14 | User15 | |
Relative | 94.05 | 93.91 | 92.98 | 90.25 | 93.07 | 99.08 | 99.17 | 95.31 | 99.77 | 95.96 | 91.46 |
Absolute | 0.05 | 19.98 | 4.43 | 26.9 | 99.23 | 79.02 | 0 | 62.6 | 0 | 100 | 82.79 |
Metric | Rates (%) | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
User1 | User4 | User5 | User6 | User7 | User8 | User9 | User12 | User13 | User14 | User15 | Average | |
FAR | 0.84 | 0.16 | 0.25 | 0.13 | 0.61 | 0.07 | 0.33 | 0.97 | 0.24 | 0.47 | 0.42 | 0.41 |
FRR | 1.58 | 5.67 | 5.97 | 9.31 | 6.78 | 0.41 | 0.41 | 3.42 | 0.23 | 1.92 | 8.54 | 4.02 |
References | Requirements for User Authentication in the Metaverse | ||||
---|---|---|---|---|---|
Continuity | Universality | Non-Additional Sensor | Resistance to Shoulder Surfing | High Acceptability | |
Shen et al. [16] | ○ | - | ● | ● | ● |
LaRubbio et al. [17] | ○ | ● | - | ● | - |
Miller et al. [18] | ○ | - | ● | ● | ● |
Mai et al. [19] | ○ | ● | ● | ○ | ● |
Lu et al. [20] | ● | ○ | - | ○ | ○ |
Gopal et al. [21] | ○ | ○ | - | ○ | ● |
Suzuki et al. [22] | ● | ○ | ● | ○ | ● |
Jitpanyoyos et al. [23] | - | - | - | ● | - |
Our framework | ● | ● | ● | ● | ● |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kang, G.; Park, J.; Kim, Y.-G. Continuous Behavioral Biometric Authentication for Secure Metaverse Workspaces in Digital Environments. Systems 2025, 13, 588. https://doi.org/10.3390/systems13070588
Kang G, Park J, Kim Y-G. Continuous Behavioral Biometric Authentication for Secure Metaverse Workspaces in Digital Environments. Systems. 2025; 13(7):588. https://doi.org/10.3390/systems13070588
Chicago/Turabian StyleKang, Giluk, Jihoon Park, and Young-Gab Kim. 2025. "Continuous Behavioral Biometric Authentication for Secure Metaverse Workspaces in Digital Environments" Systems 13, no. 7: 588. https://doi.org/10.3390/systems13070588
APA StyleKang, G., Park, J., & Kim, Y.-G. (2025). Continuous Behavioral Biometric Authentication for Secure Metaverse Workspaces in Digital Environments. Systems, 13(7), 588. https://doi.org/10.3390/systems13070588