Exploring Preferential Ring-Based Gesture Interaction Across 2D Screen and Spatial Interface Environments
Abstract
:Featured Application
Abstract
1. Introduction
- Compare the user experiences and performance of hand-gesture-based interactions for a TV-platform-based 2D interface versus an XR-platform-based 3D spatial interface;
- Determine which gestures types (e.g., hand/finger swipes, point and click, grab and release, etc.) are relatively preferred in each contextual condition;
- Assess the user comfort levels and usability issues (physical fatigue, cognitive load, or social discomfort) associated with different gestures on the two different platforms.
2. Related Studies
2.1. Natural User Interfaces (NUIs)
2.2. Gesture Interactions in TV Interfaces
2.3. Gesture Interactions in XR (VR/AR) Interfaces
2.4. Smart Ring Interactions
3. Methods
3.1. Overview
3.2. The Rationale for the Selection of the Gesture Types
3.3. The Experimental Design
3.4. The Data Collection and Analysis Methods
3.4.1. Observational and Performance Data
3.4.2. Surveys and Questionnaires
3.4.3. Interviews
3.5. The Data Analysis Methods
4. Results
4.1. The Evaluation of the Gesture Types in the TV Environment
4.2. The Evaluation of the Gesture Types in the XR Environment
4.3. Comparative Analysis of the Results Between the TV and XR Environments
4.4. The Evaluation Results for Specific Gesture Actions by Type
4.5. The Evaluation Results for Gesture Execution Times: An Analysis of the Correlation with the UEQ-S Scores
5. Discussions
5.1. [Objective #1] Does a Difference Arise in the Participants’ Gesture Interaction Experiences Between the TV and XR Environments?
5.2. [Objective #2] Do Users Have Preferred Gesture Types Depending on TV and XR Environments?
- Environment-specific gesture design is crucial. Developing gestures optimized for each unique environmental context will significantly enhance user experiences.
- Consideration of spatial awareness is necessary. Gesture designs must thoughtfully incorporate the characteristics of the interaction space, effectively utilizing either two-dimensional or three-dimensional elements according to the specific environmental setting.
- UX research should examine both categorical preferences and usage behavior, including combinations and context-based adaptations. These insights should be reflected in gesture interface development to support real-world usage patterns more comprehensively.
5.3. [Objective #3] What Actions Are Preferred or Found Uncomfortable for Each Type of Gesture?
5.4. Challenges and Opportunities in Designing Gesture Interfaces
6. Conclusions
6.1. An Executive Summary
6.2. Limitations and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
XR | Extended Reality |
GET | Gesture Execution Time |
NUI | Natural User Interface |
HCI | Human–Computer Interaction |
HMD | Head-Mounted Display |
SDK | Software Development Kit |
VR | Virtual Reality |
MR | Mixed Reality |
IMU | Inertial Measurement Unit |
EMG | Electromyography |
UI | User Interface |
AR | Augmented Reality |
UX | User Experience |
UEQ-S | User Experience Questionnaire, Short Version |
OLS | Ordinary Least Squares |
References
- Intellias. Hand Tracking and Gesture Recognition Using AI: Applications and Challenges. Available online: https://intellias.com/hand-tracking-and-gesture-recognition-using-ai-applications-and-challenges (accessed on 8 March 2025).
- Yeh, S.-C.; Wu, E.H.-K.; Lee, Y.-R.; Vaitheeshwari, R.; Chang, C.-W. User Experience of Virtual-Reality Interactive Interfaces: A Comparison between Hand Gesture Recognition and Joystick Control for XRSPACE MANOVA. Appl. Sci. 2022, 12, 12230. [Google Scholar] [CrossRef]
- Meta. Hand Representation. Available online: https://developers.meta.com/horizon/resources/hand-representation/ (accessed on 8 March 2025).
- Gallagher, W. Apple Vision Pro Gaze and Pinch Gesture Combo is Perfect for AR/VR. Available online: https://appleinsider.com/articles/23/06/29/apple-vision-pro-gaze-and-pinch-gesture-combo-is-perfect-for-ar-vr (accessed on 8 March 2025).
- Huang, K.-M.; Liu, M.-C.; Yu, L. Gesture Recognition System for TV Control. U.S. Patent No. US20120069168A1, 17 September 2010. Available online: https://patents.google.com/patent/US20120069168A1/en (accessed on 22 March 2012).
- Dreyer, M.E.; Lu, M.R.; Missig, J.K. Devices, Methods, and Graphical User Interfaces for Interacting with Virtual Objects Using Hand Gestures. U.S. Patent Application No. US20230252737A1, 8 February 2022. Available online: https://patents.google.com/patent/US20230252737A1 (accessed on 10 August 2023).
- Kurz, M.; Gstoettner, R.; Sonnleitner, E. Smart Rings vs. Smartwatches: Utilizing Motion Sensors for Gesture Recognition. Appl. Sci. 2021, 11, 2015. [Google Scholar] [CrossRef]
- Al Farid, F.; Hashim, N.; Abdullah, J.; Bhuiyan, M.R.; Shahida Mohd Isa, W.N.; Uddin, J.; Haque, M.A.; Husen, M.N. A structured and methodological review on vision-based hand gesture recognition system. J. Imaging 2022, 8, 153. [Google Scholar] [CrossRef] [PubMed]
- McGrath, O.G. Some Considerations for Designing and Supporting XR Experiences in Classroom Settings. In Proceedings of the 2024 ACM SIGUCCS Annual Conference (SIGUCCS’24), Chicago, IL, USA, 8–10 April 2024; Association for Computing Machinery: New York, NY, USA, 2024; pp. 34–38. [Google Scholar] [CrossRef]
- Hornby, R. This Meta Connect Reveal Could Change AR and Gaming Forever—And It’s Not a VR Headset or Smart Glasses. Available online: https://www.laptopmag.com/gaming/vr/this-meta-connect-2024-reveal-could-change-ar-and-gaming-forever-and-its-not-a-vr-headset-or-smart-glasses (accessed on 10 April 2025).
- Laine, T.H.; Suk, H.J. Investigating User Experience of an Immersive Virtual Reality Simulation Based on a Gesture-Based User Interface. Appl. Sci. 2024, 14, 4935. [Google Scholar] [CrossRef]
- Pfeuffer, K.; Gellersen, H.; Gonzalez-Franco, M. Design principles and challenges for gaze + pinch interaction in XR. IEEE Comput. Graph. Appl. 2024, 44, 74–81. [Google Scholar] [CrossRef]
- Wigdor, D.; Wixon, D. Brave NUI World: Designing Natural User Interfaces for Touch and Gesture; Elsevier: Amsterdam, The Netherlands, 2011; pp. 44–46. ISBN 9780123822314. Available online: https://shop.elsevier.com/books/brave-nui-world/wigdor/978-0-12-382231-4 (accessed on 10 April 2025).
- Martinez-Maldonado, R.; Shum, S.B.; Schneider, B.; Charleer, S.; Klerkx, J.; Duval, E. Learning analytics for natural user interfaces. J. Learn. Anal. 2017, 4, 24–57. [Google Scholar]
- Zagermann, J.; Pfeil, U.; Fink, D.; von Bauer, P.; Reiterer, H. Memory in motion: The influence of gesture-and touch-based input modalities on spatial memory. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 1899–1910. [Google Scholar] [CrossRef]
- Jang, S.; Vitale, J.M.; Jyung, R.W.; Black, J.B. Direct manipulation is better than passive viewing for learning anatomy in a three-dimensional virtual reality environment. Comput. Educ. 2017, 106, 150–165. [Google Scholar] [CrossRef]
- Shamonsky, D. User Experience Design Principles of a Natural User Interface (NUI). Available online: https://www.ics.com/blog/user-experience-design-principles-natural-user-interface-nui#:~:text=2 (accessed on 8 March 2025).
- Hinman, R. Eight Principles of Natural User Interfaces. Available online: https://principles.design/examples/eight-principles-of-natural-user-interfaces (accessed on 8 March 2025).
- Lutkevich, B. Natural User Interface (NUI). Available online: https://www.techtarget.com/whatis/definition/natural-user-interface-NUI (accessed on 8 March 2025).
- Hey, T. Natural User Interface Leaps Forward with Release of Kinect for Windows SDK Beta. Available online: https://www.microsoft.com/en-us/research/blog/natural-user-interface-leaps-forward-with-release-of-kinect-for-windows-sdk-beta/ (accessed on 8 March 2025).
- Microsoft Research. Transforming Computer Interaction with Natural User Interfaces. YouTube. Available online: https://www.youtube.com/watch?v=i3Up4k1K4QE (accessed on 8 March 2025).
- Apple Developer. Gestures—Human Interface Guidelines. Available online: https://developer.apple.com/design/human-interface-guidelines/gestures (accessed on 8 March 2025).
- Apple. Introducing Apple Vision Pro: Apple’s First Spatial Computer. Available online: https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/ (accessed on 8 March 2025).
- Google ATAP. Welcome to Project Soli. YouTube. Available online: https://www.youtube.com/watch?v=0QNiZfSsPc0 (accessed on 8 March 2025).
- Bedal, L.; Giusti, L. A New Interaction Language. Available online: https://design.google/library/a-new-interaction-design-paradigm (accessed on 8 March 2025).
- Amazon. Meet the New Alexa, Powered by Generative AI. Available online: https://www.aboutamazon.com/news/devices/new-alexa-generative-artificial-intelligence (accessed on 8 March 2025).
- Ramis, S.; Perales, F.J.; Manresa-Yee, C.; Bibiloni, A. Usability Study of Gestures to Control a Smart-tv. In Proceedings of the Applications and Usability of Interactive TV: Third Iberoamerican Conference, jAUTI 2014, and Third Workshop on Interactive Digital TV, Held as Part of Webmedia 2014, João Pessoa, Brazil, 18–21 November 2014; Revised Selected Papers 3. pp. 135–146. [Google Scholar]
- Vatavu, R.D.; Zaiti, I.A. Leap gestures for TV: Insights from an elicitation study. In Proceedings of the ACM International Conference on Interactive Experiences for TV and Online Video, Newcastle Upon Tyne, UK, 25–27 June 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 131–138. [Google Scholar] [CrossRef]
- Dong, H.; Danesh, A.; Figueroa, N.; El Saddik, A. An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. IEEE Access 2015, 3, 543–555. [Google Scholar] [CrossRef]
- He, Y.; Hou, S.; Cheng, P. Generating a Gesture Set Using the User-defined Method in Smart Home Contexts. In Proceedings of the 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022), New York, NY, USA, 16–20 July 2022. [Google Scholar] [CrossRef]
- Xuan, L.; Daisong, G.; Moli, Z.; Jingya, Z.; Xingtong, L.; Siqi, L. Comparison on user experience of mid-air gesture interaction and traditional remotes control. In Proceedings of the Seventh International Symposium of Chinese CHI, Xiamen, China, 27–30 June 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 16–22. [Google Scholar] [CrossRef]
- Bernardos, A.M.; Wang, X.; Bergesio, L.; Besada, J.A.; Casar, J.R. Assessing the Acceptance of a Mid-Air Gesture Syntax for Smart Space Interaction: An Empirical Study. J. Sens. Actuator Netw. 2024, 13, 25. [Google Scholar] [CrossRef]
- Jang, S.; Stuerzlinger, W.; Ambike, S.; Ramani, K. Modeling cumulative arm fatigue in mid-air interaction based on perceived exertion and kinetics of arm motion. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 3328–3339. [Google Scholar] [CrossRef]
- Rakkolainen, I.; Farooq, A.; Kangas, J.; Hakulinen, J.; Rantala, J.; Turunen, M.; Raisamo, R. Technologies for Multimodal Interaction in Extended Reality—A Scoping Review. Multimodal Technol. Interact. 2021, 5, 81. [Google Scholar] [CrossRef]
- Apple. Use Gestures with Apple Vision Pro. Available online: https://support.apple.com/en-us/117741 (accessed on 10 April 2025).
- Microsoft. Direct Manipulation with Hands. Available online: https://learn.microsoft.com/en-us/windows/mixed-reality/design/direct-manipulation (accessed on 10 April 2025).
- Heath, A. Meta’s Big Tease: Orion is an Impressive Demo of AR Glasses, but Can Mark Zuckerberg Beat Everyone Else to the Next Big Platform? Available online: https://www.theverge.com/24253908/meta-orion-ar-glasses-demo-mark-zuckerberg-interview (accessed on 10 April 2025).
- Innovate Forge. Meta’s AR Wristband. Available online: https://medium.com/@InnovateForge/metas-ar-wristband-12eae52bae13 (accessed on 10 April 2025).
- Visconti, P.; Gaetani, F.; Zappatore, G.A.; Primiceri, P. Technical features and functionalities of Myo armband: An overview on related literature and advanced applications of myoelectric armbands mainly focused on arm prostheses. Int. J. Smart Sens. Intell. Syst. 2018, 11, 1–25. [Google Scholar] [CrossRef]
- Sathiyanarayanan, M.; Rajan, S. MYO Armband for physiotherapy healthcare: A case study using gesture recognition application. In Proceedings of the 2016 8th International Conference on Communication Systems and Networks (COMSNETS), Bangalore, India, 5–10 January 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Mendez, I.; Hansen, B.W.; Grabow, C.M.; Smedegaard, E.J.; Skogberg, N.B.; Uth, X.J.; Bruhn, A.; Geng, B.; Kamavuako, E.N. Evaluation of the Myo armband for the classification of hand motions. In Proceedings of the 2017 International Conference on Rehabilitation Robotics (ICORR), London, UK, 17–20 July 2017; pp. 1211–1214. [Google Scholar] [CrossRef]
- Lang, B. HaptX Launches New & Improved DK2 Haptic VR Gloves for Enterprise. Available online: https://www.roadtovr.com/haptx-dk2-vr-glove-launch/ (accessed on 10 April 2025).
- Shen, J.; Boldu, R.; Kalla, A.; Glueck, M.; Surale, H.B.; Karlson, A. RingGesture: A Ring-Based Mid-Air Gesture Typing System Powered by a Deep-Learning Word Prediction Framework. IEEE Trans. Vis. Comput. Graph. 2024, 30, 7441–7451. [Google Scholar] [CrossRef] [PubMed]
- Yu, T.C.; Hu, G.; Zhang, R.; Lim, H.; Mahmud, S.; Lee, C.J.; Li, K.; Agarwal, D.; Nie, S.; Oh, J.; et al. Ring-a-Pose: A Ring for Continuous Hand Pose Tracking. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2024, 8, 1–30. [Google Scholar] [CrossRef]
- Gheran, B.F.; Vanderdonckt, J.; Vatavu, R.D. Gestures for smart rings: Empirical results, insights, and design implications. In Proceedings of the 2018 Designing Interactive Systems Conference (DIS’18), Hong Kong, China, 11–13 June 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 623–635. [Google Scholar] [CrossRef]
- Gheran, B.F.; Vatavu, R.D.; Vanderdonckt, J. New insights into user-defined smart ring gestures with implications for gesture elicitation studies. In Proceedings of the Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (CHI EA’23), Hamburg, Germany, 23–28 April 2023; Association for Computing Machinery: New York, NY, USA, 2023; pp. 1–8. [Google Scholar] [CrossRef]
- Wilhelm, M.; Krakowczyk, D.; Albayrak, S. PeriSense: Ring-based multi-finger gesture interaction utilizing capacitive proximity sensing. Sensors 2020, 20, 3990. [Google Scholar] [CrossRef]
- Samsung. Galaxy Ring. Available online: https://www.samsung.com/us/rings/galaxy-ring/ (accessed on 19 April 2025).
- So, A. Review: Samsung Galaxy Ring. Available online: https://www.wired.com/review/samsung-galaxy-ring/ (accessed on 23 April 2025).
- Chokkattu, J. Everything Samsung Announced at Galaxy Unpacked in Paris. Available online: https://www.wired.com/story/samsung-galaxy-unpacked-july-2024-flip6-fold6-galaxy-ring/ (accessed on 23 April 2025).
- McGoogan, C. This Smart Ring can Control Your Phone, Laptop and TV. Available online: https://www.wired.com/story/smart-ring-can-control-your-phone/ (accessed on 23 April 2025).
- Werner, S. Y-RING: Smart Ring with Innovative Features Launches with Gesture Control and NFC. Available online: https://www.notebookcheck.net/Y-RING-Smart-ring-with-innovative-features-launches-with-gesture-control-and-NFC.850805.0.html (accessed on 23 April 2025).
- Just2Devs. Hand Tracking Plugin Documentation. Available online: https://docs.just2devs.com/hand-tracking-plugin (accessed on 8 March 2025).
- Meta. Install the Meta XR Plugin (Unreal Quick Start). Available online: https://developers.meta.com/horizon/documentation/unreal/unreal-quick-start-install-metaxr-plugin/ (accessed on 8 March 2025).
- Nielsen Norman Group. The Wizard of Oz Method in UX. Available online: https://www.nngroup.com/articles/wizard-of-oz/ (accessed on 10 April 2025).
- Schrepp, M.; Hinderks, A.; Thomaschewski, J. Design and Evaluation of a Short Version of the User Experience Questionnaire (UEQ-S). Int. J. Interact. Multimed. Artif. Intell. 2017, 4, 103–108. [Google Scholar] [CrossRef]
- Apple. Use Multi-Touch Gestures on Your Mac. Available online: https://support.apple.com/en-us/102482 (accessed on 26 April 2025).
- Fukumoto, M.; Tonomura, Y. “Body coupled FingerRing” wireless wearable keyboard. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems (CHI’97), Atlanta, GA, USA, 27–28 March 1997; Association for Computing Machinery: New York, NY, USA, 1997; pp. 147–154. [Google Scholar] [CrossRef]
- Park, K.; Kim, D.; Heo, S.; Lee, G. MagTouch: Robust finger identification for a smartwatch using a magnet ring and a built-in magnetometer. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI’20), Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–13. [Google Scholar] [CrossRef]
- Yoon, H.; Im, H.; Kim, Y.; Song, Y.; Yi, T. Investigating Preferential Usability on Intelligent Agent-Driven Multimodal Interaction within the Extended Reality Environment. Arch. Des. Res. 2024, 37, 7–24. [Google Scholar] [CrossRef]
- Hornbæk, K.; Oulasvirta, A. What is interaction? In Proceedings of the 2017 CHI conference on human factors in computing systems (CHI’17), Denver, CO, USA, 6–11 May 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 5040–5052. [Google Scholar] [CrossRef]
- Nguyen, V.; Rupavatharam, S.; Liu, L.; Howard, R.; Gruteser, M. HandSense: Capacitive coupling-based dynamic, micro finger gesture recognition. In Proceedings of the 17th Conference on Embedded Networked Sensor Systems (SenSys’19), Denver, CO, USA, 10–13 November 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 285–297. [Google Scholar] [CrossRef]
- Zhao, Y.; Lian, C.; Zhang, X.; Sha, X.; Shi, G.; Li, W.J. Wireless IoT motion-recognition rings and a paper keyboard. IEEE Access 2019, 7, 44514–44524. [Google Scholar] [CrossRef]
- Dobinson, R.; Teyssier, M.; Steimle, J.; Fruchard, B. MicroPress: Detecting Pressure and Hover Distance in Thumb-to-Finger Interactions. In Proceedings of the 2022 ACM Symposium on Spatial User Interaction (SUI’22), Online, CA, USA, 1–2 December 2022; Association for Computing Machinery: New York, NY, USA, 2022; pp. 1–10. [Google Scholar] [CrossRef]
- Parizi, F.S.; Whitmire, E.; Patel, S. Auraring: Precise electromagnetic finger tracking. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 3, 1–28. [Google Scholar] [CrossRef]
- Bhandari, P. Within-Subjects Design Explanation, Approaches, Examples. Available online: https://www.scribbr.com/methodology/within-subjects-design/ (accessed on 10 April 2025).
- Reese, H.W. Counterbalancing and other uses of repeated-measures Latin-square designs: Analyses and interpretations. J. Exp. Child Psychol. 1997, 64, 137–158. [Google Scholar] [CrossRef]
- Winer, B.J.; Brown, D.R.; Michels, K.M. Statistical Principles in Experimental Design; Mcgraw-Hill: New York, NY, USA, 1971; Volume 2. [Google Scholar]
- Schrepp, M.; Hinderks, A.; Thomaschewski, J. User Experience Questionnaire (UEQ). Available online: https://www.ueq-online.org/ (accessed on 20 April 2025).
- Simon, P. UX Thinking for Hand Gestures in XR Experiences—UX Collective. Available online: https://uxdesign.cc/ux-thinking-for-hand-gestures-in-xr-experiences-7363ce4e2cd3 (accessed on 14 April 2025).
- Wu, H.; Wang, J.; Zhang, X. User-centered gesture development in TV viewing environment. Multimed. Tools Appl. 2016, 75, 733–760. [Google Scholar] [CrossRef]
- Decobert, M. UX for XR—User Experience for Extended Reality. Available online: https://blog.tonic3.com/ux-for-xr-user-experience-for-extended-reality (accessed on 14 April 2025).
- Kalbani, M.A. Designing User Interactions in XR—Do We Need “Natural” Hand Interactions? Available online: https://medium.com/samsung-internet-dev/designing-user-interactions-in-xr-do-we-need-natural-hand-interactions-6f028395aa51 (accessed on 14 April 2025).
- Wobbrock, J.O.; Morris, M.R.; Wilson, A.D. User-defined gestures for surface computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’09), Boston, MA, USA, 4–9 April 2009; Association for Computing Machinery: New York, NY, USA, 2009; pp. 1083–1092. [Google Scholar] [CrossRef]
- DeepMind. Project Astra. Available online: https://deepmind.google/models/project-astra/ (accessed on 10 June 2025).
- Google. Project Astra|Exploring the Future Capabilities of a Universal AI Assistant. YouTube. Available online: https://www.youtube.com/watch?v=hIIlJt8JERI (accessed on 10 June 2025).
- Dong, Z.; Zhang, J.; Bai, X.; Clark, A.; Lindeman, R.W.; He, W.; Piumsomboon, T. Touch-Move-Release: Studies of Surface and Motion Gestures for Mobile Augmented Reality. Front. Virtual Real. 2022, 3, 927258. [Google Scholar] [CrossRef]
- Biener, V.; Kalamkar, S.; Dudley, J.J.; Hu, J.; Kristensson, P.O.; Müller, J.; Grubert, J. Working with XR in public: Effects on users and bystanders. In Proceedings of the 2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Orlando, FL, USA, 16–21 March 2024; pp. 779–780. [Google Scholar] [CrossRef]
- Xu, C.; Zhou, B.; Krishnan, G.; Nayar, S. Ao-finger: Hands-free fine-grained finger gesture recognition via acoustic-optic sensor fusing. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI’23), Hamburg, Germany, 23–28 April 2023; Association for Computing Machinery: New York, NY, USA, 2023. Article 306. pp. 1–14. [Google Scholar] [CrossRef]
- Kita, S. Cross-cultural variation of speech-accompanying gesture: A review. Lang. Cogn. Process. 2009, 24, 145–167. [Google Scholar] [CrossRef]
- Wu, H.; Gai, J.; Wang, Y.; Liu, J.; Qiu, J.; Wang, J.; Zhang, X. Influence of cultural factors on freehand gesture design. Int. J. Hum.-Comput. Stud. 2020, 143, 102502. [Google Scholar] [CrossRef]
- Stößel, C.; Wandke, H.; Blessing, L. Gestural Interfaces for Elderly Users: Help or Hindrance? In Gesture in Embodied Communication and Human-Computer Interaction; Kopp, S., Wachsmuth, I., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2010; Volume 5934, pp. 269–280. [Google Scholar] [CrossRef]
- Salman, H.M.; Wan Ahmad, W.F.; Sulaiman, S. Usability Evaluation of the Smartphone User Interface in Supporting Elderly Users from Experts’ Perspective. IEEE Access 2018, 6, 22578–22591. [Google Scholar] [CrossRef]
- Melo, B.; Andrade, R.; Darin, T. Longitudinal user experience studies in the IoT domain: A brief panorama and challenges to overcome. In Proceedings of the 21st Brazilian Symposium on Human Factors in Computing Systems (IHC’22), Diamantina, Brazil, 17–21 October 2022; Association for Computing Machinery: New York, NY, USA, 2022. Article 23. pp. 1–13. [Google Scholar] [CrossRef]
- Flaherty, K. Diary Studies: Understanding Long-Term User Behavior and Experiences. Available online: https://www.nngroup.com/articles/diary-studies/ (accessed on 10 June 2025).
- Yu, M.; Zhou, R.; Cai, Z.; Tan, C.W.; Wang, H. Unravelling the Relationship Between Response Time and User Experience in Mobile Applications. Internet Res. 2020, 30, 1353–1382. [Google Scholar] [CrossRef]
- Yun, J.; Yoo, Y.J.; Kim, H.R.; Song, Y.M. Recent progress in thermal management for flexible/wearable devices. Soft Sci. 2023, 3, 12. [Google Scholar] [CrossRef]
- OpenAI. ChatGPT (GPT-4o, April 2025 Version) [Large Language Model]; OpenAI: San Francisco, CA, USA, 2025. Available online: https://openai.com/chatgpt (accessed on 10 June 2025).
Gesture Type | Quality Scale | Mean | Std. Dev. | Grade | Cronbach’s α |
---|---|---|---|---|---|
Surface-Touch Gesture | Pragmatic | 1.40 | 1.15 | Above Average (Top 25–50%) | 0.9 |
Hedonic | 0.73 | 0.91 | Below Average (Bottom 50% or more) | 0.72 | |
Overall Satisfaction | 1.07 | 0.83 | Above Average (Top 25–50%) | - | |
Mid-Air Gesture | Pragmatic | 0.42 | 1.22 | Bad (Bottom 10%) | 0.8 |
Hedonic | 0.98 | 0.99 | Above Average (Top 25–50%) | 0.85 | |
Overall Satisfaction | 0.70 | 0.88 | Below Average (Bottom 50% or more) | - | |
Micro Finger-Touch Gesture | Pragmatic | 0.69 | 1.44 | Bad (Bottom 10%) | 0.92 |
Hedonic | 1.40 | 0.9 | Good (Top 10–25%) | 0.84 | |
Overall Satisfaction | 1.05 | 0.99 | Above Average (Top 25–50%) | - |
Environment | Quality Scale | Stats | DF | p-Value | Post Hoc Result |
---|---|---|---|---|---|
TV | Pragmatic | 8.7 | 2 | <0.01 ** | Mid-air vs. Surface (corrected p < 0.05 *) |
Hedonic | 12.55 | 2 | <0.05 * | Micro finger vs. mid-air (corrected p < 0.05 *) Micro finger vs. Surface-Touch (corrected p < 0.01 **) | |
Overall Satisfaction | 4.78 | 2 | <0.1 # | Mid-air vs. Surface-Touch (corrected p < 0.05 *) |
Gesture Type | Quality Scale | Mean | Std. Dev. | Grade | Cronbach’s α |
---|---|---|---|---|---|
Surface-Touch Gesture | Pragmatic | 1.58 | 1.12 | Good (Top 10–25%) | 0.9 |
Hedonic | 0.77 | 1.13 | Below Average (Bottom 50% or more) | 0.9 | |
Overall Satisfaction | 1.17 | 0.95 | Above Average (Top 25–50%) | - | |
Mid-Air Gesture | Pragmatic | 1.36 | 1.11 | Above Average (Top 25–50%) | 0.9 |
Hedonic | 1.08 | 0.95 | Above Average (Top 25–50%) | 0.93 | |
Overall Satisfaction | 1.22 | 0.91 | Above Average (Top 25–50%) | - | |
Micro Finger-Touch Gesture | Pragmatic | 1.10 | 1.32 | Below Average (Bottom 50% or more) | 0.93 |
Hedonic | 1.29 | 1.01 | Good (Top 10–25%) | 0.9 | |
Overall Satisfaction | 1.20 | 1.05 | Above Average (Top 25–50%) | - |
Environment | Quality Scale | Stats | DF | p-Value | Post-Hoc Result |
---|---|---|---|---|---|
XR | Pragmatic (Friedman) | 1.94 | 2 | >0.05 ns | - |
Hedonic (Repeated Measures ANOVA) | 4.45 | 2, 58 | <0.05 * | Micro Finger vs Surface-Touch (corrected p < 0.05 *) | |
Overall Satisfaction (Repeated Measures ANOVA) | 0.04 | 2, 58 | >0.05 ns | - |
Scale | Gesture Type | Test | Stat | p-Value |
---|---|---|---|---|
Pragmatic Quality | Surface-Touch Gesture | Paired t-test | −1.002 | >0.05 ns |
Mid-Air Gesture | Paired t-test | −4.275 | <0.01 * | |
Micro Finger-Touch Gesture | Wilcoxon | 131.5 | >0.05 ns | |
Overall Satisfaction | Surface-Touch Gesture | Paired t-test | −0.832 | >0.05 ns |
Mid-Air Gesture | Paired t-test | −3.768 | <0.001 ** | |
Micro Finger-Touch Gesture | Wilcoxon | 164.0 | >0.05 ns |
Environment | UEQ-S Scale | Coefficient | R2 | p-Value |
---|---|---|---|---|
Whole (TV + XR) | Pragmatic Quality | −0.004 | 0.001 | >0.05 ns |
Hedonic Quality | −0.015 | 0.034 | <0.05 * | |
Overall Satisfaction | −0.009 | 0.015 | >0.05 ns | |
TV | Pragmatic Quality | 0.001 | 0.0 | >0.05 ns |
Hedonic Quality | −0.021 | 0.078 | <0.01 ** | |
Overall Satisfaction | −0.001 | 0.019 | >0.05 ns | |
XR | Pragmatic Quality | −0.009 | 0.009 | >0.05 ns |
Hedonic Quality | −0.008 | 0.010 | >0.05 ns | |
Overall Satisfaction | −0.008 | 0.013 | >0.05 ns |
Analysis Results and Participant Feedback | |
---|---|
Analysis Results | In the TV environment, all indicators of the UEQ-S showed significance (see Section 4.1), whereas in the XR environment, significance was found only in the hedonic dimension (see Section 4.2). |
Related Interviews | [P23] “XR makes me feel like my movements become larger compared to TV. Maybe that’s why the gesture interaction experience feels different between TV and XR.” |
[P25] “We’re so used to the remote control when using a TV, so gestures feel awkward. But with XR, there’s no pre-learned control method, so using gestures felt more natural and comfortable.” | |
[P26] “I think the difference in gesture interaction experience comes from the contrast between the flat surface of the TV and the spatial aspect of XR.” |
Analysis Results and Participant Feedback | |
---|---|
Analysis Results | According to the UEQ-S results, the mid-air gesture type is the only one that shows statistically significant differences between the TV and XR environments (see Table 5 in Section 4.3). |
Looking at the participants’ gesture rankings by environment (see Figure 9 in Section 4.3), the first and third rankings switch places between the TV and XR environments. | |
Related Interviews | [P15] “Since the TV screen doesn’t feel spatial, I find Surface gestures projected onto the 2D plane more comfortable, whereas in the XR environment, 3D interactions like Mid-Air Gestures feel more intuitive and fitting to the space.” |
[P28] “When watching TV, I liked Micro Finger Gestures because I could place my hands anywhere, but in the XR environment, since it feels like the screen is right in front of me—unlike the distant TV screen—Mid-Air Gestures felt easier and more convenient.” |
Analysis Results and Participant Feedback | |
---|---|
Analysis Results | Gestures that used finger movements or tapping actions similar to existing touch interactions received positive feedback. However, new actions such as palm taps, gripping, or shaking were evaluated negatively (refer to Section 4.4). |
Related Interviews | <Positively Evaluated Actions> [P24] (in the Surface-Touch gesture scenario) “The Small Move motion felt intuitive because the hand movement matched the actual screen movement.” [P25] (in the micro finger-touch gesture scenario) “Actions like Small Move or Selection were the clearest and most intuitive, so I liked them.” |
<Negatively Evaluated Actions> [P09] (in the Surface-Touch gesture scenario) “I found the Return/Back action a bit confusing, whether it’s swiping up or swiping down.” [P11] (in the mid-air gesture scenario) “The Home motion felt like grabbing something with my hand, so it didn’t seem very connected to its function.” |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yoon, H.; Im, H.; Chung, S.; Yi, T. Exploring Preferential Ring-Based Gesture Interaction Across 2D Screen and Spatial Interface Environments. Appl. Sci. 2025, 15, 6879. https://doi.org/10.3390/app15126879
Yoon H, Im H, Chung S, Yi T. Exploring Preferential Ring-Based Gesture Interaction Across 2D Screen and Spatial Interface Environments. Applied Sciences. 2025; 15(12):6879. https://doi.org/10.3390/app15126879
Chicago/Turabian StyleYoon, Hoon, Hojeong Im, Seonha Chung, and Taeha Yi. 2025. "Exploring Preferential Ring-Based Gesture Interaction Across 2D Screen and Spatial Interface Environments" Applied Sciences 15, no. 12: 6879. https://doi.org/10.3390/app15126879
APA StyleYoon, H., Im, H., Chung, S., & Yi, T. (2025). Exploring Preferential Ring-Based Gesture Interaction Across 2D Screen and Spatial Interface Environments. Applied Sciences, 15(12), 6879. https://doi.org/10.3390/app15126879