Evaluating a Metahuman-Integrated Computer-Based Training Tool for Nursing Interventions: Usability and Expert Heuristic Analysis
Abstract
1. Introduction
2. Materials and Methods
2.1. Development of CBT-Based DT Program Using Unreal Engine and Metahuman Technology
2.2. Participants
2.3. Research Tools
2.4. Data Collection Methods and Ethical Considerations
2.5. Data Analysis Methods
3. Results
3.1. Heuristic Evaluation
3.2. Usability Evaluation
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| CBT | computer-based training |
| DT | digital textbook |
| IRB | Institutional Review Board |
| UI | user interface |
| uMARS | user version of the Mobile App Rating Scale |
Appendix A
| Category | Item | Students (n = 12) | Experts (n = 4) | Examples of Qualitative Evaluation Narratives | ||||
|---|---|---|---|---|---|---|---|---|
| M ± SD | M ± SD | User | Expert | |||||
| Engagement | Entertainment | 4.25 ± 0.45 | 4.15 ± 0.51 | 3.75 ± 0.50 | 3.75 ± 0.44 | Initially difficult but immersive once familiar, though lacking engaging design | S1 a “It felt unfamiliar at first, but I was quite immersed once I got used to it” S2 “It felt like actually performing nursing, so it was fun.” | E1 b “Initially difficult, but OK once familiar” E2 “Apart from basic interactions, the design does not sufficiently promote learner engagement” |
| Interest | 4.50 ± 0.52 | 4.50 ± 0.58 | ||||||
| Customization | 3.67 ± 0.98 | 3.00 ± 1.41 | ||||||
| Interactivity | 4.08 ± 0.79 | 3.00 ± 1.63 | ||||||
| Target Group | 4.25 ± 0.45 | 4.50 ± 0.58 | ||||||
| Functionality | Performance | 4.33 ± 0.78 | 4.27 ± 0.43 | 3.25 ± 0.96 | 3.69 ± 0.63 | Functions are intuitive and easy to use but require more refinement | S8 “The buttons are intuitive, so I could quickly learn how to use them. S6 “It would be more convenient if there was a function to rotate the direction of the nurse’s viewpoint with the mouse.” | E4 “Almost no personalization features” E3 “Performance is OK, but there’s a slight delay,” “Functional consistency is poor, and navigation is limited” |
| Ease of Use | 4.00 ± 0.74 | 3.75 ± 0.96 | ||||||
| Navigation | 4.25 ± 0.75 | 3.75 ± 0.96 | ||||||
| Gestural Design | 4.50 ± 0.52 | 4.00 ± 0.82 | ||||||
| Aesthetics | Layout | 4.75 ± 0.45 | 4.64 ± 0.30 | 3.50 ± 1.00 | 3.92 ± 0.88 | Design is pretty and visually satisfying but simplification is needed | S10 “The design is clean, and the colors are good, so I could concentrate well.” S11 “The design was nice, and it helped me a lot because I could learn the sequence of nursing procedures.” | E2 “The design is satisfactory, but more simplification needed for professionals” |
| Graphics | 4.75 ± 0.45 | 4.25 ± 0.96 | ||||||
| Visual Appeal | 4.42 ± 0.51 | 4.00 ± 0.82 | ||||||
| Information | Quality of Information | 4.67 ± 0.49 | 4.15 ± 0.98 | 4.00 ± 0.82 | 3.69 ± 1.20 | Information structure is easy to understand but more detailed explanations needed | S11 “The content was beneficial, but more detailed explanations would be good.” S8 “It would be good if patient information could be viewed simultaneously when writing SBAR.” | E1 “Source citation is insufficient, and detailed explanations are inadequate” E2 “Difficult to feel engagement because of insufficient interaction design,” “Information with weak credibility needs improvement” E4 “Information structure is appropriate, but source citation is insufficient” |
| Quantity of Information | 3.83 ± 1.47 | 3.75 ± 0.96 | ||||||
| Visual Information | 4.17 ± 1.40 | 3.75 ± 1.89 | ||||||
| Credibility of Source | 3.92 ± 1.31 | 3.25 ± 2.22 | ||||||
| Subjective Quality | Recommendation | 4.33 ± 0.49 | 4.02 ± 0.46 | 4.25 ± 0.96 | 3.50 ± 0.54 | Positive response that it is worth trying but still insufficient for commercialization | S2 “Overall, it was satisfactory and seems recommendable to others.” S9 “Overall, the design was user-friendly and posed no difficulties for use.” | E4 “It is worth trying but needs improvement” E2 “It is still insufficient for the commercialization stage” |
| Usage Intent | 3.75 ± 0.62 | 2.75 ± 0.50 | ||||||
| Willingness to Pay | 3.58 ± 0.79 | 3.00 ± 0.00 | ||||||
| Overall Satisfaction | 4.42 ± 0.51 | 4.00 ± 0.82 | ||||||
| Total | 4.25 ± 0.24 | 3.71 ± 0.15 | ||||||
References
- Sharma, D.; Sharma, J. The potential of virtual cloud character creation technology-meta human creator: A review. AIP Conf. Proc. 2023, 2782, 020153. [Google Scholar] [CrossRef]
- Fang, Z.; Cai, L.; Wang, G. MetaHuman creator the starting point of the metaverse. In Proceedings of the 2021 International Symposium on Computer Technology and Information Science (ISCTIS), Guilin, China, 4–6 June 2021; pp. 154–157. [Google Scholar] [CrossRef]
- Chojnowski, O.; Eberhard, A.; Schiffmann, M.; Müller, A.; Richert, A. Human-like nonverbal behavior with metahumans in real-world interaction studies: An architecture using generative methods and motion capture. In Proceedings of the 2025 20th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Melbourne, Australia, 3–6 March 2025. [Google Scholar] [CrossRef]
- Salehi, P.; Sheshkal, S.A.; Thambawita, V.; Halvorsen, P. From flat to feeling: A feasibility and impact study on dynamic facial emotions in AI-generated avatars. arXiv 2025. [Google Scholar] [CrossRef]
- Schunk, D.H.; Zimmerman, B.J. Self-Regulation of Learning and Performance: Issues and Educational Applications, 3rd ed.; Routledge: New York, NY, USA, 2023; pp. 255–280. ISBN 9780805813357. [Google Scholar]
- Zhang, D.; Zhou, L.; Briggs, R.O.; Nunamaker, J.F., Jr. Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Inf. Manag. 2006, 43, 15–27. [Google Scholar] [CrossRef]
- Taylor, D.L.; Yeung, M.; Bashet, A.Z. Personalized and adaptive learning. In Innovative Learning Environments in STEM Higher Education: Opportunities, Challenges, and Looking Forward; Ryoo, K., Winkelmann, K., Eds.; Springer: Cham, Switzerland, 2021; pp. 17–34. ISBN 9783030589479. [Google Scholar]
- Yaseen, H.; Mohammad, A.S.; Ashal, N.; Abusaimeh, H.; Ali, A.; Sharabati, A.-A.A. The impact of adaptive learning technologies, personalized feedback, and interactive AI tools on student engagement: The moderating role of digital literacy. Sustainability 2025, 17, 1133. [Google Scholar] [CrossRef]
- Nielsen, J. Usability Engineering; Morgan Kauffman: San Francisco, CA, USA, 1994; ISBN 0125184069. [Google Scholar]
- Bertini, E.; Gabrielli, S.; Kimani, S.; Catarci, T.; Santucci, G. Appropriating and assessing heuristics for mobile computing. In Proceedings of the AVI ’06: Proceedings of the Working Conference on Advanced Visual Interfaces, Venezia, Italy, 23–26 May 2006. [Google Scholar] [CrossRef]
- Choi, H.; Tak, S.H.; Lee, D. Nursing students’ learning flow, self efficacy and satisfaction in virtual clinical simulation and clinical case seminar. BMC Nurs. 2023, 22, 454. [Google Scholar] [CrossRef] [PubMed]
- Saab, M.M.; McCarthy, M.; O’Mahony, B.; Cooke, E.; Hegarty, J.; Murphy, D.; Walshe, N.; Noonan, B. Virtual reality simulation in nursing and midwifery education: A usability study. CIN Comput. Inform. Nurs. 2023, 41, 815–824. [Google Scholar] [CrossRef] [PubMed]
- Choi, K.S. Virtual reality simulation for learning wound dressing: Acceptance and usability. Clin. Simul. Nurs. 2022, 68, 49–57. [Google Scholar] [CrossRef]
- Pour, M.E.; Gieβer, C.; Schmitt, J.; Brück, R. Evaluation of metahumans as digital tutors in augmented reality in medical training. Curr. Dir. Biomed. Eng. 2023, 9, 41–44. [Google Scholar] [CrossRef]
- Ivory, M.Y.; Hearst, M.A. The state of the art in automating usability evaluation of user interfaces. ACM Comput. Surv. 2001, 33, 470–516. [Google Scholar] [CrossRef]
- Silva, A.G.; Caravau, H.; Martins, A.; Almeida, A.M.P.; Silva, T.; Ribeiro, Ó.; Santinha, G.; Rocha, N.P. Procedures of user-centered usability assessment for digital solutions: Scoping review of reviews reporting on digital solutions relevant for older adults. JMIR Hum. Factors 2021, 8, e22774. [Google Scholar] [CrossRef] [PubMed]
- Cho, H.; Yen, P.Y.; Dowding, D.; Merrill, J.A.; Schnall, R. A multi-level usability evaluation of mobile health applications: A case study. J. Biomed. Inform. 2018, 86, 79–89. [Google Scholar] [CrossRef] [PubMed]
- Hwang, W.; Salvendy, G. Number of people required for usability evaluation: The 10 ± 2 rule. Commun. ACM 2010, 53, 130–133. [Google Scholar] [CrossRef]
- Stoyanov, S.R.; Hides, L.; Kavanagh, D.J.; Zelenko, O.; Tjondronegoro, D.; Mani, M. Mobile app rating scale: A new tool for assessing the quality of health mobile apps. JMIR mHealth uHealth 2015, 3, e27. [Google Scholar] [CrossRef] [PubMed]
- IBM Corp. IBM SPSS Statistics for Windows, Version 25.0; IBM Corp: Armonk, NY, USA, 2017.
- Jeffries, R.; Miller, J.R.; Wharton, C.; Uyeda, K.M. User interface evaluation in the real world: A comparison of four techniques. In Proceedings of the CHI ’91: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 27 April–2 May 1991; pp. 119–124. [Google Scholar] [CrossRef]
- Elo, S.; Kyngäs, H. The qualitative content analysis process. J. Adv. Nurs. 2008, 62, 107–115. [Google Scholar] [CrossRef] [PubMed]
- Novak, E.; McDaniel, K.; Li, J. Factors that impact student frustration in digital learning environments. Comput. Educ. Open 2023, 5, 100153. [Google Scholar] [CrossRef]
- Nielsen, J. 10 Usability Heuristics for User Interface Design. Available online: https://www.nngroup.com/articles/ten-usability-heuristics/ (accessed on 7 August 2025).
- Liu, Z.; Zhang, Q.; Liu, W. Perceptions and needs for a community nursing virtual simulation system: A qualitative study. Heliyon 2024, 10, e28473. [Google Scholar] [CrossRef] [PubMed]
- Kiegaldie, D.; Shaw, L. Virtual reality simulation for nursing education: Effectiveness and feasibility. BMC Nurs. 2023, 22, 488. [Google Scholar] [CrossRef] [PubMed]
- Lee, J.; Soylu, M.Y.; Ou, C. Exploring insights from online students: Enhancing the design and development of intelligent textbooks for the future of online education. Int. J. Innov. Online Edu. 2023, 7, 29–54. [Google Scholar] [CrossRef]


| Characteristics | Categories | n (%) or M ±SD |
|---|---|---|
| Students (n = 12) | ||
| Age (year) | 21.07 ± 1.03 | |
| Using the experience of digital textbooks | Yes | 7 (58.3) |
| No | 5 (41.7) | |
| Learning experience in nursing care for upper gastrointestinal bleeding | Yes | 8 (66.7) |
| No | 4 (33.3) | |
| Learning experience in nursing care for patients with ileus | Yes | 5 (41.7) |
| No | 7 (58.3) | |
| Experts (n = 4) | ||
| Age (year) | 48.25 ± 6.29 | |
| Area of expertise | Adult nursing | 3 (75.0) |
| Critical care nursing | 1 (25.0) | |
| Research field | Web-based education | 1 |
| Mobile-based education | 1 | |
| Virtual reality-based education | 1 | |
| Smartphone application-based education | 1 | |
| Clinical experience (year) | 4.91 ± 2.10 | |
| Educational experience (year) | 12.00 ± 8.03 | |
| Heuristics | Severity Score | Examples of Qualitative Evaluation Narratives | |||
|---|---|---|---|---|---|
| E1 | E2 | E3 | E4 | ||
| Visibility of System Status | 0 | 0 | 0 | 0 | Location-based learning guide function E4 a: “The app did not clearly indicate the current status or progress shown to users.” |
| Match Between the System and the Real Society | 0 | 2 | 1 | 0 | None |
| Consistency and Standards | 0 | 4 | 2 | 0 | Insufficient screen transition or button connectivity: E1: “The unclear Back or Home buttons make navigation within the app difficult.” E1: “During learning, there is a lack of a function to control content back and forth or go back.” |
| User Control and Freedom | 0 | 3 | 0 | 1 | E4: “Providing error hints and corrective feedback would improve learning.” |
| Recognition Rather than Recall | 1 | 0 | 2 | 0 | E1: “Sound settings were missing, and navigation was difficult.” |
| Flexibility and Efficiency of Use | 0 | 0 | 3 | 0 | E1: “Lack of poor navigation hindered.” |
| Aesthetic and Minimalist Design | 0 | 0 | 0 | 0 | None |
| Error Prevention | 2 | 3 | 0 | 3 | Lack of automatic Save button and interface status display: E3: “Because automatic saving is not available, input data may be lost easily.” E4: “It would be good to provide hints and enable correction when errors occur.” E2: “It would be good to provide ways to correct errors.” |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jang, A.; Jeong, H.; Kim, Y. Evaluating a Metahuman-Integrated Computer-Based Training Tool for Nursing Interventions: Usability and Expert Heuristic Analysis. Appl. Sci. 2025, 15, 11650. https://doi.org/10.3390/app152111650
Jang A, Jeong H, Kim Y. Evaluating a Metahuman-Integrated Computer-Based Training Tool for Nursing Interventions: Usability and Expert Heuristic Analysis. Applied Sciences. 2025; 15(21):11650. https://doi.org/10.3390/app152111650
Chicago/Turabian StyleJang, Aeri, Hyunju Jeong, and Yunhee Kim. 2025. "Evaluating a Metahuman-Integrated Computer-Based Training Tool for Nursing Interventions: Usability and Expert Heuristic Analysis" Applied Sciences 15, no. 21: 11650. https://doi.org/10.3390/app152111650
APA StyleJang, A., Jeong, H., & Kim, Y. (2025). Evaluating a Metahuman-Integrated Computer-Based Training Tool for Nursing Interventions: Usability and Expert Heuristic Analysis. Applied Sciences, 15(21), 11650. https://doi.org/10.3390/app152111650

