Evaluating Interaction Capability in a Serious Game for Children with ASD: An Operability-Based Approach Aligned with ISO/IEC 25010:2023
Abstract
1. Introduction
- Search strategy: Systematic searches across PubMed, IEEE Xplore, ACM Digital Library, and Scopus databases using terms: (“serious games” OR “educational games”) AND (“autism” OR “ASD”) AND (“usability” OR “interaction”) from 2018 to 2024;
- Inclusion criteria: Peer-reviewed studies evaluating serious games for children with ASD, reporting usability or interaction metrics;
- Quality assessment: Studies were evaluated using the Effective Public Health Practice Project (EPHPP) quality assessment tool. The review identified 25 relevant studies, with 22 (88%) demonstrating significant improvements in at least one measured outcome, consistent with recent meta-analyses showing large effect sizes (Hedges’ g = 0.62) for digital interventions in ASD populations [9].
2. Materials and Methods
2.1. Study Design and Sample Size Justification
- Enable detailed longitudinal analysis of learning patterns and adaptation;
- Provide sufficient within-subject data for mixed-effects modeling;
- Capture the heterogeneity characteristic of ASD populations;
- Establish feasibility and generate effect size estimates for future larger-scale investigations.
2.2. Population and Sample
- Institutional Authorization: Official governmental approval was obtained through the Ministry of Education’s formal review process, ensuring compliance with national educational research standards and protection protocols for children with special needs. The authorization required detailed project documentation and institutional coordination protocols.
- Consent from Parent or Guardian: Informed permission was acquired from all parents/guardians after a comprehensive discussion of the study’s methods, risks, benefits, and the voluntary aspect of participation. Parents were notified of their entitlement to remove their kid at any moment without repercussions, as delineated in the governmental permission stipulations.
- Child Assent Protocol: Age-appropriate assent was obtained from all participants using a multi-modal approach adapted for ASD communication needs [25]:
- Visual assent materials: Pictorial cards showing study activities (playing games, being observed) with simple yes/no response options;
- Verbal explanation: Therapist-mediated explanation using familiar language and allowing processing time;
- Behavioral indicators: Continuous monitoring for signs of distress, withdrawal, or non-compliance as indicators of withdrawn assent;
- Ongoing consent: Assent was reconfirmed at each session, with immediate discontinuation if the child showed reluctance.
2.3. Statistical Analysis Plan
- Linear mixed-effects models (LMM) were employed to analyze the relationship between operability and gameplay variables, accounting for the nested structure of sessions with participants. Models were fitted using restricted maximum likelihood (REML) estimation with Kenward–Roger adjusted degrees of freedom to provide accurate inference with small samples [27,28];
- The repeated measures design (25 total observations) provides sufficient power for mixed-effects modeling despite small between-subject sample size [29];
- Visual inspection of residuals and Q-Q plots indicated acceptable approximation to normality for most variables, with robust standard errors providing additional protection against distributional assumptions;
2.3.1. Effect Size Calculation
2.3.2. Model Specification
2.3.3. Assumption Checking
2.4. Software Design and Development
2.5. Operability Evaluation
2.5.1. Operability Framework
- The distribution of empirical scores obtained from 25 gameplay sessions.
- Percentile-based analysis, where 40 points roughly aligned with the first quartile (Q1).
- Expert consultation with ASD-specialized educators and therapists.
- Comparison with similar interpretative schemes in usability and accessibility evaluations.
2.5.2. Custom-Designed Parametric Model
- P1: Ease of learning;
- P2: User control over the interface;
- P3: Conformance with interface conventions;
- P4: Comprehension of system messages;
- wi: Weights assigned to each parameter based on expert judgment and relevance.
2.5.3. Automated Metric-Based Model
- D Difficulty level (1 = easy, 3 = hard);
- T: Time to complete the activity (in seconds);
- M: Number of movements made;
- P: Final score obtained;
- Weights: w1 = 0.3, w2 = 0.2, w3 = 0.2, w4 = 0.3.
- Therapeutic relevance: components directly linked to ASD intervention goals received higher weights;
- Empirical evidence: factors with strongest research support in autism HCI literature;
- Clinical observation: behaviors most frequently targeted in institutional therapy sessions;
- Accessibility impacts most critical for inclusive digital design.
2.6. Validation Strategy
- Content Validity: A panel of three ASD specialists evaluated each operability component for relevance and appropriateness on a 4-point scale, achieving consensus on all items.
- Internal Consistency: Split-half reliability was calculated by randomly dividing the 25 sessions and correlating operability scores between halves, with Spearman–Brown correction applied.
- Construct Validation: Partial correlation analysis examined relationships between operability components while controlling overall game score, to demonstrate that the metric captures unique variance beyond performance alone.
- Convergent Validity: Operability scores were correlated with observational data on engagement behaviors recorded during sessions by trained observers blind to the operability scores.
- Exploratory Factor Analysis: To examine the structural validity of the operability construct, exploratory factor analysis (EFA) will be conducted on the four operability components using principal axis factoring with oblique rotation. The Kaiser–Meyer–Olkin measure will verify sampling adequacy, and Bartlett’s test of sphericity will assess the appropriateness of the correlation matrix for factor analysis. This analysis will determine whether the operability components form a coherent unidimensional construct or represent multiple underlying factors, providing evidence for the theoretical conceptualization of the composite score approach.
3. Results
3.1. Results—Puzzle Game
3.1.1. Descriptive Statistics and Individual Variability
3.1.2. Individual Trajectories and Between-Subject Variability
3.1.3. Categorical Interpretation of Operability
3.1.4. Validation Results
Internal Consistency
Construct Validity
Comprehensive Validation Evidence
- Easy vs. Medium: z = −2.47, padj = 0.041;
- Easy vs. Hard: z = −3.12, padj = 0.005;
- Medium vs. Hard: z = −1.85, padj = 0.192.
3.1.5. Between-Group Differences (Non-Parametric Analysis)
4. Discussion
4.1. Methodological Strengths and Limitations
4.2. Interpretation of Correlational Patterns
4.3. Implications and Future Directions
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Correction Statement
References
- Hodges, E.K.; Kuentzel, J.G.; Hook, J.N. Pediatric Neuropsychology: Perspectives from the Ambulatory Care Setting, 1st ed.; Routledge: New York, NY, USA, 2022; ISBN 978-1-003-34907-5. [Google Scholar]
- Salloum-Asfar, S.; Zawia, N.; Abdulla, S.A. Retracing Our Steps: A Review on Autism Research in Children, Its Limitation and Impending Pharmacological Interventions. Pharmacol. Ther. 2024, 253, 108564. [Google Scholar] [CrossRef] [PubMed]
- AlSalehi, S.M.; Alhifthy, E.H. Autism Spectrum Disorder. In Clinical Child Neurology; Salih, M.A.M., Ed.; Springer International Publishing: Cham, Switzerland, 2020; pp. 275–292. ISBN 978-3-319-43152-9. [Google Scholar]
- Autism Spectrum Disorder. In Harris’ Developmental Neuropsychiatry: The Interface with Cognitive and Social Neuroscience; Harris, J.C., Coyle, J.T., Eds.; Oxford University Press: New York, NY, USA, 2024; pp. 445–516. ISBN 978-0-19-992811-8. [Google Scholar]
- Sauer, A.K.; Stanton, J.E.; Hans, S.; Grabrucker, A.M. Autism Spectrum Disorders: Etiology and Pathology. In Autism Spectrum Disorders; Grabrucker, A.M., Ed.; Exon Publications: Brisbane, Australia, 2021; pp. 1–16. ISBN 978-0-6450017-8-5. [Google Scholar]
- Corcoran, J.; Wolk, C.B. Autism Spectrum Disorder: Jacqueline Corcoran, Julie Worley, and Courtney Benjamin Wolk. In Child and Adolescent Mental Health in Social Work; Oxford University Press: New York, NY, USA, 2023; pp. 87–106. ISBN 978-0-19-765356-2. [Google Scholar]
- Greydanus, D.E.; Patel, D.R.; Rowland, D.C. Autism Spectrum Disorder. In Comprehensive Pharmacology; Elsevier: Amsterdam, The Netherlands, 2022; pp. 396–434. ISBN 978-0-12-820876-2. [Google Scholar]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
- Sandgreen, H.; Frederiksen, L.H.; Bilenberg, N. Digital Interventions for Autism Spectrum Disorder: A Meta-Analysis. J. Autism Dev. Disord. 2021, 51, 3138–3152. [Google Scholar] [CrossRef]
- Löytömäki, J.; Ohtonen, P.; Huttunen, K. Serious Game the Emotion Detectives Helps to Improve Social–Emotional Skills of Children with Neurodevelopmental Disorders. Brit J. Educ. Technol. 2024, 55, 1126–1144. [Google Scholar] [CrossRef]
- Dalwadi, R.D.; Nalawade, S.; Mazumdar, P.; Chetia, B. Comprehensive Study on Serious Game Design for Autistic Children. In Proceedings of the 2023 IEEE 11th Region 10 Humanitarian Technology Conference (R10-HTC), Rajkot, India, 16 October 2023; pp. 990–996. [Google Scholar]
- Abd El-Sattar, H.K.H.; Omar, M.; Mohamady, H. Developing a Participatory Research Framework through Serious Games to Promote Learning for Children with Autism. Front. Educ. 2024, 9, 1453327. [Google Scholar] [CrossRef]
- Costello, R.; Donovan, J. How Game Designers Can Account for Those with Autism Spectrum Disorder (ASD) When Designing Game Experiences: In Research Anthology on Physical and Intellectual Disabilities in an Inclusive Society; Information Resources Management Association, Ed.; IGI Global: Hershey, PA, USA, 2022; pp. 202–224. ISBN 978-1-6684-3542-7. [Google Scholar]
- Muneeb, S.; Sitbon, L.; Ahmad, F. Opportunities for Serious Game Technologies to Engage Children with Autism in a Pakistani Sociocultural and Institutional Context: An Investigation of the Design Space for Serious Game Technologies to Enhance Engagement of Children with Autism and to Facilitate External Support Provided. In Proceedings of the Proceedings of the 34th Australian Conference on Human-Computer Interaction, Sweetser, Penny, 29 November 2022; ACM: Canberra, Australia, 2022; pp. 338–347. [Google Scholar]
- Suryapranata, L.K.P.; Soewito, B.; Kusuma, G.P.; Gaol, F.L.; Warnars, H.L.H.S. Quality Measurement for Serious Games. In Proceedings of the 2017 International Conference on Applied Computer and Communication Technologies (ComCom), Jakarta, Indonesia, 17–18 May 2017; pp. 1–4. [Google Scholar]
- Wibawa, R.C.; Rochimah, S.; Anggoro, R. A Development of Quality Model for Online Games Based on ISO/IEC 25010. In Proceedings of the 2019 12th International Conference on Information & Communication Technology and System (ICTS), Surabaya, Indonesia, 18 July 2019; pp. 215–218. [Google Scholar]
- Lee, M.; Shin, S.; Lee, M.; Hong, E. Educational Outcomes of Digital Serious Games in Nursing Education: A Systematic Review and Meta-Analysis of Randomized Controlled Trials. BMC Med. Educ. 2024, 24, 1458. [Google Scholar] [CrossRef]
- ISO/IEC 25010:2023; Systems and Software Engineering—Systems and Software Quality Requirements and Evaluation (SQuaRE)—System and Software Quality Models 2023. International Organization for Standardization, International Electrotechnical Commission: Geneva, Switzerland, 2023.
- Estrada Molina, O.; Fuentes-Cancell, D.R.; García-Hernández, A. Evaluating Usability in Educational Technology: A Systematic Review from the Teaching of Mathematics. LUMAT 2022, 10. [Google Scholar] [CrossRef]
- Berkovits, L.; Eisenhower, A.; Blacher, J. Emotion Regulation in Young Children with Autism Spectrum Disorders. J. Autism Dev. Disord. 2017, 47, 68–79. [Google Scholar] [CrossRef]
- Mack, K.; McDonnell, E.; Jain, D.; Wang, L.; Froehlich, J.E.; Findlater, L. What Do We Mean by “Accessibility Research”?: A Literature Survey of Accessibility Papers in CHI and ASSETS from 1994 to 2019. In Proceedings of the Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Online, 8–13 May 2021; ACM: Yokohama, Japan, 2021; pp. 1–18. [Google Scholar]
- Brysbaert, M. How Many Participants Do We Have to Include in Properly Powered Experiments? A Tutorial of Power Analysis with Reference Tables. J. Cogn. 2019, 2, 16. [Google Scholar] [CrossRef] [PubMed]
- Tackett, J.L.; Lilienfeld, S.O.; Patrick, C.J.; Johnson, S.L.; Krueger, R.F.; Miller, J.D.; Oltmanns, T.F.; Shrout, P.E. It’s Time to Broaden the Replicability Conversation: Thoughts for and From Clinical Psychological Science. Perspect. Psychol. Sci. 2017, 12, 742–756. [Google Scholar] [CrossRef]
- Cascio, M.A.; Weiss, J.A.; Racine, E.; the Autism Research Ethics Task Force. Person-Oriented Ethics for Autism Research: Creating Best Practices through Engagement with Autism and Autistic Communities. Autism 2020, 24, 1676–1690. [Google Scholar] [CrossRef]
- Morris, C.; Detrick, J.J.; Peterson, S.M. Participant Assent in Behavior Analytic Research: Considerations for Participants with Autism and Developmental Disabilities. J. App. Behav. Anal. 2021, 54, 1300–1316. [Google Scholar] [CrossRef]
- El Shemy, I.; Jaccheri, L.; Giannakos, M.; Vulchanova, M. Participatory Design of Augmented Reality Games for Word Learning in Autistic Children: The Parental Perspective. Entertain. Comput. 2025, 52, 100756. [Google Scholar] [CrossRef]
- Magezi, D.A. Linear Mixed-Effects Models for within-Participant Psychology Experiments: An Introductory Tutorial and Free, Graphical User Interface (LMMgui). Front. Psychol. 2015, 6, 2. [Google Scholar] [CrossRef]
- Luke, S.G. Evaluating Significance in Linear Mixed-Effects Models in R. Behav. Res. 2017, 49, 1494–1502. [Google Scholar] [CrossRef]
- Barr, D.J.; Levy, R.; Scheepers, C.; Tily, H.J. Random Effects Structure for Confirmatory Hypothesis Testing: Keep It Maximal. J. Mem. Lang. 2013, 68, 255–278. [Google Scholar] [CrossRef]
- Lakens, D. Calculating and Reporting Effect Sizes to Facilitate Cumulative Science: A Practical Primer for t-Tests and ANOVAs. Front. Psychol. 2013, 4, 863. [Google Scholar] [CrossRef]
- Kirby, K.N.; Gerlanc, D. BootES: An R Package for Bootstrap Confidence Intervals on Effect Sizes. Behav. Res. 2013, 45, 905–927. [Google Scholar] [CrossRef]
- Rousselet, G.A.; Pernet, C.R.; Wilcox, R.R. Beyond Differences in Means: Robust Graphical Methods to Compare Two Groups in Neuroscience. Eur. J. Neurosci. 2017, 46, 1738–1748. [Google Scholar] [CrossRef]
- Ghasemi, A.; Zahediasl, S. Normality Tests for Statistical Analysis: A Guide for Non-Statisticians. Int. J. Endocrinol. Metab. 2012, 10, 486–489. [Google Scholar] [CrossRef]
- Zimmerman, D.W. A Note on Preliminary Tests of Equality of Variances. Brit J. Math. Statis 2004, 57, 173–181. [Google Scholar] [CrossRef] [PubMed]
- Lami, G.; Spagnolo, G. A Lightweight Software Product Quality Evaluation Method. In Proceedings of the 17th International Conference on Software Technologies, Lisbon, Portugal, 11–13 July 2022; SCITEPRESS—Science and Technology Publications: Lisbon, Portugal, 2022; pp. 524–531. [Google Scholar]
- Carneiro, T.; Carvalho, A.; Frota, S.; Filipe, M.G. Serious Games for Developing Social Skills in Children and Adolescents with Autism Spectrum Disorder: A Systematic Review. Healthcare 2024, 12, 508. [Google Scholar] [CrossRef] [PubMed]
- Talebi Azadboni, T.; Nasiri, S.; Khenarinezhad, S.; Sadoughi, F. Effectiveness of Serious Games in Social Skills Training to Autistic Individuals: A Systematic Review. Neurosci. Biobehav. Rev. 2024, 161, 105634. [Google Scholar] [CrossRef]
- Valencia, K.; Rusu, C.; Botella, F.; Jamet, E. A Methodology to Evaluate User Experience for People with Autism Spectrum Disorder. Appl. Sci. 2022, 12, 11340. [Google Scholar] [CrossRef]
- Aguiar, Y.P.C.; Galy, E.; Godde, A.; Trémaud, M.; Tardif, C. AutismGuide: A Usability Guidelines to Design Software Solutions for Users with Autism Spectrum Disorder. Behav. Inf. Technol. 2022, 41, 1132–1150. [Google Scholar] [CrossRef]
- Jaramillo-Alcazar, A.; Lujan-Mora, S.; Salvador-Ullauri, L. Accessibility Assessment of Mobile Serious Games for People with Cognitive Impairments. In Proceedings of the 2017 International Conference on Information Systems and Computer Science (INCISCOS), Quito, Ecuador, 23–25 November 2017; pp. 323–328. [Google Scholar]
- Wang, R.K.; Kwong, K.; Liu, K.; Kong, X.-J. New Eye Tracking Metrics System: The Value in Early Diagnosis of Autism Spectrum Disorder. Front. Psychiatry 2024, 15, 1518180. [Google Scholar] [CrossRef]
- Yuan, A.; Sabatos-DeVito, M.; Bey, A.L.; Major, S.; Carpenter, K.L.; Franz, L.; Howard, J.; Vermeer, S.; Simmons, R.; Troy, J.; et al. Automated Movement Tracking of Young Autistic Children during Free Play Is Correlated with Clinical Features Associated with Autism. Autism 2023, 27, 2530–2541. [Google Scholar] [CrossRef]
- Birkeneder, S.L.; Bullen, J.; McIntyre, N.; Zajic, M.C.; Lerro, L.; Solomon, M.; Sparapani, N.; Mundy, P. The Construct Validity of the Childhood Joint Attention Rating Scale (C-JARS) in School-Aged Autistic Children. J. Autism Dev. Disord. 2024, 54, 3347–3363. [Google Scholar] [CrossRef]
- Maun, R.; Fabri, M.; Trevorrow, P. Participatory Methods to Engage Autistic People in the Design of Digital Technology: A Systematic Literature Review. J. Autism Dev. Disord. 2024, 54, 2960–2971. [Google Scholar] [CrossRef]
- Lotfizadeh, A.D.; Gard, B.; Rico, C.; Poling, A.; Choi, K.R. Convergent and Discriminant Validity of the Verbal Behavior Milestones Assessment and Placement Program (VB-MAPP) and the Vineland Adaptive Behavior Scales (VABS). J. Autism Dev. Disord. 2025, 55, 803–811. [Google Scholar] [CrossRef]
- Harrison, A.J.; Madison, M.; Naqvi, N.; Bowman, K.; Campbell, J. The Development of the Autism Stigma and Knowledge Questionnaire, Second Edition (ASK-Q-2), through a Cross-Cultural Psychometric Investigation. Autism 2025, 29, 195–206. [Google Scholar] [CrossRef]
- Turvey, M.T. Coordination Dynamics: Issues and Trends. In Handbook of Sport Psychology; Tenenbaum, G., Eklund, R.C., Eds.; Wiley: Hoboken, NJ, USA, 2020; pp. 477–497. [Google Scholar]
- English, M.C.; Poulsen, R.E.; Maybery, M.T.; McAlpine, D.; Sowman, P.F.; Pellicano, E. Psychometric Evaluation of the Comprehensive Autistic Trait Inventory in Autistic and Non-Autistic Adults. Autism 2025, 13623613251347740. [Google Scholar] [CrossRef] [PubMed]
- Gowen, E.; Taylor, R.; Bleazard, T.; Greenstein, A.; Baimbridge, P.; Poole, D. Guidelines for Conducting Research Studies with the Autism Community. Autism Policy Pr. 2019, 2, 29–45. [Google Scholar]
- Whyte, E.M.; Smyth, J.M.; Scherf, K.S. Designing Serious Game Interventions for Individuals with Autism. J. Autism Dev. Disord. 2015, 45, 3820–3831. [Google Scholar] [CrossRef] [PubMed]
- Yifu, L.; Yan, M.; Libing, H.; Chunling, X.; Tao, D. The Effects of Human-Computer Interaction-Based Interventions for Autism Spectrum Disorder: A Meta-Analysis. Educ. Inf. Technol. 2025, 30, 8353–8372. [Google Scholar] [CrossRef]
- Pronk, T.; Molenaar, D.; Wiers, R.W.; Murre, J. Methods to Split Cognitive Task Data for Estimating Split-Half Reliability: A Comprehensive Review and Systematic Assessment. Psychon. Bull. Rev. 2022, 29, 44–54. [Google Scholar] [CrossRef] [PubMed]
- Schrepp, M.; Hinderks, A.; Thomaschewski, J. Design and Evaluation of a Short Version of the User Experience Questionnaire (UEQ-S). Int. J. Interact. Multimed. Artif. Intell. 2017, 4, 103–108. [Google Scholar] [CrossRef]
- Lewis, J.R.; Sauro, J. Item Benchmarks for the System Usability Scale. J. Usability Stud. 2018, 13, 158–167. [Google Scholar]
- Chow, J.; Zhao, H.; Sandbank, M.; Bottema-Beutel, K.; Woynaroski, T. Empirically-Derived Effect Size Distributions of Interventions for Young Children on the Autism Spectrum. J. Clin. Child Adolesc. Psychol. 2023, 52, 271–283. [Google Scholar] [CrossRef]
- Sandbank, M.; Bottema-Beutel, K.; Crowley LaPoint, S.; Feldman, J.I.; Barrett, D.J.; Caldwell, N.; Dunham, K.; Crank, J.; Albarran, S.; Woynaroski, T. Autism Intervention Meta-Analysis of Early Childhood Studies (Project AIM): Updated Systematic Review and Secondary Analysis. BMJ 2023, 383, e076733. [Google Scholar] [CrossRef]










| Participant ID | Age (Years) | ASD Level (DSM-5) | Gender | Type of Communication | Observations |
|---|---|---|---|---|---|
| P1 | 6 | Level 2 | Male | Verbal with moderate support | Slow response time |
| P2 | 7 | Level 1 | Female | Fluent verbal | Strong visual learning |
| P3 | 8 | Level 3 | Male | Non-verbal, uses pictograms | High sensory sensitivity |
| P4 | 6 | Level 2 | Male | Verbal, simple sentences | Short attention span |
| P5 | 9 | Level 1 | Female | Fluent verbal | Needs repeated instructions |
| P6 | 7 | Level 2 | Male | Basic verbal communication | Mild motor difficulties |
| P7 | 8 | Level 3 | Female | Non-verbal, uses gestures | Avoids eye contact |
| P8 | 9 | Level 2 | Male | Verbal, simple responses | Visual preference |
| P9 | 6 | Level 1 | Female | Fluent verbal | Good concentration |
| P10 | 7 | Level 3 | Female | Non-verbal, uses AAC device | Low frustration tolerance |
| Operability | Interpreted Usability Level |
|---|---|
| ≥60 | High operability |
| 40–59 | Acceptable/moderate operability |
| <40 | Low operability, improvement recommended |
| Statistic | Value | 95% Bootstrap CI |
|---|---|---|
| Count | 25 | - |
| Mean | 45.07 | [41.07, 49.07] |
| Standard Deviation | 10.52 | [8,52, 12.52] |
| Skewness | 0.18 | [−0.25, 0.61] |
| Kurtosis | [−1.47, −0.31] | |
| Min | 26.73 | - |
| 25% | 35.77 | - |
| 50% | 47.92 | - |
| 75% | 54.55 | - |
| Max | 62.12 | - |
| Effect Sizes | −0.89 | |
| Hedges’ g (vs. threshold 40) | 0.47 | [0.09, 0.92] |
| Kendall’s W (concordance) | 0.37 | [0.26, 0.48] |
| Operability Level | Frequency |
|---|---|
| Low | 10 |
| Moderate | 13 |
| High | 2 |
| Level | n | Mean | SD | Range |
|---|---|---|---|---|
| 1 (Easy) | 10 | 37.04 | 9.31 | 26.73–62.12 |
| 2 (Medium) | 10 | 48.71 | 7.49 | 31.41–57.22 |
| 3 (Hard) | 5 | 53.87 | 7.16 | 43.61–61.91 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Carrión-León, D.I.; Lopez-Ramos, M.P.; Santillan-Valdiviezo, L.G.; Tanguila-Tapuy, D.S.; Morocho-Santos, G.M.; Moyano-Arias, R.J.; Yautibug-Apugllón, M.E.; Chacón-Luna, A.E. Evaluating Interaction Capability in a Serious Game for Children with ASD: An Operability-Based Approach Aligned with ISO/IEC 25010:2023. Computers 2025, 14, 370. https://doi.org/10.3390/computers14090370
Carrión-León DI, Lopez-Ramos MP, Santillan-Valdiviezo LG, Tanguila-Tapuy DS, Morocho-Santos GM, Moyano-Arias RJ, Yautibug-Apugllón ME, Chacón-Luna AE. Evaluating Interaction Capability in a Serious Game for Children with ASD: An Operability-Based Approach Aligned with ISO/IEC 25010:2023. Computers. 2025; 14(9):370. https://doi.org/10.3390/computers14090370
Chicago/Turabian StyleCarrión-León, Delia Isabel, Milton Paúl Lopez-Ramos, Luis Gonzalo Santillan-Valdiviezo, Damaris Sayonara Tanguila-Tapuy, Gina Marilyn Morocho-Santos, Raquel Johanna Moyano-Arias, María Elena Yautibug-Apugllón, and Ana Eva Chacón-Luna. 2025. "Evaluating Interaction Capability in a Serious Game for Children with ASD: An Operability-Based Approach Aligned with ISO/IEC 25010:2023" Computers 14, no. 9: 370. https://doi.org/10.3390/computers14090370
APA StyleCarrión-León, D. I., Lopez-Ramos, M. P., Santillan-Valdiviezo, L. G., Tanguila-Tapuy, D. S., Morocho-Santos, G. M., Moyano-Arias, R. J., Yautibug-Apugllón, M. E., & Chacón-Luna, A. E. (2025). Evaluating Interaction Capability in a Serious Game for Children with ASD: An Operability-Based Approach Aligned with ISO/IEC 25010:2023. Computers, 14(9), 370. https://doi.org/10.3390/computers14090370

