Next Article in Journal
Can Entropy Predict the Creativity Potential of Mathematical Tasks? An Exploratory Study in Mathematics Education
Previous Article in Journal
Language and/or Literacy Disorders vs. Language Differences in Multilingual Children: Development of Two Detection Questionnaires
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Brief Report

Enhancing Academic Performance in Motor Control: A Structured H5P-Based Multiple-Choice Intervention in Higher Education

1
Sports Sciences Laboratory (LAB-ISCE Human Performance Center), Department of Sport Sciences, ISCE—Polytechnic University of Lisbon and Tagus Valley, 2620-379 Lisbon, Portugal
2
Faculty of Sport Sciences and Physical Education, CIPER, University of Coimbra, 3040-256 Coimbra, Portugal
3
Live Quality Research Center (CIEQV), Complexo Andaluz, Apartado 279, 2001-904 Santarém, Portugal
4
Department of Social Sciences and Humanities, ISCE—Polytechnic University of Lisbon and Tagus Valley, 2620-379 Lisbon, Portugal
5
Department of Business Sciences, ISCE—Polytechnic University of Lisbon and Tagus Valley, 2620-379 Lisbon, Portugal
6
Department of Education, ISCE—Polytechnic University of Lisbon and Tagus Valley, 2620-379 Lisbon, Portugal
*
Author to whom correspondence should be addressed.
Educ. Sci. 2026, 16(4), 619; https://doi.org/10.3390/educsci16040619
Submission received: 21 February 2026 / Revised: 6 April 2026 / Accepted: 10 April 2026 / Published: 14 April 2026

Abstract

Background: Interactive learning resources developed with the H5P platform have been progressively adopted to support autonomous learning and conceptual consolidation. However, empirical evidence regarding their impact on academic performance in theoretically demanding university courses remains limited. The primary aim of this study was to examine the effect of the structured integration of an interactive digital pedagogical resource developed with multiple-choice H5P on the academic performance of higher education students enrolled in a Motor Control course. Methods: A quasi-experimental study was conducted to compare two independent groups: a control group (CG; n = 90) and an intervention group (IG; n = 115), which had access throughout the semester to a multiple-choice interactive resource developed using the H5P platform. Academic performance was operationalized as the score obtained on a written summative assessment. Baseline equivalence between groups was assessed using an initial diagnostic test. Between-group comparisons were performed using robust non-parametric statistical procedures and further examined using a linear regression model adjusted for relevant covariates. Results: No statistically significant differences were found between groups in the baseline diagnostic test (p > 0.05), indicating comparable starting levels. At the end of the intervention period (≈2 months), the intervention group obtained significantly higher scores in the summative assessment (p < 0.001), with a large effect size (d = 0.87). Conclusions: The findings suggest that the structured integration of multiple-choice H5P resources may positively contribute to academic performance when used as a complementary tool alongside traditional teaching. These results reinforce the pedagogical potential of multiple-choice H5P to support autonomous learning and conceptual consolidation, while also highlighting the need for future research employing more rigorous experimental designs and process-based measures to better understand the underlying learning mechanisms.

1. Introduction

In recent years, higher education has undergone a significant transformation, characterized by an increasing emphasis on student-centered pedagogical approaches, active learning, and the purposeful integration of digital technologies into teaching and learning processes (Ramos-Azcuy et al., 2025). This transformation has also been accompanied by a growing interest in the development and use of structured digital learning resources, particularly Learning Objects and Open Educational Resources, which emphasize reusability, pedagogical alignment, and accessibility in higher education contexts (Hilton, 2016; Wiley et al., 2014). This shift reflects a growing recognition that effective learning environments require not only the transmission of knowledge but also the active engagement of students in meaningful learning activities supported by appropriate pedagogical design. Within this broader context, several initiatives have emerged to promote pedagogical innovation through the integration of evidence-informed teaching practices and digital resources. One such initiative in Portugal is the Pedagogic XXI Consortium, recognized as a Centre of Excellence in Pedagogical Innovation. The consortium promotes educational approaches that emphasize students’ active participation, self-regulated learning, the pedagogically meaningful use of digital resources, and assessment practices oriented toward learning rather than exclusively transmissive instructional models (Gonçalves et al., 2025). Pedagogic XXI brings together 25 higher education institutions distributed across mainland Portugal and the Autonomous Region of Madeira, encompassing diverse scientific domains such as Health Sciences, Sport Sciences, Psychology, Education, and the Social Sciences and Humanities. A key principle underlying this initiative is the alignment between learning objectives, teaching strategies, and assessment practices. This principle is consistent with instructional design frameworks, particularly the concept of constructive alignment, which highlights the need for coherence between intended learning outcomes, learning activities, and assessment strategies (Biggs & Tang, 2011). Importantly, educational research indicates that technology alone does not necessarily lead to improvements in learning outcomes; however, when integrated through coherent pedagogical design and supported by empirical evidence, digital tools can facilitate deeper learning processes and enhance student engagement (Hattie & Timperley, 2007).
Within Sport Sciences, the Motor Control course plays a central role in academic training, providing an essential conceptual foundation for understanding the processes that regulate human movement, motor learning, and adaptation to training and sport practice (Bacmeister et al., 2020; Krakauer et al., 2019). This field is characterized by a high level of theoretical density and a strong interaction between abstract concepts and practical applications, which often represents a considerable pedagogical challenge for students. The complexity of the content, combined with the need to integrate knowledge from motor development, exercise physiology, biomechanics, and motor learning, requires pedagogical strategies that support progressive conceptual consolidation, guided autonomous study, and systematic review of course material, particularly in contexts marked by large class sizes and heterogeneous student profiles.
In this context, interactive digital learning resources (particularly those developed using the H5P platform) have been widely explored as promising strategies to promote student engagement, active learning, and autonomy in studying (Jacob & Centofanti, 2024; Ramos-Azcuy et al., 2025). From an instructional design perspective, such resources can be conceptualized as Learning Objects when they are intentionally designed, structured, and aligned with specific learning objectives and pedagogical functions (Wiley et al., 2014). Recent empirical evidence across different higher education disciplines indicates that H5P activities, when well designed and aligned with clearly defined learning objectives, can contribute to increased cognitive engagement, conceptual consolidation, and, in some cases, improvements in academic performance (Huff & Tseng, 2025). However, the literature also highlights that the effectiveness of such tools depends critically on their pedagogical design, particularly the nature of the proposed tasks, the role of feedback, and the way these resources are integrated into the course assessment ecosystem (Chernikova et al., 2020).
An important aspect of this discussion concerns the use of multiple-choice questions as a pedagogical resource (Greving & Richter, 2022). Traditionally, this format has been criticized for favoring recognition processes rather than active retrieval of previously learned information (Little et al., 2012), and in some contexts has been associated with more superficial forms of learning (Roediger & Karpicke, 2006). More recent research, however, suggests that this perspective may be overly simplistic. When embedded in formative learning activities that require deliberate practice, conceptual elaboration, and informed decision-making, multiple-choice questions can support meaningful and self-regulated learning processes (Butler & Roediger, 2008). In this sense, their pedagogical value may depend less on the format itself and more on how they are integrated within a coherent instructional design that promotes active engagement and repeated interaction with the content. Systematic reviews further indicate that the greatest benefits emerge when such formats are not used solely as assessment tools but rather as study resources integrated within digital environments that promote active engagement and metacognitive reflection (Chernikova et al., 2020).
Despite the increasing adoption of H5P resources in higher education (Huff & Tseng, 2025; Martín-Alguacil et al., 2025; Yang et al., 2025), empirical evidence systematically examining their impact on academic performance in conceptually demanding courses such as Motor Control remains limited. More importantly, there is still insufficient understanding of how specific instructional design decisions embedded within such digital resources, such as structured formative use, repeated engagement, and the absence of immediate feedback, influence learning outcomes beyond the mere presence of technology. In this context, the present study addresses the following research question: Does the structured integration of a multiple-choice H5P-based learning resource influence academic performance in a Motor Control course? Additionally, the study explores the following sub-question: To what extent can this type of instructional design be associated with improved conceptual consolidation in higher education contexts? Therefore, the main objective of the present study was to analyze the effect of the structured integration of an interactive digital pedagogical resource developed with H5P (specifically, multiple-choice activities) on the academic performance of higher education students enrolled in a Motor Control course.

2. Materials and Methods

2.1. Study Design

A quantitative quasi-experimental design with non-equivalent groups was adopted to examine the effect of integrating an interactive digital pedagogical resource on the academic performance of students enrolled in a Motor Control course in a higher education context. Given the characteristics of the educational setting, random allocation of students was not feasible; therefore, a historical comparison group was used. Accordingly, two cohorts were compared: a historical control group (CG) and an intervention group (IG).

2.2. Participants

A total of 205 higher education students enrolled in the Motor Control course participated in this study. Participants were undergraduate students enrolled in a Sport Sciences program at a higher education institution in Portugal and were allocated to groups according to the academic year in which they attended the course. The control group (CG) included students from both daytime and evening classes during the 2024–2025 academic year (n = 90), in which no interactive digital pedagogical resource was implemented. The intervention group (IG) corresponded to the cohort enrolled in the 2025–2026 academic year (n = 115), in which an interactive multiple-choice learning resource developed using the H5P platform was integrated. Students were included if they: (i) were formally enrolled in the course during the respective academic year; (ii) attended the course during the period in which the pedagogical activities were implemented; and (iii) completed the assessment used as the primary outcome of the study. Students were excluded if they did not complete the summative assessment during the regular examination period.
To reduce potential cohort bias and increase comparability between academic years, relevant administrative and academic variables were collected when available, including prior academic grade point average, performance in related courses, and enrolment status. These variables were used to describe the sample and, where possible, to adjust the statistical analyses. The study was conducted in accordance with ethical principles applicable to research in higher education contexts (Calhoun et al., 2020; Goel, 2025). Student participation was voluntary and had no academic consequences, and non-participation did not affect course evaluation. Students were informed about the objectives of the study, the procedures for data collection and use, and the confidentiality and anonymization of the collected information. The analyzed data consisted exclusively of aggregated academic results and records of use of the pedagogical resource, without individual identification. The study followed the principles of the Declaration of Helsinki, adapted for educational research and institutional guidelines for the use of academic data in scientific research (World Medical Association, 2013).

2.3. Pedagogical Intervention

The instructional design of the pedagogical intervention consisted of the structured integration of a set of 60 multiple-choice questions delivered through an interactive digital resource developed using the H5P platform at the beginning of the academic semester. The resource was designed to support autonomous study and improve students’ academic performance in the Motor Control course. It was made available exclusively to students enrolled in the 2025–2026 academic year on the first day of classes through the institutional learning management system and was not implemented in the historical cohort from the 2024–2025 academic year.
The independent variable corresponded to exposure to a multiple-choice questionnaire implemented as an interactive digital pedagogical resource through H5P within a learning management system (Moodle) (Jacob & Centofanti, 2024; Mutawa et al., 2023). Consistent with previous evidence suggesting that multiple-choice formats can support autonomous learning and conceptual consolidation when embedded within formative learning contexts (Wrzesińska et al., 2025), the resource consisted of a total of 60 items distributed across two formats: approximately 80% single-best-answer multiple-choice questions and 20% true/false items. The questions covered the main conceptual domains of Motor Control and were primarily designed to promote conceptual understanding, requiring students to accurately identify and differentiate key theoretical constructs rather than relying solely on factual recall. Although the items did not involve complex scenario-based problem solving or open-ended responses, several required discriminations between closely related concepts (e.g., distinguishing between feedback and feedforward control), thereby engaging cognitive processes beyond simple recognition.
In terms of cognitive demand, most items were aligned with the “understand” level of Bloom’s revised taxonomy (Krathwohl, 2002), with elements of conceptual discrimination that may support early stages of knowledge application. Each question was explicitly aligned with course learning objectives, including: (i) understanding fundamental mechanisms of motor control; (ii) distinguishing between key theoretical models; and (iii) consolidating core conceptual knowledge required for subsequent application in more complex learning contexts. This structured design aimed to promote progressive conceptual consolidation through repeated engagement with course content, supporting autonomous study processes within a self-regulated learning framework.
Students had a two-month period to complete the pedagogical resource and were allowed to consult supporting materials (e.g., textbooks, lecture notes, and other academic resources), following a structured autonomous learning approach. The activity was not supervised and did not constitute a formal examination or summative assessment. Completion of the activity required correctly answering all questions, with unlimited attempts permitted whenever necessary. Importantly, this requirement had a purely formative purpose and was not directly associated with the final course grade. The resource did not provide explicit indications of incorrect answers, nor did it offer hints or suggested responses. This design aimed to encourage a comprehensive review of the course content and discourage trial-and-error strategies, requiring students to actively engage with course materials to verify their responses.
This design is consistent with theoretical frameworks related to self-regulated learning, where learners are required to plan, monitor, and evaluate their own learning processes (Panadero, 2017; Zimmerman, 2002). Additionally, it can be interpreted within the framework of retrieval practice, where active recall and repeated engagement with content are associated with improved retention and learning outcomes (Adesope et al., 2017; Roediger & Karpicke, 2006). Therefore, the absence of immediate feedback constitutes a deliberate instructional design feature intended to foster sustained cognitive engagement and encourage systematic review of the content. All students completed the activity (96% completion rate), with a mean number of attempts of 7.4.
Thus, the pedagogical resource functioned as a structured complement to the traditional teaching model, without replacing or modifying the formal assessment instruments of the course, which remained equivalent in both the CG and IG, ensuring comparability of academic outcomes across academic years.

2.4. Instruments, Variables, and Assessment Procedures

Consistent with previous studies, academic performance was operationalized as the score obtained in the summative assessment of the course (Mitra & Barua, 2015), which was administered in two academic years under equivalent conditions. The written test followed a constructed-response format consisting of five open-ended questions designed to evaluate students’ ability to explain, integrate, and apply key theoretical concepts in the field of Motor Control. Although the questions were open-ended, the assessment allowed for consistent and structured evaluation, as responses were graded using a predefined analytic rubric focused on conceptual accuracy, logical coherence, scientific adequacy, and relevance of the information provided.
The control and intervention groups were not blinded during the grading process; however, to minimize potential grading bias and ensure comparability between groups, all assessments were evaluated using a predefined analytic rubric, applied consistently across both academic years, and all examinations were graded by the same instructor under standardized conditions. This approach ensured consistency in the interpretation of assessment criteria and in the attribution of scores. Additionally, the overall structure of the examination, the content assessed, and the weighting of the test within the final course grade remained equivalent across the two academic years, further reinforcing the comparability of academic outcomes. Also, to account for potential baseline differences between groups, indicators of prior academic performance were considered. These included the mean score obtained in an initial diagnostic test composed of 10 questions administered to both groups. For the intervention group, additional indicators related to the use of the digital pedagogical resource were collected, such as completion status and number of attempts. These data were analyzed descriptively to characterize engagement with the intervention and were not used as formal evaluation criteria.

2.5. Statistical Analysis

Data are presented as mean ± standard deviation (M ± SD). The assumption of normality was assessed using the Shapiro–Wilk test for each group. The analysis indicated that at least one variable did not meet the normality assumption in both groups. Therefore, differences in academic performance between the two cohorts (2024–2025 and 2025–2026) were analyzed using the Mann–Whitney U test. The level of statistical significance was set at p < 0.05. The magnitude of the observed differences was quantified using the rank-biserial correlation (RRB) with corresponding 95% confidence intervals, allowing a standardized interpretation of the practical relevance of the effects. To further examine the robustness of the findings and account for potential baseline differences between groups, regression models with robust bootstrap estimates were performed. In these models, the exam score (Test) was included as the dependent variable, group as the independent factor, and the initial diagnostic score (Dig) as a covariate. As an additional robustness analysis, an independent-samples robust test (Yuen’s test) with bootstrap estimation was conducted to reduce the influence of non-normal distributions and potential outliers. Results were interpreted considering both statistical significance and practical relevance. All statistical analyses were conducted using Jamovi (Version 2.6).

3. Results

The results are presented in relation to the research questions guiding the study, with a focus on group comparability at baseline and the effect of the intervention on academic performance. Baseline equivalence: No statistically significant differences were observed between the control group (CG) and the intervention group (IG) in the initial diagnostic assessment (p = 0.462). This finding was further supported by the trivial effect size identified using the robust Yuen test with bootstrap (p = 0.391; ξ = 0.07), indicating comparable baseline levels between groups and supporting the internal validity of subsequent comparisons.
Effect of the intervention on academic performance: Significant differences were observed in academic performance between groups. Students in the intervention group obtained higher scores in the summative course examination compared with the CG (p < 0.001; Table 1). The magnitude of this difference corresponded to a large effect size (d = 0.87), indicating a substantial improvement in performance associated with the structured integration of the H5P-based learning resource.
Adjusted analysis controlling for baseline performance: To account for potential baseline differences, a linear regression model was conducted, including group as the independent variable and the initial diagnostic score as a covariate. The model showed a moderate fit (adjusted R2 = 0.15). The group effect remained statistically significant (β = 2.87; 95% CI [1.94, 3.79]; p < 0.001), confirming that students in the IG achieved higher exam scores independently of their initial performance level. The diagnostic score was not a significant predictor of academic performance (p = 0.954), suggesting that the observed differences are primarily attributable to the intervention. Taken together, these findings suggest that the structured use of the H5P-based resource may have contributed to improved academic performance, potentially reflecting enhanced conceptual consolidation, although this mechanism was not directly measured.

4. Discussion

This study examined the impact of integrating an interactive digital pedagogical resource developed with H5P on the academic performance of university students enrolled in a Motor Control course. The results revealed a statistically significant improvement in academic performance in the intervention group, accompanied by a large effect size. Importantly, no initial differences were observed between groups in the diagnostic assessment, which strengthens the interpretation that the higher scores observed in the summative examination cannot be attributed to baseline differences between cohorts. From a practical perspective, the findings suggest that formative digital activities designed without immediate feedback may promote more autonomous and systematic engagement with course content, encouraging students to actively review and consolidate their knowledge over time. These findings are consistent with recent studies reporting positive effects associated with the pedagogical use of H5P resources in higher education (Wrzesińska et al., 2025).
Beyond the evaluation of a specific digital tool, the present study contributes to the literature by suggesting that the academic value of H5P-based resources may depend fundamentally on their instructional design. In this regard, the findings support the view that relatively simple digital formats can be educationally meaningful when they are intentionally structured as formative learning activities that promote repeated engagement with content. Within this framework, the study adds to current discussions on digital learning by reinforcing the importance of pedagogical design over technological novelty in conceptually demanding higher education contexts.
The absence of statistically significant differences in the initial diagnostic assessment between groups, accompanied by trivial effect sizes, constitutes a relevant starting point for interpreting the study’s findings. The data are consistent with widely accepted methodological recommendations in quasi-experimental educational research, according to which verifying baseline equivalence between groups is essential to reduce selection bias and strengthen the internal validity of group comparisons (Cook & Steiner, 2010; Windle et al., 2025). Furthermore, recent evidence based on learning management system (LMS) data consistently suggests that initial differences in performance, when not controlled, tend to bias the evaluation of the effectiveness of pedagogical interventions (Regueras et al., 2025). Considering these considerations, the convergence observed between non-parametric analyses, robust methods, and adjusted models in the present study reinforces the interpretation that the differences observed in academic performance are associated with the pedagogical intervention, although strong causal inferences should be avoided.
The significant increase in academic performance observed in the intervention group is consistent with recent literature documenting positive effects associated with the use of H5P-based interactive digital resources in higher education, particularly when these are pedagogically and intentionally integrated into the teaching and learning process (Jacob & Centofanti, 2024). Previous studies suggest that well-structured H5P activities tend to be associated with improvements in learning outcomes in online contexts, especially when they function as tools for conceptual consolidation and support for autonomous study, rather than being used exclusively as assessment mechanisms (Ki, 2025).
Complementarily, recent reviews indicate that the strongest effects on academic performance emerge when digital resources promote active cognitive engagement, deliberate practice, and self-regulated learning processes, in contrast to approaches based primarily on passive content consumption (Akpen et al., 2024). This perspective is consistent with the design of the intervention analyzed in the present study, which required students to correctly answer all questions and discouraged focused trial-and-error strategies, thereby encouraging a comprehensive review of course content and a progressive deepening of conceptual understanding.
It is important, however, to acknowledge that empirical evidence specifically addressing the use of multiple-choice activities without immediate feedback remains limited in higher education contexts. To date, robust experimental studies isolating the effect of this pedagogical design have not been clearly established. Therefore, this approach should be interpreted as a plausible pedagogical hypothesis rather than as a conclusion definitively supported by the existing literature.
Although multiple-choice questions are often associated, in many contexts, with low-level cognitive assessment—particularly when designed to test superficial recognition—this view has been progressively reconsidered. Empirical and meta-analytic evidence suggests that multiple-choice–based tasks can promote active retrieval of previously acquired knowledge, conceptual retention, and knowledge transfer when they require deeper cognitive processing and are integrated formatively within the learning process (Adesope et al., 2017; Haladyna et al., 2002). Within this framework, the pedagogical value of multiple-choice items depends less on the format itself and more on the design of the tasks, including item quality, alignment with learning objectives, cognitive demand, and integration into autonomous study practices (Adesope et al., 2017). Recent empirical evidence in higher education further supports this perspective. Studies comparing strategies such as structured note-taking and the writing or generation of multiple-choice questions have shown that actively generating questions can support performance in summative assessments and foster more active learning processes (Wrzesińska et al., 2025). This line of research is particularly relevant to the present study, as the H5P resource was designed as a repeated formative practice requiring sustained engagement with course content over time, rather than as a one-time verification test.
However, it is important to acknowledge that this type of approach is not without limitations and potential adverse effects. The literature has shown that, in certain contexts, multiple-choice formats may favor recognition processes rather than mechanisms of active information retrieval. The latter have been consistently associated with more durable learning and greater transfer of knowledge to new contexts (Chernikova et al., 2020; Roediger & Karpicke, 2006). Additionally, recent empirical evidence suggests that, under specific instructional conditions, constructed-response formats may lead to greater learning gains than those observed in multiple-choice tasks, particularly when they require conceptual elaboration, integration of content, and deeper cognitive processing (Greving & Richter, 2022). In parallel, research has also highlighted that pedagogical interventions that restrict immediate feedback may, in some contexts, be associated with additional challenges in the learning process. These may include increased frustration, higher levels of extraneous cognitive load, and greater sensitivity to individual differences in students’ self-regulation (Butler & Roediger, 2008; Panadero, 2017).
In the present study, these limitations were considered in the design of the pedagogical intervention. In contrast to high-pressure evaluative approaches with restrictive feedback, the resource was explicitly designed for formative purposes, being made available over an extended period of two months and allowing unlimited access to supporting materials. This approach sought to mitigate potential negative effects associated with the absence of immediate feedback by reducing evaluative pressure and promoting a learning process oriented toward autonomous study, systematic review, and conceptual consolidation rather than simple performance verification. Within this framework, although the observed results are compatible with the promotion of sustained active cognitive engagement, the interpretation of the underlying mechanisms should be made with caution, acknowledging that the available data do not allow direct causal inferences regarding the psychological processes involved. Another plausible interpretation of the findings is that the H5P resource may have contributed to activating self-regulated learning processes by requiring students to plan their study, monitor their progress, and systematically review course content over time. The literature on self-regulated learning suggests that interventions that structure and guide autonomous study tend to produce more consistent and sustained gains than approaches predominantly based on content transmission (Luo & Zhou, 2024; Zimmerman, 2002).
Nevertheless, it is important to emphasize that the present study did not collect direct data on students’ study strategies, actual time devoted to learning, perceived cognitive load, or affective states associated with the learning process. For this reason, the explanation proposed here should be understood as a theoretical interpretation consistent with the existing literature rather than as a direct empirical inference. In this sense, although the H5P resource used in this study had a formative purpose and was not directly linked to the course grade, future research could benefit from adopting a broader approach by systematically examining the impact of such interventions on dimensions such as perceived cognitive load, academic anxiety, and the balance between pedagogical demands and student support.
One of the key methodological strengths of the present study lies in the convergence and application of robust statistical approaches, including non-parametric tests, robust analyses with bootstrap procedures, and covariate-adjusted regression models (Hopkins et al., 2009). This combination is particularly appropriate in quasi-experimental designs and in applied educational research contexts, where violations of classical statistical assumptions—such as asymmetric distributions and heteroscedasticity are common (Christiansen et al., 1996; Glass et al., 1972). In line with methodological recommendations, the use of these robust techniques strengthens the reliability, robustness, and interpretability of the findings.
From a practical perspective, the present findings suggest that higher education instructors may benefit from integrating structured digital formative resources that prompt repeated interaction with course content across time, particularly in conceptually demanding subjects. When positioned as low-stakes complements to formal teaching, such activities may help students engage in more regular review, strengthen retrieval of key concepts, and support more autonomous study routines. In this context, designs that do not immediately disclose the correct answer may be pedagogically useful when they are embedded within a broader formative framework, provide sufficient time for completion, and allow students access to supporting materials. Rather than being interpreted as a universally preferable strategy, this type of design may be considered a viable instructional option for promoting systematic content review and conceptual consolidation in higher education.
Despite the positive findings, the study presents several important limitations. First, the quasi-experimental design prevents strong causal inferences. Uncontrolled contextual factors, such as subtle differences in cohort characteristics, institutional conditions, or parallel academic experiences, may have influenced the results. Second, the evaluation focused exclusively on academic performance, which did not allow the analysis of potential effects on engagement, motivation, or long-term learning outcomes. Moreover, the generalization of the findings should be approached with caution. Comparative studies examining traditional and digital approaches across different disciplinary areas indicate that the effects of educational technologies are highly dependent on context, pedagogical design, and implementation quality (Novakovich et al., 2017).

5. Conclusions

The findings of this study indicate that the structured integration of an interactive digital pedagogical resource based on H5P may represent a meaningful instructional strategy in higher education, particularly when it is intentionally aligned with learning objectives and embedded within a coherent pedagogical framework. The absence of initial differences between groups strengthens the consistency of the comparison and supports the interpretation that the observed effect is plausibly associated with the pedagogical intervention. Importantly, the results should be interpreted within the specific context of a formative instructional design, in which the resource was used as a structured complement to traditional teaching rather than as a standalone or summative assessment tool. In this sense, the findings suggest that such designs may support more systematic engagement with course content and contribute to the progressive consolidation of conceptual knowledge over time.
Considering these limitations, future research should aim to adopt more robust experimental designs, including randomized controlled trials where feasible, and incorporate process-oriented measures that allow for a deeper understanding of the mechanisms underlying learning. Further studies should examine the role of instructional design features—such as feedback conditions, task structure, and levels of cognitive demand—as well as students’ perceptions, cognitive load, and engagement patterns within digital learning environments. Overall, the present study contributes to the growing body of research on digital learning in higher education by suggesting that the educational value of tools such as H5P lies not in the technology itself, but in the way they are pedagogically designed and integrated into the learning process.

Author Contributions

R.M.-B.: Conceptualization; Methodology; Formal analysis; Investigation; Data curation; Writing—original draft; Visualization; Project administration. A.C.: Methodology; Validation; Investigation; Resources; Writing—review and editing. V.P.: Validation; Investigation; Resources; Writing—review and editing. F.C.: Formal analysis; Investigation; Data curation; Writing—review and editing. A.N.: Formal analysis; Investigation; Data curation; Writing—review and editing. N.A.: Formal analysis; Investigation; Data curation; Writing—review and editing. P.F.: Formal analysis; Investigation; Data curation; Writing—review and editing. C.R.: Formal analysis; Investigation; Data curation; Writing—review and editing. I.R.: Formal analysis; Investigation; Data curation; Writing—review & editing. L.P.: Formal analysis; Investigation; Data curation; Writing—review & editing. R.M.: Conceptualization; Writing—review & editing; Validation; Funding acquisition P.S.: Conceptualization; Methodology; Investigation; Resources; Supervision; Project administration; Writing—review & editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki. Ethical review and approval were waived for this study because the research was conducted in an educational context using anonymized academic data and did not involve sensitive personal information.

Informed Consent Statement

Participation was voluntary and students were informed about the use of anonymized data for research purposes.

Data Availability Statement

The data supporting the findings of this study are available from the corresponding author upon reasonable request. The data are not publicly available due to privacy restrictions. Detailed information regarding the assessment instruments and the H5P resource is available upon request from the authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Adesope, O. O., Trevisan, D. A., & Sundararajan, N. (2017). Rethinking the use of tests: A meta-analysis of practice testing. Review of Educational Research, 87(3), 659–701, (Erratum in 2017, Review of Educational Research, 87(3), NP1). [Google Scholar] [CrossRef]
  2. Akpen, C. N., Asaolu, S., Atobatele, S., Okagbue, H., & Sampson, S. (2024). Impact of online learning on student’s performance and engagement: A systematic review. Discover Education, 3(1), 205. [Google Scholar] [CrossRef]
  3. Bacmeister, C. M., Barr, H. J., McClain, C. R., Thornton, M. A., Nettles, D., Welle, C. G., & Hughes, E. G. (2020). Motor learning promotes remyelination via new and surviving oligodendrocytes. Nature Neuroscience, 23(7), 819–831. [Google Scholar] [CrossRef] [PubMed]
  4. Biggs, J., & Tang, C. (2011). Teaching for quality learning at university. McGraw-Hill Education. [Google Scholar]
  5. Butler, A. C., & Roediger, H. L. (2008). Feedback enhances the positive effects and reduces the negative effects of multiple-choice testing. Memory & Cognition, 36(3), 604–616. [Google Scholar] [CrossRef] [PubMed]
  6. Calhoun, A. W., Pian-Smith, M., Shah, A., Levine, A., Gaba, D., DeMaria, S., Goldberg, A., & Meyer, E. C. (2020). Guidelines for the responsible use of deception in simulation: Ethical and educational considerations. Simulation in Healthcare: Journal of the Society for Simulation in Healthcare, 15(4), 282–288. [Google Scholar] [CrossRef]
  7. Chernikova, O., Heitzmann, N., Stadler, M., Holzberger, D., Seidel, T., & Fischer, F. (2020). Simulation-based learning in higher education: A meta-analysis. Review of Educational Research, 90(4), 499–541. [Google Scholar] [CrossRef]
  8. Christiansen, N. D., Lovejoy, M. C., Szymanski, J., & Lang, A. (1996). Evaluating the structural validity of measures of hierarchical models: An illustrative example using the social problem-solving inventory. Educational and Psychological Measurement, 56(4), 600–625. [Google Scholar] [CrossRef]
  9. Cook, T. D., & Steiner, P. M. (2010). Case matching and the reduction of selection bias in quasi-experiments: The relative importance of pretest measures of outcome, of unreliable measurement, and of mode of data analysis. Psychological Methods, 15(1), 56–68. [Google Scholar] [CrossRef]
  10. Glass, G. V., Peckham, P. D., & Sanders, J. R. (1972). Consequences of failure to meet assumptions underlying the fixed effects analyses of variance and covariance. Review of Educational Research, 42(3), 237–288. [Google Scholar] [CrossRef]
  11. Goel, S. (2025). Earnings management is “theses management” in management educational research: A review of ethics for behavioural psychology. Acta Psychologica, 258, 105216. [Google Scholar] [CrossRef]
  12. Gonçalves, D., Cruz, J., Orvalho, L., Sousa, M., Torres, J., & Ribeiros, I. (2025). Pedagogia XXI: Boas práticas de inovação e de formação na educação superior. Escola Superior de Educação de Paula Frassinetti (ESEPF). [Google Scholar]
  13. Greving, S., & Richter, T. (2022). Practicing retrieval in university teaching: Short-answer questions are beneficial, whereas multiple-choice questions are not. Journal of Cognitive Psychology, 34(5), 657–674. [Google Scholar] [CrossRef]
  14. Haladyna, T. M., Downing, S. M., & Rodriguez, M. C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15(3), 309–334. [Google Scholar] [CrossRef]
  15. Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. [Google Scholar] [CrossRef]
  16. Hilton, J. (2016). Open educational resources and college textbook choices: A review of research on efficacy and perceptions. Educational Technology Research and Development, 64(4), 573–590. [Google Scholar] [CrossRef]
  17. Hopkins, W. G., Marshall, S. W., Batterham, A. M., & Hanin, J. (2009). Progressive statistics for studies in sports medicine and exercise science. Medicine & Science in Sports & Exercise, 41(1), 3–12. [Google Scholar] [CrossRef] [PubMed]
  18. Huff, T., & Tseng, D. C. Y. (2025). Active learning in Open Digital Textbooks: Designing for learner engagement and motivation using H5P. Techtrends, 70(1), 240–252. [Google Scholar] [CrossRef]
  19. Jacob, T., & Centofanti, S. (2024). Effectiveness of H5P in improving student learning outcomes in an online tertiary education setting. Journal of Computing in Higher Education, 36(2), 469–485. [Google Scholar] [CrossRef]
  20. Ki, Y. (2025). Enhancing student engagement and learning outcomes in higher education using H5P interactive learning tools: A systematic literature review. International Journal of Modern Education, 7, 969–990. [Google Scholar] [CrossRef]
  21. Krakauer, J. W., Hadjiosif, A. M., Xu, J., Wong, A. L., & Haith, A. M. (2019). Motor learning. Comprehensive Physiology, 9(2), 613–663. [Google Scholar] [CrossRef]
  22. Krathwohl, D. R. (2002). A revision of bloom’s taxonomy: An overview. Theory Into Practice, 41(4), 212–218. [Google Scholar] [CrossRef]
  23. Little, J. L., Bjork, E. L., Bjork, R. A., & Angello, G. (2012). Multiple-choice tests exonerated, at least of some charges: Fostering test-induced learning and avoiding test-induced forgetting. Psychological Science, 23(11), 1337–1344. [Google Scholar] [CrossRef]
  24. Luo, R.-Z., & Zhou, Y.-L. (2024). The effectiveness of self-regulated learning strategies in higher education blended learning: A five years systematic review. Journal of Computer Assisted Learning, 40(6), 3005–3029. [Google Scholar] [CrossRef]
  25. Martín-Alguacil, N., Mota-Blanco, R., Avedillo, L., Marañón-Almendros, M., & Gallego-Agundez, M. (2025). Timing, tools, and thinking: H5P-Driven engagement in flipped veterinary education. Veterinary Sciences, 12(10), 1013. [Google Scholar] [CrossRef]
  26. Mitra, N. K., & Barua, A. (2015). Effect of online formative assessment on summative performance in integrated musculoskeletal system module. BMC Medical Education, 15(1), 29. [Google Scholar] [CrossRef]
  27. Mutawa, A. M., Al Muttawa, J. A. K., & Sruthi, S. (2023). The effectiveness of using H5P for undergraduate students in the asynchronous distance learning environment. Applied Sciences, 13(8), 4983. [Google Scholar] [CrossRef]
  28. Novakovich, J., Miah, S., & Shaw, S. (2017). Designing curriculum to shape professional social media skills and identity in virtual communities of practice. Computers & Education, 104, 65–90. [Google Scholar] [CrossRef]
  29. Panadero, E. (2017). A review of self-regulated learning: Six models and four directions for research. Frontiers in Psychology, 8, 422. [Google Scholar] [CrossRef]
  30. Ramos-Azcuy, F. J., Rodríguez-Gámez, M., Benavides-Bailón, J. M., Bonilla-Jiménez, M. M., & Arroba-Cárdenas, Á. E. (2025). Igniting student engagement: H5P’s transformative potential in higher education. RIED-Revista Iberoamericana De Educacion a Distancia, 28(2), 379–400. [Google Scholar] [CrossRef]
  31. Regueras, L. M., Verdú, M. J., de Castro, J. P., & Álvarez-Álvarez, S. (2025). Techno-pedagogical approaches and academic performance: A quantitative study based on LMS log data. Education Sciences, 15(11), 1533. [Google Scholar] [CrossRef]
  32. Roediger, H. L., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17(3), 249–255. [Google Scholar] [CrossRef]
  33. Wiley, D., Bliss, T. J., & McEwen, M. (2014). Open educational resources: A review of the literature. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (pp. 781–789). Springer. [Google Scholar]
  34. Windle, S. B., Harper, S., Arneja, J., Socha, P., & Nandi, A. (2025). Systematic reviews of quasi-experimental studies: Challenges and considerations. Journal of Clinical Epidemiology, 191, 112121. [Google Scholar] [CrossRef]
  35. World Medical Association. (2013). World medical association declaration of helsinki: Ethical principles for medical research involving human subjects. JAMA, 310(20), 2191–2194. [Google Scholar] [CrossRef]
  36. Wrzesińska, M. A., Rakoczy, J., Binder-Olibrowska, K. W., Kostyła, M., Allott, V. E., Harris, S. R., Walsh, J. L., Handa, A., & Harris, B. H. (2025). Comparing structured note-taking and multiple-choice question writing as learning strategies among physiotherapy students. BMC Medical Education, 25(1), 1665. [Google Scholar] [CrossRef] [PubMed]
  37. Yang, G., Kim, J., Young, D., Chen, X. H., Li, N., & Purwanto, E. (2025). Conceptual model to examine students’ use of H5P technology in online learning environments: The integration of the unified theory of acceptance and use of technology model and perceived pedagogical value model. European Journal of Education, 60(4), e70286. [Google Scholar] [CrossRef]
  38. Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory Into Practice, 41(2), 64–70. [Google Scholar] [CrossRef]
Table 1. Descriptive statistics and between-group comparisons for academic performance at baseline and post-intervention.
Table 1. Descriptive statistics and between-group comparisons for academic performance at baseline and post-intervention.
CGIGMean-Dif.pξd
Diagnostic5.57 ± 1.255.51 ± 1.700.060.4620.070.04
Test11.4 ± 3.87 14.3 ± 2.822.9<0.0010.580.87
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Montoro-Bombú, R.; Costa, A.; Pinheiro, V.; Coelhoso, F.; Nascimento, A.; Abranja, N.; Farinho, P.; Rosa, C.; Ribeiros, I.; Picado, L.; et al. Enhancing Academic Performance in Motor Control: A Structured H5P-Based Multiple-Choice Intervention in Higher Education. Educ. Sci. 2026, 16, 619. https://doi.org/10.3390/educsci16040619

AMA Style

Montoro-Bombú R, Costa A, Pinheiro V, Coelhoso F, Nascimento A, Abranja N, Farinho P, Rosa C, Ribeiros I, Picado L, et al. Enhancing Academic Performance in Motor Control: A Structured H5P-Based Multiple-Choice Intervention in Higher Education. Education Sciences. 2026; 16(4):619. https://doi.org/10.3390/educsci16040619

Chicago/Turabian Style

Montoro-Bombú, Raynier, Armando Costa, Valter Pinheiro, Filipa Coelhoso, Alexandra Nascimento, Nuno Abranja, Paula Farinho, Celeste Rosa, Inês Ribeiros, Luís Picado, and et al. 2026. "Enhancing Academic Performance in Motor Control: A Structured H5P-Based Multiple-Choice Intervention in Higher Education" Education Sciences 16, no. 4: 619. https://doi.org/10.3390/educsci16040619

APA Style

Montoro-Bombú, R., Costa, A., Pinheiro, V., Coelhoso, F., Nascimento, A., Abranja, N., Farinho, P., Rosa, C., Ribeiros, I., Picado, L., Martins, R., & Sousa, P. (2026). Enhancing Academic Performance in Motor Control: A Structured H5P-Based Multiple-Choice Intervention in Higher Education. Education Sciences, 16(4), 619. https://doi.org/10.3390/educsci16040619

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop