Skip to Content
Trends in Higher EducationTrends in Higher Education
  • Article
  • Open Access

19 January 2026

A Case Study on Formative Assessment in Physical Education Teacher Training in Uruguay

,
,
,
and
1
Departamento de Educación, Universidad de los Lagos, Osorno 5290000, Chile
2
Facultad de Educación y Ciencias Sociales, Universidad Andrés Bello, Santiago 6513491, Chile
3
Instituto Superior de Educación Física, Universidad de la República, Montevideo 11200, Uruguay
4
Departamento de Ciencias de la Actividad Física, Universidad de los Lagos, Osorno 5311157, Chile

Abstract

Several authors emphasize that assessment is a key tool for teachers to guide and verify learning, improve their practice, and contribute to deeper student learning. Beyond its technical function, assessment enables the creation of a meaningful pedagogical relationship with the central actor of the educational process, “the student”. This study aimed to understand how students value a system guided by the principles of formative assessment and its impact on the self-perception of acquired competencies. The “Questionnaire on the Experience of Good Practice” and the “Scale for the Self-Perception of Student Competencies” were applied to a sample of 74 students (26.4 ± 4.5) from a public university in Uruguay. The results show that the assessment system was positively rated in terms of usefulness, innovation, and replicability, although limitations were observed in terms of sustainability and fairness in grading. In addition, a significant decrease was observed in the self-perception of technical competencies and an increase in those related to pedagogical reflection and attention to diversity, suggesting a more critical and realistic view of their own professional performance on the part of the students.

1. Introduction

1.1. The Initial Teacher Training and Assessment

Learning assessment holds a strategic position in current pedagogical practice, as it not only serves a proper certifying function but, more importantly, fulfills a pedagogical role in guiding and regulating the teaching–learning (T–L) process. It contributes to informed decision-making, although the effectiveness of its action depends on task design, the quality of feedback, and the opportunities it provides to redirect student activity [1]. The evidence suggests that initial teacher education should aim to develop the knowledge, skills, and attitudes in preservice teachers, ensuring that they are capable of designing, implementing, and interpreting valid, reliable, and ethically responsible assessment processes. To this end, preservice teachers must acquire the competencies that will allow them, in their future teaching practice, to use evidence to provide feedback and adjust the T–L processes [2]. Therefore, a shift in the focus of assessment processes, one that goes beyond the mere certifying role and moves toward diversified processes (self-assessment, peer assessment, and hetero-assessment), is essential, as it is also known to foster metacognitive processes and promote self-regulation, provided that explicit criteria and opportunities for review and improvement are offered [3].
Despite the evidence, there is still a gap between the discourse on assessment and the actual practices in initial teacher training (ITT); feedback often arrives late or is of low quality, and the instruments used frequently rely on short-answer tasks with poorly defined criteria [1]. In this regard, competency-based training in the Latin American university context has been consolidated as a framework that integrates knowledge, skills, and values, aiming for their appropriation in situated contexts, with a focus on transferability. However, for its true success, key attributes such as clear graduate profiles and authentic assessment with explicit criteria are necessary [4], which tends to be difficult in actual practice due to challenges such as inconsistency in competency mapping, weak articulation of evidence in assessment instruments, and the tendency to maintain traditional summative practices [5].
Multiple studies that analyze the scope of competency-based curricula mention the existence of multiple perspectives (technical, critical, outcome-oriented, humanistic, hybrid). This highlights the need to clarify the purposes and epistemological assumptions behind each curriculum design, especially considering the employability focus each curriculum aims to address [6]. Therefore, when discussing competency-based designs in ITT, the critical core lies in aligning observable performances with authentic assessments and opportunities for improvement, which for a preservice teacher is an enabling condition to translate this approach into daily future practice [7]. Therefore, the competency-based shift faces the dual challenge of avoiding instrumentalization and standardization (assessment practices centered on checklists), moving instead toward situated pedagogical autonomy (with formative assessment that provides opportunities for improvement during the course, not just at the end) [8].

1.2. The Uruguayan Higher Education System

Initial Physical Education training in Uruguay requires deeper consideration of assessment as a formative process within higher education, shaping professional competence and reflective practice [9,10].
The higher education system in Uruguay presents particular structural features within the region, characterized by a strong historical presence of public universities, the provision of free education, and open access without selective admission processes [11]; in terms of quality assurance, the Uruguayan university system has followed a trajectory marked by discontinuous progress in the evaluation and accreditation of its institutions, with a significant delay in the adoption of quality assurance mechanisms compared to other MERCOSUR countries [12]. Undoubtedly, the absence of regulatory frameworks for quality assurance affects governance and the way institutions manage the curriculum [13]. In the specific case of the University of the Republic (UdelaR), characteristics such as free tuition and tradition result in a very high student population, which has a significant impact on course organization and the evaluations that are implemented [14]. In this context, the ideals of individualized formative assessment and the possibility of timely feedback are challenged, as group size limits the ability to provide detailed feedback and observe performance in a personalized manner. The literature shows that as cohort sizes grow, universities tend to standardize instruments at the expense of reducing the qualitative contribution of assessment judgment [15].

1.3. Initial Teacher Training in Physical Education (ITEPE) in Uruguay

Regarding ITEPE in Uruguay, reports can be found that mention the same historical tensions observed in other contexts concerning curricular architecture (mainly traditional approaches are observed) [16]. It is precisely here where assessment is positioned as a tool that can contribute to promoting social justice approaches, and where peer assessment when designed as formative feedback with the possibility of product review can also enhance assessment judgment [17]. Likewise, recent studies show that effective peer feedback designs have been proven to improve performance and self-regulation [18]. The same occurs when a supervised assessment is incorporated to accompany and provide feedback on practicum processes in real Physical Education teaching contexts [19]. In this sense, the IITEPE in Uruguay is called to implement a competency-based curriculum with authentic, situated, and individualized tasks, considering that students must be placed at the center of the process. For this reason, the incorporation of shared criteria and situated tasks is essential [20,21].
Therefore, the incorporation of assessment initiatives with a formative approach has been widely recommended in contexts such as Spain [9]. Likewise, successful experiences can be found following its implementation in Latin American countries such as Chile [22,23], Colombia [24,25], Brazil [26], among others. Based on all the above, the objective of this study was as follows:
To analyze the self-perception of the competencies acquired and the evaluation made by students in initial teacher education in Physical Education (ITEPE) regarding a formative assessment system implemented in a course of their training.

2. Materials and Methods

This study adheres to a quantitative approach with a descriptive scope. It is non-experimental in nature, meaning that the variables were not manipulated; rather, they were observed at a single point in time in order to report the perceptions of the students who participated in the experience [27]. The sample consisted of 74 students with an average age of (26.4 ± 4.5), who took part in the experience as part of the course “Team Sports III” corresponding to the sixth semester of the core curriculum in the ITEPE at a public Uruguayan university during the second academic semester of 2023.
The “Scale for Student Self-Perception of Competence” was applied before and after the assessment experience. To respond to the items, the instrument used a four-point Likert scale (where 1 = None; 2 = Little; 3 = Quite a Bit; and 4 = A Lot). The scale had been previously validated [28]. For the analysis of data between the pre-test and post-test, inferential statistics were used, specifically the Mann–Whitney U test, with a significance level of p ≤ 0.05. Additionally, at the end of the semester, to understand the students’ evaluation of the assessment system used, the “Questionnaire on the Good Practice Assessment Experience” was administered. This instrument used a five-point Likert scale to record responses (0 = None; 1 = Little; 2 = Somewhat; 3 = Quite a Bit; 4 = A Lot), and it had also been validated [29]. The data collection protocol had been previously approved by the Scientific Ethics Committee of the Universidad de Los Lagos, Chile (CEC-Ulagos). Descriptive statistics were used to analyze the data, calculating the arithmetic mean (M) and standard deviation (SD) for each variable in the “Questionnaire on the Good Practice Assessment Experience”.
The experience was carried out within the course Team Sports III. The core contents of the subject were directly linked to the formative assessment system implemented. Students were required to justify their decisions, provide peer feedback, and apply explicit criteria to analyze. This alignment between content and assessment fostered reflective judgment and promoted deeper, more meaningful learning.
An assessment system was designed in accordance with the principles of formative assessment, based on the multiple experiences that highlight the positive effects of such systems in T–L processes [30]. The designed assessment system aimed to be embedded in the T–L process, seeking to strengthen and guide student learning while respecting the 7 procedural principles for implementing formative assessment systems [30]. Table 1 shows the relationships between assessment activities throughout the semester. Considering that the experience was implemented in Uruguay, the grading scale ranged from 0 to 12, with a passing grade beginning at 5.
Table 1. Learning activities during the assessment experience.

3. Results

Table 2 shows the level of agreement with a series of statements regarding the assessment system.
Table 2. Means and SD. Factor “Advantages of the Assessment System”.
The results of Table 2 reveal the perceptions related to the advantages attributed to the assessment system implemented in the course by the students. In general, the results show a positive evaluation, ranging from “somewhat” to “a lot”, although with significant differences among the various items. The highest level of agreement is observed in the statement “Do you consider what you learned from this experience useful?”, which suggests that the evaluation system was able to generate meaning and a sense of applicability. On the other hand, the lowest evaluation corresponds to the initial negotiation of the experience within the course framework, related to the statement “Was the use of this experience negotiated in the course at the beginning of the semester?”, which reflects the limited participation of students in defining the assessment processes.
Table 3 presents the level of agreement regarding the usefulness of the evaluative experience.
Table 3. Usefulness of the assessment experience.
Table 3 presents a series of statements related to the learning outcomes attributed to the experience, considering four attributes: innovation, effectiveness, sustainability, and replicability. In general terms, the results show moderately positive evaluations, with means ranging from 2.75 to 3.08, which allows us to affirm that the experience is recognized as valuable, although there are differences in how each item is rated. Table 4 displays students’ perceptions regarding the support received and their satisfaction with the assessment system.
Table 4. Support received and satisfaction with the assessment system.
The data in Table 4 reveal students’ perceptions regarding the support received during the experience and their level of satisfaction with the assessment system implemented. In general terms, a medium rating is observed, with the highest value corresponding to the statement “support received from peers”, which may be linked to collaboration dynamics, teamwork, or peer assessment instances. Meanwhile, “support provided by the teacher” also received a medium rating, but it was the item with the lowest perceived value. Table 5 presents the results for items related to the advantages and possible disadvantages of the assessment system.
Table 5. Means and SD. “Advantages and Possible Drawbacks of the Assessment System”.
Regarding the statements related to the advantages of the assessment system presented in Table 5, it can be seen that three statements stand out for receiving high ratings. These are linked to the perception that the assessment experience enables active learning, collaborative work, and theory–practice interrelation. The statement with the lowest rating was “more individualized follow-up is provided”, which shows that, despite the pedagogical benefits of the system, students do not perceive significant improvements in terms of equity or personalized attention.
Regarding the statements linked to the possible disadvantages of the assessment system, the highest ratings from students correspond to the mandatory active attendance and the demand for continuity implied by the assessment system. The lowest-rated items were statements such as “there is difficulty working in groups” and “the work dynamic is unfamiliar, lacks habit”.
In summary, Table 5 shows how students value the assessment system for its ability to promote collaborative and meaningful learning, although they express concerns about equity, the clarity of grading, and the associated workload. Table 6 presents the results of the pre-test vs. post-test related to students’ self-perception of the acquisition of transversal, general teaching, and specific Physical Education teaching competencies.
Table 6. Self-perception of transversal, general, and specific competencies in Physical Education.
The results from Table 6 reveal significant variations in the self-perception of competencies following the assessment experience. Regarding transversal competencies, a significant decrease is observed in most items, particularly in analyzing and synthesizing, organizing and planning, use of information technologies, and communication in a foreign language. Interpersonal and personal skills also show a decline, whereas teamwork and creativity remain stable, with no significant differences. In terms of general teaching competencies, improvements are noted in developing change proposals and in strategies for addressing diversity, suggesting increased pedagogical awareness related to inclusion. However, scores decrease for competencies related to communication with families and participation in school life. Other items, such as designing learning situations and educational innovation, show no significant changes.
Finally, for specific physical education competencies, there are notable decreases in using play as a didactic resource, school sports initiation, healthy living, and outdoor activities, along with a slight reduction in biological and physiological foundations. Overall, the results reflect a more critical and realistic self-perception by students regarding their competencies, with lower ratings in technical and instrumental dimensions, but improvements in pedagogical aspects and attention to diversity, consistent with the principles of formative assessment.

4. Discussion

The results obtained show that the evaluations from the students who participated in the assessment process reveal a positive stance regarding the advantages of the assessment system, indicating that it contributed to the acquisition of competencies and had a positive impact on their learning. This aligns with the international literature on formative assessment in ITEPE [31,32]. However, students perceived fewer opportunities for negotiation, which remains a persistent tension in the literature [33]. Studies have shown that the effectiveness of formative assessment depends not only on the instruments used but also on student participation in defining the assessment criteria [17,19]. Additionally, the incorporation of peer assessment and feedback promotes autonomy and critical thinking in future teachers [34,35]. Therefore, incorporating shared assessment negotiation spaces remains a pending task. This is especially relevant considering that when assessment practices are perceived as minimally negotiated or more technical than dialogical, there is a risk that students may develop a reductionist conception of assessment, focused more on grading than on learning improvement [1].
The findings related to the usefulness of the assessment system, based on students’ perceptions, reveal that they believe that the assessment experience holds potential for transfer and applicability in other contexts. Studies highlight that good assessment practices are especially valuable when they are replicable [36]. Similarly, the perceptions of effectiveness and innovation align with research emphasizing that assessment systems integrating authentic tasks and formative feedback are recognized by students as learning experiences with tangible and creative impacts [37,38]. However, the lowest rating was given to sustainability, which may hinder the consolidation of the learning outcomes generated by the experience. In this regard, the literature warns that the sustainability of assessment learning heavily depends on institutional assessment coherence. Therefore, if innovative and effective practices remain isolated to one-off activities, their impact may fade over time [39,40].
In relation to students’ expectations and the role played by the teaching staff in the assessment experience, a low rating was observed regarding the support received from the teacher, something that, due to the specific characteristics of the Uruguayan university context, has also been noted in other studies [10,41]. However, during the learning activities, the experience was planned in advance with a strong presence of group work, explicitly including peer feedback situations. This is reflected in a higher perception of peer support, suggesting that the collaborative dimension gained greater prominence in the construction of learning.
Recent studies confirm that quality teacher feedback is a key predictor of satisfaction and self-regulation in higher education [42]. However, when such feedback cannot be provided (e.g., due to teacher overload), it tends to be compensated by horizontal collaboration strategies among students [43]. In the field of ITEPE, peer assessment has become an effective tool for fostering autonomy and evaluative judgment, as long as coherence and validity in the criteria are ensured [17,44].
Regarding the advantages and possible drawbacks of the assessment system, a paradox emerges, one that is common in the literature on learning assessment in higher education. While the pedagogical richness of collaborative, competency-based approaches is acknowledged, perceptions of unfairness and subjectivity in grading persist [45,46,47]. In ITEPE, international research has shown that combining situated learning experiences with shared assessment fosters motivation and ownership of the learning process [48]. However, their success depends on the transparency of the criteria and consistent teacher support [49]. This is reflected in the present experience, as its design focused on the incorporation of collaborative processes rather than a teacher-centered approach. The high score on mandatory attendance and continuity confirms that formative assessment systems demand greater commitment and effort from students, which may be perceived as a burden [50], However, numerous studies suggest that this perceived workload is compensated by deeper learning and is, in fact, recognized as part of the self-directed hours built into the course structure [51].
The results show a significant decrease in the self-perception of some competencies following the assessment experience, suggesting that students developed a more critical and realistic view of their own professional performance. This phenomenon has been described by authors who argue that formative and shared assessment promotes processes of self-regulation and reflection, which can lead to initial perceptions of lower mastery by increasing awareness of the complexity of teaching competencies [33,52]. This reinterpretation of one’s own capabilities constitutes a positive indicator of professional maturity and the internalization of formative principles, something even promoted by ministerial regulations such as Chile’s “Decree 67°”.
The increase in the value placed on competencies related to attention to diversity reinforces the importance of inclusive education. Therefore, authors emphasize that it is through reflective teaching that equity and pedagogical adaptation can be integrated as central pillars of teaching performance [53,54]. Likewise, the observed decrease in the self-perception of competencies related to technical dimensions aligns with the arguments of Backman and Barker (2020) [55]. who point out that university-level Physical Education tends to overvalue motor skills over pedagogical ones, which partly explains this critical stance. The results reaffirm that participation in authentic assessment processes helps to consolidate meaningful learning by promoting the integration of reflection, practice, and ethical commitment [56,57].

5. Conclusions

In conclusion, this study shows that the implemented assessment system was positively valued by students in aspects such as usefulness, innovation, and its potential for replication in other contexts. These evaluations reinforce the idea that assessment practices aligned with T–L processes and offering possibilities of transfer beyond the specific course represent a relevant path to improve the quality of formative processes. The importance students placed on peer collaboration is also highlighted; this finding suggests that, when adequately promoted, horizontal dynamics become a central source of support and knowledge construction, contributing to the consolidation of learning.
Nevertheless, the results also highlight certain limitations of the implemented assessment system, which are related to a perceived low level of student participation in the negotiation of criteria. This reveals that, despite progress toward formative approaches, a gap still exists in the incorporation of students in evaluative decision-making processes. In relation to competence acquisition, the results indicate that students developed a more critical and realistic self-perception of their professional abilities, showing progress in reflective, pedagogical, and inclusive dimensions, while reassessing technical aspects. This suggests that formative assessment experiences promote a deeper understanding and self-awareness of teaching competencies in ITEPE contexts.
In summary, the study reveals the benefits of moving toward collaborative, innovative, and replicable practices, but it also highlights limitations related to the need for greater student participation in the definition of criteria, as well as the need to complement peer assessment and feedback processes with more interaction from the teaching staff. As guidelines for future assessment experiences, (a) at the practical level, it is necessary to strengthen the participatory component of assessment by creating opportunities for negotiating criteria, ensuring transparency and clarity in grading procedures through consensual rubrics and continuous feedback processes; (b) at the theoretical level, this study confirms that assessment in ITEPE cannot be reduced to a technical dimension, but must instead be understood as a pedagogical space where conceptions of justice, equity, and educational democracy are at stake.
It is important to clarify that, at the time of the experience, participants did not have prior or ongoing engagement in authentic teaching practice, nor previous experience with formative assessment systems of this nature. Therefore, the findings of this study should be interpreted in terms of students’ self-perceived development of professional competencies, as assessed through a validated questionnaire, rather than as evidence of demonstrated teaching performance in real school contexts. This limitation is acknowledged, as self-perceptions may reflect theoretical understanding and familiarity with coursework rather than fully consolidated professional competence. Nevertheless, analyzing self-perception constitutes a relevant first step in understanding how future teachers interpret and value formative assessment processes within their initial training.

Author Contributions

All authors contributed to the design, writing, and final review of the study. F.G.-F. and M.C.-F. were responsible for the design of the initial draft. C.M.-A. and B.C.-T. contributed to the methodological design and data analysis. J.G.-F. provided the first comprehensive review and preparation of the original draft. Finally, all authors contributed to a round of review and editing of the final version of the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Agencia Nacional de Investigación y Desarrollo, Programa Fondecyt N° 1230609 and N° 1240883.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and the published data were collected as part of the project “ANID, FONDECYT REGULAR 2023 (1230609),” titled “When school regulations change, should Teacher Training also change? The impact of the new decree on assessment, grading, and school promotion in the preparation of future Physical Education teachers.” This project was approved under Code “H009/2023,” issued by the Scientific Ethics Committee of the University of Los Lagos (CEC-Ulagos) on 28 March 2023.

Data Availability Statement

The data from this study are available upon request from the corresponding author in a version that maintains the anonymity of the participants.

Acknowledgments

This article is linked to the project “ANID, FONDECYT REGULAR 2023 (1230609)”, titled “When school regulations change, should Teacher Training also change? The impact of the new decree on assessment, grading, and school promotion in the preparation of future Physical Education teachers”.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Morris, R.; Perry, T.; Wardle, L. Formative assessment and feedback for learning in higher education: A systematic review. Rev. Educ. 2021, 9, e3292. [Google Scholar] [CrossRef]
  2. Pastore, S. Teacher assessment literacy: A systematic review. Front. Educ. 2023, 8, 1217167. [Google Scholar] [CrossRef]
  3. Gao, X.; Noroozi, O.; Gulikers, J.; Biemans, H.; Banihashem, S. A systematic review of the key components of online peer feedback practices in higher education. Educ. Res. Rev. 2024, 42, 100588. [Google Scholar] [CrossRef]
  4. Brauer, S. Towards competence-oriented higher education: A systematic literature review of the different perspectives on successful exit profiles. Educ. Train. 2021, 63, 1376–1390. [Google Scholar] [CrossRef]
  5. Torres-Miranda, T. Particularities of the curriculum of university formation in the context of the latin america. Alternativas 2023, 22, 22–26. [Google Scholar] [CrossRef]
  6. Tahirsylaj, A.; Sundberg, D. Five visions of competence-based education and curricula as travelling policies: A systematic research review 1997–2022. J. Curric. Stud. 2025, 1–26. [Google Scholar] [CrossRef]
  7. Perdomo, J.; Perdomo, T.; Perdomo, T. Implementation of the educational model based on competencies and the challenges of the evaluative stage. Rev. Acciones Médicas 2022, 1, 66–76. [Google Scholar] [CrossRef]
  8. Vargas, H.; Arredondo, E.; Heradio, R.; Torre, L. Standardizing course assessment in competency-based higher education: An experience report. Front. Educ. 2025, 10, 1579124. [Google Scholar] [CrossRef]
  9. López-Pastor, V.; Pérez-Pueyo, Á. Formative and Shared Assessment in Education: Successful Experiences at All Educational Stages; Universidad de León: León, Spain, 2017. [Google Scholar]
  10. Sarni-Muñiz, M.; Corbo-Bruno, J.L. La evaluación estandarizada y sus efectos en la educación en Uruguay. Estud. Em Avaliação Educ. 2024, 35, e10946. [Google Scholar] [CrossRef]
  11. Romero, C. A university system without restrictions of access: The case of uruguay. Rev. Iberoam. Evaluación Educ. 2010, 3, 76–89. [Google Scholar]
  12. Martínez-Larrechea, E.; Chiancone, A. Higher education quality assurance in Uruguay: Balance and perspective of a delayed policy. Qual. Assur. Educ. 2022, 30, 304–318. [Google Scholar] [CrossRef]
  13. Moulton, A.; Mcnicoll, Y.; Luff, A. Governing Education, Educating the Governors. In Collaboration, Communities and Competition; Dent, S., Lane, L., Strike, T., Eds.; SensePublishers: Rotterdam, The Netherlands, 2017; pp. 73–89. [Google Scholar] [CrossRef]
  14. Failache, E.; Fiori, N.; Katzkowicz, N.; Machado, A.; Méndez, L. Impact of COVID-19 on higher education for a developing country: Evidence from Uruguay. Int. J. Educ. Dev. 2025, 117, 103374. [Google Scholar] [CrossRef]
  15. Sánchez-Mendiola, M.; Manzano-Patiño, A.; García-Minjares, M.; Casanova, E.; Herrera Penilla, C.; Goytia-Rodríguez, K.; Martínez-González, A. Large-scale diagnostic assessment in first-year university students: Pre and transpandemic comparison. Educ. Assess. Eval. Account. 2023, 35, 503–523. [Google Scholar] [CrossRef]
  16. Dogliotti, P.; Páez, S. Physical education in the educational policies of the last decades in Uruguay: Continuities and ruptures between the progressive and the conservative. Educ. Policy Anal. Arch. 2024, 32, 1–23. [Google Scholar] [CrossRef]
  17. Backman, E.; Quennerstedt, M.; Tolgfors, B.; Nyberg, G. Peer assessment in physical education teacher education—A complex process making social and physical capital visible. Curric. Stud. Health Phys. Educ. 2024, 15, 274–288. [Google Scholar] [CrossRef]
  18. Kerman, N.; Banihashem, S.; Karami, M.; Er, E.; Van Ginkel, S.; Noroozi, O. Online peer feedback in higher education: A synthesis of the literature. Educ. Inf. Technol. 2024, 29, 763–813. [Google Scholar] [CrossRef]
  19. Pardo, R.; García-Pérez, D.; Panadero, E. Shaping the assessors of tomorrow: How practicum experiences develop assessment literacy in secondary education pre-service teachers. Teach. Teach. Educ. 2024, 152, 104798. [Google Scholar] [CrossRef]
  20. Fuentes-Merino, P.; Valenzuela-Rettig, P.; Canuiqueo-Vargas, A. Type of assessment’s objective and perceived impact on academic performance in Physical Education Teaching program students under a competency-based curriculum. Retos 2022, 46, 739–744. [Google Scholar] [CrossRef]
  21. Grilli-Silva, J.; Dalmas, D.; Prado, A. Evaluation of Competencies in the Initial Training of Science Teachers in Uruguay: A Globalizing Interdisciplinary End-of-Course Experience. Rev. Andin. Educ. 2024, 8, 000814. [Google Scholar] [CrossRef]
  22. Gallardo-Fuentes, F. Effects of the Use of Formative Assessment Processes on Physical Education Teacher Education Students at the University of Los Lagos (Chile), Universidad de Valladolid. Doctoral Dissertation, Universidad de Valladolid, Valladolid, Spain, 2018. [Google Scholar] [CrossRef]
  23. Gallardo-Fuentes, F.; Carter-Thuillier, B.; López-Pastor, V. Formative and Shared Assessment in the Initial Training of Physical Education Teachers: Results after Four Years of Implementation in a Chilean Public University. Rev. Iberoam. De Evaluación Educ. 2019, 12, 139–155. [Google Scholar]
  24. Giraldo-Ruiz, C. Experience of formative and shared assesment in the area of physical education in school transition grade. Rev. Infanc. Educ. Aprendiz. 2024, 10, 24–30. [Google Scholar] [CrossRef]
  25. Morales, Y. Formative and shared assessment for the development of investigative competences in university students. Educere 2019, 23, 499–508. [Google Scholar]
  26. Santos, W.; Vieira, A.; Stieg, R.; Mathias, B.; Cassani, J. Práticas avaliativas de professores de educação física: Inventariando possibilidades. J. Phys. Educ. 2018, 30, 3005. [Google Scholar] [CrossRef]
  27. Nwabuko, O. An Overview of Research Study Designs in Quantitative Research Methodology. Am. J. Med. Clin. Res. Rev. 2024, 3, 1–6. [Google Scholar] [CrossRef]
  28. Salcines-Talledo, I.; González-Fernández, N.; Ramírez-García, A.; Martínez-Mínguez, L. Validation of Self-Perception Scale of Transversal and Professional Competences of Higher Education Students. Profr. Rev. Curric. Form. Profr. 2018, 22, 31–51. [Google Scholar]
  29. Castejón-Oliva, F.; Santos-Pastor, M.; Palacios-Picos, A. Questionnaire on methodology and assessment in physical education initial training. Rev. Int. Med. Cienc. Act. Física Deporte 2015, 15, 245–267. [Google Scholar]
  30. López-Pastor, V. Formative and shared assessment in higher education. Clarifying concepts and proposing interventions from the Formative and Shared Assessment Network. Psychol. Soc. Educ. 2012, 4, 117–130. [Google Scholar]
  31. Gallardo-Fuentes, F.; Carter-Thuillier, B.; Peña-Troncoso, S.; Martínez-Angulo, C.; López-Pastor, V. Critically analyzing the incorporation of the current regulations on “evaluation, grading and school promotion” in the initial training of physical education teachers in Chile. Interciencia 2023, 48, 544–551. [Google Scholar]
  32. López-Pastor, V. Developing Formative and Shared Assessment Systems in University Teaching: Analysis of the Results of Their Implementation in Initial Teacher Education. Eur. J. Teach. Educ. 2008, 31, 293–311. [Google Scholar] [CrossRef]
  33. Molina-Soria, M.; Pascual-Arias, C.; Hortigüela-Alcalá, D.; Fernández-Garcimartín, C. Development Analysis of Students’ Perception of Their Learning in Shared Assessment Systems. Rev. Iberoam. Evaluación Educ. 2022, 15, 43–60. [Google Scholar] [CrossRef]
  34. Lynch, R.; McNamara, P.; Seery, N. Promoting deep learning in a teacher education programme through self- and peer-assessment and feedback. Eur. J. Teach. Educ. 2012, 35, 179–197. [Google Scholar] [CrossRef]
  35. Ortega-Ruipérez, B.; Pereles-López, A.; Lázaro, M. Impact of a Digital Tool to Improve Metacognitive Strategies for Self-Regulation During Text Reading in Online Teacher Education. J. Inf. Technol. Educ. Innov. Pract. 2024, 23, 007. [Google Scholar] [CrossRef] [PubMed]
  36. Stein, S.; Goodchild, A.; Moskal, A.; Terry, S.; McDonald, J. Student perceptions of student evaluations: Enabling student voice and meaningful engagement. Assess. Eval. High. Educ. 2021, 46, 837–851. [Google Scholar] [CrossRef]
  37. Guadamud-Muñoz, J.; Chiriboga-Palacios, I.; Zumba-Juela, J.; Briceño-Salazar, R.; Jiménez-Vargas, J.; Palma-Candelario, Á. Innovations and trends in educational evaluation systems. LATAM Rev. Latinoam. Cienc. Soc. Humanidades 2024, 5, 1724–1733. [Google Scholar] [CrossRef]
  38. Putri, V.; Palupi, Y.; Laili, Y.; Pradana, D. The Role of Formative Feedback in Curriculum Materials: Improving Learning Outcomes Through Continuous Assessment. J. Technol. Educ. Teach. (J-TECH) 2025, 1, 102–107. [Google Scholar] [CrossRef]
  39. O’Connell, B.; Stupans, I.; Jollands, M. A new sustainable change theoretical framework for the professional disciplines. High. Educ. Q. 2023, 77, 311–326. [Google Scholar] [CrossRef]
  40. Pereira-Silva, E. Avaliação sustentável no processo de ensino-aprendizagem no ensino superior. Estud. Em Avaliação Educ. 2024, 35, e10026. [Google Scholar] [CrossRef]
  41. Capuñay-Uceda, O.; Zuñe-Chero, L. Perception and evaluative practices of university professors. Rev. Univ. Zulia 2022, 13, 747–762. [Google Scholar] [CrossRef]
  42. Hernández-Rivero, V.; Santana-Bonilla, P.; Sosa-Alonso, J. Feedback and self-regulated learning in higher education. Rev. Investig. Educ. 2021, 39, 227–248. [Google Scholar] [CrossRef]
  43. Brasil-Irala, V.; Kerkhoff-Cristofari, A. Feedbacks e autorregulação da aprendizagem no ensino superior: Uma revisão de escopo. Rev. Teias 2022, 23, 414–433. [Google Scholar] [CrossRef]
  44. Alqassab, M.; Panadero, E. Peer Assessment; Routledge: Oxfordshire, UK, 2022. [Google Scholar] [CrossRef]
  45. Cañadas, L. Formative assessment in university context: Opportunities and proposals for action. Rev. Digit. Investig. Docencia Univ. 2020, 14, e1214. [Google Scholar] [CrossRef]
  46. Nunes ABatista, P.; Mendes, R.; Almeida, E. Avaliação do processo ensino aprendizagem na graduação: Uma revisão de literatura. Contrib. A Las Cienc. Soc. 2025, 18, e18306. [Google Scholar] [CrossRef]
  47. Páez-Herrera, J.; Hurtado-Almonacid, J.; Reyes-Amigo, T.; Rolle-Cáceres, G.; Yáñez-Sepúlveda, R. Evaluation of learning in competency-based curricular models in higher education. Cienc. Lat. Rev. Científica Multidiscip. 2023, 7, 3041–3056. [Google Scholar] [CrossRef]
  48. Maldonado-Diaz, C.; Nuñez-Diaz, C. Initial Training of Teachers and Co-Teaching Practices: What does the International Research of the last 20 Years Say? Pensam. Educ. Rev. Investig. Educ. Latinoam. 2023, 60, 1–16. [Google Scholar] [CrossRef]
  49. Abril-Gallego, A.; Peinado, M. Initial teacher training in community of learning to promote PBL. Profr. Rev. Currículum Form. Profr. 2023, 27, 1–20. [Google Scholar] [CrossRef]
  50. Otero-Saborido, F.; Rodriguez-Bies, E.; Gallardo-López, J.; López-Noguero, F. Percepción del alumnado de Educación Física sobre la carga de trabajo y Evaluación Formativa en Flipped Learning (Student perception of workload and Formative Assessment in Flipped Learning). Retos 2023, 50, 298–305. [Google Scholar] [CrossRef]
  51. Custodio-Carbajal, L.; Hernández-Fernández, B.; Centurion-Larrea, A. Formative assessment in learning: A systematic review. Horizontes. Rev. Investig. Cienc. Educ. 2025, 9, 2909–2923. [Google Scholar] [CrossRef]
  52. Nieva-Boza, C.; Martínez-Mínguez, L.; Moya-Prados, L. Formative assessment in Project of Co-Oriented Psychomotor Learning (PCo-OPL): Student perceptions on acquisition of professional skills. Sport. Sci. J. Sch. Sport Phys. Educ. Psychomot. 2020, 6, 327–346. [Google Scholar] [CrossRef]
  53. Maklakova, N.; Maklakov, I. Reflection in teaching: Goals, techniques, contradictions. Alma Mater. Vestnik Vysshey Shkoly 2024, 84–90. [Google Scholar] [CrossRef]
  54. Alves, M.; Oliveira, G. Reflexão da Prática Pedagógica na perspectiva de uma Formação Docente Contextualizada. ID Line Rev. Psicol. 2016, 10, 182–193. [Google Scholar] [CrossRef]
  55. Backman, E.; Barker, D. Re-thinking pedagogical content knowledge for physical education teachers–implications for physical education teacher education. Phys. Educ. Sport Pedagog. 2020, 25, 451–463. [Google Scholar] [CrossRef]
  56. Sanahuja-Ribés, A.; Escobedo-Peiró, P.; Traver-Martí, J.A.; García, E. Participatory assessment in higher education: A case study of university students in education. Cult. Educ. 2024, 37, 61–87. [Google Scholar] [CrossRef]
  57. Guillem-Gómez, T.; Cedeño-Zambrano, L.; Pérez-Delgado, F.G.; Quezada-Briones, F. Marcos de referencia y acción de la evaluación auténtica en el contexto universitario. Cienciamatria 2023, 9, 193–211. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.