Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics
Abstract
:1. Introduction
1.1. Competency-Based Approach
1.2. How Competencies Could Be Embedded into Higher Education Syllabus
- Designing the Degree’s competencies profile, including generic and specific competencies. Several stakeholders intervene in this profiling: external professional standards, government, agencies, professional associations, etc. Each Higher Education institution should list and describe each competency and monitor its accomplishment periodically [36].
- Creating a rubric with the competency standards to be achieved by the students throughout the Degree. Professional standards could be the framework to draw from so that the maximum achievement in leaving the career is at the initial level of the professional standards of professional competencies [37].
- Distributing competencies to be achieved through several subjects along the Degree. Specific subjects (e.g., internship, final project, learning-service projects, integrated subjects) are fixed into the syllabus to foster competencies development [38].
- Establishing a system of competencies qualification embedded in tasks and courses [37].
1.3. Assessment of Competencies
1.4. Main Aim of the Study
2. Materials and Methods
- i.
- To describe the most frequent assessment practices in line with the generic competencies from the teachers’ and students’ perspective.
- ii.
- To detect the primary purposes of the assessment practices in online teaching–learning environments forced by lockdown from the teachers and students’ perspective.
- iii.
- To analyze the characteristics of the proposals that both teachers and students consider most beneficial for developing generic competencies.
- iv.
- To understand the degree of use that both teachers and students make of the Learning Analytics available on the institutional virtual campus (LMS Moodle) and the purpose for which they are used.
- v.
- To identify the use of digital tools by teachers for competency assessment and the perception of their usefulness to assess generic competencies.
2.1. Sample
2.2. Procedures
3. Results
3.1. Results from Instructors’ Perspective
3.1.1. Goal 1. To Describe the Most Frequent Assessment Practices in Line with Generic Competencies from the Instructors’ Perspective
3.1.2. Goal 2. To Detect the Primary Purposes of the Assessment Practices in Lockdown-Forced Online Teaching–Learning Environments from the Instructors’ Perspective
3.1.3. Goal 3. To Analyze the Characteristics of the Proposals That Instructors Consider Most Beneficial to Develop Generic Competencies
3.1.4. Goal 4. To Explore How and for What Purpose Instructors Use the Learning Analytics Resources Available on the Institutional Virtual Campus (LMS Moodle)
3.1.5. Goal 5. To Identify Instructors’ Use of Digital Tools for Competency Assessment
3.2. Results from Students’ Perspective
3.2.1. Goal 1. To Describe the Most Frequent Assessment Practices in Line with Generic Competencies from the Students’ Perspective
3.2.2. Goal 2. To Detect the Primary Purposes of the Assessment Practices in Lockdown-Forced Online Teaching–Learning Environments from the Students’ Perspective
3.2.3. Goal 3. To Analyze the Characteristics of the Proposals That Students Consider Most Beneficial to Develop Generic Competencies
3.2.4. Goal 4. To Explore How and for What Purpose Students Use Learning Analytics Resources Available on the Institutional Virtual Campus (LMS Moodle)
3.2.5. Goal 5. To Identify Students’ Use of Digital Tools for Competency Assessment
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Villarroel, V.; Bruna, D. ¿Evaluamos lo que realmente importa? El desafío de la Evaluación Auténtica en Educación Superior. Calid. En La Educ. 2019, 50, 492–509. [Google Scholar] [CrossRef] [Green Version]
- Krstikj, A.; Sosa, J.; García Bañuelos, L.; González Peña, O.I.; Quintero, H.N.; Urbina, P.D.; Vanoye, A.Y. Analysis of Competency Assessment of Educational Innovation in Upper Secondary School and Higher Education: A Mapping Review. Sustainability 2022, 14, 8089. [Google Scholar] [CrossRef]
- Sánchez Santamaría, J. Evaluación de los aprendizajes universitarios: Una comparación sobre sus posibilidades y limitaciones en el Espacio Europeo de Educación Superior. Rev. De Form. E Innovación Educ. Univ. (REFIEDU) 2011, 4, 40–54. [Google Scholar]
- Baughan, P.; Carless, D.R.; Moody, J.; Stoakes, G. Re-considering assessment and feedback practices in light of the COVID-19 pandemic. In On Your Marks: Learner-Focused Feedback Practices and Feedback Literacy; Baughan, P., Ed.; Advance in Higher Education: York, UK, 2020; pp. 171–191. [Google Scholar]
- García-Peñalvo, F.J.; Corell, A.; Abella, V.; Grande, M. Online assessment in higher education in the time of COVID-19. Educ. Knowl. Soc. 2020, 21, 26. [Google Scholar] [CrossRef]
- Pokhrel, S.; Chhetri, R. A literature review on impact of COVID-19 pandemic on teaching and learning. High. Educ. Future 2021, 8, 133–141. [Google Scholar] [CrossRef]
- Jena, P.K. Impact of COVID-19 on higher education in India. Int. J. Adv. Educ. Res. (IJAER) 2020, 77–81. [Google Scholar]
- Yau, A.H.Y.; Yeung, M.W.L.; Lee, C.Y.P. A co-orientation analysis of teachers’ and students’ perceptions of online teaching and learning in Hong Kong higher education during the COVID-19 pandemic. Stud. Educ. Eval. 2022, 72, 101128. [Google Scholar] [CrossRef]
- Oliveira, G.; Teixeira, J.G.; Torres, A.; Morais, C. An exploratory study on the emergency remote education experience of higher education students and teachers during the COVID-19 pandemic. Br. J. Educ. Technol. 2021, 52, 1357–1376. [Google Scholar] [CrossRef]
- Gonzalez, T.; De La Rubia, M.A.; Hincz, K.P.; Comas-Lopez, M.; Subirats, L.; Fort, S.; Sacha, G.M. Influence of COVID-19 confinement on students’ performance in higher education. PLoS ONE 2020, 15, e0239490. [Google Scholar] [CrossRef]
- Jehl, T.; Khan, R.; Dos Santos, H.; Majzoub, N. Effect of COVID-19 outbreak on anxiety among students of higher education; A review of literature. Curr. Psychol. 2022. [Google Scholar] [CrossRef]
- Piyatamrong, T.; Derrick, J.; Nyamapfene, A. Technology-Mediated Higher Education Provision during the COVID-19 Pandemic: A Qualitative Assessment of Engineering Student Experiences and Sentiments. J. Eng. Educ. Transform. 2021, 34, 290–297. [Google Scholar] [CrossRef]
- Karademir, A.; Yaman, F.; Saatçioglu, Ö. Challenges of Higher Education Institutions against COVID-19: The Case of Turkey. J. Pedagog. Res. 2020, 4, 453–474. [Google Scholar] [CrossRef]
- Guangul, F.M.; Suhail, A.H.; Khalit, M.I.; Khidir, B.A. Challenges of remote assessment in higher education in the context of COVID-19: A case study of Middle East College. Educ. Assess. Eval. Account. 2020, 32, 519–535. [Google Scholar] [CrossRef] [PubMed]
- Şenel, S.; Şenel, H.C. Remote assessment in higher education during COVID-19 pandemic. Int. J. Assess. Tools Educ. 2021, 8, 181–199. [Google Scholar] [CrossRef]
- Montenegro, M.; Luque, A.; Sarasola, J.L.; Fernández-Cerero, J. Assessment in Higher Education during the COVID-19 Pandemic: A Systematic Review. Sustainability 2021, 13, 10509. [Google Scholar] [CrossRef]
- Cano, E. Retos de futuro en la evaluación por competencias. In Evaluación por Competencias: La Perspectiva de las Primeras Promociones de Graduados en el EEES; Cano, E., Fernández, M., Eds.; Octaedro: Barcelona, Spain, 2016; pp. 139–148. [Google Scholar]
- Tuah, N.A.A.; Naing, L. Is online assessment in higher education institutions during COVID-19 pandemic reliable? Siriraj Med. J. 2021, 73, 61–68. [Google Scholar] [CrossRef]
- Bakhmat, L.; Babakina, O.; Belmaz, Y. Assessing online education during the COVID-19 pandemic: A survey of lecturers in Ukraine. J. Phys. Conf. Ser. 2021, 1840, 012050. [Google Scholar] [CrossRef]
- Biggs, J. Using assessment strategically to change the way students learn. In Assessment Matters in Higher Education. Choosing and Using Diverse Approaches; Brown, S., Glasner, A., Eds.; Society for Research into Higher Education: Buckingham, UK, 2003; pp. 41–54. [Google Scholar]
- St-Onge, C.; Ouellet, K.; Lakhal, S.; Dubé, T.; Marceau, M. COVID-19 as the tipping point for integrating e-assessment in higher education practices. Br. J. Educ. Technol. 2022, 53, 349–366. [Google Scholar] [CrossRef]
- Quinlan, K.M.; Pitt, E. Towards signature assessment and feedback practices: A taxonomy of discipline-specific elements of assessment for learning. Assess. Educ. Princ. Policy Pract. 2021, 28, 191–207. [Google Scholar] [CrossRef]
- Pitt, E.; Quinlan, K.M. Signature assessment and feedback practices in the disciplines. Assess. Educ. Princ. Policy Pract. 2021, 28, 97–100. [Google Scholar] [CrossRef]
- Khan, S.; Kambris, M.E.K.; Alfalahi, H. Perspectives of University Students and Faculty on remote education experiences during COVID-19—A qualitative study. Educ. Inf. Technol. 2022, 27, 4141–4169. [Google Scholar] [CrossRef] [PubMed]
- Iqbal, S.A.; Ashiq, M.; Rehman, S.U.; Rashid, S.; Tayyab, N. Students’ perceptions and experiences of online education in Pakistani Universities and Higher Education Institutes during COVID-19. Educ. Sci. 2022, 12, 166. [Google Scholar] [CrossRef]
- Maatuk, A.M.; Elberkawi, E.K.; Aljawarneh, S.; Rashaideh, H.; Alharbi, H. The COVID-19 pandemic and E-learning: Challenges and opportunities from the perspective of students and instructors. J. Comput. High. Educ. 2022, 34, 21–38. [Google Scholar] [CrossRef] [PubMed]
- Bindé, J. Hacia las Sociedades del Conocimiento: Informe Mundial de la UNESCO; UNESCO: London, UK, 2015. [Google Scholar]
- Strijbos, J.; Engels, N.; Struyven, K. Criteria and standards of generic competencies at bachelor degree level: A review study. Educ. Res. Rev. 2015, 14, 18–32. [Google Scholar] [CrossRef]
- Boud, D.; Falchikov, N. Aligning assessment with long-term learning. Assess. Eval. High. Educ. 2006, 31, 399–413. [Google Scholar] [CrossRef] [Green Version]
- European Parliament. Recommendation of the European Parliament and of the Council of 18 December 2006 on Key Competencies for Lifelong Learning; European Parliament: Strasbourg, France, 2006. [Google Scholar]
- European Council. Council Recommendation of 22 May 2018 on Key Competencies for Lifelong Learning; European Council: Brussels, Belgium, 2018. [Google Scholar]
- European Commission. Communication from the Commission to the European Parliament, The Council, The European Economic and Social Committee and the Committee of the Regions on a Renewed EU Agenda for Higher Education; COM/2017/0247; European Commission: Brussels, Belgium, 2017. [Google Scholar]
- De Miguel, M. Modalidades de Enseñanza Centradas en el Desarrollo de Competencias. Orientaciones para Promover el Cambio Metodológico en el EEES; Ministerio de Educación y Ciencia/Universidad de Oviedo: Madrid, Spain, 2005. [Google Scholar]
- Strijbos, J.W.; Narciss, S.; Dünnebier, K. Peer feedback content and sender’s competency level in academic writing revision tasks: Are they critical for feedback perceptions and efficiency? Learn. Instr. 2010, 20, 291–303. [Google Scholar] [CrossRef] [Green Version]
- García-San Pedro, M.J.; Gairín, J. Los mapas de competencias: Una herramienta para mejorar la calidad de la formación universitaria. Rev. Electrónica De Investig. En Cienc. Económicas REICE 2011, 9, 84–102. [Google Scholar]
- Gulikers, J.T.M.; Baartman, L.K.J.; Biemans, H.J.A. Facilitating Evaluations of Innovative, Competency-based Assessments: Creating Understanding and Involving Multiple Stakeholders. Eval. Program Plan. 2010, 33, 120–127. [Google Scholar] [CrossRef]
- Tejada, J.; Ruiz, C. Evaluación de competencias profesionales en educación superior: Retos e implicaciones. Educ. XXI 2016, 19, 17–38. [Google Scholar]
- Cano, E. Presentación del monográfico: Evaluación por Competencias en la Educación Superior: Buenas Prácticas ante los Actuales Retos. Rev. Iberoam. De Evaluación Educ. 2019, 12, 5–8. [Google Scholar]
- Ibarra, M.S.; Rodríguez-Gómez, G.; Boud, D. The quality of assessment tasks as a determinant of learning. Assess. Eval. High. Educ. 2020, 46, 943–955. [Google Scholar] [CrossRef]
- Kennedy, D. Writing and Using Learning Outcomes. A Practical Guide; University College Cork: Cork, Ireland, 2007. [Google Scholar]
- Monereo, C. La evaluación del conocimiento estratégico a través de tareas auténticas. Pensam. Educ. Rev. De Investig. Latinoam. (PEL) 2003, 32, 71–89. [Google Scholar]
- Trujillo, F. Competencia en Comunicación Lingüística Nunha Europa Plurilingüe e Pluricultural. Ensinanza de Linguas no Contexto Europeo: Tendencias e Propostas. November 2008. Available online: https://docplayer.es/52379208-Competencia-en-comunicacion-linguistica-nunha-europa-plurilingue-e-pluricultural.html (accessed on 14 September 2021).
- Black, P.; Wiliam, D. Assessment and classroom learning. Assess. Educ. Princ. Policy Pract. 1998, 5, 7–74. [Google Scholar] [CrossRef]
- Creswell, J.W. Controversies in mixed methods research. In The Sage Handbook of Qualitative Research; Denzin, N.K., Lincoln, Y.S., Eds.; SAGE Publishers: Thousand Oaks, CA, USA, 2011; pp. 269–284. [Google Scholar]
- Hernández-Sampieri, R.; Mendoza, C. Metodología de la Investigación. Las Rutas Cuantitativa, Cualitativa y Mixta; McGraw-Hill: Ciudad de México, México, 2018. [Google Scholar]
- Braun, V.; Clarke, V. Thematic analysis. In APA Handbook of Research Methods in Psychology, Vol 2: Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological; Cooper, H., Camic, P.M., Long, D.L., Panter, A.T., Rindskopf, D., Sher, K.J., Eds.; American Psychological Association: New York, NY, USA, 2012; pp. 57–71. [Google Scholar] [CrossRef]
- Guba, E.; Lincoln, Y. Epistemological and Methodological Bases of Naturalistic Inquiry. Educ. Commun. Technol. 1982, 30, 233–252. [Google Scholar] [CrossRef]
- Gómez-Poyato, M.J.; Eito, A.; Mira, D.C.; Matías, A. Digital Skills, ICTs and Students’ Needs: A Case Study in Social Work Degree, University of Zaragoza (Aragón-Spain). Educ. Sci. 2022, 12, 443. [Google Scholar] [CrossRef]
- Del Arco, I.; Flores, Ò.; González-Rubio, J.; Araneda, D.S.; Olivos, C.L. Workloads and Emotional Factors Derived from the Transition towards Online and/or Hybrid Teaching among Postgraduate Professors: Review of the Lessons Learned. Educ. Sci. 2022, 12, 666. [Google Scholar] [CrossRef]
- Ramos, A.; Reese, L.; Arce, C.; Balladares, J.; Fiallos, B. Teaching Online: Lessons Learned about Methodological Strategies in Postgraduate Studies. Educ. Sci. 2022, 12, 688. [Google Scholar] [CrossRef]
- Remesal, A. Primary and secondary teachers’ conceptions of assessment: A qualitative study. J. Teach. Teach. Educ. 2011, 27, 472–482. [Google Scholar] [CrossRef]
- Fernández-Ruiz, J.; Panadero, E. Comparison between conceptions and assessment practices among secondary education teachers: More differences than similarities. J. Study Educ. Dev. 2020, 43, 309–346. [Google Scholar] [CrossRef]
- Brown, G.T.L. Student Conceptions of Assessment: Regulatory Responses to Our Practices. ECNU Rev. Educ. 2021, 5, 116–139. [Google Scholar] [CrossRef]
- Juan, N.; Villach, M.J.R.; Remesal, A.; De Salvador, N. Qué dificultades perciben los futuros maestros y sus profesores acerca del feedback recibido durante el trabajo final de grado. Perspect. Educ. Form. De Profr. 2018, 57, 24–49. [Google Scholar]
- Portillo, J.; Garay, U.; Tejada, E.; Bilbao, N. Self-Perception of the Digital Competence of Educators during the COVID-19 Pandemic: A Cross-Analysis of Different Educational Stages. Sustainability 2020, 12, 10128. [Google Scholar] [CrossRef]
- Bhagat, K.K.; Spector, J.M. International Forum of Educational Technology & Society Formative Assessment in Complex Problem-Solving Domains: The Emerging Role of Assessment Technologies. J. Educ. Technol. Soc. 2017, 20, 312–317. [Google Scholar]
- Pinto, M.; Leite, C. Digital technologies in support of students learning in Higher Education: Literature review. Digit. Educ. Rev. 2020, 37, 343–360. [Google Scholar] [CrossRef]
- Trujillo FFernández-Navas, M.; Montes, M.; Segura, A.; Alaminos, F.J.; Postigo, A.Y. Panorama de la Educación en España tras la Pandemia de COVID-19: La Opinión de la Comunidad Educativa; Fundación de Ayuda contra la Drogadicción (Fad): Madrid, Spain, 2020. [Google Scholar]
- Looney, A.; Cumming, J.; Van Der Kleij, F.; Harris, K. Reconceptualising the role of teachers as assessors: Teacher assessment identity. Assess. Educ. Princ. Policy Pract. 2018, 25, 442–467. [Google Scholar] [CrossRef]
- Jiang, L.; Yu, S. Understanding changes in EFL teahers’ feedback practice during COVID-19: Implications for teacher feedback literacy at a time of crisis. Asia-Pac. Educ. Res. 2021, 30, 509–518. [Google Scholar] [CrossRef]
- Winstone, N.E.; Nash, R.A.; Parker, M.; Rowntree, J. Supporting Learners’ Agentic Engagement with Feedback: A Systematic Review and a Taxonomy of Recipience Processes. Educ. Psychol. 2017, 52, 17–37. [Google Scholar] [CrossRef] [Green Version]
- Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef] [Green Version]
Characteristics | Instructors | Students |
---|---|---|
Age | ||
18–25 | - | 15 |
40–50 | 10 | - |
50+ | 6 | - |
Sex | ||
Female | 10 | 8 |
Male | 6 | 7 |
Academic course | ||
Second (4th semester) | NA | 8 |
Fourth (8th semester) | NA | 7 |
Discipline | ||
Archaeology/History | 2 | 2 |
Audiovisual Communication | 2 | 1 |
Biology | 2 | 2 |
Computer Engineering | - | 1 |
Management and Public Administration | 2 | 1 |
Mathematics | 2 | 2 |
Pharmacy | 2 | 1 |
Psychology | 4 | 2 |
Teacher Education | - | 1 |
Specific Objectives | Dimensions of the Students’ Interview Script | Dimensions of the Instructors’ Interview Script |
---|---|---|
To describe the assessment practices mostly used in line with the generic competencies from the perspective of the instructors and the students. | Type of assessment tasks undertaken. Differences between assessment task before and during COVID-19 period. Knowledge about generic competencies. Relation between the assessment tasks and the development of generic competencies. | Assessment tasks proposed. Differences between assessment task before and during COVID-19 period. Information transferred to students about generic competencies at the beginning of the course. Work on generic competencies. |
To understand the main purposes of the assessment practices carried out in lockdown-forced online teaching environments. | Main purposes of the assessment practices developed. Differences between main purposes of the assessment practices before and during COVID-19 period. Main concerns about the assessment online process. | Main purposes of the assessment practices designed. Differences between main purposes of the assessment practices before and during COVID-19 period. Main concerns about the assessment online process. |
To analyze the characteristics of the proposals that both instructors and students consider most useful to develop generic competencies. | Description of the different assessment tasks performed during the mixed teaching period. Characteristics of the assessment tasks developed considered most useful to develop generic competencies. | Description of an assessment task considered especially good and successful during the mixed teaching period, and why it is considered useful to develop generic competencies. Characteristics of the assessment tasks designed considered most useful to develop generic competencies. |
To explore how and for what purpose both instructors and students use Learning Analytics resources available on the Virtual Campus. | Knowledge about Learning Analytics. Use of Learning Analytics. Purposed of Learning Analytics used. | Knowledge about Learning Analytics in Moodle–Virtual Campus and/or in external tools. Use of Learning Analytics. Purposed of Learning Analytics used. Usefulness of Learning Analytics. |
To identify the use that instructors make of digital tools for competency assessment. | List of digital tools used for competency assessment. Digital tools (from Moodle–Virtual Campus or external pages) considered most useful to assess generic competencies. | |
To identify the perception of the usefulness of digital tools for the assessment of competencies by students. | Digital tools (from Moodle–Virtual Campus and from external pages) considered most useful to develop and to assess generic competencies. |
Generic Competencies | Mentions |
---|---|
Teamwork | 8 (50%) |
Creativity and entrepreneurial capability | 1 (6.5%) |
Self-regulated learning and responsibility | 4 (25%) |
Communication abilities | 6 (37.5%) |
Uses | Mentions |
---|---|
Non-teaching, non-assessment | 4 (25.00%) |
Only teaching | 5 (31.25%) |
Specific teaching and assessment | 6 (37.50%) |
Assessment without teaching | 1 (6.3%) |
Reasons | Mentions |
---|---|
Relevant in the professional profile | 3 (18.75%) |
Appears in syllabus | 3 (18.75%) |
It is what the used methodology fosters | 2 (12.50%) |
No reason | 8 (50.00%) |
Purposes | Mentions |
---|---|
No explicit purpose | 1 (6.25%) |
Diagnostic purpose | 0 |
Summative purpose | 6 (37.5%) |
Formative purpose | 3 (18.75%) |
Formative and summative purposes | 6 (37.5%) |
Reasons | Mentions |
---|---|
Continuous assessment | 9 (62.5%) |
Contextualized assessment | 6 (37.5%) |
Authentic assessment | 5 (31%) |
Knowledge of Learning Analytics (by Teachers) | Mentions |
---|---|
Without knowledge | 2 (12.5%) |
They are known but not used | 6 (37.5%) |
Use of Learning Analytics (by teachers) | 0 |
With verified use | 5 (31.25%) |
With regulatory use | 1 (6.25%) |
For a formative use | 2 (12.5%) |
Communication Tools Mentions | Creation Tools Mentions | Moodle Activities Mentions | Quizzes and Interactive Tools Mentions | ||||
---|---|---|---|---|---|---|---|
BBCollaborate | 10 | Google Drive | 2 | Moodle Tasks | 11 | Kahoot | 2 |
Teams | 1 | Quizzes | 11 | Mentimeter | 2 | ||
Skype | 2 | Forums | 3 | Chat | 1 | ||
Lesson | 1 | Youtube | 2 | ||||
Personalized Learning Designer | 1 | ||||||
Workshop | 1 |
Type of Tools | Mentions |
---|---|
Content management (videos, ppt, prices, genially, change, etc.) | 7 (19.4%) |
Participant management or participation: apps that facilitate the creation of workgroups (e.g., Moodle queries), or group function in BBCollaborate, or menti.com that allows interaction | 6 (16.67%) |
Student–content relationship: learning resources that allow the student to process and interact with the contents (tests, lesson, Kahoot…) | 10 (27.78%) |
Student–student relationship: resources that allow direct contact between students, e.g., Moodle forums in “separate groups” mode, resources that allow co-evaluation such as the Moodle workshop | 9 (25%) |
Student-learning management: resources that can “guide” the student in the learning process. For example, using autonomous learning tests for practice, where the student themself can follow their progress | 4 (11.1%) |
Use of Tools | Mentions |
---|---|
Control of attendance/participation rates | 9 |
Monitor the students’ progress | 8 |
Give collective feedback | 7 |
Allow the collective construction of knowledge by students | 11 |
Generic Competencies | Mentions |
---|---|
Teamwork | 12 (80%) |
Creativity and entrepreneurial capability | 9 (60%) |
Self-regulated learning and responsibility | 7 (46.7%) |
Communication abilities | 7 (46.7%) |
Ethical commitment | 5 (30%) |
Sustainability | 1 (6.7%) |
Purpose of Assessment | Mentions |
---|---|
Without awareness of purpose | 1 (6.67%) |
Diagnostic purpose | 0 |
Summative purpose | 10 (66.67%) |
Formative purpose | 0 |
Summative and formative purpose | 4 (26.67%) |
Keywords | Mentions |
---|---|
Knowledge | 7 (46.67%) |
Learning | 4 (26.67%) |
Understanding | 3 (20%) |
Application | 2 (13.33%) |
Continuous evaluation | 1 (6.67%) |
Knowledge of Learning Analytics (by Students) | Frequency of Mentions |
---|---|
Do not know | 10 (66.67%) |
Know | 5 (33.33%) |
Use of Learning Analytics (by Students, if knowing) | Frequency of Mentions |
Verifier use | 2 (13.33%) |
Regulatory use | 3 (20.00%) |
Communication Tools | Mentions | Creation Tools | Mentions | Other Tools | Mentions |
---|---|---|---|---|---|
BBCollaborate | 9 | Google drive | 7 | Youtube | 3 |
Zoom | 5 | Moodle virtual campus | 6 | GitHub | 1 |
5 | Office 365 | 2 | CamScanner | 1 | |
Skype | 4 | Institutional library tools | 2 | ||
2 | OneDrive | 1 | |||
Facetime | 1 | ||||
Discord | 1 | ||||
1 | |||||
Google Meet | 1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cano, E.; Lluch, L.; Grané, M.; Remesal, A. Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics. Trends High. Educ. 2023, 2, 238-254. https://doi.org/10.3390/higheredu2010012
Cano E, Lluch L, Grané M, Remesal A. Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics. Trends in Higher Education. 2023; 2(1):238-254. https://doi.org/10.3390/higheredu2010012
Chicago/Turabian StyleCano, Elena, Laia Lluch, Mariona Grané, and Ana Remesal. 2023. "Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics" Trends in Higher Education 2, no. 1: 238-254. https://doi.org/10.3390/higheredu2010012
APA StyleCano, E., Lluch, L., Grané, M., & Remesal, A. (2023). Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics. Trends in Higher Education, 2(1), 238-254. https://doi.org/10.3390/higheredu2010012