Next Article in Journal
Assessing Service Quality Using SERVQUAL Model: An Empirical Study on Some Private Universities in Bangladesh
Previous Article in Journal
Equity/Equality, Diversity and Inclusion, and Other EDI Phrases and EDI Policy Frameworks: A Scoping Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics

1
Faculty of Education, University of Barcelona, 08035 Barcelona, Spain
2
Faculty of Psychology, University of Barcelona, 08035 Barcelona, Spain
*
Author to whom correspondence should be addressed.
Trends High. Educ. 2023, 2(1), 238-254; https://doi.org/10.3390/higheredu2010012
Submission received: 31 January 2023 / Revised: 24 February 2023 / Accepted: 8 March 2023 / Published: 10 March 2023

Abstract

:
This article reports on a research project on university teaching and learning in the context of pandemics. Sixteen university professors and fifteen bachelor’s degree students were interviewed regarding “emergency e-assessment practices” during the first lockdown semester at a Spanish institution. The research aimed to understand their perception of how generic competencies were being assessed. Data were generated in semi-structured individual interviews. The main findings are: (a) generic competencies are not explicitly considered in e-assessment practices; (b) online assessment practices follow mainly a summative purpose; (c) digital technologies are not considered for the instructional design; (d) both instructors and students lack assessment literacy. Furthermore, there are difficulties in reaching a shared understanding regarding what competency-based assessment means and its implications for daily praxis. The results underline the challenge of using digital technologies for fostering and assessing generic competencies, as well as the need for assessment literacy on both sides, teachers and students.

1. Introduction

Competency-based designs have been common ground in instructional planning documents throughout all Higher Education institutions in Europe since 1999. However, the literature indicates that teaching has not yet fully met the principles for competencies development; in particular, many assessment practices still follow a traditional approach. The implementation of Competency-Based Assessment (CBA) practices, with complex, authentic, real-context tasks [1], is still scarce, both in secondary and in higher education [2]. We find reasons for this lack of alignment at (A) institutional impediments (e.g., in the process of setting learning plans or in the lack of coordination of teaching teams) [3] and (B) instructors’ poor assessment literacy and counterproductive conceptions of assessment [4]. If this was already problematic in face-to-face settings, the emergency changes forced during the first COVID-19 lockdown only emphasized these issues [5].
The COVID-19 pandemic, with the sudden closure of all face-to-face universities, created a new vital and professional scenario [6]. Despite the advancements of the 21st century, many face-to-face universities had not fully implemented online teaching–learning practices by March 2020. New practices may have been incorporated into habitual higher education teaching after the lockdown semester. Learning about these successful cases is particularly interesting. Indeed, we can learn from studies in many different national and cross-national contexts (e.g., [7,8,9]), about how participants at higher education institutions coped with a variety of challenges at cognitive/performance (e.g., [10]), motivational or emotional e.g., [11,12], and institutional levels (e.g., [13,14]).
After the first pandemic semester, universities had to eventually incorporate blended learning, with both synchronous and asynchronous scenarios, devices, and strategies, in anticipation of future short-term lockdowns or even individual confinements. Consequently, assessment practices had to be designed following e-learning principles and not direct transportation of face-to-face practices [15,16]. Synchronous sessions, for example, should adjust duration and contents and favor peer interaction with debates and small-group activities instead of expository discourse. In addition, assessment activities had to integrate the formative purpose of lifelong learning—i.e., assessment for learning—and the peculiarities of e-learning [17]. Therefore, having valuable showcases of high-quality assessment practices can be helpful. Without underestimating other indicators, in this study, we paid attention to students’ and instructors’ perceptions and criteria for good online assessment practices to pursue generic competencies (as established by the University of Barcelona).
During the lockdown semester and the consecutive academic year, instructors’ interests in online assessment sometimes focused more on e-proctoring devices and strategies than promoted formative assessment and self-regulated learning in the virtual context [18,19]. Indeed, we need a change of assessment culture that overcomes the debate on avoiding cheating or possible pitfalls and focuses instead on designing productive (and not reproductive) assessment proposals. First, there is a need to promote technology-supported assessment systems [20,21] aligned with the active methodologies that many instructors already use in face-to-face settings (e.g., problem-based learning, project-based learning, challenge-based learning, flipped classroom, cooperative learning). Second, the pandemics highlighted the importance of developing generic competencies. Although the discourse of generic competencies (also known as soft skills) is common among employers and also frequent in academic discourse, 20 years after the Bologna Declaration, there are still some university faculties who are skeptical about the competencies-based curricular approach [22,23]. Alternatively, they prefer to encourage specific competencies but may not be mindful of generic ones. However, the competency of lifelong learning, responsibility, or communicative competency, among others, have become central in this period.
Previous studies on students’ and instructors’ perspectives have looked into their perceptions of the lockdown teaching experience, mostly regarding the challenges with learning technologies for videoconferencing and asynchronous learning, interpersonal communication, both student–instructor and inter-peer, and teaching and learning activities [24,25,26]. In this study, we intend to focus on a different component, specifically, the promotion of generic competencies and their formative assessment.

1.1. Competency-Based Approach

Due to information and communication technologies, the knowledge society is characterized by the possibility of modifying productive activities and transforming social, cultural, and economic relations within the framework of sustainable development [27]. This paradigm shift goes along with the need to develop lifelong learning skills and adaptability to a rapidly changing world.
In 1999, the European Higher Education Area (EHEA) introduced a change of syllabi and curricular designs based on a competency profile. Even though competency-based designs are standard in the planning documents, research indicates that teaching is not always in line with this competency approach [28]. Indeed, assessment practices still often follow a traditional approach [29].
Following the recommendation of the European Parliament and of the Council of 18 December 2006 on key competencies for lifelong learning, competencies are a “combination of knowledge, skills, and attitudes appropriate to the context” [30] (p. 13). In this recommendation, which was a key reference document for developing competency-based education, member states were asked to develop key competencies for all as part of their lifelong learning strategies. The European Council Recommendation of 22 May 2018 on key competencies for lifelong learning indicates that “people need the right set of skills and competencies to sustain current standards of living, support high rates of employment and foster social cohesion in the light of tomorrow’s society and world of work” [31] (p. 1). Specifically, in higher education, the Communication from the Commission to the European Parliament regarding a renewed EU agenda for higher education confirms that “too many students graduate with poor basic skills (literacy, numeracy, digital) and without the range of generic skills (problem-solving, communication, etc.) they need for resilience in a changing world” [32] (p. 3). This communication asks for all students to acquire advanced generic skills and key competencies that will allow them to thrive: “High-level digital competencies, numeracy, autonomy, critical thinking and a capacity for problem-solving are increasingly crucial attributes” [32] (p. 4).

1.2. How Competencies Could Be Embedded into Higher Education Syllabus

To apply a competency-based approach in Higher Education, some top-down and bottom-up steps need to be undertaken [33,34,35]. From a top-down perspective, the four main steps are:
  • Designing the Degree’s competencies profile, including generic and specific competencies. Several stakeholders intervene in this profiling: external professional standards, government, agencies, professional associations, etc. Each Higher Education institution should list and describe each competency and monitor its accomplishment periodically [36].
  • Creating a rubric with the competency standards to be achieved by the students throughout the Degree. Professional standards could be the framework to draw from so that the maximum achievement in leaving the career is at the initial level of the professional standards of professional competencies [37].
  • Distributing competencies to be achieved through several subjects along the Degree. Specific subjects (e.g., internship, final project, learning-service projects, integrated subjects) are fixed into the syllabus to foster competencies development [38].
  • Establishing a system of competencies qualification embedded in tasks and courses [37].
This institutional initiative needs to be complemented with specific actions to install the competency-based approach. From a bottom-up perspective, teacher in-service training is a must. Teachers need to learn how to design complex tasks [39], how to apply authentic evaluation processes [1], how to write learning outcomes that include content and competencies [40], and how to apply assessment criteria aligned with learning outcomes and competencies [20]. This update allows teachers to actively contribute to constructing the new syllabus, committing realistically to the institutional project, because they can now reflect on what competencies would be desirable and participate in designing proposals that allow such development.

1.3. Assessment of Competencies

A proper assessment of competencies requires the design of complex and authentic tasks, similar to those found in the workplace context and the citizenship activity, and evaluating these tasks with criteria aligned with those competencies. However, implementing competency-based assessment (CBA) practices based on complex, authentic, interdisciplinary tasks [40,41,42] has not yet become widespread. The central features of CBA are quite known: “Assessment exercises should faithfully reflect the main learning aims and should be designed to evoke evidence about learning needs; the main purpose for assessment is the formative purpose […and] the focus of attention is the individual learner” [43] (p. 44) These practices still seem to be infrequent. This could be due both to institutional barriers (e.g., in study plan designs, in the coordination of the teaching team) [36] and to difficulties derived from teachers’ conceptions of assessment and their lack of assessment literacy [4]. If this was already difficult in face-to-face settings, COVID-19 forced overnight change towards online teaching, and this difficulty was only stressed [5].

1.4. Main Aim of the Study

According to the preceding literature review, the main aim of our study was to ascertain to what extent students and instructors perceived and evaluated the incorporation of competence-based learning assessment practices within the emergency-teaching semester during the COVID19 pandemic. Concrete goals are presented in the next section.

2. Materials and Methods

This project adopts a non-experimental descriptive design because it does not intend to modify any independent variable in order to understand how these affect the developed assessment practices. The research follows a mixed methodology: we collected and analyzed quantitative and qualitative data in an integrated way to better understand the object of the study [44]. Therefore, the mixed methodology goes beyond quantitative and qualitative methods per se, as it involves a combination of approaches. Thus, the project collected both quantitative data, which would allow us to explore the assessment practices and teachers’ and students’ perceptions of them, and qualitative data, that would help to understand these practices in-depth and how they impact students’ development of generic competencies, as well as the attribution that students and teachers do between the assessment practices and the development of generic competencies. Results from the quantitative analysis offering a thorough description of assessment practices have been presented elsewhere. In this paper, we want to focus on the qualitative accounts of students and instructors’ experiences and reports of that first semester of lockdown.
The specific objectives of the study were:
i.
To describe the most frequent assessment practices in line with the generic competencies from the teachers’ and students’ perspective.
ii.
To detect the primary purposes of the assessment practices in online teaching–learning environments forced by lockdown from the teachers and students’ perspective.
iii.
To analyze the characteristics of the proposals that both teachers and students consider most beneficial for developing generic competencies.
iv.
To understand the degree of use that both teachers and students make of the Learning Analytics available on the institutional virtual campus (LMS Moodle) and the purpose for which they are used.
v.
To identify the use of digital tools by teachers for competency assessment and the perception of their usefulness to assess generic competencies.
This article will focus on the results of the qualitative data collected and analyzed.

2.1. Sample

The participants of this study were 16 professors and 15 students from nine different Bachelor’s Degrees at the University of XXX: Pharmacy, Archaeology, Primary Education, Computer Engineering, Mathematics, Audio-visual Communications and Media Studies (AVCMS), Psychology, Biology, and Management and Public Administration (MPA).
Different sampling criteria were considered, apart from availability and accessibility. Regarding professors, prior teaching experience in online contexts versus lack thereof was the main selection criterion. As for students, only sophomores to final year students would participate (thus avoiding first-year students without prior university experience) with an average academic record (avoiding outstanding and low achievement students).
With the support of the faculty, students were sent an email invitation to collaborate, and informed consent was requested beforehand from all participants, following responsible research practices. The data are confidential and stored on secure devices. Altogether, sixteen professors and fifteen students were interviewed. Table 1 exposes the demographic composition of the interviewees set. The purpose of presenting these data is only for characterization and contextualization. Personal variables were not considered in the analysis due to the great heterogeneity and low number of participants.

2.2. Procedures

The overall objective is to analyze assessment practices to enhance generic competencies in mixed or blended teaching environments from the perspective of instructors and students. We carried out semi-structured interviews with instructors and students during the second semester of the academic year 2020–2021. A semi-structured interview is proposed based on a guide of initial issues or questions, offering the interviewer the freedom to introduce additional questions to clarify concepts and obtain more information, adjusting the interview flow to the interviewees’ pace [45].
Specific interview scripts—for students and instructors—were designed, grounded in the theoretical framework. The scripts started with preliminary questions to help the participants understand the object of the study. The entire interdisciplinary research team contributed to the review process of the script’s design. Validation and contextual adjustment of each final script were warranted by sending it to one instructor of each Degree (Pharmacy, Archaeology, Primary Education, Computer Engineering, Mathematics, Audio-visual Communications and Media Studies, Psychology, Biology, Management and Public Administration), asking them to provide feedback to ensure intelligibility.
The interview script had different open questions (Table 2). These questions are related to the specific objectives of the research. Primary demographic and identification data were asked from the participants: sex, degree and course (in the case of students); sex, years of teaching experience, degree/s, previous experience of COVID-19 with online teaching (in the case of instructors).
All the interviews were conducted online through video conferences with Blackboard Collaborate (as provided in the virtual campus). Interviews had an average duration of 35 min (lasting from 24 to 55 min) and were recorded, transcribed, and sent back to the interviewees for content validation.
Authors proceeded to a thematic analysis of the transcribed interviews [46] following the constant comparison model of Guba and Lincoln [47]. By constantly comparing the ideas expressed in the interviews, analysts identified codes, which were grouped into categories. All co-authors were equally involved in the process of the analysis of contrasting results until reaching full consensus of initial discrepancies. The content analysis occurred in two basic steps. First, two researchers categorized open data according to classification criteria through peer-review. Then, a third researcher was responsible for reviewing these categorizations and obtaining a definitive classification in case of discrepancies between the two initial researchers. Second, a frequency count was performed for each code. No particular software was used for the qualitative analysis, because the amount of data was manageable, with a common spreadsheet as a tool for organizing the categorization by the analysts.

3. Results

First, we present results concerning the instructors’ perspectives. Secondly, we expose students’ points of view. Finally, in the discussion section, both views are contrasted.

3.1. Results from Instructors’ Perspective

Results are organized with respect to each particular research goal. In short, the main findings are: (a) generic competencies are mainly absent in assessment practices; (b) online assessment practices have a main summative purpose; and (c) teachers’ instructional design does not include digital technologies explicitly. In other words, the main features of CBA seem not to be present. Concrete results are presented below:

3.1.1. Goal 1. To Describe the Most Frequent Assessment Practices in Line with Generic Competencies from the Instructors’ Perspective

When asked about the assessment practices concerning generic competencies; three instructors (19%) spontaneously described such practices, identifying concrete competencies, ten interviewees (62%) responded only after specific pointing by the interviewer, and the last three (19%) would not mention any particular assessment practice, even after the interviewer’s hint. Table 3 presents the referred competencies:
Not only are the generic competencies referred to important, but also how these competencies are considered: Are generic competencies taught and assessed? Are they only taught? Are they assessed without being taught? Table 4 presents the results concerning the embeddedness of competencies into the curricula.
Half of the instructors gave no reason for selecting one or another generic competency (n = 8). Three of them attributed this selection to their relevance in connection with the professional profile; another three pointed to the teaching plan as the formal reason for their selection. Eventually, two instructors declared that the choice of generic competencies was intrinsically linked to the teaching methodology (based on collaborative learning). Table 5 presents these results.

3.1.2. Goal 2. To Detect the Primary Purposes of the Assessment Practices in Lockdown-Forced Online Teaching–Learning Environments from the Instructors’ Perspective

We considered three primary purposes of assessment in the analysis process: diagnostic, summative, and formative. To what extent do the instructors declare one purpose or another? We learn that assessment practices are closely related to assessment conceptions and literacy from instructors’ responses. Remarkably, none of the participants referred to diagnostic purposes, and one of them even did not declare any specific purpose at all. For the rest of the interviewees, the majority declared pursuing summative purposes, either uniquely or together with formative purposes. That is, specific formative purposes were pointed out in only three cases. Table 6 presents such results.

3.1.3. Goal 3. To Analyze the Characteristics of the Proposals That Instructors Consider Most Beneficial to Develop Generic Competencies

Regarding the characteristics of the most useful competency-based assessment practices, instructors emphasize being continuous (n = 9), contextualized (n = 6), and authentic (n = 5) (see Table 7).

3.1.4. Goal 4. To Explore How and for What Purpose Instructors Use the Learning Analytics Resources Available on the Institutional Virtual Campus (LMS Moodle)

The knowledge and use of Learning Analytics tools (LA) appear in Table 8. All but two instructors knew about the existence of LA tools in the virtual campus. However, instructors usually do not use them, describing them as very user-unfriendly (challenging to find, generate, and interpret data) or use them merely to check a student’s activity when particular cases raise some performance doubts.

3.1.5. Goal 5. To Identify Instructors’ Use of Digital Tools for Competency Assessment

Regarding the digital tools used by the instructors in assessment practices, they mainly refer to “tasks”, “quizzes”, and “videoconference” (BB-Collaborate), as included in the virtual campus. To a lesser frequency, instructors mentioned the use of “Forum” (as another tool embedded in the virtual campus), and other external tools such as Kahoot, Mentimeter, Youtube or Google Drive. Table 9 presents such results.
We categorized the digital tools regarding their pedagogical potential, with respect to the promotion of peer interaction, student–instructor interaction or student–content interaction. As the results in Table 10 show, the digital tools were basically used for two purposes: (1) to facilitate the students’ access to the learning content (student–content interaction), which does not, however, necessarily guarantee any active engagement by the student, and (2) to promote peer interaction.
Regarding the use given to digital tools (see Table 11), the priority is the collective construction of knowledge under a constructive perspective of learning. In second place, teachers declare using the digital tools for the purpose of control of attendance and participation. Half of the interviewees do use digital tools to monitor students’ progress and eventually the least frequently mentioned purpose corresponds to the provision of feedback (also related with the least frequent formative purpose of those assessment practices, as presented earlier).

3.2. Results from Students’ Perspective

As we did in the previous section regarding the results from instructors’ interviews, the results concerning students are organized in as many sections as research goals and interview sections.

3.2.1. Goal 1. To Describe the Most Frequent Assessment Practices in Line with Generic Competencies from the Students’ Perspective

The first important result concerning our first research goal points to the fact that students were mostly unaware of the generic competencies. In all cases, the interviewer had to raise the specific question and give some open hints for the students to be able to reflect on them. Therefore, the focus of analysis turned into a different analysis question, namely: “Provided the list of generic competencies set by the institution, which ones are recognized by the students during the interview (as experienced in assessment practices)?”. Indeed, none of the students recognized the concept of ‘generic competency’, but all of them were later able to recognize at least one of them out of the list provided by the interviewer. Table 12 shows results regarding how often each of the generic competencies was recognized by the students in the interview, after explicit hints (reading out a list of generic competencies).
Twelve students (80%) considered working in groups as a typical activity to develop team working skills; however, none of them identified it as a being specifically assessed. In addition, nine students (60%) identified the purpose of developing creativity and entrepreneurship by means of the learning activities they typically have to solve, but again, they did not recognize them as the focus of assessment. Nearly half of the interviewees (n 7; 46.7%) identified the instructors’ interest in the students developing the general competency of self-regulated learning, but only one of these students explicitly recognized his own personal interest in developing such competency and becoming responsible of his learning process. In addition, a scarce 50% (n 7; 46.7%) referred to communicative skills, noticing that clearly the situation of the pandemic with forced online learning has negatively impacted peer interaction and also students–instructor interaction. Only five students identified ethical commitment as a general skill being developed during learning activities, particularly online debates, but still not being the focus of assessment. Finally, only one of the students identified having worked on sustainability; however, her reference to it was aimed at the online learning situation allowing the reduction of consumables.

3.2.2. Goal 2. To Detect the Primary Purposes of the Assessment Practices in Lockdown-Forced Online Teaching–Learning Environments from the Students’ Perspective

Generally, students see the assessment activities as a final action in each subject; that is, they perceive a summative purpose in the evaluation processes, as Table 13 shows. Only four students (out of 15) talked about formative aims.
Students have a critical point of view concerning the relationship between assessment activities and actual learning. They perceive that assessment practices are aimed at proving their actual final learning (summative goal), but they do not perceive them as learning tools (formative goal). By analyzing the terms students used to express their perceptions regarding the purposes of the evaluation (see Table 14), we discover how they are aware of the assessment intentions. Altogether, the interviewed students presented a pessimistic vision of their experience during the lockdown learning period, and many referred to a learning experience focused on contents rather than on the students.

3.2.3. Goal 3. To Analyze the Characteristics of the Proposals That Students Consider Most Beneficial to Develop Generic Competencies

Students’ answers mostly pointed to assessment tasks concerned with contents processing. Eight students (53.3%) talked about assessment activities based on conceptual learning, e.g., conceptual exams, portfolios, presentations. Four students (26.7%) especially highlighted peer interaction and teamwork as an element that has helped them in their learning process, and at the same time to maintain engagement with the academic course. The experience of maintaining teamwork in online learning has been fundamental, although some recognized diversity in the level of peers’ engagement. Eventually, only two students underlined that the support towards self-regulated learning activities helped them learn most.

3.2.4. Goal 4. To Explore How and for What Purpose Students Use Learning Analytics Resources Available on the Institutional Virtual Campus (LMS Moodle)

Only five of the fifteen students (33.3%) had heard of the term learning analytics (LA), so ten undergraduates did not know or had not used the virtual campus options related to learning analytics by the time they were interviewed (see Table 15). Of those knowledgeable students, three used the LA-provided information as a proto-regulation tool for making decisions concerning their study agenda, while two used it only with accountability-verifying purpose.

3.2.5. Goal 5. To Identify Students’ Use of Digital Tools for Competency Assessment

The students referred to the digital tools conducted by instructors and tools they used in their natural communicative environment to carry out teamwork or interact with their peers to solve academic tasks. Concerning digital tools, it is necessary to highlight videoconferencing systems (BBCollaborate, Zoom, GoogleMeet, Discord, Facetime, Skype), which are named more than 20 times in students’ interviews, and messaging systems such as WhatsApp. Other communication systems are email and Facebook, although to a lesser extent. However, the students also mention other tools that helped them work through the first lockdown term. Firstly, collaborative content creation tools such as Google Drive, MSOffice 36,5 and OneDrive, and on the other hand, general tools of the Virtual Campus (LMS Moodle) and those provided by the institutional library system for research and document management. The tools students have used and considered most beneficial in their learning are those that facilitated peer interaction and communication with instructors and those that enabled them to create content collaboratively (see Table 16).

4. Discussion

Our study complements previous reports on diverse aspects concerning the impact of the pandemics on the university teaching and learning processes [6,48,49,50]. Our research focuses on the assessment practices during the pandemics [14]. Results showed that the generic competencies mentioned by instructors were similar to those identified by students. They talked about teamwork (50% of instructors, 80% of students), communication abilities (37.5% of instructors, 46.7% of students), and self-regulated learning and responsibility (25% of instructors, 46.7% of students). Therefore, we argue that these three competencies are the easiest, most common, visible, and instrumental in the competencies assessment practices in higher education. A majority (80%) of the students considered working in groups as a typical activity, but none of them identified it as specifically assessed.
With these data, we could infer that the development of generic competencies is still far from meeting expectations. Indeed, only 37.5% of the interviewed instructors accepted teaching and assessing those generic competencies, 31.25% indicated only teaching those competencies into curricula, 25% reckoned that they were neither teaching nor assessing them, and 6.3% recognized that they assessed without previously teaching those generic competencies. In contrast, despite only 6.5% of instructors mentioning creativity and entrepreneurial ability as a generic competency, most of the students (60%) mentioned it. Furthermore, students added and mentioned generic competencies that were different from those of the instructors: ethical commitment (30%) and sustainability (6.7%).
The second goal of our study was to understand the main purposes of assessment practices during the lockdown online teaching environment. No diagnostic goal appears in either group. While more than half of the interviewed instructors (56.25%) talked about either a formative purpose or a double formative/summative purpose, students mentioned only a summative purpose in 66.67% of cases and a mixed goal in barely 26.67% of cases. Instructors declared an interest in facilitating students’ learning; however, students perceived instructors’ actions as focused on learning content. As a matter of fact, there is a conflict of assessment purposes, as previous research also reports [51]. These findings also resonate with other studies [52], where participants claimed to have a formative conception of assessment, though providing evidence of summative assessment practices. Besides that, students and instructors present different points of view for these purposes [53,54]. For example, students believe that the aim of assessment practices is to prove that they know some content, but they do not perceive them as tools to learn, better understand, or apply learning outside the “class-box”. These results are similar to previous studies that reported on simplistic assessment practices during the pandemics, mainly focused on multiple-choice tests and proctoring measures [14,15,18].
When further investigating participants’ interpretations of assessment experiences during the pandemics, according to the third research goal, students talked about assessment activities based on conceptual learning (53.3%)—such as conceptual exams and presentations—and they highlighted peer interaction and teamwork (26.7%); instructors, in contrast, emphasized continuous assessment (62.5%), and, to a lesser extent, context-driven assignments (37.5%) and authentic assessment (31%). Although it might be considered a positive result to have instructors underlining continuous assessment, it is essential to remember that “continuous assessment” corresponds to the official (administrative) requirement of the University of Barcelona. Hence, it could be seen as a formal requirement and a discursive facade, often touching the ground as a series of fragmented summative practices. Some instructors seem to have a greater understanding of competency-based assessment. In fact, those who picked up authentic assessment as one feature of their assessment practices are those who have highlighted the competencies of the degree profile. Therefore, their assessment practices seem to be more likely to achieve this profile.
Our fourth research goal was on the knowledge, use, and purpose of the learning analytics (LA) available to participants on the Virtual Campus. On the one hand, results show that all teachers declared knowing what LA are. However, 37.5% of teachers rejected using LA due to an unfriendly interface, as they reported, which hinders the drawing and interpretation of data. This explains why most students (66.67%) did not even know about the LA tools. On the other hand, those teachers using LA mentioned a verification goal to check (31.25%) students’ activity only in particular doubtful cases. Students also mentioned this use, but to a lower extent (13.33%). Nevertheless, students also mentioned a somehow regulatory use of 33.33% for deciding on their study agenda.
Finally, as assessment tools, Moodle activities such as tasks, quizzes, and a video-conference platforms were most frequently referred to by both collectives, students and instructors. However, while instructors mentioned virtual campus activities and quizzes and interactive tools (e.g., Kahoot), students more often mentioned communication tools they used in their natural communicative environment to carry out teamwork or interact with their peers for solving academic tasks. Creation tools (e.g., Google Drive) and other tools (e.g., YouTube) as external tools (off the institutional virtual campus) were selected by students, contrasting with instructors, who talked about virtual campus activities as tools to assess competencies.
Furthermore, these results are related to the use of digital tools in virtual teaching and learning, which is also permeated by the so-called digital competences [55]. As these authors report, digital competencies were indeed scarce among the members of faculties at the time of lockdown. Results indicate that, most frequently (62.5%), these tools focus on student–content relationships (as resources that allow the students to process and interact with the content, such as quizzes and Kahoot, among others). Additionally, digital tools were frequently used for the collective construction of knowledge (68.75%). Moreover, on a minority frequency, 25% of teachers mentioned a student-learning management goal (e.g., autonomous learning tests for practice, where the students themselves can monitor their progress).
As these results show, the digital tools have been mainly used by students to access learning content, which does not necessarily guarantee active learning engagement. However, to a lower extent, instructors reported on the relationship between students and mentioned tools that allow direct contact between peers. Furthermore, according to the students’ point of view, when they were asked about the digital tools used, they mainly mentioned tools considered most beneficial in their learning, such as those that allowed them to communicate with other participants, peers, and instructors, and those that enabled them to create content collaboratively.
The principal limitation of this study is the small sample size. A representative sample regarding the different Bachelors’ Degrees at the University of Barcelona could not be ensured. Participants decided to take part on a voluntary basis, and positive reactions to the callings to participate were rare during the lockdown semester and the subsequent semester in the following academic course. Another limitation is that the results refer to self-reported perceptions, not actual behaviors. As such, interpretations must be coherently cautious.
The results of our study not only describe what happened during the pandemics but also seem to be consistent with previous findings in the literature. In fact, they offer an essential reflection concerning habitual teaching and learning practices at different disciplines happening before the pandemic [29,37,38]. Teaching practices developed in online teaching environments in the days of COVID-19 have merely transferred what instructors were already doing offline. Furthermore, difficulties in gaining a common understanding regarding what competency-based assessment means and its implications had already been pointed out by Gulikers et al. [36]. Moreover, it seems that technologies do not imply a fundamental change at the pedagogical level. Digital tools remain at a superficial, non-reflexive use. Instructors want to have easier management and faster reporting, but real pedagogical innovation is not actually implemented [56,57].
Moreover, the purposes of the assessment practices reported in our study show a summative use, primarily, and not a formative purpose (lacking diagnostic evaluation, missing students’ engagement with assessment criteria, and losing the opportunity to reflect on feedback to guide the student in the learning process). Perhaps instructors felt over-challenged through the sudden lockdown and worried about students’ structural and personal conditions. Thus, tracking student access and providing opportunities to interact appeared to be their main concerns to combat a likely digital gap [58]. Another aspect to consider is the lack of literacy in assessment [59,60]. The need for assessment literacy has been systematically claimed in the recent literature [61,62].
To the usual difficulties of competency-based assessment [38], the problems derived from the lockdown situation must be added. In that context, the alignment between the tasks, the assessment criteria, and the learning outcomes, which include competencies, has been even harder.

5. Conclusions

The main concerns of instructors during the time of COVID-19 seemed to be as follows: to avoid the digital gap and ensure student connectivity; to track students attending online classes; and to encourage student interaction. The effort focused on avoiding cheating and installing e-proctoring measures regarding the assessment process. Due to the emergency, assessment practices were direct adaptations of the usual assessment strategies, not designed to match online instructional designs. The most frequent device was online quizzes, with multiple-choice questions under strict time control, looking for the most objective and reliable data.
Our findings in the context of COVID-19 appear to be quite similar to those of the previous literature. Nevertheless, we believe that a deeper understanding of the processes underlying the challenges regarding the assessment of generic competencies has been achieved. Additionally, concerning learning analytics, this research provides academic implications with the need to continue researching its formation and use, with a more formative meaning to contribute to the self-regulation of learning. COVID-19 led to the sudden challenge of demanding changes, but eventually, teachers replicated face-to-face practices and focused on proctoring strategies. The main difficulties in this process are explained and discussed in the theoretical framework, and the implications for future CBA in online learning designs are discussed.

Author Contributions

Conceptualization, E.C., L.L., M.G. and A.R.; methodology, L.L. and M.G.; validation, E.C., L.L., M.G. and A.R.; formal analysis, E.C., L.L., M.G. and A.R.; investigation, E.C. resources, E.C.; data curation, L.L.; writing—original draft preparation, E.C.; writing—review and editing, M.G. and A.R.; visualization, L.L.; supervision, E.C.; project administration, E.C.; funding acquisition, E.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the University of Barcelona, grant number REDICE20-2380.

Institutional Review Board Statement

Ethical review and approval were waived for this study due to anonymity of personal data.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data supporting the reported results can be requested from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest. The funder had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Villarroel, V.; Bruna, D. ¿Evaluamos lo que realmente importa? El desafío de la Evaluación Auténtica en Educación Superior. Calid. En La Educ. 2019, 50, 492–509. [Google Scholar] [CrossRef] [Green Version]
  2. Krstikj, A.; Sosa, J.; García Bañuelos, L.; González Peña, O.I.; Quintero, H.N.; Urbina, P.D.; Vanoye, A.Y. Analysis of Competency Assessment of Educational Innovation in Upper Secondary School and Higher Education: A Mapping Review. Sustainability 2022, 14, 8089. [Google Scholar] [CrossRef]
  3. Sánchez Santamaría, J. Evaluación de los aprendizajes universitarios: Una comparación sobre sus posibilidades y limitaciones en el Espacio Europeo de Educación Superior. Rev. De Form. E Innovación Educ. Univ. (REFIEDU) 2011, 4, 40–54. [Google Scholar]
  4. Baughan, P.; Carless, D.R.; Moody, J.; Stoakes, G. Re-considering assessment and feedback practices in light of the COVID-19 pandemic. In On Your Marks: Learner-Focused Feedback Practices and Feedback Literacy; Baughan, P., Ed.; Advance in Higher Education: York, UK, 2020; pp. 171–191. [Google Scholar]
  5. García-Peñalvo, F.J.; Corell, A.; Abella, V.; Grande, M. Online assessment in higher education in the time of COVID-19. Educ. Knowl. Soc. 2020, 21, 26. [Google Scholar] [CrossRef]
  6. Pokhrel, S.; Chhetri, R. A literature review on impact of COVID-19 pandemic on teaching and learning. High. Educ. Future 2021, 8, 133–141. [Google Scholar] [CrossRef]
  7. Jena, P.K. Impact of COVID-19 on higher education in India. Int. J. Adv. Educ. Res. (IJAER) 2020, 77–81. [Google Scholar]
  8. Yau, A.H.Y.; Yeung, M.W.L.; Lee, C.Y.P. A co-orientation analysis of teachers’ and students’ perceptions of online teaching and learning in Hong Kong higher education during the COVID-19 pandemic. Stud. Educ. Eval. 2022, 72, 101128. [Google Scholar] [CrossRef]
  9. Oliveira, G.; Teixeira, J.G.; Torres, A.; Morais, C. An exploratory study on the emergency remote education experience of higher education students and teachers during the COVID-19 pandemic. Br. J. Educ. Technol. 2021, 52, 1357–1376. [Google Scholar] [CrossRef]
  10. Gonzalez, T.; De La Rubia, M.A.; Hincz, K.P.; Comas-Lopez, M.; Subirats, L.; Fort, S.; Sacha, G.M. Influence of COVID-19 confinement on students’ performance in higher education. PLoS ONE 2020, 15, e0239490. [Google Scholar] [CrossRef]
  11. Jehl, T.; Khan, R.; Dos Santos, H.; Majzoub, N. Effect of COVID-19 outbreak on anxiety among students of higher education; A review of literature. Curr. Psychol. 2022. [Google Scholar] [CrossRef]
  12. Piyatamrong, T.; Derrick, J.; Nyamapfene, A. Technology-Mediated Higher Education Provision during the COVID-19 Pandemic: A Qualitative Assessment of Engineering Student Experiences and Sentiments. J. Eng. Educ. Transform. 2021, 34, 290–297. [Google Scholar] [CrossRef]
  13. Karademir, A.; Yaman, F.; Saatçioglu, Ö. Challenges of Higher Education Institutions against COVID-19: The Case of Turkey. J. Pedagog. Res. 2020, 4, 453–474. [Google Scholar] [CrossRef]
  14. Guangul, F.M.; Suhail, A.H.; Khalit, M.I.; Khidir, B.A. Challenges of remote assessment in higher education in the context of COVID-19: A case study of Middle East College. Educ. Assess. Eval. Account. 2020, 32, 519–535. [Google Scholar] [CrossRef] [PubMed]
  15. Şenel, S.; Şenel, H.C. Remote assessment in higher education during COVID-19 pandemic. Int. J. Assess. Tools Educ. 2021, 8, 181–199. [Google Scholar] [CrossRef]
  16. Montenegro, M.; Luque, A.; Sarasola, J.L.; Fernández-Cerero, J. Assessment in Higher Education during the COVID-19 Pandemic: A Systematic Review. Sustainability 2021, 13, 10509. [Google Scholar] [CrossRef]
  17. Cano, E. Retos de futuro en la evaluación por competencias. In Evaluación por Competencias: La Perspectiva de las Primeras Promociones de Graduados en el EEES; Cano, E., Fernández, M., Eds.; Octaedro: Barcelona, Spain, 2016; pp. 139–148. [Google Scholar]
  18. Tuah, N.A.A.; Naing, L. Is online assessment in higher education institutions during COVID-19 pandemic reliable? Siriraj Med. J. 2021, 73, 61–68. [Google Scholar] [CrossRef]
  19. Bakhmat, L.; Babakina, O.; Belmaz, Y. Assessing online education during the COVID-19 pandemic: A survey of lecturers in Ukraine. J. Phys. Conf. Ser. 2021, 1840, 012050. [Google Scholar] [CrossRef]
  20. Biggs, J. Using assessment strategically to change the way students learn. In Assessment Matters in Higher Education. Choosing and Using Diverse Approaches; Brown, S., Glasner, A., Eds.; Society for Research into Higher Education: Buckingham, UK, 2003; pp. 41–54. [Google Scholar]
  21. St-Onge, C.; Ouellet, K.; Lakhal, S.; Dubé, T.; Marceau, M. COVID-19 as the tipping point for integrating e-assessment in higher education practices. Br. J. Educ. Technol. 2022, 53, 349–366. [Google Scholar] [CrossRef]
  22. Quinlan, K.M.; Pitt, E. Towards signature assessment and feedback practices: A taxonomy of discipline-specific elements of assessment for learning. Assess. Educ. Princ. Policy Pract. 2021, 28, 191–207. [Google Scholar] [CrossRef]
  23. Pitt, E.; Quinlan, K.M. Signature assessment and feedback practices in the disciplines. Assess. Educ. Princ. Policy Pract. 2021, 28, 97–100. [Google Scholar] [CrossRef]
  24. Khan, S.; Kambris, M.E.K.; Alfalahi, H. Perspectives of University Students and Faculty on remote education experiences during COVID-19—A qualitative study. Educ. Inf. Technol. 2022, 27, 4141–4169. [Google Scholar] [CrossRef] [PubMed]
  25. Iqbal, S.A.; Ashiq, M.; Rehman, S.U.; Rashid, S.; Tayyab, N. Students’ perceptions and experiences of online education in Pakistani Universities and Higher Education Institutes during COVID-19. Educ. Sci. 2022, 12, 166. [Google Scholar] [CrossRef]
  26. Maatuk, A.M.; Elberkawi, E.K.; Aljawarneh, S.; Rashaideh, H.; Alharbi, H. The COVID-19 pandemic and E-learning: Challenges and opportunities from the perspective of students and instructors. J. Comput. High. Educ. 2022, 34, 21–38. [Google Scholar] [CrossRef] [PubMed]
  27. Bindé, J. Hacia las Sociedades del Conocimiento: Informe Mundial de la UNESCO; UNESCO: London, UK, 2015. [Google Scholar]
  28. Strijbos, J.; Engels, N.; Struyven, K. Criteria and standards of generic competencies at bachelor degree level: A review study. Educ. Res. Rev. 2015, 14, 18–32. [Google Scholar] [CrossRef]
  29. Boud, D.; Falchikov, N. Aligning assessment with long-term learning. Assess. Eval. High. Educ. 2006, 31, 399–413. [Google Scholar] [CrossRef] [Green Version]
  30. European Parliament. Recommendation of the European Parliament and of the Council of 18 December 2006 on Key Competencies for Lifelong Learning; European Parliament: Strasbourg, France, 2006. [Google Scholar]
  31. European Council. Council Recommendation of 22 May 2018 on Key Competencies for Lifelong Learning; European Council: Brussels, Belgium, 2018. [Google Scholar]
  32. European Commission. Communication from the Commission to the European Parliament, The Council, The European Economic and Social Committee and the Committee of the Regions on a Renewed EU Agenda for Higher Education; COM/2017/0247; European Commission: Brussels, Belgium, 2017. [Google Scholar]
  33. De Miguel, M. Modalidades de Enseñanza Centradas en el Desarrollo de Competencias. Orientaciones para Promover el Cambio Metodológico en el EEES; Ministerio de Educación y Ciencia/Universidad de Oviedo: Madrid, Spain, 2005. [Google Scholar]
  34. Strijbos, J.W.; Narciss, S.; Dünnebier, K. Peer feedback content and sender’s competency level in academic writing revision tasks: Are they critical for feedback perceptions and efficiency? Learn. Instr. 2010, 20, 291–303. [Google Scholar] [CrossRef] [Green Version]
  35. García-San Pedro, M.J.; Gairín, J. Los mapas de competencias: Una herramienta para mejorar la calidad de la formación universitaria. Rev. Electrónica De Investig. En Cienc. Económicas REICE 2011, 9, 84–102. [Google Scholar]
  36. Gulikers, J.T.M.; Baartman, L.K.J.; Biemans, H.J.A. Facilitating Evaluations of Innovative, Competency-based Assessments: Creating Understanding and Involving Multiple Stakeholders. Eval. Program Plan. 2010, 33, 120–127. [Google Scholar] [CrossRef]
  37. Tejada, J.; Ruiz, C. Evaluación de competencias profesionales en educación superior: Retos e implicaciones. Educ. XXI 2016, 19, 17–38. [Google Scholar]
  38. Cano, E. Presentación del monográfico: Evaluación por Competencias en la Educación Superior: Buenas Prácticas ante los Actuales Retos. Rev. Iberoam. De Evaluación Educ. 2019, 12, 5–8. [Google Scholar]
  39. Ibarra, M.S.; Rodríguez-Gómez, G.; Boud, D. The quality of assessment tasks as a determinant of learning. Assess. Eval. High. Educ. 2020, 46, 943–955. [Google Scholar] [CrossRef]
  40. Kennedy, D. Writing and Using Learning Outcomes. A Practical Guide; University College Cork: Cork, Ireland, 2007. [Google Scholar]
  41. Monereo, C. La evaluación del conocimiento estratégico a través de tareas auténticas. Pensam. Educ. Rev. De Investig. Latinoam. (PEL) 2003, 32, 71–89. [Google Scholar]
  42. Trujillo, F. Competencia en Comunicación Lingüística Nunha Europa Plurilingüe e Pluricultural. Ensinanza de Linguas no Contexto Europeo: Tendencias e Propostas. November 2008. Available online: https://docplayer.es/52379208-Competencia-en-comunicacion-linguistica-nunha-europa-plurilingue-e-pluricultural.html (accessed on 14 September 2021).
  43. Black, P.; Wiliam, D. Assessment and classroom learning. Assess. Educ. Princ. Policy Pract. 1998, 5, 7–74. [Google Scholar] [CrossRef]
  44. Creswell, J.W. Controversies in mixed methods research. In The Sage Handbook of Qualitative Research; Denzin, N.K., Lincoln, Y.S., Eds.; SAGE Publishers: Thousand Oaks, CA, USA, 2011; pp. 269–284. [Google Scholar]
  45. Hernández-Sampieri, R.; Mendoza, C. Metodología de la Investigación. Las Rutas Cuantitativa, Cualitativa y Mixta; McGraw-Hill: Ciudad de México, México, 2018. [Google Scholar]
  46. Braun, V.; Clarke, V. Thematic analysis. In APA Handbook of Research Methods in Psychology, Vol 2: Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological; Cooper, H., Camic, P.M., Long, D.L., Panter, A.T., Rindskopf, D., Sher, K.J., Eds.; American Psychological Association: New York, NY, USA, 2012; pp. 57–71. [Google Scholar] [CrossRef]
  47. Guba, E.; Lincoln, Y. Epistemological and Methodological Bases of Naturalistic Inquiry. Educ. Commun. Technol. 1982, 30, 233–252. [Google Scholar] [CrossRef]
  48. Gómez-Poyato, M.J.; Eito, A.; Mira, D.C.; Matías, A. Digital Skills, ICTs and Students’ Needs: A Case Study in Social Work Degree, University of Zaragoza (Aragón-Spain). Educ. Sci. 2022, 12, 443. [Google Scholar] [CrossRef]
  49. Del Arco, I.; Flores, Ò.; González-Rubio, J.; Araneda, D.S.; Olivos, C.L. Workloads and Emotional Factors Derived from the Transition towards Online and/or Hybrid Teaching among Postgraduate Professors: Review of the Lessons Learned. Educ. Sci. 2022, 12, 666. [Google Scholar] [CrossRef]
  50. Ramos, A.; Reese, L.; Arce, C.; Balladares, J.; Fiallos, B. Teaching Online: Lessons Learned about Methodological Strategies in Postgraduate Studies. Educ. Sci. 2022, 12, 688. [Google Scholar] [CrossRef]
  51. Remesal, A. Primary and secondary teachers’ conceptions of assessment: A qualitative study. J. Teach. Teach. Educ. 2011, 27, 472–482. [Google Scholar] [CrossRef]
  52. Fernández-Ruiz, J.; Panadero, E. Comparison between conceptions and assessment practices among secondary education teachers: More differences than similarities. J. Study Educ. Dev. 2020, 43, 309–346. [Google Scholar] [CrossRef]
  53. Brown, G.T.L. Student Conceptions of Assessment: Regulatory Responses to Our Practices. ECNU Rev. Educ. 2021, 5, 116–139. [Google Scholar] [CrossRef]
  54. Juan, N.; Villach, M.J.R.; Remesal, A.; De Salvador, N. Qué dificultades perciben los futuros maestros y sus profesores acerca del feedback recibido durante el trabajo final de grado. Perspect. Educ. Form. De Profr. 2018, 57, 24–49. [Google Scholar]
  55. Portillo, J.; Garay, U.; Tejada, E.; Bilbao, N. Self-Perception of the Digital Competence of Educators during the COVID-19 Pandemic: A Cross-Analysis of Different Educational Stages. Sustainability 2020, 12, 10128. [Google Scholar] [CrossRef]
  56. Bhagat, K.K.; Spector, J.M. International Forum of Educational Technology & Society Formative Assessment in Complex Problem-Solving Domains: The Emerging Role of Assessment Technologies. J. Educ. Technol. Soc. 2017, 20, 312–317. [Google Scholar]
  57. Pinto, M.; Leite, C. Digital technologies in support of students learning in Higher Education: Literature review. Digit. Educ. Rev. 2020, 37, 343–360. [Google Scholar] [CrossRef]
  58. Trujillo FFernández-Navas, M.; Montes, M.; Segura, A.; Alaminos, F.J.; Postigo, A.Y. Panorama de la Educación en España tras la Pandemia de COVID-19: La Opinión de la Comunidad Educativa; Fundación de Ayuda contra la Drogadicción (Fad): Madrid, Spain, 2020. [Google Scholar]
  59. Looney, A.; Cumming, J.; Van Der Kleij, F.; Harris, K. Reconceptualising the role of teachers as assessors: Teacher assessment identity. Assess. Educ. Princ. Policy Pract. 2018, 25, 442–467. [Google Scholar] [CrossRef]
  60. Jiang, L.; Yu, S. Understanding changes in EFL teahers’ feedback practice during COVID-19: Implications for teacher feedback literacy at a time of crisis. Asia-Pac. Educ. Res. 2021, 30, 509–518. [Google Scholar] [CrossRef]
  61. Winstone, N.E.; Nash, R.A.; Parker, M.; Rowntree, J. Supporting Learners’ Agentic Engagement with Feedback: A Systematic Review and a Taxonomy of Recipience Processes. Educ. Psychol. 2017, 52, 17–37. [Google Scholar] [CrossRef] [Green Version]
  62. Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef] [Green Version]
Table 1. Demographic characterization of participants.
Table 1. Demographic characterization of participants.
CharacteristicsInstructorsStudents
Age
18–25-15
40–5010-
50+6-
Sex
Female108
Male67
Academic course
Second (4th semester)NA8
Fourth (8th semester)NA7
Discipline
Archaeology/History22
Audiovisual Communication21
Biology22
Computer Engineering-1
Management and Public
Administration
21
Mathematics22
Pharmacy21
Psychology42
Teacher Education-1
Table 2. Interview script dimensions aligned with the specific objectives.
Table 2. Interview script dimensions aligned with the specific objectives.
Specific ObjectivesDimensions of the Students’ Interview ScriptDimensions of the Instructors’ Interview Script
To describe the assessment practices mostly used in line with the generic competencies from the perspective of the instructors and the students.Type of assessment tasks undertaken.
Differences between assessment task before and during COVID-19 period.
Knowledge about generic competencies.
Relation between the assessment tasks and the development of generic competencies.
Assessment tasks proposed.
Differences between assessment task before and during COVID-19 period.
Information transferred to students about generic competencies at the beginning of the course.
Work on generic competencies.
To understand the main purposes of the assessment practices carried out in lockdown-forced online teaching environments.Main purposes of the assessment practices developed.
Differences between main purposes of the assessment practices before and during COVID-19 period.
Main concerns about the assessment online process.
Main purposes of the assessment practices designed.
Differences between main purposes of the assessment practices before and during COVID-19 period.
Main concerns about the assessment online process.
To analyze the characteristics of the proposals that both instructors and students consider most useful to develop generic competencies.Description of the different assessment tasks performed during the mixed teaching period.
Characteristics of the assessment tasks developed considered most useful to develop generic competencies.
Description of an assessment task considered especially good and successful during the mixed teaching period, and why it is considered useful to develop generic competencies.
Characteristics of the assessment tasks designed considered most useful to develop generic competencies.
To explore how and for what purpose both instructors and students use Learning Analytics resources available on the Virtual Campus.Knowledge about Learning Analytics.
Use of Learning Analytics.
Purposed of Learning Analytics used.
Knowledge about Learning Analytics in Moodle–Virtual Campus and/or in external tools.
Use of Learning Analytics.
Purposed of Learning Analytics used.
Usefulness of Learning Analytics.
To identify the use that instructors make of digital tools for competency assessment. List of digital tools used for competency assessment.
Digital tools (from Moodle–Virtual Campus or external pages) considered most useful to assess generic competencies.
To identify the perception of the usefulness of digital tools for the assessment of competencies by students.Digital tools (from Moodle–Virtual Campus and from external pages) considered most useful to develop and to assess generic competencies.
Table 3. Instructors mentioning generic competencies (n = 16). Frequency of mentions.
Table 3. Instructors mentioning generic competencies (n = 16). Frequency of mentions.
Generic Competencies Mentions
Teamwork 8 (50%)
Creativity and entrepreneurial capability 1 (6.5%)
Self-regulated learning and responsibility 4 (25%)
Communication abilities 6 (37.5%)
Table 4. To what extent generic competencies are embedded into curricula by instructors (n = 16). Frequency of mentions.
Table 4. To what extent generic competencies are embedded into curricula by instructors (n = 16). Frequency of mentions.
Uses Mentions
Non-teaching, non-assessment 4 (25.00%)
Only teaching 5 (31.25%)
Specific teaching and assessment 6 (37.50%)
Assessment without teaching 1 (6.3%)
Table 5. Instructors’ reasons for competencies’ choice (n = 16). Frequency of mentions.
Table 5. Instructors’ reasons for competencies’ choice (n = 16). Frequency of mentions.
Reasons Mentions
Relevant in the professional profile 3 (18.75%)
Appears in syllabus 3 (18.75%)
It is what the used methodology fosters 2 (12.50%)
No reason 8 (50.00%)
Table 6. Purposes of assessment practices from instructors’ view (n = 16). Frequency of mentions.
Table 6. Purposes of assessment practices from instructors’ view (n = 16). Frequency of mentions.
PurposesMentions
No explicit purpose1 (6.25%)
Diagnostic purpose0
Summative purpose6 (37.5%)
Formative purpose3 (18.75%)
Formative and summative purposes6 (37.5%)
Table 7. Characteristics of instructors’ competency-based proposals (n = 16). Frequency of mentions.
Table 7. Characteristics of instructors’ competency-based proposals (n = 16). Frequency of mentions.
ReasonsMentions
Continuous assessment9 (62.5%)
Contextualized assessment6 (37.5%)
Authentic assessment5 (31%)
Table 8. To what extent LA are known and used by instructors (n = 16). Frequency of mentions.
Table 8. To what extent LA are known and used by instructors (n = 16). Frequency of mentions.
Knowledge of Learning Analytics (by Teachers)Mentions
Without knowledge2 (12.5%)
They are known but not used6 (37.5%)
Use of Learning Analytics (by teachers)0
With verified use5 (31.25%)
With regulatory use1 (6.25%)
For a formative use2 (12.5%)
Table 9. Digital tools used by instructors for competency assessment (n = 16). Frequency of mentions.
Table 9. Digital tools used by instructors for competency assessment (n = 16). Frequency of mentions.
Communication Tools
Mentions
Creation Tools
Mentions
Moodle Activities
Mentions
Quizzes and Interactive Tools
Mentions
BBCollaborate10Google Drive2Moodle Tasks11Kahoot 2
Teams1 Quizzes11Mentimeter 2
Skype 2 Forums3Chat1
Lesson 1Youtube 2
Personalized Learning Designer1
Workshop1
Table 10. Categorization of the pedagogical potential of digital tools as used by instructors (n = 16). Frequency of mentions.
Table 10. Categorization of the pedagogical potential of digital tools as used by instructors (n = 16). Frequency of mentions.
Type of Tools Mentions
Content management (videos, ppt, prices, genially, change, etc.)7 (19.4%)
Participant management or participation: apps that facilitate the creation of workgroups (e.g., Moodle queries), or group function in BBCollaborate, or menti.com that allows interaction6 (16.67%)
Student–content relationship: learning resources that allow the student to process and interact with the contents (tests, lesson, Kahoot…)10 (27.78%)
Student–student relationship: resources that allow direct contact between students, e.g., Moodle forums in “separate groups” mode, resources that allow co-evaluation such as the Moodle workshop9 (25%)
Student-learning management: resources that can “guide” the student in the learning process. For example, using autonomous learning tests for practice, where the student themself can follow their progress4 (11.1%)
Table 11. Digital tools’ use from the instructors’ side (n = 16). Frequency of mentions.
Table 11. Digital tools’ use from the instructors’ side (n = 16). Frequency of mentions.
Use of ToolsMentions
Control of attendance/participation rates9
Monitor the students’ progress8
Give collective feedback7
Allow the collective construction of knowledge by students11
Table 12. Students’ mention of generic competencies (n = 15). Frequency of mentions.
Table 12. Students’ mention of generic competencies (n = 15). Frequency of mentions.
Generic CompetenciesMentions
Teamwork12 (80%)
Creativity and entrepreneurial capability9 (60%)
Self-regulated learning and responsibility7 (46.7%)
Communication abilities7 (46.7%)
Ethical commitment5 (30%)
Sustainability1 (6.7%)
Table 13. Purpose of assessment from students’ point of view (n = students). Frequency of mentions.
Table 13. Purpose of assessment from students’ point of view (n = students). Frequency of mentions.
Purpose of AssessmentMentions
Without awareness of purpose1 (6.67%)
Diagnostic purpose0
Summative purpose10 (66.67%)
Formative purpose0
Summative and formative purpose4 (26.67%)
Table 14. Keywords from students’ interviews about assessment purposes (n = 15). Frequency of mentions.
Table 14. Keywords from students’ interviews about assessment purposes (n = 15). Frequency of mentions.
KeywordsMentions
Knowledge7 (46.67%)
Learning4 (26.67%)
Understanding3 (20%)
Application2 (13.33%)
Continuous evaluation1 (6.67%)
Table 15. Knowledge and use of Learning Analytics (n = 15 students).
Table 15. Knowledge and use of Learning Analytics (n = 15 students).
Knowledge of Learning Analytics (by Students)Frequency of Mentions
Do not know10 (66.67%)
Know5 (33.33%)
Use of Learning Analytics (by Students, if knowing)Frequency of Mentions
Verifier use2 (13.33%)
Regulatory use3 (20.00%)
Table 16. Helpful digital tools used by students (n = 15). Frequency of mentions.
Table 16. Helpful digital tools used by students (n = 15). Frequency of mentions.
Communication ToolsMentionsCreation ToolsMentionsOther ToolsMentions
BBCollaborate9Google drive7Youtube3
Zoom5Moodle virtual campus6GitHub1
WhatsApp5Office 3652CamScanner1
Skype 4Institutional library tools2
eMail2OneDrive1
Facetime1
Discord1
Facebook1
Google Meet1
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cano, E.; Lluch, L.; Grané, M.; Remesal, A. Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics. Trends High. Educ. 2023, 2, 238-254. https://doi.org/10.3390/higheredu2010012

AMA Style

Cano E, Lluch L, Grané M, Remesal A. Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics. Trends in Higher Education. 2023; 2(1):238-254. https://doi.org/10.3390/higheredu2010012

Chicago/Turabian Style

Cano, Elena, Laia Lluch, Mariona Grané, and Ana Remesal. 2023. "Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics" Trends in Higher Education 2, no. 1: 238-254. https://doi.org/10.3390/higheredu2010012

APA Style

Cano, E., Lluch, L., Grané, M., & Remesal, A. (2023). Competency-Based Assessment Practices in Higher Education: Lessons from the Pandemics. Trends in Higher Education, 2(1), 238-254. https://doi.org/10.3390/higheredu2010012

Article Metrics

Back to TopTop