Next Article in Journal
A User-Centered Evaluation of a VR HMD-Based Harvester Training Simulator
Previous Article in Journal
A Mixed Reality Tool with Automatic Speech Recognition for 3D CAD Based Visualization and Automatic Dimension Generation in the Industry 5.0 Shipyard
Previous Article in Special Issue
Mapping Blended Learning Activities to Students’ Digital Competence in VET
 
 
Due to scheduled maintenance work on our servers, there may be short service disruptions on this website between 11:00 and 12:00 CEST on March 28th.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An HCI-Centered Experiences of ICT Integration and Its Impact on Professional Competencies Supporting Formative Assessment in Higher Education e-Learning

1
Faculty of Educational Sciences, Mohammed V University, Rabat 10000, Morocco
2
Faculty of Sciences, Mohammed V University, Rabat 10000, Morocco
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2026, 10(2), 14; https://doi.org/10.3390/mti10020014
Submission received: 23 November 2025 / Revised: 21 January 2026 / Accepted: 30 January 2026 / Published: 2 February 2026

Abstract

As universities expand their e-learning systems, it becomes increasingly important to understand how the use of information and communication technologies (ICTs) changes the skills needed for effective formative assessment. This study uses the principles of human–computer interaction (HCI) to create a framework for examining how digital tools, interfaces, and modes of interaction influence the way teachers assess students in higher education. The research relies on the information provided by 115 Mohammed V University teachers, who filled out a competency-based assessment grid regarding online assessment practices. The results remain exploratory and context-dependent and do not make claims of statistical representativeness beyond the studied institutional context. The findings attest to the virtues of digital technology in improving methodological and techno-pedagogical skills, without excluding the existence of serious shortcomings in semio-ethical and evaluative skills. It is certainly useful to leverage feedback to correct imperfections in evaluation practices and make them more responsive to digital interfaces. It is becoming imperative to rethink professional skills as the regulatory halo of the online formative assessment system, in order to evaluate a more synergistic framework that can give better visibility to virtual classrooms.

1. Introduction

Advancements in technology have led to creative pedagogical models and services for diverse learners; however, assessment continues to be a limitation within distance and digital out mediated higher education. Online and formative e-assessment, which began to be applied due to massification, boosted by the COVID-19 pandemic, is now a crucial part of hybrid learning environments as it provides opportunities for real-time monitoring and support of student’s learning through ICT. However, its effectiveness is still limited by the enduring problems of pedagogical design, interaction quality, ethical practices, and the importance of providing impactful, actionable feedback. Unless the re-conceptualization of assessment models include the strategic use of feedback to commensurate pedagogical solutions at digital interfaces, e-assessment tools risk lacking the capacity to ensure valid evaluation and sustained learning improvement. While formative e-assessment issues and ICT integration as such are well-described in the literature, a structured multidimensional mapping of teachers’ competencies is not commonly offered in current studies. Most research addresses competencies conceptually or focuses narrowly on pedagogical or technical aspects, without integrating interactional and human–computer interaction (HCI)-related dimensions. This lack of operationalization limits the understanding of how teachers perceive and enact formative assessment in digitally mediated environments.
There is no doubt that this dynamic human–computer interaction (HCI) offers conceptual tools to study these changes, including how interface structure and information complexity affect cognitive load [1,2], how evaluation is distributed across users, artifacts, and systems [3,4], and the way in which the paradoxical changes linked to the introduction of digital tools are redefining professional practices [5,6]. In addition, the principles of usability show that the effectiveness of feedback and the good quality of formative feedback depend on the efficiency of the interactions produced by the platforms [7,8]. Recent research into large language models, especially multimodal LLMs, has made clear their increasing importance for educational assessment and learning design. Vision-and multimodal-enabled LLMs merge textual, visual, and interactional data to enable adaptive feedback and attention analysis as well as personalized learning in an educational context—most explicitly within STEM education [9,10]. Analogously, research based on dual coding theory attempts to analyze how multimodal LLMs are capable of generating adaptive visual–textual learning resources while simultaneously raising very important issues concerning cognitive load, content appropriateness, and ethical regulation [9,10]. These approaches are not empirically tested in this study but bring to the fore the question of teachers’ techno-pedagogical competencies and interactional competencies regarding ethics for a responsible integration of AI-mediated formative assessment environments.
Taking these elements into account, our objective through this document is to highlight the professional skills mobilized during an online formative evaluation process, those most affected by the integration of digital technologies and likely to make evaluative practices more effective and swifter to act under constraining conditions. The perceived assessment is often multidimensional, in which innate and acquired skills interfere with a purely pedagogical logic that promotes knowledge and understanding [11]. In a formative evaluation, this logic is more oriented toward support than control. By delving into the professional environment, we seek to capitalize on non-formal internal dynamics and exchange ideas on the way forward, to assess in a comforting and more attractive setting. As a result, we can contribute to public debate, gradually bridging the gap between what is possible and what is necessary, seeking a balance between what is desirable and what is achievable in the online act of evaluation. As explained in the literature, an act of formative evaluation, through the revealing clues it emits, provides proof of the relevance of our pedagogical choices and the state of progress achieved, theresults will be certain if the dynamic of student involvement is confirmed [12]. In e-learning, formative assessment facilitates the growth of key professional abilities through ongoing feedback, self-reflection opportunities, and personalized learning. Evidence indicates that formative e-assessment is beneficial for both teacher and student competencies, particularly when feedback is provided in a timely manner and assessment criteria are transparent and discussed with the student [13,14,15,16,17]. With formative assessments in professional development, teachers experience enhanced self-efficacy, knowledge, and the use of effective practices, resulting in improved student learning outcomes [17,18,19]. Undeniably, these dynamics serve to feed crucial avenues of reflection for the examination of the pedagogical, ethical, and societal aspects of distance learning.
This study addresses the teachers’ self-reported competencies as the first step for understanding their interactions with advanced digital assessment tools. These competencies can be categorized into dimensions such as techno-pedagogical, methodological, evaluative, semiotic-ethical, and communicational. The HCI-related components will here be understood as learner interaction design, mediation, and regulation competencies applied to digital assessment settings. The study is thus framed by the following research questions:
RQ1. What is the distribution of the teachers’ e-assessment formative competencies among the different competency domains?
RQ2. What association patterns can be found between the criteria that underlie these competencies?
RQ3. How will HCI concepts help in interpreting the competency profiles and guide future AI-mediated assessment environment designs?
This exploratory and cross-sectional study uses “impact” and “evolution” descriptively to characterize observed associations or contrasts between areas of competency without implying causal effects or longitudinal change. It seeks to provide an initial mapping of perceived teacher competencies that might then support further inquiry into designing, adopting, and ethically using advanced digital assessment tools in higher education.

2. Theoretical and Conceptual Framework

2.1. The Place of Digital Technology in the University Model

To guarantee the quality of its interventions, the university has ensured the proper functioning of its components at a satisfactory level of quality and meeting the standards and requirements provided for by the directives of the Ministry of Higher Education, Scientific Research and Innovation. All links in the higher education value chain work together to overcome the constraints facing the university by fully exploiting its internal potential. In principle, seven components are functional and in a mutually interactional way (Figure 1).
Since 2003, the Moroccan university was aware of the need to reform its methods and products to adapt to international contingency components, in particular: an internationalization movement; standardization of curricula; professionalization of courses of study; the emergence of digitalization; the impetus of the knowledge economy and plurality of internal and external issues. This new awareness on the part of Moroccan universities is reflected in their commitment to a new reform, following the guidelines set out in the National Education and Training Charter, and the promulgation of Law 01-00 on the organization of higher education [20]. Moroccan universities have resolutely supported the Bologna Process as a fruitful initiative for continuous improvement to excel in governance, training, and scientific research to keep pace with these pressing institutional and managerial changes.

2.2. Aim of the Assessment

The formative evaluation is a process that informs teacher-trainers about the quality of the “intermediate product” of a learning sequence so that they can propose solutions to improve the action by triggering regulatory mechanisms. This dynamic helps to reduce the gap between expectations and reality by identifying errors and hypothesizing their sources. This will serve to highlight the features of the final “good product” expected at the end of the summation [21]. Therefore, our evaluative posture is invested in a logic of evaluation for competency management as a measure of the anticipation of results throughout the teaching–learning process before the “Assessment of learning” summation. So, beyond the simple notion of control, assessment becomes an essential support tool to help learners progress and increase their autonomy. “The assessor detects dysfunctions to regularize or regulate them. This type of evaluation tends to create a system of individualization of pedagogical modes of action and interaction” [22]. Guerda Rodríguez (2021) believes that the importance of summation is limited to the valuing of previous acquisitions through individual passings in compliance with well-defined rules and grading scales, while formative assessment shows itself to be more concerned with the verification of skills under construction through the neutralization of linguistic requirements in favor of the stimulation of thought and the search for meaning [23].
Significantly, formative assessment fulfils a dual function: to monitor progress against stated objectives and to provide information on what students have learned. “It’s concern is not to classify, grade, blame, or reward, but to help students learn. [...] It does not necessarily leave any traces, it is not official, it only concerns the teacher and his students, not the administration.” [24]. In the same vein, Scallon in 2004, does not endorse the idea that evaluation is momentary and oriented exclusively towards certification but sees it as an interactive, perennial process of information exchange that offers the potential to address student difficulties of all kinds [25]. On this basis, four procedures feed into formative evaluation: observation, monitoring, analysis, and accompaniment. While the first two provide the necessary information to situate learners in their learning trajectories, the last two are geared more towards taking account of their differences regarding their needs, abilities, and learning rhythms through a categorization of interventions and more to less differentiated remediation plans. In this process, it is important to coordinate these procedures as well as possible and link them together in such a way as to rush the collective to level out learning difficulties or bring them down to an acceptable level. It is a time-consuming task that comes under the heading of mastery pedagogy and requires a more substantial effort in terms of defining the needs to be met, planning the trajectories to be taken, the families of situations to be dealt with, and the regulatory mechanisms to be convened. Three crucial parameters that need to be defined before we can think about evaluation practices: contextual, functional, and methodological. These dynamic parameters provide markers of positioning and enable us to establish links between referent and referent, which any evaluation presupposes to put into relation [21].

2.3. Online Assessment in the COVID-19 Era

The COVID-19 health crisis has been accelerated the deviation toward techno-pedagogical environments more suited to the period of confinement. His primary concern was not to provide quality education but to make university e-learning look like a business model aimed at a mass audience [26]. As a result of the pandemic, the education and training sector has become muddled, and the signs of these effects are not going unnoticed. Cancellations, work interruptions, reduced services, massive deployment of telecommuting, and changes in managerial and pedagogical modes are all forms of adaptation to the restrictive context imposed by this force majeure. Thus, during this madness, technological resources have been in high demand since they represent a real bulwark against the adverse effects of the pandemic on educational continuity. Under normal conditions, face-to-face assessment struggles to find its place in the teaching–learning process, given the cumbersome devices it requires and its tedious nature. This situation is even worse in the case of distance learning, where assessment is less controllable and more controversial. Digital assessment devices are indeed supposed to bring greater credibility for both the design and production of knowledge and the delivery of pedagogical services since they make it possible to adopt pedagogical verdicts on skills being built up. In this premise, perceptions of assessment devices and techniques differ widely from one teacher to another due to differences in several elements: the personal toolbox, the evaluative breath, the training modality, and the network for sharing and exchanging experience.
It is therefore important to iron out the imperfections of e-learning systems through sharing and feedback to adapt them to different assessment contexts. Innovative practices must incorporate new technologies and use these devices to shape curricula, accelerate pedagogical designs, improve the quality of delivery, broaden the scope of information exchanged, and as a result, make the university experience relevant to learners’ everyday lives. Given that formative assessment generally covers the entire teaching–learning process, it becomes vital to seize this key “freeze frame” moment to set benchmarks on the evolution of the acquisitional level of the whole class. In addition to the assessment tools provided by platform designers, these key moments of support, remediation, monitoring, and control need to mobilize a combination of more dynamic devices, better adapted to the needs of the target audience and capable of overcoming the mechanistic postulate of simple measurement, in order to increase the frequency of interactivity with students; eradicate or reduce unhealthy phenomena; improve the objectivity of assessments; and combat monotony and keep learners alert.

3. Research Methodology

3.1. Scope and Nature of the Study

Advances in large language models (LLMs) and multimodal AI have expanded the possibilities of formative e-assessment through automated adaptive feedback, multimodal interaction, and personalized learning pathways. These technologies were not directly assessed in the present study but their development highlights the importance of examining the teachers’ competencies related to interaction design, feedback mediation, and ethical regulation within digitally mediated assessment contexts. Insights from LLMs and multimodal AI can inform future research on how teacher practices interact with advanced AI-mediated assessment tools and platforms.
This research work, whose scope was limited to the Mohammed V University in Rabat, is based on an exploratory study, within an evaluative logic, of the effect of ICT integration on the evolution of five professional competencies in a higher education process deemed to be the most influential in an online assessment device. The quantitative method invoked an evaluation grid with the aim of exposing the virtues and shortcomings of the technological input by measuring changes in formative assessment practices via the professional skills that underpin it. From there, and following the integration of ICTs, we seek, by means of practical benchmarks, to improve the pedagogical services that enable us to accompany the university towards excellence by demonstrating the facts that operate in the shadows, quantifying the most cross-cutting phenomena in online formative assessment to seize the opportunities that inspire change. Therefore, feedback on the use of ICT in assessment practices will enable us to capitalize on points of convergence and enhance the value of successful practices (Figure 2); therefore, this research follows a purely exploratory and descriptive approach. Its purpose is applicable only within this particular context and is not to establish causal links, assess relative strengths and weaknesses, or confirm underlying constructs, but rather to outline perceived competency profiles and identify patterns of association within a clearly defined institutional setting. Future studies may consider multi-institutional samples, longitudinal designs, and system-generated interaction data.

3.2. Evaluation Tool

Without claiming to be exhaustive, the assessment instrument was designed as a grid with 35 items with respect to five competencies as follows (Appendix A, Table A1 Summary of the Five Competency Domains and Their Items):
  • Techno-pedagogical skills: from variable T1 to T7.
  • Methodological skills: from variable M8 to M14.
  • Collaborative skills: from variable C15 to C21.
  • Semio-ethical skills: from variable S22 to S28.
  • Evaluative skills: from variable E29 to E35.
Each competency is operationalized with 7 criteria measured using a 5-level ordinal scale whose points are weighted from 0 to 10 (Figure 3). This grid captures the various dimension of teachers’ competencies with respect to formative e-assessment. Each criterion relates to a particular dimension of the teachers’ professional practice. Existing literature on digital pedagogy and online assessment as well as the authors’ constructs guided the item formulation. The grid construction included educational technology experts as reviewers and a pilot study.
The competence grid was designed to be a criterion-based, formative evaluation system. The observed indicators were supposed to describe areas of practice conjointly and not as a reflection of an underlying latent variable. As a result, indices of internal consistency were not used as evidence for measurement quality. Exploratory analyses involving these indicators were consequently restricted to a study of associations and was interpreted as indicative of construct validity.

3.3. Participants and Sampling Procedure

Given that our research is exploratory, a reasoned non-probabilistic sample of 200 teacher-researchers spread across the higher education establishments of Rabat’s Mohammed V University was selected. Given that “it is probably more reliable to seek to identify innovative individuals by measuring the various manifestations of innovative behavior” [27], the choice of a population of teacher-researchers heavily involved in distance learning seems judicious and revealing of the interference that occurs in the contact zone between the pedagogical act and the technological solution in terms of virtues and shortcomings. The responses were to be kept confidential, and participation was completely voluntary. A total of 120 questionnaires were received, 115 of which were complete, constituting a response rate of 57.5% and were retained for analysis. Questionnaires that were incomplete or had a significant amount of missing information were not retained for analysis. No missing data were estimated. The eligibility criteria were having experience in online or hybrid instruction as well as in the use of digital assessment tools.

3.4. Data Collection and Scoring

Data were collected via an online self-administered questionnaire. The participants answered the assessment grid as a self-assessment, based on their own teaching and experiences related to formative assessment in digital or hybrid contexts. Respondents were instructed to rate their answers based on the most recent experience they had with online assessment tools in teaching. The questionnaire was administered asynchronously over a period of four weeks.

3.5. Ethical Considerations

Before participation, participants were informed about the objectives of the study and gave their consent. The data were collected anonymously and used solely for research purposes, in compliance with the institutional ethical standards for educational research.

3.6. Data Analysis

Descriptive statistics (mean, standard deviation, and confidence interval) were calculated for all of the criteria. Correlation analyses were performed using Spearman’s rank correlation coefficient because of the ordinal nature of the scale. Principal component analysis (PCA) was utilized as an exploratory tool to investigate patterns of association among the criteria. These analyses are descriptive and interpreted with caution due to sample size and exploratory study objectives. The thresholds that were used to indicate higher (>8) and lower (<3) mean scores are heuristic markers meant to assist with interpretation rather than being normative performance benchmarks.

4. Results and Discussion

The results provide an exploratory snapshot of the teachers’ perceived competencies. Techno-pedagogical and methodological competencies obtained the highest average scores, while evaluative and semiotic-ethical competencies showed lower average scores in several criteria. Associations among criteria are presented descriptively. All findings are interpreted in terms of relative strengths and weaknesses rather than causal effects or measured change. Interpretations are explicitly grounded in the reported statistical results. References to external literature are used to contextualize observed patterns rather than generalize beyond the empirical data.

4.1. Frequency Analysis

4.1.1. Total Scores and Averages for the Five Professional Skills in an Online Training Mode

The sum of the cumulative scores for each skill was highly contrasted, with a range of variation reaching 3082 pts. This suggests that these skills evolved neither in the same way nor at the same pace in technological interfaces. Techno-pedagogical and methodological skills seemed to be the most affected, while semio-ethical and evaluative skills were still struggling to generate meaning in distance mode (Figure 4).
The area delimited by the averages of the competency variables extended more towards the right-hand side, giving an initial impression of the clear evolution of the three techno-pedagogical, methodological, and collaborative competencies to the detriment of the evaluative and semio-ethical ones. This distribution can be explained by the fact that some of the teachers have only recently encountered the technological tool, or that the period during which they have been familiar with it has been very short overall. As a result, this transition calls for the optimization of the technological toolbox for urgent techno-pedagogical, methodological, and collaborative purposes (Figure 5).

4.1.2. Average Assessment Indicators for the Five Professional Skills

A simple glance at the histograms of the criteria, illustrated in the following figures, revealed the variables that contributed most to the development of professional skills, as well as those that were the most constraining in a formative evaluation process.
Techno-Pedagogical Skills
For techno-pedagogical skills, three blocks of variables were visible as a function of scores cashed in, as shown in Figure 6 with a single constraining variable: T7, two with medium observed patterns: T4 and T5, and four contributory ones, including two relatively higher mean scores, T1, T2, T3P, and T6P.
Methodological Skills
The methodological skills variables were divided into two more or less balanced blocks in terms of number, with three constraining variables: M8, M11, and M13 and four contributory ones, including three relatively higher mean scores: M9P, M10P, M12P, and M14 (Figure 7).
Collaborative Skills
For collaborative skills, the variables were divided into two more or less balanced blocks, with four constraining variables, including two lower mean scores: C15, C16, C20A, and C21A, and three contributory variables, including two relatively higher mean scores: C17, C18P, and C19P (Figure 8).
Semio-Ethical Skills
It seems clear that there was a clear predominance of criticality for semio-ethical skills. We can see that all variables were lower mean scores: S22A, S23A S25A, S26A S27A, and S28A. The only exception was S24. (Figure 9).
Evaluative Skills
As with semio-ethical skills, criticality also reigned for evaluative skills, and all variables were predominantly lower mean scores: E29A, E30A E31A, E33A, E34A, and S35A, excluding variable E32 (Figure 10).

4.1.3. Distribution of the Most Influential Variables

The most contributory variables were divided into two categories according to the competency to which they referred to and defined two distinct strata: a performance stratum, represented by relatively higher mean scores with scores above 8 pts, and a heuristic interpretive range, represented by lower mean scores with scores below 3 pts. Among the values displayed, we note T3 as the apogee point and S27 as the perigee point (Figure 11).
Cluster of the Most Constraining Variables
These variables combine to form a force opposing the perfect infusion of ICT into higher education. In fact, they attest to the limits of digitization in education and training as it slows the wheels of formative assessment and calls into question the alignment between technology and pedagogy. It is assumed that the more ICT integration contributes to improving the professional skills in question, the more comfortably formative assessment is implemented. There are 14 of these lower mean scores of variables, broken down as follows:
  • Six variables for evaluative skills: E29, E30, E31, E33, E34, E35;
  • Six variables for semio-ethical skills: S22, S23, S25, S26, S27, S28;
  • Two variables for collaborative skills: C20, C21.
  • The Constraints of Evaluative Skills
Through this research work, we attempt to highlight the components that generally influence formative evaluation skills in an e-learning process. Also, “the ability to conduct evaluation activities—that is, ongoing monitoring and self-evaluation—to manage external evaluations, and then to use the results.” [28]. There have been many theoretical and pragmatic attempts to make evaluation objective, but in the end, they all seem to fail: achieving objectivity seems to be a pipe dream [29]. It would seem wise, however, to abandon the idea that assessment outputs are always conditioned by objectivity, given that the perceptions of evaluators, their expectations, and the biases they are subject to are numerous and difficult to imagine. In a university e-learning context, objectivizing evaluation practices becomes even more difficult for two reasons: firstly, the evaluation does not respond perfectly to the dynamic nature of human relations, and secondly, there is a lack of pedagogical reflection regarding technological transpositions when setting up online evaluations in terms of shortcomings and potentials to be exploited. Whether one is operating within a logic of monitoring or accompanying evaluation, improving evaluation skills offers teachers a means of monitoring the realization of pedagogical services and gaining valuable feedback on the effectiveness of the efforts invested. As a result, it seems clear that the assessment is deficient and lacks visibility. “Even though relatively simple and commonly used academic technologies such as Moodle are used, the management of large groups generates problems of fairness, monitoring, identification, and significant concerns, such as the phenomenon of plagiarism or identity theft.” [30].
Faval’s study support the idea that problems encountered during active e-learning time translate into feelings of increased frustration and confusion among users when faced with the incompatibility of the technological process [31]. Given that assessment, and in particular summative assessment, takes place at the end of the pedagogical process, it is only natural that it should constitute a moment when the harmful effects of poorly conducted formative assessment become eloquently apparent, thus testifying to the inadequacy of certain technological tools that ensure communication and exchange between the two sides of the equation: that of teachers and that of students. In this context, the implementation of e-learning has led to a shared perception by teaching staff and students of the most eminent constraints identified on the system, particularly difficulties linked to the conditions for carrying out certain courses and tutorials, and the importance of physical presence and interaction between teaching staff and students. As a result, the lack of clear and precise guidelines for the delivery of teaching and assessment methods can only heighten the legitimate fears of electronic assessments [32].
  • The Constraints of Semio-Ethical Skills
For semio-ethical skills, Chauvin (2018), partner and co-founder of Groupe Philia, motivated to promote integrity within organizations through its ethics and compliance consulting services, highlights a definition that extends to generate meaning for a family of situations: “Ethics is a reflexive process analyzing various specific anchor points—such as laws and rules, norms and codes, personal and community values, social mores and professional practices, etc.—It is a reflective process based on a given problem, considering a specific context, and aimed at informing the making of a singular decision” [33]. Prioritizing self-paced online training means overcoming the difficulties of dealing with student dishonesty and a lack of interest, without encroaching on active teaching and learning time. As Bonfils (2020) explains, “One of the problems encountered is not being able to maintain a satisfactory quality of exchange during synchronous distanced sessions, particularly when faced with large cohorts of students (lectures in a rather transmissive mode). The result is disaffection or, at the very least, a significant loss of attention on the part of students who are inactive for long hours in front of their screens” [34]. Therefore, given the variety of disinterested and dishonest student behavior that manifests itself online, such as declining student attention spans; discontinuity of connectivity time; contemplative presenteeism; plagiarism; identity theft and cheating, or even failure to meet assignment deadlines, it becomes worth betting on three interrelated measures, likely to bring e-learning’s shortcomings down to an acceptable level:
-
A positive commitment to self-motivation and self-discipline makes it possible to grasp the interests and challenges of the digital transition and overcome the obstacles without the risk of confining distance learning in a too-tight straitjacket of cumbersome procedures, which are supposed to closely monitor the implementation of this resolution, in vain.
-
Introduce innovative forms of formative assessment that involve a strong visual, vocal, and gestural presence on the part of students. These perennial assessments meet the convergence criteria of coherence, continuity, and methodological complementarity about the pedagogical scenario, based on “three essential characteristics: its place in learning, its relationship with the student, and the remediation it engenders” [35].
-
To ensure that digital learning environments flourish, a close and regular collaboration between the pedagogical and technological authorities is called for, given that it can align the efforts of pedagogues and designers to develop digital platforms, equipped with tools that also enable certain aspects of collaboration to be evaluated (interactions, collective activities...).
In this sense, the more restructured assessment methods are, the greater will be the alleviation of academic dishonesty. Initially, students will have to become accustomed to and take full advantage of the functionalities and terminology of techno-pedagogical platforms. As a result, their digital and analytical skills will manifest themselves through their behaviors and interventions. It is therefore of the utmost importance to create exams that are not based entirely on raw Internet content and that foster close, balanced collaboration within the group while allowing individualists to excel. The time allotted for feedback must take into account the complexity of the task submitted and other constraints linked to the social and technical conditions of the students to have an effective riposte [36].
  • The Constraints of Collaborative Skills
The more successful digital environments become, the more confidence they gain, and the more legitimacy they acquire, the more we need to prioritize moments of reflexivity on the place of collaborative work in this process in a state of giving life and desire in virtual classrooms. It is undeniable that the emergence of digital technology has profoundly transformed the university through improved access to information, innovative teaching practices, managerial practices, and conditions of well-being in the workplace. Such changes have made the technological contribution perceptible, without denying its shortcomings. So, faced with this reality of light and shade, the nature and strength of the relational, didactic, and learning fields of the pedagogical triangle seem to have been shaken. The online pedagogical relationship has been hardest hit, as the state of compartmentalization has remarkably reduced pedagogical reliability, exchanges, and interactivity between tutor and students, and between pairs with less vocal and visual presence. The hard truth is that e-learning communication is evolving into a form of inactivity that is detrimental to training beneficiaries and often reduced to written exchanges on the Web. “The main thing to remember is that students remember information better when they are actively involved in the learning process. It’s difficult to follow courses that are overloaded with reading texts, translations, or watching videos. It’s better to be in the action, i.e., to be at the center of events, than simply to be an outside spectator.” [37].
High-Performance Variable Clusters
These variables partly drive the change forward. There are seven of them, distributed as follows:
  • Three variables for methodological skills: M9, M10, M12;
  • Two variables for techno-pedagogical skills: T3, T6;
  • Two variables for collaborative skills: C18, C19.
Easy, low-cost access to technological services has accelerated the dissemination and sharing of digital resources. This new situation reflects the teachers’ enthusiasm for the use of information and communication technologies, given their advantages, including flexibility, time savings, and in particular, a range of choices when it comes to educational resources and LMS platforms. Today, many teaching resources are instantly accessible, and most textbooks and courses are available online, accessible anywhere from a computer, tablet, or smartphone. Ultimately, e-learning platforms represent a rallying point where the art of teaching is manifested through the choice of pedagogies, educational resources, and communication techniques so that the teaching–learning process is streamlined and efficient. Therefore, we are witnessing satisfaction with the technological contribution to teaching practices. Teachers can easily cope with heavy workloads and simplify the presentation of information through the design of innovative teaching scenarios. As a result, complex explanations become an everyday exercise, thanks to pedagogical and entertaining techniques that leave nothing to chance. The advent of the Internet and communication technologies has paved the way for the unprecedented success of mass media, such as social networks, which have been transformed into new e-learning platforms. In addition, the e-learning market took off spectacularly in the wake of the COVID-19 health crisis, and its extension into the education and training sector gave ICT an unprecedented societal dimension. From there, the mediatized environment, in a distance learning mode, has favored both the dissemination of digital resources and the use of a panel of interactive, user-friendly tools to communicate, prospect, and respond to the many needs concerning the handling of pedagogical content and spatiotemporal ease. As a result, teachers and students alike have acquired the ability to better understand technological jargon and use computer software to promote multimedia products.
Given that technology is constantly evolving, new techno-educational solutions have emerged. Faced with this diversity, the question arises of choosing the right technological tools for a smooth, reasoned switchover between face-to-face and distance learning modes. A successful choice will depend first and foremost on the teaching need and then on the functionalities of the exchange interface. In this sense, Peltier’s studyagrees that we need to unravel the simplistic and hasty idea that designing a distance learning course is simply a complete and faithful transcription of a classic face-to-face course [38]. Above all, it is a long-term process that involves the mediatization of content and presupposes a smooth insertion into the technological paradigm while meeting all its requirements in terms of resources and organizational reconfiguration. It is certainly true that mediatized environments offer timely feedback through one-to-one meetings or personalized comments on students’ progress, providing accurate information on the quality and relevance of their contributions. It seems obvious that information and communication techniques can create a climate conducive to improving collaborative skills if they confer greater empowerment and communication skills. This means overcoming fears of embarrassment when face-to-face with the audience; building excellent relationships with the class; and solving problem situations as a team via various synchronous and asynchronous communication techniques. With ICT, teachers can now revitalize the virtual classroom and turn it into an exciting knowledge gallery. Making classrooms interactive, courses more enjoyable, and belonging to exchange and communication networks are therefore distinctive attributes of the technological contribution required to foster a spirit of openness, sharing, and developing pedagogical solutions.

4.2. Inferential Analysis

4.2.1. Type of Measurement Model

To verify the internal consistency of the items, the calculation of the Cronbach’s alpha coefficient did not seem to be successful, since we started from a formative measurement model that differed from the reflective model in that the items of our research tool define and characterize the latent variable, i.e., formative assessment in an e-learning context, in a specific and non-interchangeable way. What is original in this sense is that the latent variable evolves from the interaction of the technological input with the manifest variables referring to the five professional skills required for the implementation of a formative assessment of students in a conducive and serene environment. In short, in a formative model, measures cause the latent variable, and correlations can be positive, negative, or null.

4.2.2. Study of Bivariate Correlations

Due to the ordinal nature of the scale and non-normal distributions, Spearman’s rank correlation coefficient was used throughout. Correlation results were interpreted as exploratory associations and do not imply causal or directional relationships. To interpret the tables, we assumed that two quantitative variables were only correlated if the significance level was less than 0.05, while Spearman ‘s correlation index was used to give an idea of the strength of the relationship (Table 1).
Correlation: M10—Providing a Variety of Entertaining Materials and C19—Offer Personalized Feedback
Benchmark data:
  • Significance is less than 0.05;
  • Spearman correlation is very strong: 92.4%.
The various types of digital educational resources enable teachers to insert different forms of communication (text, image, video, etc.) into their training space, resulting in rich, diversified content. These include files, image galleries, books, web pages, and URLs. The seamless transition between the elements of this multimedia collection is positioned to produce the personalized feedback that forms the essence of any formative evaluation. No one can deny the role of digital resources and their pedagogical uses in increasing student motivation and autonomy if they are invested in suitable pedagogical approaches. As explained in the literature,, some people process auditory information very quickly, while others tend to have more visual or sensorimotor strengths. Either way, when we have more time to process information, the quality of our thinking and learning improves [39]. Therefore, whatever the personal distinctions within the classroom, there is a strong likelihood that the learner will find his or her learning comfort in one or other of the digital resources on offer, given their diversity and abundance.
Between the moment when digital resources are inserted and the moment when feedback is collected, there are fertile periods of reflection known as “cognitive pauses”, during which mental mechanisms perform to accommodate the states of disruption they undergo because of assimilating new content. However, “these cognitive pauses should not be seen as a waste of time where nothing happens in the classroom! On the contrary, it’s about giving students a moment to think, consider possibilities, mentally structure the new content, make associations and comparisons, etc.” [40]. New forms of pedagogical resources impact the target audience in a contrasting but widespread way since these alternating cognitive pauses are capable of triggering ripple effects in the collective, generating revealing feedback on learners’ positioning in the acquisition trajectory based on which personalized regulation and remediation measures will be taken. The sensation of loss, isolation, lack of interaction, and loss of concentration remain the major features of online teaching, as teachers struggle to develop strategies for attracting learners, maintaining their attention, and involving them digitally. Another disruptive element is the lack of a calm, conducive distance-learning environment, since most of those connected live with their families and are exposed to noisy sources [41]. The same observation has been confirmed, it pointed out the negative effects that hinder the excellence of e-learning due to numerous distractions, the absence of a suitable learning environment, or the lack of contact with the teacher [42].
Correlation: M11—Reduce Preparation Time for Learning Activities and M13—Classroom Management
Benchmark data:
  • Significance is less than 0.05;
  • Spearman correlation is medium to fairly strong: 54.4%.
Unequivocally, technology has provided a significant information load and highly advanced design tools, speeding up the pace of work and reducing the time needed to prepare pedagogical deliverables. This precious time can be used to rethink teaching practices and the implementation of mastery strategies, in order to generate greater motivation and commitment from students. With this in mind, Gaudreau’s studyrecommends five classroom components to promote the implementation of intervention means and strategies: resource management; setting clear expectations; developing positive social relationships; focusing students’ attention and engagement on the learning object; and managing indiscipline behaviors [43]. It is important to say that the emergence of educational technology has upended teaching practices in terms of the pedagogical approaches adopted, the nature of pedagogical content, and the didactic relationship between teacher and students. What is even more interesting is that these elements are linked to open-access resources, making them an ideal catalyst for innovation. Access to the Internet has made digital resources infinite and technological design tools varied and more professional. As a result, classrooms are no longer perceived as a closed system, but rather as open and connected to libraries and e-learning platforms worldwide. The obvious advantages of a good command of technological production tools on the optimization of university time no longer need to be demonstrated. To this end, ensuring that expected pedagogical objectives are achieved in a timely fashion requires audiovisual design skills. This gives teachers a certain credibility when it comes to digitizing knowledge, staging it artistically, and designing it ergonomically. Thanks to technology, a vast amount of time can be reduced to a well-scripted, well-framed capsule lasting just a few minutes, combining pedagogical aspects and highly entertaining moments simultaneously, thus generating more interactivity and greater student commitment than with traditional methods.
A balanced view has been presented stating that given the current pandemic situation and the wider adoption of digital technologies at this time, the live e-learning mode may become the new norm in the future. However, there are still insurmountable obstacles to the realization of pedagogical services, as the interactive functionalities offered on the platforms remain out of step with the objectives of the teaching body, in particular, the design of judicious teaching pedagogies in synchronous mode and the management of different groups of students on the online platform about session preparation time and assessment mode [44]. Zhang in 2020, for his part, acknowledges that beyond borders, there are other constraints of a political nature that require bypassing firewalls to learn online, as is the case in mainland China [45].
Correlation: T6—Technological Training and Assistance and M12—Innovative Pedagogical Scripting
Benchmark data:
  • Significance is less than 0.05;
  • Spearman correlation is strong: 69.5%.
Creativity can only flourish where teachers are fully engaged and free to experiment and lead new ways of thinking that offer more singularity than traditional pedagogical scenarios. In all this, ongoing training and technological support are indispensable, as they are the key to success in developing technical skills. There is no doubt that the key to success in educational provision is mastery of the training process. This requires instructional scripting, which is one of the basic foundations of pedagogical engineering, since it provides a clear, coherent, and formalized vision of training sequences, from the specifications to the storyboard. Training in e-learning storyboarding helps to identify future gaps by asking the right questions in advance of the course and looking for creative ideas. Once the script has been properly written, development and implementation take over. However, the most interesting aspect of this process is that experience sharing becomes an easy exercise, thanks to the transferability of scenarios and the reproduction of pedagogical works, which in turn leads to very significant gains in terms of workload reduction. The “instructions” for the benefit of students at the top of the list of key considerations to be considered by educators engaged in an e-learning process. The importance of this consideration is to ensure a climate of concordance between the pace of progression and the fragmentation of the learning unit [46]. “Online courses need to be very clear and well structured, delivered in manageable chunks, providing a good opportunity for students to practice what needs to be learned and allowing the teacher to see student work and provide timely feedback.” [46]. Martin’s concerns point to the importance of pedagogical scripting, which controls the order and content of pedagogical units, as well as the pace and types of assessment activities [46].
Correlation: T7—Confidence in ICT and C16—Performance of Dynamic and Interactive Tasks
Benchmark data:
  • Significance is less than 0.05;
  • Spearman correlation is very strong: 82.2%.
The tipping point leading to the development of techno-pedagogical skills among teachers is having confidence in information and communication technology. Remaining distrustful and aloof from this contingency factor deprives them of a source of internal synergy. To develop solutions that improve the delivery of educational services through dynamic tasks generating interaction, recourse to technology seems an essential step. If confidence in technology is impaired, this is expressed in attitudes of reluctance to innovate and change teaching practices. In this sense, a balanced vision of the place of information and communication technologies in the educational scene is required. From there, technology must neither be dissolved to the margins of pedagogical design nor dominate to the detriment of the value of pedagogical action. Julia Martins asserts that creating an internal dynamic requires long-term work and remains conditioned by the predispositions of peers to collaborate effectively [47]. The group’s evolution towards affinity and cohesion is reflected in the quality of its members’ interactions. The group state in this case can be improved through the gentle and confident integration of new technologies to increase collective performance, reduce monotony, and give a taste of school. It is important to note, through the study of bivariate correlations, that there is a predominance of methodological variables. This is indisputable proof that technology is shaping the way we research, script, and produce digital resources. Indeed, methodological skills are the backbone of any teaching or research service, enabling us to personalize our actions and deal fearlessly with problem situations.

4.2.3. Factor Analysis: PCA

PCA was used as an exploratory method to reduce data and trace clusters of associations between the indicators. PCA was not employed by us here to estimate latent competencies or determine a robust factorial association pattern. Pre-PCA verification of the measure adequacy for sampling was performed. The Kaiser–Meyer–Olkin (KMO) measure was 0.52, which bordered on acceptable for sampling adequacy, whereas Bartlett’s test of sphericity was significant (χ2 (528) = 1460.91, p < 0.001), indicating that there were appropriate linkages among variables. Because the KMO value was very close to the cut-off, the PCA results are interpreted as exploratory and descriptive only.
Variable Representation Qualities
Extracted representational qualities are estimates of the variance of each variable taken into account by the components. The averages of the representation qualities of the variables were all high for the five occupational skills, indicating that the extracted components represent these variables well (Figure 12 and Figure 13).
A balanced contribution of the items of the five skills to the extracted components, with the most influential variables of the collaborative skill in the foreground, was as follows:
  • C17: Carry out collaborative designs;
  • C19: Provide personalized feedback;
  • C20: Generate greater pedagogical equity;
  • C16: Perform dynamic tasks: interactions.
Followed by methodological skills variables:
  • M10: Enable the use of fun and varied media: videos, interactive formats, podcasts...;
  • M11: Reduce preparation time for learning activities;
  • M12: Design innovative teaching scenarios.
And with, as a third contributor, the variables of techno-pedagogical skills:
  • T2: Think critically about the advantages and limitations of ICT;
  • T6: Feeling the need for training and technological assistance to provide pedagogical services;
  • T7: Have confidence in the use of ICT.
This seems logical, given that techno-pedagogical, methodological, and collaborative skills are perceived as precursors that enhance the quality of mastery of evaluative and semio-ethical skills. Therefore, we are faced with two major constructs, one influencing the other. This catalytic dependency effect gives the teacher greater visibility over his or her e-learning class and consequently enables him or her to anticipate conditions that are conducive to formative assessment.
Total Variance Explained
The variance explained by the initial solution, the extracted components, and the rotated components is displayed (Appendix B, Table A2 Total Variance Explained). It shows the initial eigenvalues. A look at the second column shows that 13 components had an eigenvalue greater than 1. The first factor alone explained 11.02% of the total variance of the 35 variables in the analysis. Taken together, the 13 components explained 74.17% of the variance. To be sure of choosing the right number of components (F) to extract, we examined where the Cattell’s bend was broken on the eigenvalue graph. We could see a change after the tenth factor, and the slope of the kink attests to this, as the points lined up almost rectilinearly.
Therefore, the first 10 components were kept for explorations and descriptive analyses, since this criterion is more rigorous than that of eigenvalues. Thus, with our 10 components, we could explain 64.39% of the total variance and consequently explained the variability of the original 35 variables by reference to the five occupational skills, so that we could considerably reduce the complexity of the dataset using these components, with an information loss of 35.6% (Figure 14). Given the small sample size compared to the number of indicators, these dimensions represent illustrative patterns of association rather than robust and generalized entities.
Examination of the Component Matrix After Rotation
  • Identifying the Highest Weight for Each Variable
To obtain a simpler factorial representation, we used a VARIMAX rotation. This type of rotation preserves the orthogonality between components. This time, we note that the variables were much better distributed over the different components when compared to the component matrix. Since the number of variables that saturate the ten components is on average three, we can keep them to build scales. We now need to name the components and try to identify the latent construct they measure. Four components were not the subject of our analysis, namely components 3, 4, 5 and 6, for the following reasons: on the one hand, factor 3 seems insignificant and breaks with our research objective, since it involved two variables: the first is C20, relating to educational equity and the second is S32, concerning compliance with deadlines for submitting work. On the other hand, components 3, 4 and 5 exhibit bivariate relationships already revealed through Spearman’s correlation in the previous section between variables T7 and C16 for factor 3, C19 and M10 for factor 5, and T6 and M12 for factor 6. Our analysis was therefore limited to six components: 1; 2; 7; 8; 9; and 10. (Appendix B, Table A3. Rotated Component Matrix (PCA)).
  • Components Labelling
It is now important to design the components and try to identify the latent construct they measure or the logical links that may exist between the variables of each factor.
a.
Variables of component 1 (F1):
  • M11: Reduce preparation time for learning activities;
  • M13: Enable efficient class management;
  • M14: Adopt digital tools and practices to develop creative teaching methods;
  • T4: Be able to produce digital resources: capsules, mind maps, etc.;
  • E32: Make assessments more efficient by using benchmarks and communicating invariants.
There is no doubt that there is a rational order or interconnection between these five variables. Indeed, technological progress has remarkably reduced human effort, and heavy tasks for teachers have become an easy exercise thanks to transferability, artificial intelligence, and pedagogical engineering techniques. This time-saving can be invested in experimenting with other innovative pedagogies and introducing more order into the classroom, making it a propitious working environment for pedagogical achievements and the implementation of more efficient assessment devices.
b.
Variables of component 2 (F2):
  • T2: Keep a critical watch on the benefits and limitations of ICTs;
  • C17: Creating collaborative designs;
  • M8: Allowing plenty of room for pedagogical maneuver.
Early designs of digital platforms and tools were archaic and not strongly adapted to the context of education and training, but with the advent of the COVID-19 pandemic, extensive feedback from the user community enabled designers to enrich their commercial offerings with flexible technological solutions, giving teachers more control and generating more interactivity and pedagogical collaboration for the benefit of individuals, small groups, classes, and mass audiences.
c.
Variables of component 7 (F7):
  • C15: Dynamic tasks: Group animation;
  • T5: Learning to cope with technical failures and poor management of digital environments;
  • E31: Making assessment systems more flexible;
  • S24: Identify signs of lack of interest and help resolve them: Connectivity time;
  • T1: Promote the best approaches for the successful infusion of ICT into teaching and learning practices.
It is true that the better the group facilitation, the longer the connectivity and physical presence of e-learning learners. Group facilitation indeed enables dynamic tasks to be carried out, as several synergies surface, with very marked competitive influxes that define a captive environment favorable to the building of skills. Furthermore, if teachers are creative in deploying the best methods for optimizing technological functionalities, this reflects on their degree of mastery of digital environments and the way in which they manage technical failures without undermining pedagogical progress. Optimal quality digital environments for well-managed collaborative work can undoubtedly reduce signs of lack of interest and contribute to their resolution. These assets make it possible to assess under the right conditions, through a wide range of flexible choices.
d.
Variables of component 8 (F8):
  • C21: Improve learner engagement and motivation;
  • S26: Identify the signs of student dishonesty and contribute to its resolution: Plagiarism;
  • E34: Increase the accuracy of results through monitoring and support;
  • E30: Make assessment systems more adaptable.
There is no doubt that motivation has been shaken since the start of the health crisis, and e-learning, during these anxious times, has been described as an adaptation phase because we have been able to maintain pedagogical continuity by creating “school at home”, with all the comfort of the personalized spatio-temporal arrangements it offers. Therefore, if the idea of change inspires us, changing the mindset of learners is one of them. It is through improving learners’ motivation and state of engagement that we can establish reliable predictors of the risk of falling academic performance, and of the components that can reduce interest or impair readiness to learn. An engaged learner integrates well into the new learning system without showing signs of dishonest behavior. As with other components, if everything has been well prepared, the quality of formative evaluation in terms of follow-up, support, gaps to be filled, and adaptation of systems is guaranteed.
e.
Variables of component 9 (F9):
  • E33: Reducing the disruptive effects of correction: over- and under-valuation bias;
  • E29: Make assessment systems more objective.
First and foremost, we must bear in mind that the quest for absolute objectivity in teaching practices is a pipe dream, as there will always be human or system-imposed biases that stand in the way of this perfectionist state of affairs. However, efforts should be made to widen the margin of objectivity by improving measurement and monitoring systems. In principle, the better the invariants—criteria and indicators—are developed, the more accurate the level of evaluative analysis of learner output.
f.
Variables of component 10 (F10):
  • S25: Identify the signs of student dishonesty and contribute to its resolution: Contemplative presenteeism;
  • S27: Identify the signs of student dishonesty and contribute to its resolution: Identity theft;
  • S28: Identify the signs of student dishonesty and contribute to its resolution: Cheating.
The signs of student dishonesty are composite and shifting, from contemplative presenteeism, to identity theft, to fraud. However, the worrying thing is that these manifestations remain difficult to detect and manage in e-learning. Teachers find themselves overwhelmed by the immensity of the phenomenon, which gains ground every time they switch to distance learning. As a result, they have neither the academic time nor the technical expertise to deal with these deviations, which constitute stumbling blocks that are difficult to tackle due to their lack of visibility.
  • Logical Links Between Constructs
Figure 15 shows the possible logical links that could be established to reconstitute the constructs of each factor. What stands out in this representation is that the evaluative skills variables occupy the last link for four out of six components. This shows that the effort invested in the mastery of professional techno-pedagogical, methodological, collaborative, and semio-ethical skills feeds the quality of formative evaluations. This would enable us to catch up on missed skills, identify obstacles to learner performance and provide remedial plans before they reach a critical level of performance loss.

4.3. Limits of the Technological Contribution

Despite the digital revolution, several shortcomings persist, particularly those relating to assessment in terms of objectivity, fairness, pedagogical leeway, and the adaptability of assessment devices proposed by designers. Three challenges need to be met if e-learning is to triangulate its position: maintaining the quality of teaching and distance learning; coping with a mass audience; and responding rapidly and effectively to the multiple demands of beneficiaries. Therefore, it seems essential to focus on pedagogical innovation [48]. Without claiming to be exhaustive, among the obstacles to quality evaluation are school curricula that are often overloaded. Without a doubt, they leave little time for the emergence and experimentation of new evaluation methods that are more stimulating, interactive, and objective. In short, despite the importance of technology in teaching methods, the pace of knowledge acquisition, and assessment practices, it offers no miracle solutions to situations of structural and managerial crisis. In this sense, Raby & al. insist that the added value of ICT varies according to the types of users and their pedagogical needs... [49]. As demonstrated by many studies carried out over the last 30 years, Endrizzi concludes that complexity and inadequacy remain the major features of the issue of ICT in the service of pedagogy [50].
To promote distance learning courses in French, Canada’s Francophone Distance Education Network (REFAD), through its two annual reports published in 2019 [51] and 2021 [52], has consistently praised the important role of pooling expertise and sharing knowledge and practices between players. It is thanks to these assets that several benefits have been offered in terms of reinforcing, on the one hand, the optimal mobilization of technological tools in a system that favors interactivity and relationality, and on the other hand, sparking the emergence of innovative quality practices in distance learning and pedagogy-numerical approaches.

5. Conclusions

All in all, the speed at which digital technologies evolve and are transposed contributes to improving the quality of the education and training process through pedagogical scenarios, the promotion of educational resources, and the design of rapid, non-exhausting assessment systems. However, there is a need to promote research and experimentation in the field of assessment to make it more adaptable to the digital age and prevent the functionalities of the technological model from evolving in the wrong direction. This study aimed to focus on the importance of feedback and the sharing of professional experience in distance learning and verify the extent to which formative evaluation in university e-learning is induced by the development of five professional skills: techno-pedagogical, methodological, collaborative, semio-ethical, and evaluative.
To achieve our objective, data were collected using a criterion-referenced evaluation grid intended for a sample of teachers heavily involved in university e-learning. Frequency analysis showed the existence of centers of excellence for the three techno-pedagogical, methodological, and collaborative skills. However, a clear predominance of criticality was observed for the semio-ethical and evaluative competencies. Bivariate correlation analysis revealed four significant linear relationships:
-
The first correlation, with a very high intensity, associates the availability of digital resources and the potential for personalized feedback. Certainly, the highlight of the digital contribution is that it enables learners to find their learning comforts in one of the resources from the digital range on offer, whether static or dynamic.
-
The second correlation is medium to strong and relates to the reduction in preparation time for learning activities and good classroom management. Starting from the premise that technology accelerates the pace of work and optimizes academic time, this precious time can be used to rethink teaching practices and the implementation of mastery strategies to generate more motivation toward firm student commitment. At this level, we assume that the university evolves in an open system, connected to the universal technological market. So, why not take advantage of these trends to produce teaching resources and prepare adaptable, transferable scenarios?
-
The third strong correlation is between the perceived and expressed need for training and technical assistance, on the one hand, and the ability to implement innovative teaching scenarios, on the other. This seems logical, since creativity and pedagogical innovation are conditional on a firm commitment to introducing new ways of thinking that break away from archaic pedagogical practices. Indeed, building technological skills in a constantly evolving context can only be achieved through ongoing training and technical assistance throughout the entire pedagogical engineering process, from analysis to evaluation.
-
The fourth strong correlation links confidence in ICT with dynamic, interactive task performance. Remaining wary of technological contingency components deprives teachers of an inexhaustible source of internal synergy. Teachers should not delay in adhering to this new paradigm if they are to perform well in transmitting pedagogical innovations, on the one hand, and communicate successfully with a supposedly well-informed public of learners, on the other hand.
Given the exploratory and cross-sectional nature of the study, the findings have been interpreted as descriptive associations rather than causal effects. Inferential analysis showed that the quality of representation for extraction of the techno-pedagogical, methodological, and collaborative variables was good compared to that of the semio-ethical and evaluative competencies. We therefore assumed that the quality of the first three skills depends on the quality of the last two. By examining the total variance explained, we were able to reduce the number of components to thirteen, and then to ten, capable of explaining 64.39% of the total variance. Thus, the ten components selected explain the variability of the 35 original variables, which refer to the five professional skills required to enhance the quality of e-learning formative evaluations. Our analysis was limited to six components, since four of the ten selected were not processed, either due to lack of compatibility, or because they revealed constructs already included in Spearman’s bivariate correlation. Using variable labelling, the trend analysis revealed logical links structuring the relationships that may exist between the variables of each construct. What stands out in the representation of constructs is that the evaluative skills variables were positioned at the end of the chain of relationships for four out of six components. This shows that the effort invested in the mastery of techno-pedagogical, methodological, collaborative, and semio-ethical professional skills feeds the quality of formative evaluations.
Successful formative evaluation in online environments incorporates self and peer assessment, feedback, and collaborative reflection [13,16,17,53,54]. Nonetheless, large class sizes, lack of professor training, and poor digital infrastructure can hinder success. These challenges should be addressed through digital training and focused support [55,56,57]. The alignment of learning activities, assessment, and competence goals is essential. Incorporating formative e-assessment into course design models clarifies expectations, steers learning, and aligns assessment with competence [14,58,59]. Tailoring assessment to the specific e-learning context and the professional area at hand will improve relevance and efficacy further [56,58].
In short, making the most of educational technology and harmoniously optimizing its cross-disciplinary integration depends on the fertile cross-fertilization of ideas on emerging evaluation practices through e-learning experience sharing. It is important to note that professional sharing is the most valuable asset in a living, dynamic, and innovative system, thanks to its ability to generate an inspiring flow of information. It is through this sharing that interferes with the teaching community that we can gauge the evolution of the professional skills that are fundamental to carrying out a formative assessment activity in e-learning under the right conditions, and offering greater visibility in virtual classrooms.
This study provides an exploratory mapping of the university teachers’ competencies in formative e-assessment from a particular institutional context. The results indicate differentiated competency profiles and some associations among practices related to assessment. The conclusions highlight observed patterns rather than generalizable outcomes These results, although useful, are limited in their context specificity and do not support any causal or generalizable claims; therefore, more studies in other institutions using complementary data sources will be necessary to generalize these findings.

Author Contributions

Conceptualization, A.B., F.A. and M.A.; methodology, A.B., F.A. and M.A.; software, A.B., F.A. and M.A.; validation, A.B., F.A. and M.A.; formal analysis, A.B., F.A. and M.A.; investigation, A.B., F.A. and M.A.; resources, A.B., F.A. and M.A.; data curation, A.B., F.A. and M.A.; writing—original draft preparation, A.B., F.A. and M.A.; writing—review and editing, A.B., F.A. and M.A.; visualization, A.B., F.A. and M.A.; supervision, A.B., F.A. and M.A.; project administration, A.B., F.A. and M.A.; funding acquisition, A.B., F.A. and M.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding. The Article Processing Charge (APC) was covered by the authors.

Institutional Review Board Statement

Ethical review and approval were waived for this study as it was based on an anonymous and confidential questionnaire, involved no intervention, and posed no risk to participants, in accordance with institutional ethical guidelines.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Participation was voluntary, and respondents were informed of the anonymous and confidential nature of the data collection.

Data Availability Statement

The data presented in this study are not publicly available due to ethical and confidentiality restrictions related to the anonymous questionnaire and the protection of the participants’ privacy.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ICTInformation communication Technology
HCIHuman–Computer Interaction
LMSLearning Management System
PCAPrincipal Component Analysis
KMOKaiser–Meyer–Olkin
FCommitment (Factors)

Appendix A

Table A1. Summary of the Five Competency Domains and Their items.
Table A1. Summary of the Five Competency Domains and Their items.
Techno-pedagogical skillsT1: Promote the best approaches for the successful infusion of ICT into teaching and learning practices.
T2: Keep a critical watch on the benefits and limitations of ICTs.
T3: Use techno-pedagogical structures to plan spatio-temporal arrangements adapted to training courses.
T4: Be able to produce digital resources: capsules, mind maps, etc.
T5: Learning to cope with technical failures and poor management of digital environments.
T6: Identify the need for training and technological assistance in the delivery of educational services.
T7: Confidence in the use of ICT.
Methodological skillsM8: Allowing plenty of room for pedagogical maneuver.
M9: Provide the resources needed to understand and develop the teaching process.
M10: Make a variety of entertaining media available: videos, interactive formats, podcasts…
M11: Reduce preparation time for learning activities.
M12: Designing innovative teaching scenarios.
M13: Enable efficient class management.
M14: Adopt digital tools and practices to develop creative teaching methods.
Collaborative skillsC15: Performing dynamic tasks: Group animation.
C16: Performing dynamic tasks: Group animation.
C17: Creating collaborative designs.
C18: Contribute to reinforcing student autonomy.
C19: Offer personalized feedback.
C20: Generate greater educational equity.
C21: Improve learner engagement and motivation.
Semio-Ethics SkillsS22: Identify signs of lack-of-interest behavior and contribute to its resolution: Student attention spans.
S23: Identify the signs of CMI and contribute to its resolution: Respecting work handover deadlines.
S24: Identifying the signs of student dishonesty and contributing to its resolution: Connectivity time.
S25: Identifying the signs of EBD and contributing to its resolution: Contemplative presenteeism.
S26: Identify the signs of GCE and contribute to its resolution: Plagiarism.
S27: Identify the signs of CEM and contribute to its resolution: Identity theft.
S28: Identify the signs of GCE and help resolve it: Cheating.
Evaluative skillsE29: Making assessment systems more objective.
E30: Making assessment systems more adaptable.
E31: Making assessment systems more flexible.
E32: Make assessments more efficient by using benchmarks and communicating invariants.
E33: Reducing the disruptive effects of correction: Over- and under-evaluation bias.
E34: Increase the accuracy of results through monitoring and support.
E35: Exposing the personal potential of appraisees.

Appendix B

Detailed PCA outputs are provided in Appendix B, Table A2 and Table A3 for transparency and reproducibility.
Table A2. Total Variance Explained.
Table A2. Total Variance Explained.
Initial EigenvaluesSums Extracted from Load Squares
ComponentTotal% of Variance% CumulativeTotal% of Variance% Cumulative
13.86011.02911.0293.86011.02911.029
22.8808.23019.2582.8808.23019.258
32,6527.57626.8352.6527.57626.835
42.4036.86633.7012.4036.86633.701
52.3406.68740.3872.3406.68740,387
61.9895.68246.0691.9895.68246,069
71.7645.03951.1091.7645.03951.109
81.7244.92656.0351.7244.92656.035
91.5704.48660.5211.5704.48660.521
101.3563.87564.3961.3563.87564.396
111.1893.39867.7941.1893.39867.794
121.1433.26671.0591.1433.26671.059
131.0913.11874.1771.0913.11874.177
140.9592.73976.916
150.8772.50579.421
160.8202.34381.764
17….….……
…..….….
35…..…..100.000
Table A3. Rotated Component Matrix (PCA).
Table A3. Rotated Component Matrix (PCA).
Component
12345678910111213
M11−0.860−0.049−0.033−0.033−0.124−0.0280.1500.0740.0390.0370.0750.0380.094
M13−0.8120.048−0.070−0.224−0.1900.070−0.030−0.0170.1160.0310.0600.0300.004
M140.665−0.068−0.2300.0740.0140.0970.163−0.1430.3400.043−0.0600.090−0.090
T40.611−0.0680.185−0.074−0.109−0.012−0.082−0.0350.1300.1360.4910.116−0.022
E320.5000.091−0.0900.018−0.065−0.1930.1450.3590.2260.069−0.015−0.1420.014
T2−0.0340.953−0.021−0.111−0.041−0.0410.0000.0330.0010.037−0.0290.023−0.073
C17−0.0340.953−0.021−0.111−0.041−0.0410.0000.0330.0010.037−0.0290.023−0.073
M80.1050.498−0.2300.032−0.080−0.1810.0320.0980.079−0.2290.045−0.3250.220
S230.007−0.0580.952−0.041−0.0250.0040.040−0.0020.132−0.1100.031−0.001−0.022
C200.007−0.0580.952−0.041−0.0250.0040.040−0.0020.132−0.1100.031−0.001−0.022
T70.112−0.105−0.0260.9140.053−0.082−0.024−0.008−0.0120.1080.034−0.0540.039
C160.136−0.114−0.0560.8870.130−0.058−0.054−0.0700.0560.1200.070−0.0720.018
C190.085−0.060−0.0120.0880.953−0.064−0.002−0.0190.0350.0100.024−0.062−0.032
M100.093−0.051−0.0370.0760.950−0.070−0.0120.0200.053−0.0310.036−0.005−0.040
T60.009−0.068−0.032−0.073−0.0630.905−0.0650.0230.006−0.070−0.0420.071−0.056
M12−0.033−0.0730.046−0.051−0.0830.899−0.0180.039−0.067−0.0030.101−0.0170.159
C150.0870.0310.090−0.081−0.067−0.0720.7630.0680.0250.0410.2870.0750.127
T50.441−0.0890.0620.254−0.1800.010−0.5490.0480.114−0.0260.1780.123−0.247
E310.275−0.0450.106−0.147−0.111−0.017−0.5010.199−0.1710.1340.0740.4240.290
S240.155−0.2650.0610.129−0.006−0.0580.5010.3770.035−0.0130.017−0.0870.066
T10.104−0.121−0.0810.0880.1450.086−0.4960.0430.122−0.1380.219−0.2640.473
C21−0.062−0.012−0.293−0.1930.0360.052−0.0260.705−0.045−0.1250.071−0.0040.131
E34−0.0640.1110.1110.0590.042−0.1030.0420.621−0.406−0.0430.2030.0100.130
E30−0.0300.0540.2180.189−0.2010.1600.1850.5710.068−0.0300.0860.417−0.146
S260.000−0.120−0.1870.146−0.062−0.2540.021−0.515−0.131−0.1170.0960.1020.396
E330.083−0.1560.166−0.0060.133−0.1080.118−0.0800.7660.066−0.029−0.045−0.035
E290.0590.2270.1750.055−0.0350.035−0.1200.0120.735−0.013−0.1350.0240.131
S28−0.0320.066−0.0790.282−0.0120.1160.089−0.0670.0040.734−0.054−0.020−0.018
S250.022−0.028−0.215−0.0060.002−0.111−0.0930.0490.1510.6700.090−0.169−0.035
S270.179−0.0700.109−0.026−0.064−0.3380.160−0.177−0.2780.542−0.2360.0620.030
E350.011−0.1070.0300.230−0.0360.0070.0220.064−0.211−0.2390.683−0.102−0.097
S22−0.1150.1350.024−0.0360.1860.1250.2670.188−0.1290.1360.5660.0900.142
M90.0450.034−0.032−0.108−0.0750.0350.017−0.029−0.030−0.399−0.0530.7350.156
T30.1980.2420.2010.184−0.1050.0390.069−0.030−0.133−0.240−0.393−0.4600.189
C18−0.234−0.040−0.0060.016−0.0940.1140.1660.0660.065−0.005−0.0840.1050.770
Extraction method: Principal Component Analysis. Rotation method: Varimax with Kaiser Normalization. a. Convergence of rotation in 29 iterations.

References

  1. Paas, F.; Sweller, J. Implications of Cognitive Load Theory for Multimedia Learning. In The Cambridge Handbook of Multimedia Learning, 2nd ed.; Mayer, R.E., Ed.; Cambridge University Press: Cambridge, UK, 2014; pp. 27–42. [Google Scholar] [CrossRef]
  2. Sweller, J. Cognitive Load During Problem Solving: Effects on Learning. Cogn. Sci. 1988, 12, 257–285. [Google Scholar] [CrossRef]
  3. Hollan, J.; Hutchins, E.; Kirsh, D. Distributed cognition: Toward a new foundation for human-computer interaction research. ACM Trans. Comput.-Hum. Interact. 2000, 7, 174–196. [Google Scholar] [CrossRef]
  4. Hutchins, E. Cognition in the Wild, 1995. Available online: https://direct.mit.edu/books/monograph/4892/Cognition-in-the-Wild (accessed on 21 November 2025).
  5. Engeström, Y. Learning by Expanding: An Activity-Theoretical Approach to Developmental Research, 2nd ed.; Cambridge University Press: Cambridge, UK, 2014. [Google Scholar] [CrossRef]
  6. Kaptelinin, V.; Nardi, B. Acting with technology: Activity theory and interaction design. First Monday 2007, 12, 4. [Google Scholar] [CrossRef]
  7. Dix, A. (Ed.) Human-Computer Interaction, 3rd ed.; [6. Nachdr.]; Pearson Prentice-Hall: Harlow, UK, 2010. [Google Scholar]
  8. Nielsen, J. Usability Engineering; Morgan Kaufmann Publishers Inc.: San Francisco, CA, USA, 1994. [Google Scholar]
  9. Marquez-Carpintero, L.; Viejo, D.; Cazorla, M. Enhancing Engineering and STEM Education with Vision and Multimodal Large Language Models to Predict Student Attention. IEEE Access 2025, 13, 114681–114695. [Google Scholar] [CrossRef]
  10. Hang, C.N.; Ho, S.M. Personalized Vocabulary Learning through Images: Harnessing Multimodal Large Language Models for Early Childhood Education. In Proceedings of the 2025 IEEE Integrated STEM Education Conference (ISEC), Princeton, NJ, USA, 15 March 2025; pp. 1–7. [Google Scholar] [CrossRef]
  11. Strivens, J.; Baume, D.; Grant, S.; Owen, C.; Ward, R.; Nicol, D. The Role of e-Portfolios in Formative and Summative Assessment Practices. 2009. Available online: https://www.academia.edu/30102961/The_role_of_e_portfolios_in_formative_and_summative_assessment_practices (accessed on 23 November 2025).
  12. Yerly, G.; Scallon, G. Des savoirs aux compétences. Explorations en évaluation des apprentissages. Montréal, Québec: Éditions du Renouveau Pédagogique Inc. (ERPI). Rev. Sci. L’éducation 2016, 42, 219–220. [Google Scholar] [CrossRef][Green Version]
  13. Chen, L.-L.; Chen, C.-M. Formative E-Assessment Design in Online Learning Environments. Int. J. Educ. 2023, 15, 36–49. [Google Scholar] [CrossRef]
  14. Guerrero-Roldán, A.-E.; Noguera, I. A model for aligning assessment with competences and learning activities in online courses. Internet High. Educ. 2018, 38, 36–46. [Google Scholar] [CrossRef]
  15. Cañadas, L. Contribution of formative assessment for developing teaching competences in teacher education. Eur. J. Teach. Educ. 2023, 46, 516–532. [Google Scholar] [CrossRef]
  16. Parmigiani, D.; Nicchia, E.; Murgia, E.; Ingersoll, M. Formative assessment in higher education: An exploratory study within programs for professionals in education. Front. Educ. 2024, 9, 1366215. [Google Scholar] [CrossRef]
  17. Zenouzagh, Z.M. The effect of online summative and formative teacher assessment on teacher competences. Asia Pac. Educ. Rev. 2019, 20, 343–359. [Google Scholar] [CrossRef]
  18. Balbi, A.; Bonilla, M.; Curione, K.; Ibarra, A.; Menese, P. Impact of a professional development program in formative assessment for mathematics teachers. Prof. Dev. Educ. 2025, 51, 7–22. [Google Scholar] [CrossRef]
  19. Li, J.; Gu, P.Y. Formative assessment for self-regulated learning: Evidence from a teacher continuing professional development programme. System 2024, 125, 103414. [Google Scholar] [CrossRef]
  20. Bulletin Officiel, B.O N° 4800 du 1er Juin 2000, Dahir n° 1-00-199 du 15 Safar 1421 (19 Mai 2000), 2000. Available online: https://www.encgs.ac.ma/wp-content/uploads/2025/01/loi_01-00.pdf (accessed on 22 November 2025).
  21. De Ketele, J.-M. La recherche en évaluation: Propos synthétiques et prospectifs. Mes. Évaluation Éducation 2006, 29, 99. [Google Scholar] [CrossRef]
  22. Vial, M. Se Repérer Dans les Modèles de l’évaluation. Méthodes—Dispositifs—Outils; De Boeck Supérieur: Wallonia, Belgium, 2012. [Google Scholar] [CrossRef]
  23. Rodriguez, E.G. Réflexions sur L’évaluation en Ligne et son Éventuel Potentiel Formatif. Une étude de cas en Anglais LANsad. Available online: https://adjectif.net/spip.php?article555 (accessed on 22 November 2025).
  24. Perrenoud, P. Évaluation Formative: Mais non, ce n’est pas du Chinois, Même les Parents en Font! 1992. Available online: https://www.unige.ch/fapse/SSE/teachers/perrenoud/php_main/php_1992/1992_01.html (accessed on 22 November 2025).
  25. Dionne, E.; Scallon, G. L’évaluation des Apprentissages dans une Approche par Compétences. Montréal: Renouveau Pédagogique. Rev. Sci. L’éducation 2006, 32, 815–816. [Google Scholar] [CrossRef]
  26. Carey, K. Everybody Ready for the Big Migration to Online College? Actually, No. The New York Times. 2020. Available online: https://www.nytimes.com/2020/03/13/upshot/coronavirus-online-college-classes-unprepared.html?smid=url-shar (accessed on 22 November 2025).
  27. Delacroix, E.; Jolibert, A.; Monnot, É.; Jourdan, P. Chapitre 5. La construction d’une échelle de mesure. In Marketing Research: Research Methods and Marketing Studies; Dunod: Malakoff, France, 2021; pp. 121–148. [Google Scholar] [CrossRef]
  28. Rogers, P.; Gervais, M. 12. Le renforcement des capacités en évaluation. In Approches et Pratiques en Évaluation de Programmes; Dagenais, C., Ridde, V., Eds.; Presses de l’Université de Montréal: Montréal, QC, Canada, 2012; pp. 213–232. [Google Scholar] [CrossRef]
  29. Gerard, F.-M. L’indispensable subjectivité de l’évaluation. Antipodes 2002, 156, 26–34. [Google Scholar]
  30. Nizet, I.; Leroux, J.L.; Deaudelin, C.; Béland, S.; Goulet, J. Bilan de pratiques évaluatives des apprentissages à distance en contexte de formation universitaire. Rev. Int. Pédagogie L’enseignement Supérieur 2016, 32. [Google Scholar] [CrossRef]
  31. Favale, T.; Soro, F.; Trevisan, M.; Drago, I.; Mellia, M. Campus Traffic and e-Learning during COVID-19 Pandemic. Comput. Netw. 2020, 176, 107290. [Google Scholar] [CrossRef]
  32. Lassoued, Z.; Alhendawi, M.; Bashitialshaaer, R. An Exploratory Study of the Obstacles for Achieving Quality in Distance Learning during the COVID-19 Pandemic. Educ. Sci. 2020, 10, 232. [Google Scholar] [CrossRef]
  33. Chauvin, C. La Compétence Éthique et le Gestionnaire, Ordre des Administrateurs Agréés du Québec. 2018. Available online: https://www.adma.qc.ca/outils/integration-des-nouveaux-arrivants/outils/ethique-en-gestion-1/la-competence-ethique-et-le-gestionnaire-en-contexte-quebecois/ (accessed on 22 November 2025).
  34. Bonfils, P. Repenser les dispositifs de formation à l’aune de la pandémie? Distance Mediat. Knowl. 2020, 31. [Google Scholar] [CrossRef]
  35. Forestier, A.; Martel, M. Évaluation Formative et Mise en Place d’une Pédagogie Différenciée Efficace, 2012. 58. Available online: https://dumas.ccsd.cnrs.fr/dumas-00756724 (accessed on 22 November 2025).
  36. Lapitan, L.D., Jr.; Tiangco, C.E.; Sumalinog, D.A.G.; Sabarillo, N.S.; Diaz, J.M. An effective blended online teaching and learning strategy during the COVID-19 pandemic. Educ. Chem. Eng. 2021, 35, 116–131. [Google Scholar] [CrossRef]
  37. Salem, P.; Mohamed, A.; Ali, D.; Aal, F.A.A.; Abd El Galiel, A.M.E.S. Effet de l’emploi de l’apprentissage hybride sur le dévéloppement des compétences de la production écrit du français chez les étudiants du cycle secondaire public. BSU-J. Pedagog. Curric. 2022, 1, 39–62. [Google Scholar] [CrossRef]
  38. Peltier, C. Le e-Learning n’est qu’une Simple Transposition d’un Cours Présentiel; Laboratoire d’innovation pédagogique, Université de Genève: Geneva, Switzerland, 2020. Available online: https://www.lip-unige.ch/2020/04/01/infox7-le-e-learning-nest-quune-simple-transposition-dun-cours-presentiel/ (accessed on 22 November 2025).
  39. Jennings, P.A. Jennings—2015—Mindfulness For Teachers PDF|PDF|Mindfulness|Classroom, Scribd. 2015. Available online: https://fr.scribd.com/document/403144983/Jennings-2015-Mindfulness-for-Teachers-pdf (accessed on 22 November 2025).
  40. Carle, S. Ralentir l’enseignement: Des pauses cognitives pour favoriser l’apprentissage. Pédagogie Collégiale 2017. [Google Scholar]
  41. Coman, C.; Țîru, L.G.; Meseșan-Schmitz, L.; Stanciu, C.; Bularca, M.C. Online Teaching and Learning in Higher Education during the Coronavirus Pandemic: Students’ Perspective. Sustainability 2020, 12, 10367. [Google Scholar] [CrossRef]
  42. Owusu-Fordjour, C.; Koomson, C.K.; Hanson, D. THE IMPACT OF COVID-19 ON LEARNING—THE PERSPECTIVE OF THE GHANAIAN STUDENT. Eur. J. Educ. Stud. 2020, 7, 88–101. [Google Scholar] [CrossRef]
  43. Gaudreau, N. Gérer Efficacement sa Classe: Les Cinq Ingrédients Essentiels. 2017. Available online: https://www.researchgate.net/publication/341254791_Gerer_efficacement_sa_classe_les_cinq_ingredients_essentiels (accessed on 22 November 2025).
  44. Tang, Y.M.; Chen, P.C.; Law, K.M.Y.; Wu, C.H.; Lau, Y.; Guan, J.; He, D.; Ho, G.T.S. Comparative analysis of Student’s live online learning readiness during the coronavirus (COVID-19) pandemic in the higher education sector. Comput. Educ. 2021, 168, 104211. [Google Scholar] [CrossRef]
  45. Zhang, C. Who bypasses the Great Firewall in China? First Monday 2020, 25. [Google Scholar] [CrossRef]
  46. Martin, A.J. How to Optimize Online Learning in the Age of Coronavirus (COVID-19): A 5-Point Guide for Educators, ResearchGate. 2020. Available online: https://www.researchgate.net/publication/339944395_How_to_Optimize_Online_Learning_in_the_Age_of_Coronavirus_COVID-19_A_5-Point_Guide_for_Educators (accessed on 22 November 2025).
  47. Martins, J. Le Secret d’une Bonne Dynamique D’équipe [2025] • Asana, Asana. 2025. Available online: https://asana.com/fr/resources/improving-group-dynamics (accessed on 22 November 2025).
  48. Liguori, E.; Winkler, C. From Offline to Online: Challenges and Opportunities for Entrepreneurship Education Following the COVID-19 Pandemic. Entrep. Educ. Pedagog. 2020, 3, 346–351. [Google Scholar] [CrossRef]
  49. Raby, C.; Karsenti, T.; Meunier, H.; Villeneuve, S. Usage des TIC en pédagogie universitaire: Point de vue des étudiants. Rev. Int. Technol. Pédagogie Univ. 2011, 8, 6–19. [Google Scholar] [CrossRef]
  50. Endrizzi, L. Les Technologies Numériques Dans L’enseignement Supérieur, Entre Défis et Opportunités. 2012. Available online: https://ens-lyon.hal.science/ensl-01651604 (accessed on 29 January 2026).
  51. Rapport_annuel_REFAD_2018-2019. 2018. Available online: https://www.refad.ca/wp-content/uploads/2019/06/Rapport_annuel_REFAD_2018-2019.pdf (accessed on 22 November 2025).
  52. Activités de Perfectionnement du REFAD, REFAD. 2020. Available online: https://refad.ca/activites-de-perfectionnement/activites-de-perfectionnement-du-refad-2020-2021/ (accessed on 22 November 2025).
  53. Pradnyadewi, D.A.M.; Dewi, N.L.P.E.S.; Marsakawati, N.P.E.; Paramartha, A.A.G.Y. Formative Assessment Practices in Online Learning for Assessing Studentsâ€TM Writing Competence. Indones. J. Lang. Teach. Linguist. 2022, 7, 101–122. [Google Scholar] [CrossRef]
  54. Ogange, B.O.; Agak, J.O.; Okelo, K.O.; Kiprotich, P. Student Perceptions of the Effectiveness of Formative Assessment in an Online Learning Environment. Open Prax. 2021, 10, 29–39. [Google Scholar] [CrossRef]
  55. Abd Halim, H.; Hamzah, M.I.; Zulkifli, H. Secondary School Teachers’ Formative Assessment Competencies in the Digital Environment: A Systematic Review. Pertanika Proc. 2025, 1, 7–11. [Google Scholar] [CrossRef]
  56. Gikandi, J.; Njuguna, A.M. Formative E-assessment as a Tool for Promoting Competence-Based E-Learning in Universities: A Contextualized Perspective. In Handbook of Research on Digital-Based Assessment and Innovative Practices in Education; IGI Global Scientific Publishing: Hershey, PA, USA, 2020. [Google Scholar]
  57. Shmigirilova, I.B.; Kopnova, O.L. Formative assessment of students in conditions of distance learning. Aktual. Vopr. Obraz. 2021, 121–125. [Google Scholar] [CrossRef]
  58. van Gog, T.; Sluijsmans, D.M.A.; Brinke, D.J.-T.; Prins, F.J. Formative assessment in an online learning environment to support flexible on-the-job learning in complex professional domains. Educ. Tech. Res. Dev. 2010, 58, 311–324. [Google Scholar] [CrossRef]
  59. Sudakova, N.E.; Savina, T.N.; Masalimova, A.R.; Mikhaylovsky, M.N.; Karandeeva, L.G.; Zhdanov, S.P. Online Formative Assessment in Higher Education: Bibliometric Analysis. Educ. Sci. 2022, 12, 209. [Google Scholar] [CrossRef]
Figure 1. Explanatory model of the university components.
Figure 1. Explanatory model of the university components.
Mti 10 00014 g001
Figure 2. ICT and halo of professional skills mobilized during online formative evaluation.
Figure 2. ICT and halo of professional skills mobilized during online formative evaluation.
Mti 10 00014 g002
Figure 3. Weighting scale.
Figure 3. Weighting scale.
Mti 10 00014 g003
Figure 4. Histogram of total scores for the five skills targeted by ICT integration.
Figure 4. Histogram of total scores for the five skills targeted by ICT integration.
Mti 10 00014 g004
Figure 5. Average total scores for the five skills targeted by ICT integration.
Figure 5. Average total scores for the five skills targeted by ICT integration.
Mti 10 00014 g005
Figure 6. Histogram of averages for techno-pedagogical skills variables.
Figure 6. Histogram of averages for techno-pedagogical skills variables.
Mti 10 00014 g006
Figure 7. Histogram of averages for methodological skills variables.
Figure 7. Histogram of averages for methodological skills variables.
Mti 10 00014 g007
Figure 8. Histogram of means for collaborative skills variables.
Figure 8. Histogram of means for collaborative skills variables.
Mti 10 00014 g008
Figure 9. Histogram of means for semio-ethical skills variables.
Figure 9. Histogram of means for semio-ethical skills variables.
Mti 10 00014 g009
Figure 10. Histogram of averages for evaluative skills variables.
Figure 10. Histogram of averages for evaluative skills variables.
Mti 10 00014 g010
Figure 11. Distribution of the variables of the five skills according to their co-occurrence.
Figure 11. Distribution of the variables of the five skills according to their co-occurrence.
Mti 10 00014 g011
Figure 12. Representation qualities of professional skills variables.
Figure 12. Representation qualities of professional skills variables.
Mti 10 00014 g012
Figure 13. Extraction averages for professional skills variables.
Figure 13. Extraction averages for professional skills variables.
Mti 10 00014 g013
Figure 14. Eigenvalue graph.
Figure 14. Eigenvalue graph.
Mti 10 00014 g014
Figure 15. Logical links defining the constructs of each factor.
Figure 15. Logical links defining the constructs of each factor.
Mti 10 00014 g015
Table 1. Bivariate correlations.
Table 1. Bivariate correlations.
M10–C19Variable C19: Offer personalized feedback
Variable M10:
Make a variety of entertaining media available: videos, interactive formats, podcasts, etc.
Spearman correlation0.924 **
Sig. (bilateral)0.000
N115
M11–M13Variable M13: Enabling good classes management
Variable M11:
Reduce preparation time for learning activities
Spearman correlation0.544 **
Sig. (bilateral)0.000
N115
M12–T6Variable T6: Identify the need for training and technological assistance in the delivery of educational services
Variable M12:
Designing innovative teaching scenarios
Spearman correlation0.695 **
Sig. (bilateral)0.000
N115
T7–C16Variable C16: Performing dynamic tasks: interactions
Variable T7:
Confidence in ICT
Spearman correlation0.822 **
Sig. (bilateral)0.000
N115
** Correlation is significant at the 0.01 level (two-tailed).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Boumahdi, A.; Ammari, F.; Ammari, M. An HCI-Centered Experiences of ICT Integration and Its Impact on Professional Competencies Supporting Formative Assessment in Higher Education e-Learning. Multimodal Technol. Interact. 2026, 10, 14. https://doi.org/10.3390/mti10020014

AMA Style

Boumahdi A, Ammari F, Ammari M. An HCI-Centered Experiences of ICT Integration and Its Impact on Professional Competencies Supporting Formative Assessment in Higher Education e-Learning. Multimodal Technologies and Interaction. 2026; 10(2):14. https://doi.org/10.3390/mti10020014

Chicago/Turabian Style

Boumahdi, Abdelaziz, Fadwa Ammari, and Mohammed Ammari. 2026. "An HCI-Centered Experiences of ICT Integration and Its Impact on Professional Competencies Supporting Formative Assessment in Higher Education e-Learning" Multimodal Technologies and Interaction 10, no. 2: 14. https://doi.org/10.3390/mti10020014

APA Style

Boumahdi, A., Ammari, F., & Ammari, M. (2026). An HCI-Centered Experiences of ICT Integration and Its Impact on Professional Competencies Supporting Formative Assessment in Higher Education e-Learning. Multimodal Technologies and Interaction, 10(2), 14. https://doi.org/10.3390/mti10020014

Article Metrics

Back to TopTop