Next Article in Journal
Differences in Cognitive and Mathematical Skills of Students with a Mathematical Learning Disability and Those with Low Achievement in Mathematics: A Systematic Literature Review
Previous Article in Journal
“Because That’s What Scientists Do…. They Like to Make Their Own Stuff”: Exploring Perceptions of Self as Science-Doers Using the Black Love Framework
Previous Article in Special Issue
Assessment-Focused Pedagogical Methods for Improving Student Learning Process and Academic Outcomes in Accounting Disciplines
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Perceptions of Students and Teachers Regarding Remote and Face-to-Face Assessments in the Evolving Higher Education Landscape

by
Daniel Humberto Pozza
1,2,*,
José Tiago Costa-Pereira
1,2,3 and
Isaura Tavares
1,2
1
Experimental Biology Unit, Department of Biomedicine, Faculty of Medicine of Porto, University of Porto, 4200-319 Porto, Portugal
2
Institute for Research and Innovation in Health and IBMC, University of Porto, 4200-135 Porto, Portugal
3
Faculty of Nutrition and Food Sciences, University of Porto, 4150-180 Porto, Portugal
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(3), 360; https://doi.org/10.3390/educsci15030360
Submission received: 12 February 2025 / Revised: 10 March 2025 / Accepted: 11 March 2025 / Published: 13 March 2025

Abstract

:
In the post-pandemic era, characterized by rapid digital transformation, teaching and evaluation methods must evolve to meet the new reality, as students and educators continue to express concerns about fairness and integrity. The objective of this cross-sectional study was to evaluate the perceptions of students and teachers on the introduction of remote assessments in the context of the evolving higher education landscape, which was significantly disrupted by the 2020 pandemic, but leaving methods and approaches that are still used. Data collection comprised a sample of 989 students and 266 teachers. The results demonstrated a current preference for face-to-face assessments, which were considered fairer and more equitable, with less dishonesty, stress, and time consumption. Dishonesty was the main concern related to remote assessments, both for students and teachers. Remote assessments were undervalued, likely due to the rapid adaptation that did not allow enough time for proper models to be developed. It is believed that improving remote and hybrid assessments will lead to a greater satisfaction and confidence among teachers and students. In the era of artificial intelligence and accelerated advancements in educational technology, this article exposed the limitations and advantages of remote assessment, suggested improvements, and highlighted the gap in perceptions on that evaluation between students and teachers. The findings underscore the need to reimagine traditional pedagogy to accommodate diverse learning preferences, integrate emerging technologies, and develop the skills needed for the rapidly evolving world. New teaching methodologies that help to foster critical thinking are crucial for improving student learning and initiating a necessary paradigm shift in evaluation methods to effectively prevent cheating.

1. Introduction

Digital transformation is continuously reshaping the education landscape, with rapid advances in artificial intelligence (AI) playing a pivotal role in enhancing access to information. However, the success of this transformation extends beyond simply understanding how to use technology, it also requires a strategic integration of these tools into the learning process. Several domains are involved, including the use of online educational platforms, understanding their benefits and limitations, and, most importantly, gathering feedback from both students and teachers on the implemented innovations (Baczek et al., 2021; Donia et al., 2022; Dost et al., 2020; Elsalem et al., 2020; Fitzgerald et al., 2021; Garcia-Seoane et al., 2021; Kamalov et al., 2023; Nathaniel et al., 2021; Stain et al., 2005; Zhang & Dong, 2024).
The 2020 pandemic led to significant changes in classic face-to-face teaching and examination methods. As a result, many in-person exams transitioned rapidly to remote formats, with monitoring conducted through virtual video methods and software aimed at maintaining the integrity of the evaluation process (Alsoufi et al., 2020; Baczek et al., 2021; Coe et al., 2020; Fitzgerald et al., 2021; Summers et al., 2022). This shift occurred in Portuguese universities from March 2020, affecting millions of students, teachers, and administrative staff who had to quickly adapt to remote assessment conditions. While universities provided training for teachers on using online tools, the focus was largely on delivering online lectures rather than on assessments. Consequently, teachers faced the challenge of effectively using digital tools to evaluate students, leading to thorough scrutiny of the evaluation process by the students (Alsoufi et al., 2020; Amigud, 2020; Baczek et al., 2021; Dost et al., 2020; Elsalem et al., 2021; Elsalem et al., 2020; Slade et al., 2021; Yorke et al., 2020; Zhang & Dong, 2024).
The transition to remote exams during the pandemic revealed significant stress and behavioral changes induced by e-exams, highlighting the need for supportive measures to mitigate stress (Elsalem et al., 2020). Additionally, there was a marked preference for traditional in-person exams due to concerns regarding fairness and integrity (Elsalem et al., 2021). Online exams have been found to be more susceptible to cheating, potentially undermining the validity of assessment outcomes (Fask et al., 2014). Ensuring integrity in remote assessments requires a combination of pedagogical and technological/digital strategies to enhance trust and reliability. Secure assessment designs are essential to mitigating risks, such as cheating and identity fraud, while balancing security measures with accessibility and user-friendliness, ensuring that integrity measures do not hinder student engagement or learning outcomes (Aristeidou et al., 2024; Hilliger et al., 2022; Kaisara & Bwalya, 2023; Khalil et al., 2022).
Furthermore, student demographics are changing significantly, with an increase in the diversity of cultural and socioeconomic backgrounds. Many students face difficulties attending face-to-face classes and examinations due to living far away, work commitments, family responsibilities, and health issues. These changes require higher education institutions to adapt their pedagogical approaches to meet the needs of a more diverse student population and to offer some flexibility in learning options (Balram, 2019; Balram & Boxall, 2019; Lewohl, 2023; Vasquez et al., 2023; Zhang & Dong, 2024).
The transformation of the educational process, driven by pedagogical research and paradigm shifts aided by new technologies, includes innovative approaches such as flipped classrooms, game-based learning, problem-based learning, problem-solving, and inquiry-based learning. These methods improve engagement and motivation, further enhancing the learning experience, and can be used in student evaluations. This evolution fosters fair evaluations, stimulates critical thinking, and enhances soft skills. Additionally, it significantly reduces, if not eliminates, the possibility of cheating. By grounding these pedagogical innovations in evidence-based principles, the quality and relevance of higher education are enhanced, preparing students for real-world scenarios (Alan & Yurt, 2024; Köpeczi-Bócz, 2024; Lewohl, 2023; Mialkovska et al., 2024; Munna & Kalam, 2021; Ni et al., 2024; Pimdee et al., 2024; Ruslan et al., 2024; Wang, 2024).
Taking into account that traditional pedagogy has to adapt to accommodate diverse learning needs, the objective of this study was to evaluate the perceptions of students and teachers regarding the advantages and disadvantages of remote assessments at a Portuguese university. Remote assessment remains a topic requiring careful consideration, as it presents potential drawbacks, such as computer network instability, e-cheating, and other forms of academic misconduct. However, it also offers advantages, including the potential for more flexible and dynamic modes of evaluation.

2. Materials and Methods

2.1. Study Design and Data Collection

This research employed both quantitative and qualitative methods, with data collection occurring between 31 May and 30 June 2021. A cross-sectional survey was conducted using Google Forms questionnaires, distributed via the Dynamic Email System of University of Porto, without recording identifying data. Fifteen days after the first email, a reminder was sent to maximize the response rate. To minimize bias, participants were not informed of the project details. The methodology of data acquisition involved self-administered questionnaires, without intervention from the authors or any other individuals, ensuring no identifying information was collected from participants to maintain confidentiality. Only fully completed questionnaires were included in the data analysis. The inclusion criteria were limited to university teachers and students from the University of Porto, aged 18 years or older, who understood and agreed to participate anonymously by providing informed consent.
The questionnaires were elaborated based on prior research (Alsoufi et al., 2020; Ashworth et al., 1997; Baczek et al., 2021; Bretag et al., 2019; Elsalem et al., 2021; Elsalem et al., 2020; Franklyn-Stokes & Newstead, 1995; Sattler et al., 2017; Yorke et al., 2020). They included ten multiple-choice questions related to remote exams and one open-ended question designed to gauge overall opinions on remote assessments. Before distribution, the questionnaires were piloted with five faculty members to ensure clarity. The versions for students and teachers were fundamentally similar, with only minor adjustments to accommodate each group’s specific context (see Supplementary Materials). To avoid misinterpretations, the questionnaires were administered solely in Portuguese.

2.2. Data Analysis

The data obtained from the questionnaires were analyzed using the IBM® SPSS® Statistics software version 27. Descriptive statistics were employed to summarize the demographic characteristics of the sample, such as faculty affiliation, years of teaching experience/year of the course for students, as well as the frequency of responses. The Chi-square test was used to compare the categorical variables. The open-ended question responses were qualitatively analyzed and summarized by two researchers, with the Kappa test applied to verify concordance.

3. Results

Data collection comprised 1255 questionnaires: 989 from undergraduate students and 266 from teachers at the University of Porto. To facilitate data interpretation, the 14 faculties that constitute the University of Porto were grouped into three main scientific areas according to the Directorate General of Education and Science Statistics of the Ministry of Education (DGEEC: https://estatistica.dgeec.mec.pt/docs/docs_cdr/Classificacao_FOS_VersaoPortuguesa.pdf, accessed on 12 March 2025). These areas were: 1. Humanities/Social Sciences (Faculty of Law, Faculty of Arts and Humanities, Faculty of Fine Arts, Faculty of Architecture, Faculty of Psychology and Education Sciences, and Faculty of Economics); 2. Exact/Natural/Technological Sciences (Faculty of Engineering and Faculty of Sciences); and 3. Health Sciences (Faculty of Dental Medicine, Faculty of Medicine, Faculty of Nutrition and Food Sciences, Faculty of Pharmacy, Faculty of Sports, and Institute of Biomedical Sciences).

3.1. Descriptive Analysis

The analysis of the student data showed that the mean years of graduation was 2.5 ± 1.3 (median = 2), with 53.8% in the first two years and 46.2% in the third to sixth years of their studies. The distribution of students across the three main areas was: 14.6% in Humanities/Social Sciences, 37.0% in Exact/Natural/Technological Sciences, and 48.4% in Health Sciences.
The mean years of teaching experience among teachers was 24.7 ± 11.4 (median = 25). Approximately half of the teachers (50.8%) had less than 25 years of teaching experience, while 49.2% had 26 or more years. The distribution of teachers across the three main areas was: 13.5% in Humanities/Social Sciences, 36.8% in Exact/Natural/Technological Sciences, and 49.6% in Health Sciences.

3.2. Comparative Analysis

The multiple-choice answers included in the questionnaires are presented in Table 1, Table 2, Table 3 and Table 4. Table 1 shows the comparisons between teachers and students. The general consensus is that face-to-face exams are preferred over e-exams, as they are considered fairer, more equitable, less prone to dishonesty, less stressful, and less time-consuming. When comparing the groups, question 2 (“Dishonesty in e-exams”) is the only one where both students and teachers agree, with both groups believing that dishonesty is higher in remote evaluations. For the remaining questions, several differences emerge. Teachers report spending more time and effort preparing e-exams (question 1), believe more strongly that students think cheating is justifiable (question 3), and perceive that students are more likely to cheat in e-exams (question 4). Additionally, teachers are more likely to prefer face-to-face exams, report having worse technological conditions (question 6), feel less comfortable (question 7), and are more stressed (question 9) during e-exams. Teachers predominantly believe that face-to-face exams are better than remote assessments for the learning process.
In the student group, an analysis was performed by dividing the population into those in the first two years of their course versus those in the last three years (third to sixth). This division was based on the mean/median years of pre-graduation, which were 2.5/2, respectively, with 53.8% of students in the first two years and 46.2% in the third year or beyond (Table 2). The comparative analysis demonstrated that students in the later years experienced less stress and required less time and effort to prepare for e-exams, and they were more likely to cheat in exams. Although both groups generally preferred face-to-face exams, this preference was pronounced among students in the first two years.
In the teacher group, an analysis was performed based on the number of years of teaching experience. With a mean teaching duration of 25 years, 50.8% of teachers had 25 or fewer years of experience, while 49.2% had more than 25 years, making 25 years the cut-off point for the statistical comparisons, as shown in Table 3. Only slight differences were observed between the two subgroups. Teachers with 26 or more years of experience reported being less comfortable with technologies and were more likely to suspect that students consider cheating on exams justifiable.
The three main areas of knowledge (Humanities/Social Sciences, Exact/Natural/Technological Sciences, and Health Sciences) were also compared within the student (Table 4) and teacher (Table 5) groups.
Although all the knowledge areas generally preferred face-to-face exams, this preference was especially pronounced among students in the Exact Sciences. Students in the Exact Sciences subgroup also reported that face-to-face exams were fairer and more equitable, and that electronic exams required more time and effort compared with face-to-face assessments. Students in Human/Social Sciences reported being more comfortable with technologies and experiencing less stress during e-exams. The Health Sciences subgroup reported a higher honesty in e-exams and observed a smaller difference between the two types of exams regarding their impact on learning.
Regarding teachers across the three scientific areas, there were no statistically significant differences in most comparisons. However, teachers in the Exact Sciences were less comfortable with technologies and tended to report a lower satisfaction with technological equipment.

3.3. Qualitative Analysis

As for the open-ended question, about the general opinions, the summary is described in the following paragraphs. For teachers, 101 valid responses were collected, revealing several key perceptions of remote evaluations:
  • Cheating: reported 53 times. Teachers felt that it is impossible to ensure honesty in e-exams, and when fraud is detected, it is harder to prove than in face-to-face assessments.
  • Less equitable and fair: reported 23 times. Teachers recognized that some measures that are adopted to prevent cheating (e.g., less time and sequential exams) hinder students’ reasoning. Not all academics have good technological conditions.
  • More difficult and time consuming: reported 21 times. It requires additional technological know-how and time to prepare and evaluate students.
  • Not reliable in all knowledge areas: reported 11 times, specifically in areas such as the arts.
  • A feasible solution during pandemics: reported 8 times.
The Kappa test for this evaluation yielded a score of 0.863, indicating a strong agreement.
To illustrate this better, some quotes have been directly translated into English:
“It is a source of inequality between students who are honest and those who are dishonest, which is often the case in remote assessments. In my view, it is not possible to evaluate the real capabilities of students with full integrity. An assessment should never be conducted remotely nowadays, as students feel very comfortable with new technologies and know how to use them skillfully to exchange information with their peers.”
“Although I agree that the way it was performed in most courses facilitated fraud, there were also reliable ways to conduct assessments. Ideally, in the future, there should be a mix of in-person and remote assessments. I also emphasize that the single final assessment model is obsolete, and the adaptations that some professors made proved that with continuous assessment, students learn more and do not have as much stress condensed into a single assessment moment.”
“I felt that some professors adapted better than others, and often it is a matter of creating or adopting new assessment methods and interacting with students.”
For students, 199 valid responses were obtained, and the main perceptions of the remote evaluations were:
  • Harder and more stressful than face-to-face exams: reported 100 times. Students cited a lack of information about the exam, a shorter solution time, the inability to freely navigate between questions, more difficult questions, internet problems, difficulty in communicating with teachers during e-exams, excessive screen time, the impossibility to use a draft to construct the answers, and an unfavorable home environment as major complaints.
  • More comfortable: reported 58 times. This was particularly noted by student workers who have less time to go to the University.
  • Cheating: reported 54 times. Students reported that cheating is very easy in remote evaluations, making it unfair for those who are honest.
  • Less equitable and fair: reported 49 times.
  • A feasible solution during pandemics: reported 21 times.
  • Unfavorable home environment: reported 13 times.
  • Not reliable in all knowledge areas: reported 10 times.
The Kappa test for this evaluation yielded a score of 0.946, indicating an almost perfect agreement.
To illustrate this better, some quotes have been directly translated into English:
“Remote assessment introduces a series of variables beyond our control (namely, the quality and reliability of the equipment used and the internet connection), contributing to more stress during the exam and leading to greater inequalities among students.”
“Remote assessment can be performed from any location, as long as there is internet access. On the other hand, in-person assessment requires traveling to the university, which puts displaced students at a disadvantage since they have to wake up earlier and, compared with others, experience more stress before the exam due to traveling to the location using public transportation and ensuring punctuality.”
“Remote assessments greatly contribute to the increase in dishonesty during evaluations and to the amplification of the impact of socioeconomic inequalities on academic results.”
“Remote assessments are more convenient and practical, but they put more pressure on us. This is because there is an expectation to achieve better results, as it is assumed that we might cheat. It is also more difficult to concentrate on the evaluation, and students are not on an equal footing.”

4. Discussion

This study highlights some differences between students’ and teachers’ general perceptions of remote evaluations, but both groups considered that the remote assessments are stressful. Disparities were detected between the perceptions of students of the first two years and more advanced students regarding their ability to adapt and cheat in online evaluations. Additionally, the study pointed to some differences between distinct areas of knowledge and considerable mismatches between students and teachers in certain fields.
Both students and teachers expressed a preference for face-to-face examinations, likely due to the challenges posed by the rapid and forced shift to remote assessments. These challenges include adequate internet access, caregiving responsibilities, and difficulties concentrating on screens, as reported in the open question of the study. Except, mainly, for some student workers who reported more comfort with e-exams, due to the time saved in commuting to the university, face-to-face examinations were considered less time-consuming, stressful, dishonest, and effort-intensive. This was true mainly for students in the Exact Sciences group, who also reported learning better when studying for face-to-face assessments. It is important to ensure the reliability of online platforms and student equipment (including internet) to perform equitable online assessments. There are still many disadvantaged students who experience financial or technical difficulties, mainly related to a stable and reliable internet service, when using e-learning platforms. Furthermore, the home environment is not always the best, as was also reported in our study, as noise and disturbances can affect exam performance (Alsoufi et al., 2020; Dhawan, 2020; Dost et al., 2020; Nathaniel et al., 2021; Summers et al., 2022; Tamrat, 2021). Together, all these factors likely contributed to the reported stress, dishonesty, excessive effort, and time consumption experienced by the students.
Despite the majority of students at our university having good or adequate technological equipment and being comfortable with using it, as previously reported (Alsoufi et al., 2020), face-to-face exams were clearly considered fairer and more equitable, especially for students in the Exact Sciences. Remote assessments were undervalued, probably due to the rapid adaptation period that did not allow enough time for proper models to be developed. The reported inherent flaws in remote assessments, such as difficulties in ensuring academic integrity and the challenge of accurately measuring student performance, exacerbate these issues. Systemic issues in remote education, such as unequal access to technology and internet connectivity, further compound the problem, leading to disparities in educational outcomes. On the other hand, computer-based testing environments can have practical advantages, and some studies suggest that they can be emotionally better for some students compared with face-to-face examinations (Harley et al., 2021). Therefore, a blended teaching and evaluation process could be designed to incorporate the best of both face-to-face and online modalities in non-pandemic conditions (Ravat et al., 2021). It is believed that improving remote and hybrid assessments will lead to a greater satisfaction and confidence among teachers and students, especially in the era of AI and accelerated advancements in educational technology. As referred to before, a combination of pedagogical and technological/digital strategies needs to be implemented, since the experiences during the pandemic lead to an increase in online teaching at the universities, namely in post-graduation courses. Several possibilities need to be considered, namely redesigning the exams by shifting the pure memorization to critical thinking and application of the knowledge to new situations. Along with this pedagogic adjustment, adjusting the time of the evaluation is important to mitigate the possibilities of collaborative responses between the students and extensive searches in the available platforms, since the AI tools have improved considerably after the pandemic. These tools may, however, be very useful nowadays, since after expanding the question banks, algorithms may be used to randomize the questions and even to tailor the exam to the specificities of each student, by increasing the degree of difficulty of the sequential questions. In the online courses, leaving the traditional final exam to regular quizzes in each class should also be considered.
This study also revealed that senior students experienced less stress, and required less time and effort to prepare for e-exams. One hypothesis is that as students progress in their advanced educational levels, their ability to adapt to new situations, including becoming more skillful in cheating, is enhanced. We should not discard other possibilities due to the unique circumstances of the pandemic context. For instance, unmeasured factors, like students’ living conditions during this period, could significantly influence stress and anxiety (Solmi et al., 2025; Tavares-Almeida et al., 2023), potentially confounding our findings. Repeating the study in the current context may be important. Therefore, these results indicate that it may be important to define measures to support younger students, helping them cope better with sudden changes (Costa et al., 2022; Morgado et al., 2021; Solmi et al., 2022). Furthermore, it is evident that cheating increases as the courses progress and honesty should be fostered in the University milieu. Differences may be considered for younger and more advanced students since the latter are more experienced in the examination process, more likely to prefer e-exams, and report that it is easy to cheat.
This is a highly complex problem that should be addressed on multiple levels and involve various committees, including students and experts in the field. To start, universities should create opportunities for open discussions on the importance of honesty and ethical behavior, ensuring that these conversations are both meaningful and impactful. Given that 16.2% of students admitted that cheating during remote exams is justifiable, it is essential to go beyond merely promoting ethical values and actively engage students in critical reflections on integrity. Identifying vulnerable students and addressing the root causes of dishonesty, such as stress and perceived unfairness, can help to foster a deeper understanding of the long-term professional and ethical consequences of cheating. Notably, the rapid academic changes in academia are making cheating more favorable due to technological advancements, including AI, when used by students for this purpose rather than for fostering learning through critical thinking. Easy access to resources, facilitated by AI tools, has made it simpler for students to engage in dishonest practices. Furthermore, there is a constant need to critically integrate emerging technologies in education and the focus should be on addressing concerns on excessive remote surveillance, which can increase stress and diminish autonomy for both groups (Ashworth et al., 1997; Bretag et al., 2019; Curtis & Clare, 2017; Scarfe et al., 2024).
In our study, 76.7% of the teachers believed that students were likely to cheat, while only 16.2% of the students felt that cheating was justifiable. It is not possible to make direct comparisons; however, we hypothesize that this significant gap may be attributed to a lack of student honesty or a high level of distrust among teachers, both of which complicate the evaluation process in online exams. Consequently, this distrust could lead professors to design more difficult online exams in an effort to prevent cheating. Despite the lower reported rates of cheating among students, this remains a concerning issue. Many students may not admit to cheating, may engage in it despite knowing it is wrong, or, even worse, may believe that it is acceptable. Previous studies have shown that only 7.9% (Curtis & Clare, 2017) of students engaged in contract cheating, while 5.8% (Bretag et al., 2019) admitted to cheating. These studies also indicate that cheating is primarily influenced by opportunity. Furthermore, the actual rates of dishonesty are likely much higher than the self-reports suggest, with many students admitting to repeatedly cheating as a strategy for completing their studies (Bretag et al., 2019; Curtis et al., 2021).
The average teaching experience among the population was 25 years. Our results indicate that teachers with more than 25 years of experience reported greater difficulties in adapting to online evaluations compared with their younger colleagues. Additionally, 40% of these teachers were not comfortable with online evaluations, and most reported poor technological conditions. In this context, teachers expressed that the increased time and effort required to prepare for e-exams made them more likely to prefer face-to-face exams. Academic aging is a current issue in several universities (Kaskie et al., 2017) and, according to the difficulties reported by teachers with more than 25 years of experience in the present study, it may affect the so-called ‘digital transition’.
Formative strategies in technological resources for teachers may increase the reliability and trust in the online evaluation process. The educational team should be continuously updated to develop technological skills, to enable evaluations through innovative solutions. These solutions may include blended methods, allowing students to be evaluated on their critical thinking skills (Abdellatif et al., 2022; Almasri, 2024; Cheriguene et al., 2021; Darling-Aduana, 2021; Dhawan, 2020; Dost et al., 2020; Juele, 2018; Nathaniel et al., 2021; Scarfe et al., 2024; Shelton et al., 2017; Slade et al., 2021). Additionally, the proper integration of AI into the education and examination process, fostering critical thinking and soft skills, can enhance the fairness and effectiveness of remote and hybrid assessments. This, in turn, can lead to greater satisfaction and confidence among both teachers and students (Abdellatif et al., 2022; Almasri, 2024).
The transformation of the educational process, driven by pedagogical research and paradigm shifts aided by new technologies, is essential to overcome new challenges, such as another pandemic (Costa et al., 2022; Dhawan, 2020; Morgado et al., 2021; Silva Moreira et al., 2021). Before widely implementing online assessments, it is crucial to address these paradigm changes. Teachers must inevitably adapt to new technologies to ensure better and fairer exams, addressing the significant concern of e-cheating. By incorporating various innovative teaching methods, such as flipped classrooms, game-based, and problem-based learning, among others, critical thinking is stimulated, and engagement and motivation are improved, further enhancing the learning experience. This approach can foster fair evaluations, enhance soft skills, and significantly reduce, if not eliminate, the possibility of cheating (Chang et al., 2022; Chi et al., 2022; Köpeczi-Bócz, 2024; Ruslan et al., 2024). Embracing these changes can help to create a more reliable and trustworthy hybrid evaluation process, ultimately benefiting both educators and students. By incorporating these pedagogical innovations based on evidence-based principles, educators can enhance the quality and relevance of higher education, better preparing students for real-world scenarios.
In the open question of this study, students complained that e-exams frequently used the same paradigms as face-to-face exams and reduced the time available to answer, making it impossible to return to the previous questions. Teachers justified this approach as a way to reduce the possibilities of cheating. However, students reported that this solution did not prevent cheating; instead, it had the opposite effect, as they did not have enough time to critically think about the test questions. This likely increased the perception gap between students and professors, furthering distrust in the evaluation process. Thus, there is a need to transition to an effective and fair remote assessment system. Additionally, there are specific technical requirements to monitor and assist students during online assessments, which can raise both technical and ethical issues (Alsoufi et al., 2020; Cheng et al., 2021; de Boer, 2021; Dost et al., 2020; Potu et al., 2021). Online examinations cannot be performed in the same way as face-to-face exams: online examinations should consider adaptations, namely, avoiding evaluation through a single final assessment, promoting and evaluating active learning activities, choosing better timing, giving students enough time to complete assessments, and using quizzes and problem-solving to motivate students, as well as providing training and gathering student feedback (Alsoufi et al., 2020; Ashworth et al., 1997; Cheng et al., 2021; Curtis & Clare, 2017; Dhawan, 2020; Donia et al., 2022; Dost et al., 2020; Garcia-Seoane et al., 2021; Kumaravel et al., 2021; Looi et al., 2021; Nathaniel et al., 2021; Silva et al., 2021; Slade et al., 2021; Vonderwell & Boboc, 2013).
While online assessments are important in emergency cases to keep the educational process working (Alsoufi et al., 2020; Cheriguene et al., 2021; Dost et al., 2020; Elsalem et al., 2021; Potu et al., 2021) and summative online assessments can achieve a comparable student performance, when compared with face-to-face examinations, they still have some limitations, mainly for those that require a more practical and interactive performance. Additionally, students are also concerned and less satisfied about how online activities, including assessments, could be applied to provide practical experience, preferring face-to-face examinations (Looi et al., 2021; Nathaniel et al., 2021), which is in agreement with our study. Thus, the best ways to perform an effective online examination and assess whether it will be a good choice to substitute face-to-face activities are still under discussion. The findings underscore the need to reimagine traditional pedagogy to accommodate diverse learning preferences, integrate emerging technologies, and prepare students with the skills needed for the fast-changing world of tomorrow.
The study tool was a concise questionnaire that included questions that we considered most important to ensure a high response rate. Nevertheless, we included an open-ended question at the end of the form for participants who wished to freely discuss the subject. These few questions mark the beginning of a more comprehensive evaluation of this complex process, which must be continuously discussed in the academic environment to improve the educational process and ensure a proper student assessment that is suitable for current generations. Another limitation of this study is that it was not possible to determine the extent of participants’ prior experience with the remote exams. We assumed that even those who had not participated in remote exams had valuable opinions to share. Although the COVID-19 national measures were consistent across all educational institutions, this study was performed at a single university, and the results may not be generalizable. Additionally, longitudinal studies will be important to establish causal associations.
The present survey yielded results that could be useful in preparing better online evaluations, which could suddenly be imposed on a large scale due to the possibility of another pandemic lockdown. These insights may also be applied to future formative evaluations. In this context, the innovations that emerged during this pandemic will be extremely important for revitalizing the educational process, fostering collaborations between students and teachers for the benefit of society.

5. Conclusions

Online assessments should be improved to align with current innovative pedagogical approaches, recognizing that they are not a substitute for face-to-face examinations. Online assessments can be used in the current non-pandemic scenario, which profits from the experiences during COVID-19 with a huge increase in remote courses. The current available AI tools may help to implement and improve remote assessments. Confidence in the assessment process is essential for both for teachers and students and also for the credibility of universities. Addressing gaps between students and teachers, as well as across different academic disciplines, will be crucial in improving the overall educational process. Recognizing these challenges provides an opportunity for institutions to engage in meaningful discussions and implement diverse strategies to enhance assessment integrity.
Future research should explore the long-term impact of blended evaluation models, integrating innovative teaching methods and emerging AI technologies to enhance student learning outcomes, engagement, and academic integrity across diverse educational contexts. Further studies should also assess the effectiveness of institutional strategies in fostering trust and reducing disparities between students and teachers in online and hybrid assessment environments.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/educsci15030360/s1, PDF S1: Surveys on Remote and Face-to-Face Assessments (English).

Author Contributions

Conceptualization, D.H.P., and I.T.; methodology, D.H.P. and I.T.; software, D.H.P., J.T.C.-P. and I.T.; validation, D.H.P., J.T.C.-P. and I.T.; formal analysis, D.H.P., J.T.C.-P. and I.T.; investigation, D.H.P., J.T.C.-P. and I.T.; resources, D.H.P. and I.T.; data curation, D.H.P., J.T.C.-P. and I.T.; writing—original draft preparation, D.H.P.; writing—review and editing, D.H.P., J.T.C.-P. and I.T.; visualization, D.H.P., J.T.C.-P. and I.T.; supervision, I.T.; project administration, D.H.P. and I.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics and Research Committee of the São João Hospital, Portugal (protocol code 200-21, approved on 21 May 2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author, D.H.P., upon reasonable request.

Acknowledgments

To all the participants of this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Abdellatif, H., Al Mushaiqri, M., Albalushi, H., Al-Zaabi, A. A., Roychoudhury, S., & Das, S. (2022). Teaching, learning and assessing anatomy with artificial intelligence: The road to a better future. International Journal of Environmental Research and Public Health, 19(21), 14209. [Google Scholar] [CrossRef] [PubMed]
  2. Alan, S., & Yurt, E. (2024). Flipped learning: An innovative model for enhancing education through ChatGPT. International Journal of Modern Education Studies, 8(1), 124–148. [Google Scholar] [CrossRef]
  3. Almasri, F. (2024). Exploring the impact of artificial intelligence in teaching and learning of science: A systematic review of empirical research. Research in Science Education, 54(5), 977–997. [Google Scholar] [CrossRef]
  4. Alsoufi, A., Alsuyihili, A., Msherghi, A., Elhadi, A., Atiyah, H., Ashini, A., Ashwieb, A., Ghula, M., Ben Hasan, H., Abudabuos, S., Alameen, H., Abokhdhir, T., Anaiba, M., Nagib, T., Shuwayyah, A., Benothman, R., Arrefae, G., Alkhwayildi, A., Alhadi, A., … Elhadi, M. (2020). Impact of the COVID-19 pandemic on medical education: Medical students’ knowledge, attitudes, and practices regarding electronic learning. PLoS ONE, 15(11), e0242905. [Google Scholar] [CrossRef]
  5. Amigud, A. (2020). Cheaters on Twitter: An analysis of engagement approaches of contract cheating services. Studies in Higher Education, 45(3), 692–705. [Google Scholar] [CrossRef]
  6. Aristeidou, M., Cross, S., Rossade, K.-D., Wood, C., Rees, T., & Paci, P. (2024). Online exams in higher education: Exploring distance learning students’ acceptance and satisfaction. Journal of Computer Assisted Learning, 40(1), 342–359. [Google Scholar] [CrossRef]
  7. Ashworth, P., Bannister, P., Thorne, P., & Students on the Qualitative Research Methods Course Unit. (1997). Guilty in whose eyes? University students’ perceptions of cheating and plagiarism in academic work and assessment. Studies in Higher Education, 22(2), 187–203. [Google Scholar] [CrossRef]
  8. Baczek, M., Zaganczyk-Baczek, M., Szpringer, M., Jaroszynski, A., & Wozakowska-Kaplon, B. (2021). Students’ perception of online learning during the COVID-19 pandemic: A survey study of Polish medical students. Medicine, 100(7), e24821. [Google Scholar] [CrossRef]
  9. Balram, S. (2019). Teaching and learning pedagogies in higher education geographic information science. In S. Balram, & J. Boxall (Eds.), GIScience teaching and learning perspectives (pp. 1–8). Springer International Publishing. [Google Scholar] [CrossRef]
  10. Balram, S., & Boxall, J. (2019). GIScience teaching and learning perspectives (pp. 1–8). Springer. [Google Scholar]
  11. Bretag, T., Harper, R., Burton, M., Ellis, C., Newton, P., Rozenberg, P., Saddiqui, S., & van Haeringen, K. (2019). Contract cheating: A survey of Australian university students. Studies in Higher Education, 44(11), 1837–1856. [Google Scholar] [CrossRef]
  12. Chang, Y.-H., Yan, Y.-C., & Lu, Y.-T. (2022). Effects of combining different collaborative learning strategies with problem-based learning in a flipped classroom on program language learning. Sustainability, 14(9), 5282. [Google Scholar] [CrossRef]
  13. Cheng, X., Chan, L. K., Pan, S. Q., Cai, H., Li, Y. Q., & Yang, X. (2021). Gross anatomy education in china during the COVID-19 pandemic: A national survey. Anatomical Sciences Education, 14(1), 8–18. [Google Scholar] [CrossRef] [PubMed]
  14. Cheriguene, A., Kabache, T., Kerrache, C. A., Calafate, C. T., & Cano, J. C. (2021). NOTA: A novel online teaching and assessment scheme using Blockchain for emergency cases. Education and Information Technologies, 27(1), 115–132. [Google Scholar] [CrossRef]
  15. Chi, M., Wang, N., Wu, Q., Cheng, M., Zhu, C., Wang, X., & Hou, Y. (2022). Implementation of the flipped classroom combined with problem-based learning in a medical nursing course: A quasi-experimental design. Healthcare, 10(12), 2572. [Google Scholar] [CrossRef] [PubMed]
  16. Coe, T. M., Jogerst, K. M., Sell, N. M., Cassidy, D. J., Eurboonyanun, C., Gee, D., Phitayakorn, R., & Petrusa, E. (2020). Practical techniques to adapt surgical resident education to the COVID-19 era. Annals of Surgery, 272(2), e139–e141. [Google Scholar] [CrossRef]
  17. Costa, A. D., Fernandes, A., Ferreira, S., Couto, B., Machado-Sousa, M., Moreira, P., Morgado, P., & Picó-Pérez, M. (2022). How long does adaption last for? An update on the psychological impact of the confinement in Portugal. International Journal of Environmental Research and Public Health, 19(4), 2243. [Google Scholar] [CrossRef]
  18. Curtis, G. J., & Clare, J. (2017). How prevalent is contract cheating and to what extent are students repeat offenders? Journal of Academic Ethics, 15(2), 115–124. [Google Scholar] [CrossRef]
  19. Curtis, G. J., McNeill, M., Slade, C., Tremayne, K., Harper, R., Rundle, K., & Greenaway, R. (2021). Moving beyond self-reports to estimate the prevalence of commercial contract cheating: An Australian study. Studies in Higher Education, 47(9), 1844–1856. [Google Scholar] [CrossRef]
  20. Darling-Aduana, J. (2021). Development and validation of a measure of authentic online work. Educational Technology Research and Development, 69(3), 1729–1752. [Google Scholar] [CrossRef]
  21. de Boer, H. (2021). COVID-19 in Dutch higher education. Studies in Higher Education, 46(1), 96–106. [Google Scholar] [CrossRef]
  22. Dhawan, S. (2020). Online learning: A panacea in the time of COVID-19 crisis. Journal of Educational Technology Systems, 49(1), 5–22. [Google Scholar] [CrossRef]
  23. Donia, M. B. L., Mach, M., O’Neill, T. A., & Brutus, S. (2022). Student satisfaction with use of an online peer feedback system. Assessment & Evaluation in Higher Education, 47(2), 269–283. [Google Scholar] [CrossRef]
  24. Dost, S., Hossain, A., Shehab, M., Abdelwahed, A., & Al-Nusair, L. (2020). Perceptions of medical students towards online teaching during the COVID-19 pandemic: A national cross-sectional survey of 2721 UK medical students. BMJ Open, 10(11), e042378. [Google Scholar] [CrossRef]
  25. Elsalem, L., Al-Azzam, N., Jum’ah, A. A., & Obeidat, N. (2021). Remote E-exams during COVID-19 pandemic: A cross-sectional study of students’ preferences and academic dishonesty in faculties of medical sciences. Annals of Medicine and Surgery, 62, 326–333. [Google Scholar] [CrossRef]
  26. Elsalem, L., Al-Azzam, N., Jum’ah, A. A., Obeidat, N., Sindiani, A. M., & Kheirallah, K. A. (2020). Stress and behavioral changes with remote E-exams during the COVID-19 pandemic: A cross-sectional study among undergraduates of medical sciences. Annals of Medicine and Surgery, 60, 271–279. [Google Scholar] [CrossRef] [PubMed]
  27. Fask, A., Englander, F., & Wang, Z. (2014). Do online exams facilitate cheating? An Experiment designed to separate possible cheating from the effect of the online test taking environment. Journal of Academic Ethics, 12(2), 101–112. [Google Scholar] [CrossRef]
  28. Fitzgerald, D. A., Scott, K. M., & Ryan, M. S. (2021). Blended and e-learning in pediatric education: Harnessing lessons learned from the COVID-19 pandemic. European Journal of Pediatrics, 181, 447–452. [Google Scholar] [CrossRef] [PubMed]
  29. Franklyn-Stokes, A., & Newstead, S. E. (1995). Undergraduate cheating: Who does what and why? Studies in Higher Education, 20(2), 159–172. [Google Scholar] [CrossRef]
  30. Garcia-Seoane, J. J., Ramos-Rincon, J. M., Lara-Munoz, J. P., & CCS-OSCE Working Group of the CNDFME. (2021). Changes in the Objective Structured Clinical Examination (OSCE) of University Schools of Medicine during COVID-19. Experience with a computer-based case simulation OSCE (CCS-OSCE). Revista Clínica Española, 221(8), 456–463. [Google Scholar] [CrossRef]
  31. Harley, J. M., Lou, N. M., Liu, Y., Cutumisu, M., Daniels, L. M., Leighton, J. P., & Nadon, L. (2021). University students’ negative emotions in a computer-based examination: The roles of trait test-emotion, prior test-taking methods and gender. Assessment & Evaluation in Higher Education, 46(6), 956–972. [Google Scholar] [CrossRef]
  32. Hilliger, I., Ruipérez-Valiente, J. A., Alexandron, G., & Gašević, D. (2022). Trustworthy remote assessments: A typology of pedagogical and technological strategies. Journal of Computer Assisted Learning, 38(6), 1507–1520. [Google Scholar] [CrossRef]
  33. Juele, L. (2018, June 25). Authentic assessments: A critical thinking and engagement tool for online courses. EdMedia: World Conference on Educational Media and Technology, Amsterdam, The Netherlands. [Google Scholar]
  34. Kaisara, G., & Bwalya, K. J. (2023). Strategies for enhancing assessment information integrity in mobile learning. Informatics, 10(1), 29. [Google Scholar] [CrossRef]
  35. Kamalov, F., Santandreu Calonge, D., & Gurrib, I. (2023). New Era of artificial intelligence in education: Towards a sustainable multifaceted revolution. Sustainability, 15(16), 12451. [Google Scholar] [CrossRef]
  36. Kaskie, B., Walker, M., & Andersson, M. (2017). Efforts to address the aging academic workforce: Assessing progress through a three-stage model of institutional change. Innovative Higher Education, 42(3), 225–237. [Google Scholar] [CrossRef]
  37. Khalil, M., Prinsloo, P., & Slade, S. (2022). In the nexus of integrity and surveillance: Proctoring (re)considered. Journal of Computer Assisted Learning, 38(6), 1589–1602. [Google Scholar] [CrossRef]
  38. Köpeczi-Bócz, T. (2024). The impact of a combination of flipped classroom and project-based learning on the learning motivation of university students. Education Sciences, 14(3), 240. [Google Scholar] [CrossRef]
  39. Kumaravel, B., Stewart, C., & Ilic, D. (2021). Face-to-face versus online clinically integrated EBM teaching in an undergraduate medical school: A pilot study. BMJ Evidence-Based Medicine, 27(3), 162–168. [Google Scholar] [CrossRef]
  40. Lewohl, J. M. (2023). Exploring student perceptions and use of face-to-face classes, technology-enhanced active learning, and online resources. International Journal of Educational Technology in Higher Education, 20(1), 48. [Google Scholar] [CrossRef]
  41. Looi, J. C. L., Maguire, P., Bonner, D., Reay, R. E., Finlay, A. J. F., Keightley, P., Tedeschi, M., Wardle, C., & Kramer, D. (2021). Conduct and evaluation of final-year medical student summative assessments in Psychiatry and Addiction Medicine during COVID-19: An Australian university medical school experience. Australas Psychiatry, 29(6), 695–698. [Google Scholar] [CrossRef]
  42. Mialkovska, L., Maiboroda, O., Koretska, N., Martyniuk, Y., Haponchuk, O., & Korobchuk, L. (2024). Contemporary management innovations in shaping the educational process: Insights from Europe. Archives Des Sciences, 74(6), 51–58. Available online: https://unige.org/articles/2024%20Issue%206/2024607.pdf (accessed on 12 March 2025). [CrossRef]
  43. Morgado, A. M., Cruz, J., & Peixoto, M. M. (2021). Individual and community psychological experiences of the COVID-19 pandemic: The state of emergency in Portugal. Current Psychology, 42(4), 3213–3223. [Google Scholar] [CrossRef]
  44. Munna, A. S., & Kalam, M. A. (2021). Teaching and learning process to enhance teaching effectiveness: A literature review. International Journal of Humanities and Innovation, 4(1), 1–4. Available online: https://files.eric.ed.gov/fulltext/ED610428.pdf (accessed on 12 March 2025). [CrossRef]
  45. Nathaniel, T. I., Goodwin, R. L., Fowler, L., McPhail, B., & Black, A. C., Jr. (2021). An adaptive blended learning model for the implementation of an integrated medical neuroscience course during the COVID-19 pandemic. Anatomical Sciences Education, 14(6), 699–710. [Google Scholar] [CrossRef]
  46. Ni, Z. H., Huang, J., Yang, D. P., & Wang, J. (2024). Nursing students’experience of flipped classroom combined with problem-based learning in a paediatric nursing course: A qualitative study. BMC Nursing, 23(1), 88. [Google Scholar] [CrossRef] [PubMed]
  47. Pimdee, P., Sukkamart, A., Nantha, C., Kantathanawat, T., & Leekitchwatana, P. (2024). Enhancing Thai student-teacher problem-solving skills and academic achievement through a blended problem-based learning approach in online flipped classrooms. Heliyon, 10(7), e29172. [Google Scholar] [CrossRef] [PubMed]
  48. Potu, B. K., Atwa, H., Nasr El-Din, W. A., Othman, M. A., Sarwani, N. A., Fatima, A., Deifalla, A., & Fadel, R. A. (2021). Learning anatomy before and during COVID-19 pandemic: Students’ perceptions and exam performance. Morphologie, 106(354), 188–194. [Google Scholar] [CrossRef]
  49. Ravat, S., Barnard-Ashton, P., & Keller, M. M. (2021). Blended teaching versus traditional teaching for undergraduate physiotherapy students at the university of the Witwatersrand. The South African Journal of Physiotherapy, 77(1), 1544. [Google Scholar] [CrossRef]
  50. Ruslan, R., Lu’mu, L. M., Fakhri, M. M., Ahmar, A. S., & Fadhilatunisa, D. (2024). Effectiveness of the flipped project-based learning model based on moodle LMS to improve student communication and problem-solving skills in learning programming. Education Sciences, 14(9), 1021. [Google Scholar] [CrossRef]
  51. Sattler, S., Wiegel, C., & Veen, F. V. (2017). The use frequency of 10 different methods for preventing and detecting academic dishonesty and the factors influencing their use. Studies in Higher Education, 42(6), 1126–1144. [Google Scholar] [CrossRef]
  52. Scarfe, P., Watcham, K., Clarke, A., & Roesch, E. (2024). A real-world test of artificial intelligence infiltration of a university examinations system: A “Turing Test” case study. PLoS ONE, 19(6), e0305354. [Google Scholar] [CrossRef]
  53. Shelton, P. G., Corral, I., & Kyle, B. (2017). Advancements in undergraduate medical education: Meeting the challenges of an evolving world of education, healthcare, and technology. Psychiatric Quarterly, 88(2), 225–234. [Google Scholar] [CrossRef]
  54. Silva, E. C. E., Lino-Neto, T., Ribeiro, E., Rocha, M., & Costa, M. J. (2021). Going virtual and going wide: Comparing team-based learning in-class versus online and across disciplines. Education and Information Technologies, 27(2), 2311–2329. [Google Scholar] [CrossRef]
  55. Silva Moreira, P., Ferreira, S., Couto, B., Machado-Sousa, M., Fernández, M., Raposo-Lima, C., Sousa, N., Picó-Pérez, M., & Morgado, P. (2021). Protective elements of mental health status during the COVID-19 outbreak in the portuguese population. International Journal of Environmental Research and Public Health, 18(4), 1910. [Google Scholar] [CrossRef] [PubMed]
  56. Slade, C., Lawrie, G., Taptamat, N., Browne, E., Sheppard, K., & Matthews, K. E. (2021). Insights into how academics reframed their assessment during a pandemic: Disciplinary variation and assessment as afterthought. Assessment & Evaluation in Higher Education, 47(4), 588–605. [Google Scholar] [CrossRef]
  57. Solmi, M., Estradé, A., Thompson, T., Agorastos, A., Radua, J., Cortese, S., Dragioti, E., Leisch, F., Vancampfort, D., Thygesen, L. C., Aschauer, H., Schloegelhofer, M., Akimova, E., Schneeberger, A., Huber, C. G., Hasler, G., Conus, P., Cuénod, K. Q. D., von Känel, R., … Correll, C. U. (2022). Physical and mental health impact of COVID-19 on children, adolescents, and their families: The collaborative outcomes study on health and functioning during infection times—Children and adolescents (COH-FIT-C&A). Journal of Affective Disorders, 299, 367–376. [Google Scholar] [CrossRef] [PubMed]
  58. Solmi, M., Thompson, T., Cortese, S., Estrade, A., Agorastos, A., Radua, J., Dragioti, E., Vancampfort, D., Thygesen, L. C., Aschauer, H., Schlogelhofer, M., Aschauer, E., Schneeberger, A., Huber, C. G., Hasler, G., Conus, P., Cuenod, K. Q. D., von Kanel, R., Arrondo, G., … Correll, C. U. (2025). Collaborative outcomes study on health and functioning during infection times (COH-FIT): Insights on modifiable and non-modifiable risk and protective factors for wellbeing and mental health during the COVID-19 pandemic from multivariable and network analyses. European Neuropsychopharmacology, 90, 1–15. [Google Scholar] [CrossRef]
  59. Stain, S. C., Mitchell, M., Belue, R., Mosley, V., Wherry, S., Adams, C. Z., Lomis, K., & Williams, P. C. (2005). Objective assessment of videoconferenced lectures in a surgical clerkship. The American Journal of Surgery, 189(1), 81–84. [Google Scholar] [CrossRef]
  60. Summers, R., Higson, H., & Moores, E. (2022). The impact of disadvantage on higher education engagement during different delivery modes: A pre- versus peri-pandemic comparison of learning analytics data. Assessment & Evaluation in Higher Education, 48(1), 56–66. [Google Scholar] [CrossRef]
  61. Tamrat, W. (2021). Enduring the impacts of COVID-19: Experiences of the private higher education sector in Ethiopia. Studies in Higher Education, 46(1), 59–74. [Google Scholar] [CrossRef]
  62. Tavares-Almeida, S., Moura, D., Madeira, N., & Figueiredo-Braga, M. (2023). Psychological burden in portuguese university students during the COVID-19 pandemic. Porto Biomedical Journal, 8(2), e200. [Google Scholar] [CrossRef]
  63. Vasquez, A. G., Vasquez, A. R. G., & Maulion, C. Q. (2023). Challenges encountered by the students in the face-to-face class implementation in the post-COVID learning context. International Journal of Research in Engineering and Science, 11(1), 485–4891. [Google Scholar]
  64. Vonderwell, S. K., & Boboc, M. (2013). Promoting formative assessment in online teaching and learning. TechTrends, 57(4), 22–27. [Google Scholar] [CrossRef]
  65. Wang, J. (2024). Research on the flipped classroom + learning community approach and its effectiveness evaluation—Taking college german teaching as a case study. Sustainability, 16(17), 7719. [Google Scholar] [CrossRef]
  66. Yorke, J., Sefcik, L., & Veeran-Colton, T. (2020). Contract cheating and blackmail: A risky business? Studies in Higher Education, 47(1), 53–66. [Google Scholar] [CrossRef]
  67. Zhang, Y., & Dong, C. (2024). Exploring the digital transformation of generative ai-assisted foreign language education: A socio-technical systems perspective based on mixed-methods. Systems, 12(11), 462. [Google Scholar] [CrossRef]
Table 1. Comparative analyses of the responses of students and teachers.
Table 1. Comparative analyses of the responses of students and teachers.
QuestionAnswerStudents (%)Teachers (%)p Value
1. Time and effort to prepare for e-exams Higher35.279.3<0.001
Equal48.017.5
Lesser16.83.3
2. Dishonesty in e-exams Higher57.957.60.531
Equal37.739.6
Lesser4.42.8
3. Cheating is justifiable Agree16.276.7<0.001
Disagree83.323.3
4. More likely to cheat in e-exams Agree49.077.3<0.001
Disagree51.022.7
5. Preference Face-to-face59.575.1<0.001
Distance15.35.0
Mix25.219.9
6. My equipment/internet is Bad4.48.3<0.001
Regular31.446.9
Good64.344.9
7. Confortable with technologies Agree76.467.50.005
Disagree23.632.5
8. Learn better in Face-to-face41.354.5<0.001
Distance8.12.4
No differences50.643.1
9. More stressed in Face-to-face34.73.6<0.001
Distance38.253.6
No differences27.142.7
10. Fairer and more equitable Face-to-face77.775.70.001
Distance9.64.6
No differences12.719.9
%—percentage, <—less than.
Table 2. Division of the students in two groups according to the years of the courses.
Table 2. Division of the students in two groups according to the years of the courses.
QuestionAnswer1st and 2nd Years (%)3rd to 6th Years (%)p Value
1. Time and effort to prepare for e-exams Higher35.335.10.003
Equal44.352.2
Lesser20.312.7
2. Dishonesty in e-exams Higher58.856.80.660
Equal36.539.1
Lesser4.74.0
3. Cheating is justifiable Agree16.116.20.977
Disagree83.983.8
4. More likely to cheat in e-exams Agree53.443.90.004
Disagree46.656.1
5. Preference Face-to-face61.657.00.025
Distance12.418.7
Mix26.024.3
6. My equipment/internet is Bad3.65.30.366
Regular30.931.9
Good65.562.8
7. Confortable with technologies Agree77.974.70.238
Disagree22.125.3
8. Learn better in Face-to-face47.733.8<0.001
Distance6.79.8
No differences45.656.4
9. More stressed in Face-to-face35.441.40.032
Distance30.423.3
No differences34.235.2
10. Fairer and more equitable Face-to-face79.375.80.418
Distance8.810.5
No differences11.913.7
%—percentage, <—less than.
Table 3. Division of the teachers in two groups according to the years of teaching.
Table 3. Division of the teachers in two groups according to the years of teaching.
QuestionAnswer25 or Less (%)26 or More (%)p Value
1. Time and effort to prepare for e-exams Higher76.881.80.306
Equal20.814.0
Lesser2.44.1
2. Dishonesty in e-exams Higher53.561.80.328
Equal44.135.0
Lesser2.43.3
3. Cheating is justifiable Agree72.181.50.078
Disagree27.918.5
4. More likely to cheat in e-exams Agree75.279.30.443
Disagree24.820.7
5. Preference Face-to-face77.972.30.579
Distance4.65.4
Mix17.622.3
6. My equipment/internet is Bad6.89.90.651
Regular48.145.5
Good45.144.6
7. Confortable with technologies Agree74.259.60.015
Disagree25.840.4
8. Learn better in Face-to-face54.354.80.689
Distance1.63.2
No differences44.142.1
9. More stressed in Face-to-face4.03.20.582
Distance56.550.8
No differences39.546.0
10. Fairer and more equitable Face-to-face78.472.40.506
Distance4.54.7
No differences17.222.8
%—percentage.
Table 4. Division of the students in three groups according to the knowledge area.
Table 4. Division of the students in three groups according to the knowledge area.
QuestionAnswerHuman./Social (%)Exact (%)Health (%)p Value
1. Time and effort to prepare for e-exams Higher36.545.327.5<0.001
Equal38.238.360.2
Lesser25.316.412.3
2. Dishonesty in e-exams Higher54.161.157.80.057
Equal40.837.236.4
Lesser5.21.75.8
3. Cheating is justifiable Agree15.720.613.60.063
Disagree84.379.486.4
4. More likely to cheat in e-exams Agree54.153.043.30.009
Disagree45.947.056.7
5. Preference Face-to-face49.067.359.8<0.001
Distance24.712.212.3
Mix26.420.527.9
6. My equipment/internet is Bad2.93.95.50.363
Regular34.931.429.4
Good62.264.765.1
7. Confortable with technologies Agree82.177.272.80.023
Disagree17.922.827.2
8. Learn better in Face-to-face47.549.332.2<0.001
Distance11.77.06.9
No differences40.843.760.9
9. More stressed in Face-to-face46.335.028.1<0.001
Distance31.335.044.2
No differences22.530.127.6
10. Fairer and more equitable Face-to-face73.080.878.10.007
Distance15.28.67.2
No differences11.810.614.7
Human.—Humanities, %—percentage, <—less than.
Table 5. Division of the teachers in three groups according to the knowledge area.
Table 5. Division of the teachers in three groups according to the knowledge area.
Question AnswerHuman./Social (%)Exact (%)Health (%)p Value
1. Time and effort to prepare for e-exams Higher86.484.072.30.136
Equal11.914.722.3
Lesser1.71.35.4
2. Dishonesty in e-exams Higher57.160.356.00.675
Equal37.537.242.2
Lesser5.42.61.7
3. Cheating is justifiable Agree71.980.276.50.523
Disagree28.119.823.5
4. More likely to cheat in e-exams Agree79.277.276.40.919
Disagree20.822.823.6
5. Preference Face-to-face74.176.774.40.154
Distance3.41.28.5
Mix22.422.117.1
6. My equipment/internet is Bad5.114.16.00.080
Regular39.048.749.6
Good55.937.244.4
7. Confortable with technologies Agree72.255.173.70.019
Disagree27.844.926.3
8. Learn better in Face-to-face49.161.752.10.198
Distance5.52.50.9
No differences45.535.847.0
9. More stressed in Face-to-face1.83.84.40.518
Distance46.453.257.5
No differences51.843.038.1
10. Fairer and more equitable Face-to-face66.182.475.00.282
Distance7.13.54.2
No differences26.814.120.8
%—percentage, Human.—Humanities.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pozza, D.H.; Costa-Pereira, J.T.; Tavares, I. Perceptions of Students and Teachers Regarding Remote and Face-to-Face Assessments in the Evolving Higher Education Landscape. Educ. Sci. 2025, 15, 360. https://doi.org/10.3390/educsci15030360

AMA Style

Pozza DH, Costa-Pereira JT, Tavares I. Perceptions of Students and Teachers Regarding Remote and Face-to-Face Assessments in the Evolving Higher Education Landscape. Education Sciences. 2025; 15(3):360. https://doi.org/10.3390/educsci15030360

Chicago/Turabian Style

Pozza, Daniel Humberto, José Tiago Costa-Pereira, and Isaura Tavares. 2025. "Perceptions of Students and Teachers Regarding Remote and Face-to-Face Assessments in the Evolving Higher Education Landscape" Education Sciences 15, no. 3: 360. https://doi.org/10.3390/educsci15030360

APA Style

Pozza, D. H., Costa-Pereira, J. T., & Tavares, I. (2025). Perceptions of Students and Teachers Regarding Remote and Face-to-Face Assessments in the Evolving Higher Education Landscape. Education Sciences, 15(3), 360. https://doi.org/10.3390/educsci15030360

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop