Next Article in Journal
The Effect of the CoI on Preservice Teachers’ Self-Efficacy in Physical Education
Previous Article in Journal
A Futures Perspective on Learning and Teaching in Higher Education: An Essay on Best and Next Practices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions

by
Charlotte E. Lyddon
Department of Geography & Planning, University of Liverpool, Liverpool L69 7ZT, UK
Trends High. Educ. 2024, 3(3), 812-826; https://doi.org/10.3390/higheredu3030046
Submission received: 9 July 2024 / Revised: 23 August 2024 / Accepted: 2 September 2024 / Published: 18 September 2024

Abstract

:
The COVID-19 pandemic had unprecedented impacts, both directly and indirectly, on student populations across the UK. As teaching has returned to ‘normal’, in person exams have made a comeback and so has the debate about the value of unseen exams as a method of assessment. This research provides a comprehensive insight into student perception of exams in light of their COVID-19 educational experiences. This research combines student perspectives on unseen exams, considering this is a generation of students who have perhaps never sat a regular written exam before due to pandemic. Student perceptions are combined with academic staff experience on delivering unseen exams to identify their suitability within curriculums that promote authentic assessment and research-led teaching. The thematic analysis of results identifies that students feel strongly about the purpose, fairness, and authenticity of unseen exams, and the COVID-19 pandemic is likely to have lasting impacts on student perception of their university experience. The themes identify practical considerations for academic staff when considering the inclusion of unseen exams in their teaching, most notably with respect to accessibility, support needs, and assessment design.

1. Introduction and Theoretical Framework

The use of examinations as an assessment tool in higher education has long been a subject of debate. Among the various types of assessments, unseen written exams hold a significant place in universities worldwide. Students sit unseen exams independently, under timed conditions, and without any resources or prior knowledge of exam content. Unseen exams can take the form of multiple-choice, short answer, or essay. There is debate about the effectiveness of written exams as a method of assessment [1,2], and systematic reviews have compared the effectiveness of unseen (closed book) or seen (open book) assessments [3,4], and how the style of assessment can be linked to student success and enhanced learning [5]. Some say seen and unseen exams offer breadth to methods of assessment [6] and develop skills, including “examination techniques, writing under pressure, and recall” [7]. However, it can be argued that written exams are not representative of the potential of each student, and students can only demonstrate a superficial knowledge of a subject [8]. Unseen exams are rarely written in a way to consistently test deep, conceptual understanding [9]. It has been considered that unseen exams encourage students to memorize and reproduce factual information, but with limited understanding [10,11,12]. Students’ study profiles, assessment preference, and achievement have been shown to be linked, with those scoring low on ‘time management’ and ‘unrelated memorising’ preferring seen exams [5]. There is an increasing argument that unseen exams cause undue stress in students, which can particularly aggravate some challenges students already face, including ADHD and anxiety [13,14,15]. Further to this, unseen exams rarely test employability skills, such as communication, teamwork and flexibility [16]. Unseen exams endure due to concerns about academic dishonesty and lack of support or time to develop holistic, active assessments. The method of assessment set by academics in their modules and courses is a critical component that contributes to a student’s overall success and satisfaction [17]. It has been indicated that teaching practices and the type of assessment methods that are used can encourage different types of learning [18]. Assessments should be developed that encourage deep learning and conceptual understanding, rather than a surface approach to learning [19]. Different learning and assessment environments significantly impact students’ performance, attitudes, and learning outcomes [20], shaping both how students engage with content and how they perceive the value of their learning. Drawing on constructivist theories, which emphasize the role of students’ active engagement in their learning process [21,22,23], it is understood that the nature of the assessment environment can either foster or hinder deep learning [24,25]. Traditional in-person exams and online exams, for instance, influence not only students’ cognitive processes, such as critical thinking and problem-solving [26], but also their affective responses, including stress, anxiety, and motivation. The transition between these environments, particularly during the COVID-19 pandemic, has likely altered students’ engagement levels, their perceived relevance of assessments, and their overall educational experience, making it essential to understand these influences in shaping effective assessment strategies. Research has also shown that gender [27] and year of study [28] can also strongly influence how students approach tasks and assessment. While unseen exams remain a widely used assessment method in higher education, their effectiveness is increasingly questioned due to concerns about their impact on students’ stress, their limited ability to assess deep understanding and employability skills, and the tendency to promote superficial learning strategies.

1.1. COVID-19 Impact on Students Learning

The COVID-19 pandemic has profoundly impacted students on multiple fronts [29]. For students already enrolled at university, the shift to online learning disrupted the traditional classroom experience, leading to challenges such as limited access to resources, reduced interaction with peers and instructors, and increased feelings of isolation [30,31,32]. Students who began their university degree programme during the global COVID-19 pandemic faced unprecedented challenges. The cancellation of A-levels and GCSEs had implications for students’ mental well-being and long-term effects on their preparedness for higher education [33]. Without the opportunity to sit for these standardized exams, students might have lacked the readiness and preparedness required for their chosen degree programs, which could have consequences on academic performance. The 2022–2023 academic year saw many institutions return to ‘normal’ teaching, most notably regarding on-campus, in-person teaching and unseen exams under standard exam conditions, e.g., in-person, timed. The authors’ personal experience of reintroducing an unseen, timed, in-person exam during semester 1 of the 2022–2023 highlighted a negative attitude amongst the students who were most impacted by the COVID-19 pandemic. It is worth re-invigorating the debate around the value of unseen exams in an undergraduate degree programme, as well as understanding students’ perspectives, considering the fact that we are currently seeing a generation of students who have perhaps never sat a regular written exam before due to the pandemic.

1.2. Research Aim

This research aims to evaluate students’ opinions on and experience of undertaking in-person exams within a traditional exam setting, compared to several years of online exams following the COVID-19 pandemic. This research will explore the following research questions:
  • What are students’ perceptions of unseen exams?
  • Investigating students’ perception of unseen exams is essential to understand how these assessments are viewed in terms of fairness, stress, and educational value. This insight can reveal whether traditional exams effectively measure students’ knowledge and skills, or if they are seen as outdated and ineffective in today’s learning environment.
  • Has the COVID-19 pandemic influenced students’ attitudes towards unseen exams?
  • The rapid shift to online exams during the COVID-19 pandemic may have altered students’ preferences and expectations, impacting attitudes toward traditional in-person exams. Exploring this impact can provide valuable insights into how experiences with online exams have reshaped expectations, preferences, and the perceived effectiveness of unseen exams, potentially influencing future exam formats.
  • What other factors may influence students’ perceptions of unseen exams?
  • Beyond the pandemic, various factors, such as exam preparation, stress levels, academic support, and personal learning styles may influence students’ perceptions of unseen exams. Understanding these influences is crucial for identifying the broader context in which students form opinions about traditional exams, allowing for more informed improvements in assessment design.
  • What does the future look like for unseen exams?
  • This question addresses whether traditional in-person exams will remain relevant or if alternative assessment methods, informed by recent experiences and technological advancements, will become more prominent in higher education.

1.3. Institutional Setting

The University of Liverpool (UK), where this research took place, is a teaching and research-based institution. The geography (BA and BSc) degree programme intakes are typically around 250 students per year, and the programme is accredited by the Royal Geographical Society, which has clear expectations “That a full range of delivery and assessment methods are in use, which are appropriately challenging and rigorous” [34]. Assessment strategies are designed to align with the University of Liverpool’s 2026 Education Strategy [35] and Curriculum 2021 Framework [36], which promote Liverpool’s Hallmarks for “active learning” and “research-connected teaching”.

2. Research Methods

2.1. Self-Completion Questionnaire

A self-completion questionnaire was designed for third-year undergraduate geography students to gauge their perspective and perceptions on unseen exams, and it contained a range of “open” and “closed” questions (Questionnaire S1). The questionnaire was designed to (i) provide context on the students’ background and module choices; and (ii) measure students’ perceptions of seen and unseen exams, including their attitudes, experiences, and preferences. The questionnaire was designed to identify key constructs, such as perceived fairness, difficulty, stress, preparation strategies, and learning outcomes. Questions were designed so that answers could be analysed and compared. The questionnaire was refined with colleagues in the Department of Geography & Planning, with involvement in programme direction to ensure all relevant aspects of interest were covered. Third-year students were asked to complete the questionnaire as they would have experienced the cancellation of school exams, a move to online teaching, and the reintroduction of unseen exams. The use of “open” questions enables the respondent (i.e., the students) to provide personal opinions and control the length and type of answer that is given [37]. In contrast, the use of “closed” structured questions enables the researcher to collect standardised data to apply quantitative analysis and directly compare responses [37]. The questionnaire included 17 questions in total: eight “closed” questions to gather information about degree programme, gender, modules taken, and a five-point Likert scale to collect information on their experience of exams. Nine “open” questions were used to collect students’ opinion on the role of exams in a university setting and the impact of COVID-19 pandemic on their academic journey.

Questionnaire Analysis

The questionnaire responses were analysed numerically to determine the strength of the responses. Frequencies and percentages were calculated for each closed question and Likert scale to understand the distribution of answers and identifying any notable trends or patterns.
Responses to the “open” questions and the semi-structured interview were evaluated using a systematic and rigorous thematic analysis approach, to identify patterns in participants’ experiences, opinions, and perceptions [38]. Braun and Clarke’s (2006) [39] six-phase framework was followed:
  • Become familiar with the data;
  • Generate initial codes;
  • Search for themes;
  • Review themes;
  • Define themes;
  • Write up.
This approach allows underlying ideas or concepts to be explored within questionnaire responses. Questionnaire responses were reviewed individually, and Microsoft Excel was used for an open-coding approach, where codes were continuously developed. Responses identified likes and dislikes, emotions, and personal experiences, which helped define codes. For example, the mental health implications of unseen exams were often mentioned and is relevant to the research. This was a manual process, whereby individual Excel sheets were created for each question, with individual answers placed in each row of the sheet. Labels or “codes” to relevant data segments were added in subsequent columns and verified against the answers to see if they were mentioned by the students. Grouping answers by codes helped to uncover patterns and insights. The codes were examined, and then fitted together into subthemes and themes; for example, any codes related to the purpose and design of assessment were grouped together (Table 1). Themes and codes were reviewed to ensure the data supported the themes and there was no overlap. The prevalence of themes and keywords was also considered, i.e., how often they occur in the data set. Each theme is considered separately in Section 3, while Section 4 provides recommendations based on these wider questions.

2.2. Semi-Structured Interview with Chair of the Board of Examiners for the School of Environmental Sciences

The impact of the COVID-19 pandemic on methods of assessment was discussed through a 1 h, semi-structured interview with the Chair of the Board of Examiners for the School of Environmental Sciences. A series of pre-determined “open” questions were set to prompt discussion, with the opportunity to explore responses and themes further [40,41]. The questions focused on the COVID-19 pandemic’s impact on overall student experience, assessment, and exam results, discussions that were held in the school about reintroducing exams, what a good assessment looks like, examples of good practice, and whether there is a place for exams. The structure of the interview allowed for informal, improvised follow-up questions based on the participant’s response and enabled both interviewer and interviewee to identify key themes [42,43].

3. Results and Discussion

A total of 33 questionnaires were completed, with the breakdown of the declared degree programmes including: 7 Geography (BA) students, 22 Geography (BSc) students, and 2 Environmental Science (BSc) students. A total of two students did not provide a response to this question. A total of 28 participants identified as female, 4 as male, and 1 did not provide a response. The split in the participants’ degree programme registrations is likely to reflect the author’s stronger affiliation with the BSc degree programme, as students are more likely to respond to a request from an academic they know [44].
The results may not reflect the diversity of the entire student body in terms of demographics, academic disciplines, or prior experiences with exams. Non-response bias has been considered, e.g., the opinions of those who did not participate might differ from those who did, which may mean skewed results. The results presented here draw from responses across all students who took part to try to minimise this effect. The findings may not capture the full range of perspectives or experiences across all undergraduate degree programmes.

3.1. Closed Question Responses

A total of 20 students sat one exam at the end of semester 1 (ENVS 319 or ENVS 376), 8 students sat two exams (both ENVS 319 and ENVS 376), and 5 students did not sit an exam (Figure 1). Every Geography BSc student took an exam at the end of semester 1, and four out of seven Geography BA students sat an exam.
The majority of students (>75%) strongly disagreed or disagreed that in-person exams are a positive experience, work well as part of a mixed assessment, and are a valuable tool for assessment (Figure 2). The respondents gave the same rating for each question, indicating strong opinions for or against unseen exams. A smaller number of students felt neutral or agreed that unseen exams can be a positive experience, indicating the need for a mixed assessment strategy to address all learning styles.
Students were asked to select two of four given skills that they think exams assess and that they think they should gain from a university degree (Figure 3). A percentage of 97% of respondents felt that exams assess memory, and 67% felt that exams assess exam technique. On the contrary, 90% of students felt that a degree aims to develop knowledge and 70% felt it aims to develop effort. This indicates there is a disconnection between what students feel the purpose of attending university is and the purpose of in-person exams in supporting this experience and skills development.

3.2. Perception of Exams as a Method of Assessment

Students’ perceptions of unseen exams and the impact of the COVID-19 pandemic on their university experience are analysed in the context of three broad themes.

3.2.1. Purpose of Unseen Exams

Exams should serve as a method to assess actual understanding and knowledge, as well provide students with an opportunity to demonstrate critical thinking and the ability to construct an argument [45]. From the perspective of an academic at a UK university, the primary objective of university exams is to evaluate students’ mastery of subject-specific knowledge and their ability to synthesise and apply this knowledge to solve problems or engage in analytical discussions. Through revision and practice questions, exams encourage students to engage in independent research, focused study, and critical analysis. Further to this, the ability to memorise and recall facts is a skill that employees may have to demonstrate in the workplace. This widely held view among academics regarding the purpose of exams in a university setting is sometimes reflected in the students’ opinions:
“I like them as it encourages me to memorise what I have learnt within a module, remember it more long-term and build upon it”.
However, it is a widely held view that exams prioritise the memorisation of facts and information, rather than assess deep understanding, knowledge, and critical thinking skills. Students only have experience of exams as high-pressure environments from school and may not understand the wider skill set offered by unseen exams. This is also evidence that students are not told about why unseen exams are set as a method of assessment and how this method fits into the curriculum.
“if refering to closed book in person exams—I think they don’t truly reflect a student’s understanding and knowledge but rather ability to memorise”.
“They are only effective at testing memory and not necessarily knowledge, application of knowledge and most importantly, critical thinking”.

3.2.2. Student Autonomy in Assessment

The COVID-19 pandemic caused exams to move online and required module leaders to become more innovative when designing assessments. Seen exams were common-place, and students may have engaged with digital assessments and online technologies. Overall, students have been exposed to a broader range of assessments, so they have opinions on preferred methods of assessment, e.g., unseen versus seen exams. Respondents felt seen exams allow them to demonstrate a wider breadth of skills. “Skills/effort/think/utilise/reflect/interpret” were all commonly used words by students to represent a more positive opinion on seen exams.
“I think [seen exams] are better because they assess knowledge and its application, as well as critical thinking, are less anxiety inducing and are a better reflection of my own ability because my train of thought is focused exclusively on drawing from my knowledge and not from having to create an answer that might not fully represent my ability in a rush or panic”.
Some students engaged in the debate about unseen versus seen exams in their own answers, and this highlights how different assessments are needed to assess different skills:
“[Seen] allow people to prepare more so reflects their knowledge better but quick thinking and problem solving skills aren’t tested as much”.
The question about what a good assessment looks like solicited a broad range of thoughtful and reflective answers: 57% of students used the words “mixture/mix/variety/variation” in their answers; 36% of students identified “coursework” as their preferred assessment, with “open book” and “essay” each identified by 27% of respondents as their preference. Less than 10% of the students identified “blogs”, “multiple choice”, “group work”, and “presentations” as their preference. The students also indicated they like to have autonomy over their assessments:
“Liked pair group work, not big groups like 6 etc, and to choose who”.
“A piece of coursework that you get to pick a topic of”.
Alternative forms of assessment, such as portfolios, self or peer assessment, and presentations were perceived as beneficial, as they reward consistent effort, rather than last minute effort [46], and by giving students a voice in the co-creation and design of tasks, such as assessments [47], can support learning.

3.2.3. Time for Assessment

Time allowance for assessment appeared as a control on students’ perceptions on methods of assessment. Unseen exams take place under timed conditions, whereas a seen exam can be accessed and completed over a 24–48 h period. Students may view unseen exams as more stressful and challenging, as they should be completed in a set period, which can lead to rushed answers and hinder a student’s ability to showcase their true understanding of the subject matter.
“I’m a person not good under time conditions and pressure”.
“Not all students perform well under stricter time constraints”.
Seen exams may be viewed as less stressful and provide the opportunity for students to prepare rather than second-guess the question set. However, students can use the whole time allowance in seen exams, which may cause stress, as they overthink their answers in this time period, and plagiarism can become an issue.
“I am able to utilise my time more efficiently and therefore can plan my essay focus any revision to ensure skills are developed further with wider reading, rather than just knowledge”.
“open book at least allows time for interpretation and demonstration of understanding of the information rather than regurgitating memorised info”.

3.2.4. Fairness of Different Methods of Assessment

Fairness of exams as a method for students to demonstrate learning, knowledge, and understanding was called into question. This appears as another control on student opinion on unseen versus seen exams, with the former being perceived as the less fair option:
“Unseen exams are really hard and unfair”.
Students also expressed the opinion that seen exams are “fairer”, which could be because of the additional time allowance:
“More fair—gives students a chance to prepare and reduces stress levels massively”.
“More fair at judging knowledge and effort”.
If students feel unfairly assessed, they report lower interest in the course, lower motivation to learn, and even elevated levels of hostility towards the instructor [48,49,50]. The classroom justice theory can identify three aspects where assessment can be perceived as unfair [51]:
  • Informational injustice: insufficient information about assessment and grading criteria is used.
  • Procedural injustice: ill-defined or non-transparent grading procedures.
  • Distributional injustice: imbalance between effort and resulting grade.
Clearly explaining the structure and purpose of unseen exams could address the perception of unfairness. The results presented here can be used to inform future actions to better incorporate and communicate the use of exams in a mix assessment programme. The Chair of BoE states that the success of an exam depends on how a question is set and should provide an opportunity for a student to tell staff what they know about something. Exams should be set with this in mind, to provide students with an opportunity to demonstrate knowledge. Students also need to be informed that this is the approach staff take to setting exams; no one is trying to catch them out.

3.2.5. Authenticity of Unseen Exams as an Assessment

Authentic assessment is a form of evaluation that mirrors real-world tasks, requiring students to apply knowledge in meaningful contexts. Authentic assessment is important at the university level, as it promotes the development of practical and transferable skills. This method of assessment forms one of the pillars of the University of Liverpool curriculum framework and should underpin assessment design across the institution. Some students’ opinions highlighted that unseen exams are rarely an authentic form of assessment, which supports their negative view of unseen exams:
“Not a big fan—just reminds me of school—not the real world”.
“after uni, in work situations there aren’t really many times you’ll do an exam”.
Some students may feel that unseen exams do not adequately assess their ability to apply theoretical knowledge to real-world scenarios. Students highlight that they prefer assessment methods that involve more practical components, such as case studies, projects, or presentations:
“I liked the social media summaries assessment—it was more engaging than writing an essay and the setting of a paper each week to write a summary on meant I worked on it each week rather than leaving it to the last minute like every other assignment. Please implement more assignments such as these!”

3.2.6. Mental Health Implications: How Unseen Exams Make Students Feel

The high-stakes and time-limited nature of unseen exams can lead to increased stress, anxiety, and pressure among students [52]. The uncertainty surrounding the content and format of the exam can intensify these negative emotions [53]. Additionally, the heavy reliance on memory and recall in unseen exams may create a sense of increased anxiety about forgetting important information. The majority of responses about unseen exams had a negative emotion attached to them. Students used words including “stress/stressful/anxiety/anxious/trigger/mental health” when discussing unseen exams:
“They can trigger a lot of stress and anxiety and require time spent of memorising citations and exam practice rather than research”.
“Highly stressful and only suit certain people”.
A negative mindset around unseen exams could be of detriment to students’ performance [54], and it has been shown that test anxiety can impact exam performance [55]. This also highlights the questions of whether mental health implications of exams should be considered more when discussing the future and longevity of unseen exams.

3.3. Impact of the COVID-19 Pandemic on Academic Journey

A clear display of negative emotions in response to questions about the COVID-19 pandemic highlight the toll the pandemic has taken on students’ emotional well-being and wider university experience.

3.3.1. Sense of Loss and Missed Opportunities

All respondents felt the COVID-19 pandemic had changed the way they perform in assessments. Words such as resent, regret, anxiety, and frustration were commonly used. Negative emotions can indicate the perceived disruption to students’ academic progress due to the pandemic. Regret may reflect the loss of anticipated experiences, such as in-person classes, field classes, and laboratory practicals, indicating the broad impact the pandemic had on students’ overall learning:
“Out of anyone’s control but the pandemic took away the in-person start to my university experience which I would change in a heartbeat”.
“Have more field class/practicel teaching in first and second year”.
Some students acknowledge that they may have received better grades with a move to coursework or seen exams, but they also feel they learned less and missed out on opportunities.
“Despite there being more chance of achieving a better grade through open book exams, I believe I have learnt a lot less than a normal year would so I’m at a disadvantage with my knowledge”.
Further to this, several respondents showed very strong opinions about the impact of the COVID-19 pandemic on their opinion of assessment and unseen exams, having not actually sat an exam. These answers referred to the experience of friends or family sitting unseen exams as they were reintroduced, and the challenges that they faced. The group mentality and very similar opinions around the reintroduction of exams highlights the impact the pandemic had across the whole generation of students:
“I’ve never had to do one before at uni, and my cohort was the one that didn’t have to do A-levels either, so I think many of us lack confidence in them because of this”.

3.3.2. Implications of Missing A-Levels

The COVID-19 pandemic caused the cancellation of GCSE and A-Level exams, and students missed the opportunity to sit the exams, demonstrate their skills and knowledge, and potentially achieve the grades they had wanted and needed [56,57]. Missing the opportunity to sit A-Levels and GCSEs is referred to by over half of respondents, and the students highlighted that this had an impact on their confidence and exam techniques. Many students cite not participating in A-Levels and GCSEs as an important contributor to their opinions on unseen exams and how their performance in assessments may have changed during the pandemic. Students feel that missing A-Levels and GCSEs knocked their confidence. There is a mentality that not participating in GCSEs and A-levels left students on the back foot and hindered their academic progress.
“have only done coursework at Uni. Not done exams since GCSEs”.
“No. Pandemic majorly affected and put myself under more pressure especially having no experience since 2018”.
“I missed sitting my A-levels so lack exam experience”.
It is clear that missing GCSEs and A-Levels also meant students lost the experience of taking an exam under timed conditions, which should not be underestimated. Exam technique, which can include question selection, planning, concise writing, answering in priority order (start with questions you feel most confident about), and proofreading are skills that students do not gain at any time other than in exams. Exam technique equips students with a range of skills to effectively tackle exams, manage time, and demonstrate knowledge. It can enhance their ability to perform well, showcase their understanding, and optimise their chances of success. These experiences need to be prioritised to ensure students are able to perform well.
“Yes—used to be conditioned to exams through A-levels and GCSEs”.
“As everything was online, exam technique was non-existent with our learning. Still able to apply knowledge”.

3.3.3. Reintroduction of Unseen Exams

The decision to reintroduce exams following the pandemic is subjective, and opinions vary. The reintroduction of exams for the third year (in 2022–2023) was a challenging decision. The Chair of BoE highlighted the unprecedented nature of discussions at faculty level on how best to reintroduce exams after the pandemic. The lack of exam experience was considered, in addition to the need for additional space and resources to accommodate social distancing. The majority of staff decided to reintroduce exams in some form as a tool for assessment. Despite this, 70% of students responded that exams should not have been reintroduced.
“No because since the pandemic new ways to be assessed have been found and they work just as well”.
It may have been the case that alternative assessment methods that emphasize continuous evaluation, project-based work, or open-book exams could better accommodate the unique circumstances and provide a more comprehensive and equitable evaluation of students’ knowledge and skills post-pandemic; however, the time and resources required to develop these one-off assessments make it an unrealistic option. Some students specifically avoided taking certain modules if they involved an unseen exam too:
“All in person examination within a module influenced my choice in modules as I knew I wouldn’t perform well despite being more interested in the topic”.
Some answers provided more positive reflections:
“Yes but maybe not to years that we’re effected by the pandemic e.g., start re-introducing them to student who did take their A Levels”.
These comments highlight that the university’s strategy to reintroduce unseen exams could have been staggered to account for missed exam experiences and should be considered in the future as a priority to promote students’ wellbeing and academic performance.

3.4. Importance of Student Support

3.4.1. Post-Pandemic Support

Student support for exams at university is crucial, as it plays a vital role in promoting academic success and overall well-being. Support systems, such as study resources, workshops, and guidance from faculty or advisors, can help students develop effective exam preparation strategies, enhance their understanding of the assessment format, and alleviate anxiety or stress. Many responses identified challenges around returning to university, including taking unseen exams again, following the COVID-19 pandemic. Responses identified that many of these services were not tailored to address the challenges of the COVID-19 pandemic, and that support did not sufficiently address the lack of exam experience.
“I would want more support/feedback/seminars on how to approach questions”.
The Chair of BoE believes the key to exam success is in the revision, but revision sessions and mock exams come with their own challenges, including space, workload, and EDI considerations. Further to this, there is nothing to legislate consistency of quality of revision sessions. Additional methods to provide students support ahead of exams, including “supervisor support/feedback/meetings/seminars/tutorials” are viewed positively among the respondents. Scheduled support sessions could be used to discussion exams and manage students’ expectations and emotions towards them.

3.4.2. Individual Learning Styles, Needs, and Challenges

University assessments aim to employ a range of assessments to consider individual needs and learning styles, promoting inclusive and effective evaluation. Recognizing that students have diverse learning preferences and abilities, assessments tailored to their individual needs can enhance engagement, motivation, and performance. Responses focused on whether unseen exams favour their personal learning style or not, but rarely considered other students in their responses to demonstrate a lack of understanding of the role of exams in a broader assessment strategy. Students also rarely commented on which assessment method did fit their personal learning style.
“My personal learning style works better with revision and in person, while others may be the opposite”.
“I perform well in exams and going from mostly exams at A-Level to mostly coursework at degree level was a shock as I didn’t know how to write a good essay”.
“My method of work goes exactly against exam style”.
Several students’ identified diagnosed conditions, including ADHD (attention deficit hyperactivity disorder), anxiety, and depression, contribute to a negative experience with unseen exams. Responses focused on struggles associated with unseen exams, but offered little insight into how student support plans or support systems can offer help. Awareness of individual student support plans by staff is crucial to be able to offer appropriate levels of support to those who need it [58]. Raising awareness of the unseen challenges students face related to mental health [59] and encouraging students to reflect on individual study journeys using techniques such as scaffolding [60] can help give students control over their studies.
“I’m dyslexic so don’t fit with my academic style”.
“For people with disorders such as ADHD revising and concentrating for a set amount of time can be hard thus again poorly reflecting their skill set and knowledge”.

4. Recommendations and Conclusions

Based on the insights gained from the questionnaire responses and the discussions with the Chair of BoE, this research demonstrated the following:
  • There is a divide between how staff and students view the purpose of unseen exams, and students lack clarity on what skills they can gain from unseen exams;
  • The disruption of the COVID-19 pandemic is likely to have long-term emotional and practical impacts on this year’s group of students;
  • Personal learning styles, interests, and past experiences are additional factors that influence student perceptions of unseen exams.
These results highlight the need for clearer communication about unseen exams’ objectives, acknowledge the pandemic’s lasting effects on students, and recognize the influence of individual learning styles. They contribute to the field by emphasizing the importance of aligning assessment practices with diverse student needs and improving educational support.
On the basis of these results and questions, the following recommendations are made to support the delivery of unseen exams in a higher education setting:
  • There is a need for a balanced and blended assessment strategy, which assesses diverse knowledge and provides students with different learning styles and experiences with an equal chance to succeed. Student-led teaching and assessment can provide the autonomy that students desire over their academic journey.
  • Student expectations of unseen exams needs to be better managed. Students need to be told more explicitly what the purpose of an unseen exam is, what it aims to achieve, and why we use them. The use of anecdotes and data (e.g., if you achieve X% in coursework, you will receive X% in the exam for a 2:1 classification) can support the delivery of this key message. Students need to be reminded of this throughout a module that contains an unseen exam, for example, in weeks 1, 6, 12.
  • Tutorials may be used as a space to discuss the purpose of exams within an assessment strategy. Students could be asked to talk for a few minutes about something they liked in a lecture recently, which can then be used to show they are able to recall information and communicate it.
  • Unseen exams could be reframed as an authentic assessment. Unseen exams could be redesigned to better simulate real-world tasks and challenges to enhance the relevance and applicability of their learning experiences. The un-authentic components of unseen exams are handwriting and the memorisation of citations. An alternative exam could require students to apply their knowledge to a specific, computer-based task under timed conditions. A virtual, computer-based simulation of a real-world problem could be useful for assessing a wide range of skills, knowledge and competencies. However, the practical challenges of sealed rooms and managed Windows systems need to be considered. Use students’ feedback to continuously refine the assessment process and address technical challenges.
This study can help universities to understand student concerns and experiences during a global crisis such as a pandemic and use this information to inform resilience programs and study support resources or academic advising services should a similar situation occur in the future. In a wider setting:
Educational policy should plan for future pandemics by developing robust contingency plans for remote learning and assessments. Invest in digital infrastructure, ensure access to technology, and establish flexible, adaptable teaching methods. Prioritize mental health support and create guidelines for maintaining academic continuity to minimize disruption and support student success.
Further to this, policies should encourage the development of a balanced and blended assessment approach that caters to different learning preferences and experiences, ensuring that all students have an equal opportunity to succeed. This could involve incorporating a mix of assessment types, including coursework, projects, and exams, to evaluate a broader range of skills and knowledge.
Future research could explore how unseen exams can be adapted to accommodate post-pandemic students by investigating flexible exam formats that consider varying practical challenges. This includes evaluating computer-based assessments and integrating real-world scenarios to ensure assessments reflect true skills and competencies. Student performance could be tracked, and questionnaires or surveys could be used to compare student perception of novel approaches to assessment. Research could also examine how to incorporate diverse learning styles and address technological barriers, ensuring that the adaptations effectively support all students’ needs and improve their overall learning experience. Research can explore student perceptions across a range of STEM degree programmes and across different years of study.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/higheredu3030046/s1, Questionnaire S1: Understanding the exam experience.

Funding

This research received no external funding.

Institutional Review Board Statement

The research undertaken in this paper received collective ethical approval from University of Liverpool (ADEV702 Scholarly Investigation of Practice and ADEV702 Scholarly Investigation of Practice; Date of Approval 3 February 2023).

Informed Consent Statement

Participants were provided an information sheet, and consent was waived as no specific participant information form was collected when the questionnaire was completed in person or online.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

The author would like to acknowledge the support of colleagues Alexander Nurse and Neil Macdonald. Furthermore, the author would like to acknowledge the support of the Postgraduate Certificate in Academic Practice (PGCAP) team at The University of Liverpool for their invaluable support in the design, implementation and completion of the research in this paper. The author would also like to thank the students who completed the questionnaire and shared their experiences and opinions.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Hylton, J.B.; Diefes-Dux, H.A. A Standards-based Assessment Strategy for Written Exams. In Proceedings of the 2016 ASEE Annual Conference & Exposition, New Orleans, LA, USA, 26–29 June 2016. [Google Scholar]
  2. Lawrie, G.; Marquis, E.; Fuller, E.; Newman, T.; Qiu, M.; Nomikoudis, M.; Roelofs, F.; Van Dam, L. Moving towards Inclusive Learning and Teaching: A Synthesis of Recent Literature. Teach. Learn. Inq. 2017, 5, 9–21. [Google Scholar]
  3. Durning, S.J.; Dong, T.; Ratcliffe, T.; Schuwirth, L.; Artino, A.R., Jr.; Boulet, J.R.; Eva, K. Comparing Open-Book and Closed-Book Examinations. Acad. Med. 2016, 91, 583–599. [Google Scholar] [CrossRef] [PubMed]
  4. Theophilides, C.; Koutselini, M. Study Behavior in the Closed-Book and the Open-Book Examination: A Comparative Analysis. Educ. Res. Eval. 2000, 6, 379–393. [Google Scholar] [CrossRef]
  5. Karagiannopoulou, E.; Milienos, F.S. Exploring the relationship between experienced students’ preference for open- and closed-book examinations, approaches to learning and achievement. Educ. Res. Eval. 2013, 19, 271–296. [Google Scholar] [CrossRef]
  6. Van Bergen, P.; Lane, R. Exams Might Be Stressful, But They Improve Learning. 2014. Available online: https://theconversation.com/exams-might-be-stressful-but-they-improve-learning-35614 (accessed on 16 August 2023).
  7. Smyth, K. The benefits of students learning about critical evaluation rather than being summatively judged. Assess. Eval. High. Educ. 2004, 29, 370378. [Google Scholar] [CrossRef]
  8. Gibbs, G.; Simpson, C. Conditions under Which Assessment Supports Students’ Learning. Learn. Teach. High. Educ. (LATHE) 2004, 1, 3–31. [Google Scholar]
  9. Entwistle, N.J.; Entwistle, A. Contrasting forms of understanding for degree examinations: The student experience and its implications. High. Educ. 1991, 22, 205–227. [Google Scholar] [CrossRef]
  10. Entwistle, N.J. Contrasting perspectives on learning. In The Experience of Learning. Implications for Teaching and Studying in Higher Education, 2nd ed.; Marton, F., Hounsell, D., Entwistle, N.J., Eds.; Scottish Academic Press: Edinburgh, UK, 1997; pp. 3–22. [Google Scholar]
  11. Entwistle, N.J. Teaching for Understanding at University: Deep Approaches and Distinctive Ways of Thinking; Palgrave Macmillan: Basingstoke, UK, 2009. [Google Scholar]
  12. Marton, F.; Säljö, R. Approaches to learning. In The Experience of Learning: Implications for Teaching and Studying in Higher Education, 3rd ed.; Marton, F., Hounsell, D., Entwistle, N., Eds.; University of Edinburgh, Centre for Teaching, Learning and Assessment: Edinburgh, UK, 2005; pp. 106–125. [Google Scholar]
  13. Kwon, S.J.; Kim, Y.; Kwak, Y. Difficulties faced by university students with self-reported symptoms of attention-deficit hyperactivity disorder: A qualitative study. Child Adolesc. Psychiatry Ment. Health 2018, 12, 12. [Google Scholar] [CrossRef]
  14. Jones, E.; Priestley, M.; Brewster, L.; Wilbraham, S.J.; Hughes, G.; Spanner, L. Student wellbeing and assessment in higher education: The balancing act. Assess. Eval. High. Educ. 2020, 46, 438–450. [Google Scholar] [CrossRef]
  15. Hsu, J.L.; Goldsmith, G.R. Instructor Strategies to Alleviate Stress and Anxiety among College and University STEM Students. CBE—Life Sci. Educ. 2021, 20, es1. [Google Scholar] [CrossRef]
  16. Villarroel, V.; Boud, D.; Bloxham, S.; Bruna, D.; Bruna, C. Using principles of authentic assessment to redesign written examinations and tests. Innov. Educ. Teach. Int. 2020, 57, 38–49. [Google Scholar] [CrossRef]
  17. Ramsden, P. Learning to Teach in Higher Education; Routledge Falmer: London, UK, 2003. [Google Scholar]
  18. Trigwell, K.; Prosser, M. Improving the quality of student learning: The influence of learning context and student approaches to learning on learning outcomes. High. Educ. 1991, 22, 251–266. [Google Scholar] [CrossRef]
  19. Fawzia, S.; Karim, A. Exploring the connection between deep learning and learning assessments: A cross-disciplinary engineering education perspective. Humanit. Soc. Sci. Commun. 2024, 11, 29. [Google Scholar] [CrossRef]
  20. Hanrahan, M. The effect of learning environment factors on students’ motivation and learning. Int. J. Sci. Educ. 1998, 20, 737–753. [Google Scholar] [CrossRef]
  21. Richardson, V. Constructivist teaching and teacher education: Theory and practice. In Constructivist Teacher Education; Routledge: London, UK, 2005; pp. 13–24. [Google Scholar]
  22. Hein, G.E. Constructivist Learning Theory. In Proceedings of the CECA (International Committee of Museum Educators) Conference, Jerusalem, Israel, 15–22 October 1991; pp. 1–10. [Google Scholar]
  23. Davis, B.; Sumara, D. Constructivist discourses and the field of education: Problems and possibilites. Educ. Theory 2002, 52, 409. [Google Scholar] [CrossRef]
  24. Hansen, S. A constructivist approach to project assessment. Eur. J. Eng. Educ. 2004, 29, 211–220. [Google Scholar] [CrossRef]
  25. Hendry, G.D.; Frommer, M.; Walker, R.A. Constructivism and problem-based learning. J. Furth. High. Educ. 1999, 23, 369–371. [Google Scholar] [CrossRef]
  26. Corno, L.; Mandinach, E.B. The role of cognitive engagement in classroom learning and motivation. Educ. Psychol. 1983, 18, 88–108. [Google Scholar] [CrossRef]
  27. Nicchiotti, B.; Spagnolo, C. Gender differences in relation to perceived difficulty of a mathematical task. In Proceedings of the 47th Conference of the International Group for the Psychology of Mathematics Education, PME, Auckland, New Zealand, 17–21 July 2024; Volume 4. [Google Scholar]
  28. Delgado, Á.H.D.A.; Almeida, J.P.R.; Mendes, L.S.B.; Oliveira, I.N.D.; Ezequiel, O.D.S.; Lucchetti, A.L.G.; Lucchetti, G. Are surface and deep learning approaches associated with study patterns and choices among medical students? A cross-sectional study. Sao Paulo Med. J. 2018, 136, 414–420. [Google Scholar] [CrossRef]
  29. House of Commons, UK. The Impact of COVID-19 on University Students. 2020. Available online: https://committees.parliament.uk/publications/1851/documents/18140/default/ (accessed on 16 May 2023).
  30. Bashir, A.; Bashir, S.; Rana, K.; Lambert, P.; Vernallis, A. Post-COVID-19 Adaptations; the Shifts Towards Online Learning, Hybrid Course Delivery and the Implications for Biosciences Courses in the Higher Education Setting. Front. Educ. 2021, 6, 711619. [Google Scholar] [CrossRef]
  31. Filho, W.; Wall, T.; Rayman-Bacchus, L.; Mifsud, M.; Pritchard, D.J.; Lovren, V.O.; Farinha, C.; Petrovic, D.S.; Balogun, A.-L. Impacts of COVID-19 and social isolation on academic staff and students at universities: A cross-sectional study. BMC Public Health 2021, 21, 1213. [Google Scholar]
  32. Babbar, M.; Gupta, T. Response of educational institutions to COVID-19 pandemic: An inter-country comparison. Policy Futures Educ. 2021, 20, 469–491. [Google Scholar] [CrossRef]
  33. Gov UK. The Impact of the Coronavirus Outbreak on Exams around the World. 2020. Available online: https://ofqual.blog.gov.uk/2020/05/22/the-impact-of-the-coronavirus-outbreak-on-exams-around-the-world/ (accessed on 16 May 2023).
  34. Royal Geographical Society. Geography Programme Accreditation Handbook; Royal Geographical Society: London, UK, 2017. [Google Scholar] [CrossRef]
  35. University of Liverpool. Our Strategy 2026. 2023. Available online: https://www.liverpool.ac.uk/media/our-strategy-2026-university-of-liverpool.pdf (accessed on 16 May 2023).
  36. University of Liverpool. Curriculum 2021: A Curriculum Framework and Design Model for Programme Teams at the University of Liverpool. 2023. Available online: https://www.liverpool.ac.uk/media/livacuk/centre-for-innovation-in-education/liverpool-curriculum-framework/liverpool-curriculum-framework-booklet.pdf (accessed on 16 May 2023).
  37. Denscombe, M. The Good Research Guide: For Small-Scale Research Projects, 5th ed.; McGraw Hill Education: New York, NY, USA, 2014. [Google Scholar]
  38. Nowell, L.S.; Norris, J.M.; White, D.E.; Moules, N.J. Thematic Analysis. Int. J. Qual. Methods 2017, 16, 160940691773384. [Google Scholar] [CrossRef]
  39. Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef]
  40. Galletta, A. Mastering the Semi-Structured Interview and Beyond: From Research Design to Analysis and Publication; New York University Press: New York, NY, USA, 2012. [Google Scholar]
  41. Kallio, H.; Pietilä, A.-M.; Johnson, M.; Kangasniemi, M. Systematic methodological review: Developing a framework for a qualitative semi-structured interview guide. J. Adv. Nurs. 2016, 72, 2954–2965. [Google Scholar] [CrossRef]
  42. Dearnley, C. A reflection on the use of semi-structured interviews. Nurse Res. 2005, 13, 19–28. [Google Scholar] [CrossRef]
  43. Rubin, H.; Rubin, I. Qualitative Interviewing: The Art of Hearing Data, 2nd ed.; SAGE: Thousand Oaks, CA, USA, 2005. [Google Scholar]
  44. Saleh, A.; Bista, K. Examining Factors Impacting Online Survey Response Rates in Educational Research: Perceptions of Graduate Students. J. Multidiscip. Eval. 2017, 13, 63–74. [Google Scholar] [CrossRef]
  45. OECD. Assessment for Learning Formative Assessment. In Proceedings of the OECD/CERI International Conference “Learning in the 21st Century: Research, Innovation and Policy”, Paris, France, 15–16 May 2008. Available online: https://www.oecd.org/site/educeri21st/40600533.pdf (accessed on 16 May 2023).
  46. Sambell, K.; McDowell, L.; Brown, S. “But is it fair?”: An exploratory study of student perceptions of the consequential validity of assessment. Stud. Educ. Eval. 1997, 23, 349–371. [Google Scholar] [CrossRef]
  47. Müller-Kuhn, D.; Zala-Mezö, E.; Häbig, J.; Strauss, N.; Herzig, P. Five Contexts and Three Characteristics of Student Participation and Student Voice—A Literature Review. Int. J. Stud. Voice 2021, 6, 1–30. [Google Scholar]
  48. Chory-Assad, R.M. Classroom justice: Perceptions of fairness as a predictor of student motivation, learning, and aggression. Commun. Q. 2002, 50, 58–77. [Google Scholar] [CrossRef]
  49. Struyven, K.; Dochy, F.; Janssens, S. Students’ perceptions about evaluation and assessment in higher education: A review. Assess. Eval. High. Educ. 2005, 30, 325–341. [Google Scholar] [CrossRef]
  50. Whipp, P. University assessment practices at level 1: Exploring student perceptions of fairness, transparency and authenticity. In Proceedings of the Meeting the Challenges: Proceedings of the Australian Technology Network (ATN) Assessment Conference, Perth, Australia, 20–21 October 2011; pp. 161–169. [Google Scholar]
  51. Chory-Assad, R.M.; Paulsel, M.L. Classroom justice: Student aggression and resistance as reactions to perceived unfairness. Commun. Educ. 2004, 53, 253–273. [Google Scholar] [CrossRef]
  52. Jerrim, J. Test anxiety: Is it associated with performance in high-stakes examinations? Oxf. Rev. Educ. 2022, 49, 321–341. [Google Scholar] [CrossRef]
  53. Folkman, S.; Lazarus, R.S. If it changes it must be a process: Study of emotion and coping during three stages of a college examination. J. Personal. Soc. Psychol. 1985, 48, 150–170. [Google Scholar] [CrossRef]
  54. McCaldin, T.; Brown, K.; Greenwood, J. What Is It like to Experience Exam Stress? A Student Perspective. 2019. Available online: https://ofqual.blog.gov.uk/2019/03/08/what-is-it-like-to-experience-exam-stress-a-student-perspective/ (accessed on 16 June 2023).
  55. Hunsley, J. Test anxiety, academic performance, and cognitive appraisals. J. Educ. Psychol. 1985, 77, 678–682. [Google Scholar] [CrossRef]
  56. House of Commons, Education Committee. Getting the Grades They’ve Earned: COVID-19: The Cancellation of Exams and ‘Calculated’ Grades: Response to the Committee’s First Report. 2020. Available online: https://committees.parliament.uk/publications/2700/documents/26711/default/ (accessed on 16 May 2023).
  57. Kippin, S.; Cairney, P. The COVID-19 exams fiasco across the UK: Four nations and two windows of opportunity. Br. Politics 2021, 17, 1–23. [Google Scholar] [CrossRef]
  58. Feuerborn, L.; Chinn, D. Teacher Perceptions of Student Needs and Implications for Positive Behavior Supports. Behav. Disord. 2012, 37, 219–231. [Google Scholar] [CrossRef]
  59. Coughlan, T.; Lister, K.; Lucassen, M. Representing the Unseen with “Our Journey”: A Platform to Capture Affective Experiences and Support Emotional Awareness in University-Level Study. J. Form. Des. Learn. 2021, 5, 39–52. [Google Scholar] [CrossRef]
  60. Ryan, M. Conceptualising and teaching discursive and performative reflection in higher education. Stud. Contin. Educ. 2012, 34, 207–223. [Google Scholar] [CrossRef]
Figure 1. Breakdown of modules from 33 questionnaire responses. * indicates an exam.
Figure 1. Breakdown of modules from 33 questionnaire responses. * indicates an exam.
Higheredu 03 00046 g001
Figure 2. Students’ perception of the exam experience.
Figure 2. Students’ perception of the exam experience.
Higheredu 03 00046 g002
Figure 3. Response to questions asking students to identify 2 out of 4 skills that are (i) assessed by exams; and (ii) developed during a degree programme.
Figure 3. Response to questions asking students to identify 2 out of 4 skills that are (i) assessed by exams; and (ii) developed during a degree programme.
Higheredu 03 00046 g003
Table 1. Research themes identified from the self-completion questionnaire at the end of stage 4.
Table 1. Research themes identified from the self-completion questionnaire at the end of stage 4.
Theme: Perception of Exams as a Method of AssessmentTheme: Impact of COVID-19 Pandemic on Academic JourneyTheme: Importance of Student Support
Subthemes:Subthemes:Subthemes:
  • Purpose of unseen exams
  • Student autonomy in assessment
  • Time allowance for assessment
  • Fairness of different methods of assessment
  • Authenticity of unseen exams as an assessment
  • Mental health implications: How unseen exams make students feel
  • Sense of loss and missed opportunities
  • Implications of missing A-Levels
  • Reintroduction of unseen exams
  • Post-pandemic support
  • Individual learning styles, needs, and challenges
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lyddon, C.E. Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions. Trends High. Educ. 2024, 3, 812-826. https://doi.org/10.3390/higheredu3030046

AMA Style

Lyddon CE. Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions. Trends in Higher Education. 2024; 3(3):812-826. https://doi.org/10.3390/higheredu3030046

Chicago/Turabian Style

Lyddon, Charlotte E. 2024. "Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions" Trends in Higher Education 3, no. 3: 812-826. https://doi.org/10.3390/higheredu3030046

APA Style

Lyddon, C. E. (2024). Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions. Trends in Higher Education, 3(3), 812-826. https://doi.org/10.3390/higheredu3030046

Article Metrics

Back to TopTop