Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions
Abstract
:1. Introduction and Theoretical Framework
1.1. COVID-19 Impact on Students Learning
1.2. Research Aim
- What are students’ perceptions of unseen exams?
- Investigating students’ perception of unseen exams is essential to understand how these assessments are viewed in terms of fairness, stress, and educational value. This insight can reveal whether traditional exams effectively measure students’ knowledge and skills, or if they are seen as outdated and ineffective in today’s learning environment.
- Has the COVID-19 pandemic influenced students’ attitudes towards unseen exams?
- The rapid shift to online exams during the COVID-19 pandemic may have altered students’ preferences and expectations, impacting attitudes toward traditional in-person exams. Exploring this impact can provide valuable insights into how experiences with online exams have reshaped expectations, preferences, and the perceived effectiveness of unseen exams, potentially influencing future exam formats.
- What other factors may influence students’ perceptions of unseen exams?
- Beyond the pandemic, various factors, such as exam preparation, stress levels, academic support, and personal learning styles may influence students’ perceptions of unseen exams. Understanding these influences is crucial for identifying the broader context in which students form opinions about traditional exams, allowing for more informed improvements in assessment design.
- What does the future look like for unseen exams?
- This question addresses whether traditional in-person exams will remain relevant or if alternative assessment methods, informed by recent experiences and technological advancements, will become more prominent in higher education.
1.3. Institutional Setting
2. Research Methods
2.1. Self-Completion Questionnaire
Questionnaire Analysis
- Become familiar with the data;
- Generate initial codes;
- Search for themes;
- Review themes;
- Define themes;
- Write up.
2.2. Semi-Structured Interview with Chair of the Board of Examiners for the School of Environmental Sciences
3. Results and Discussion
3.1. Closed Question Responses
3.2. Perception of Exams as a Method of Assessment
3.2.1. Purpose of Unseen Exams
“I like them as it encourages me to memorise what I have learnt within a module, remember it more long-term and build upon it”.
“if refering to closed book in person exams—I think they don’t truly reflect a student’s understanding and knowledge but rather ability to memorise”.
“They are only effective at testing memory and not necessarily knowledge, application of knowledge and most importantly, critical thinking”.
3.2.2. Student Autonomy in Assessment
“I think [seen exams] are better because they assess knowledge and its application, as well as critical thinking, are less anxiety inducing and are a better reflection of my own ability because my train of thought is focused exclusively on drawing from my knowledge and not from having to create an answer that might not fully represent my ability in a rush or panic”.
“[Seen] allow people to prepare more so reflects their knowledge better but quick thinking and problem solving skills aren’t tested as much”.
“Liked pair group work, not big groups like 6 etc, and to choose who”.
“A piece of coursework that you get to pick a topic of”.
3.2.3. Time for Assessment
“I’m a person not good under time conditions and pressure”.
“Not all students perform well under stricter time constraints”.
“I am able to utilise my time more efficiently and therefore can plan my essay focus any revision to ensure skills are developed further with wider reading, rather than just knowledge”.
“open book at least allows time for interpretation and demonstration of understanding of the information rather than regurgitating memorised info”.
3.2.4. Fairness of Different Methods of Assessment
“Unseen exams are really hard and unfair”.
“More fair—gives students a chance to prepare and reduces stress levels massively”.
“More fair at judging knowledge and effort”.
- Informational injustice: insufficient information about assessment and grading criteria is used.
- Procedural injustice: ill-defined or non-transparent grading procedures.
- Distributional injustice: imbalance between effort and resulting grade.
3.2.5. Authenticity of Unseen Exams as an Assessment
“Not a big fan—just reminds me of school—not the real world”.
“after uni, in work situations there aren’t really many times you’ll do an exam”.
“I liked the social media summaries assessment—it was more engaging than writing an essay and the setting of a paper each week to write a summary on meant I worked on it each week rather than leaving it to the last minute like every other assignment. Please implement more assignments such as these!”
3.2.6. Mental Health Implications: How Unseen Exams Make Students Feel
“They can trigger a lot of stress and anxiety and require time spent of memorising citations and exam practice rather than research”.
“Highly stressful and only suit certain people”.
3.3. Impact of the COVID-19 Pandemic on Academic Journey
3.3.1. Sense of Loss and Missed Opportunities
“Out of anyone’s control but the pandemic took away the in-person start to my university experience which I would change in a heartbeat”.
“Have more field class/practicel teaching in first and second year”.
“Despite there being more chance of achieving a better grade through open book exams, I believe I have learnt a lot less than a normal year would so I’m at a disadvantage with my knowledge”.
“I’ve never had to do one before at uni, and my cohort was the one that didn’t have to do A-levels either, so I think many of us lack confidence in them because of this”.
3.3.2. Implications of Missing A-Levels
“have only done coursework at Uni. Not done exams since GCSEs”.
“No. Pandemic majorly affected and put myself under more pressure especially having no experience since 2018”.
“I missed sitting my A-levels so lack exam experience”.
“Yes—used to be conditioned to exams through A-levels and GCSEs”.
“As everything was online, exam technique was non-existent with our learning. Still able to apply knowledge”.
3.3.3. Reintroduction of Unseen Exams
“No because since the pandemic new ways to be assessed have been found and they work just as well”.
“All in person examination within a module influenced my choice in modules as I knew I wouldn’t perform well despite being more interested in the topic”.
“Yes but maybe not to years that we’re effected by the pandemic e.g., start re-introducing them to student who did take their A Levels”.
3.4. Importance of Student Support
3.4.1. Post-Pandemic Support
“I would want more support/feedback/seminars on how to approach questions”.
3.4.2. Individual Learning Styles, Needs, and Challenges
“My personal learning style works better with revision and in person, while others may be the opposite”.
“I perform well in exams and going from mostly exams at A-Level to mostly coursework at degree level was a shock as I didn’t know how to write a good essay”.
“My method of work goes exactly against exam style”.
“I’m dyslexic so don’t fit with my academic style”.
“For people with disorders such as ADHD revising and concentrating for a set amount of time can be hard thus again poorly reflecting their skill set and knowledge”.
4. Recommendations and Conclusions
- There is a divide between how staff and students view the purpose of unseen exams, and students lack clarity on what skills they can gain from unseen exams;
- The disruption of the COVID-19 pandemic is likely to have long-term emotional and practical impacts on this year’s group of students;
- Personal learning styles, interests, and past experiences are additional factors that influence student perceptions of unseen exams.
- There is a need for a balanced and blended assessment strategy, which assesses diverse knowledge and provides students with different learning styles and experiences with an equal chance to succeed. Student-led teaching and assessment can provide the autonomy that students desire over their academic journey.
- Student expectations of unseen exams needs to be better managed. Students need to be told more explicitly what the purpose of an unseen exam is, what it aims to achieve, and why we use them. The use of anecdotes and data (e.g., if you achieve X% in coursework, you will receive X% in the exam for a 2:1 classification) can support the delivery of this key message. Students need to be reminded of this throughout a module that contains an unseen exam, for example, in weeks 1, 6, 12.
- Tutorials may be used as a space to discuss the purpose of exams within an assessment strategy. Students could be asked to talk for a few minutes about something they liked in a lecture recently, which can then be used to show they are able to recall information and communicate it.
- Unseen exams could be reframed as an authentic assessment. Unseen exams could be redesigned to better simulate real-world tasks and challenges to enhance the relevance and applicability of their learning experiences. The un-authentic components of unseen exams are handwriting and the memorisation of citations. An alternative exam could require students to apply their knowledge to a specific, computer-based task under timed conditions. A virtual, computer-based simulation of a real-world problem could be useful for assessing a wide range of skills, knowledge and competencies. However, the practical challenges of sealed rooms and managed Windows systems need to be considered. Use students’ feedback to continuously refine the assessment process and address technical challenges.
Supplementary Materials
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Hylton, J.B.; Diefes-Dux, H.A. A Standards-based Assessment Strategy for Written Exams. In Proceedings of the 2016 ASEE Annual Conference & Exposition, New Orleans, LA, USA, 26–29 June 2016. [Google Scholar]
- Lawrie, G.; Marquis, E.; Fuller, E.; Newman, T.; Qiu, M.; Nomikoudis, M.; Roelofs, F.; Van Dam, L. Moving towards Inclusive Learning and Teaching: A Synthesis of Recent Literature. Teach. Learn. Inq. 2017, 5, 9–21. [Google Scholar]
- Durning, S.J.; Dong, T.; Ratcliffe, T.; Schuwirth, L.; Artino, A.R., Jr.; Boulet, J.R.; Eva, K. Comparing Open-Book and Closed-Book Examinations. Acad. Med. 2016, 91, 583–599. [Google Scholar] [CrossRef] [PubMed]
- Theophilides, C.; Koutselini, M. Study Behavior in the Closed-Book and the Open-Book Examination: A Comparative Analysis. Educ. Res. Eval. 2000, 6, 379–393. [Google Scholar] [CrossRef]
- Karagiannopoulou, E.; Milienos, F.S. Exploring the relationship between experienced students’ preference for open- and closed-book examinations, approaches to learning and achievement. Educ. Res. Eval. 2013, 19, 271–296. [Google Scholar] [CrossRef]
- Van Bergen, P.; Lane, R. Exams Might Be Stressful, But They Improve Learning. 2014. Available online: https://theconversation.com/exams-might-be-stressful-but-they-improve-learning-35614 (accessed on 16 August 2023).
- Smyth, K. The benefits of students learning about critical evaluation rather than being summatively judged. Assess. Eval. High. Educ. 2004, 29, 370378. [Google Scholar] [CrossRef]
- Gibbs, G.; Simpson, C. Conditions under Which Assessment Supports Students’ Learning. Learn. Teach. High. Educ. (LATHE) 2004, 1, 3–31. [Google Scholar]
- Entwistle, N.J.; Entwistle, A. Contrasting forms of understanding for degree examinations: The student experience and its implications. High. Educ. 1991, 22, 205–227. [Google Scholar] [CrossRef]
- Entwistle, N.J. Contrasting perspectives on learning. In The Experience of Learning. Implications for Teaching and Studying in Higher Education, 2nd ed.; Marton, F., Hounsell, D., Entwistle, N.J., Eds.; Scottish Academic Press: Edinburgh, UK, 1997; pp. 3–22. [Google Scholar]
- Entwistle, N.J. Teaching for Understanding at University: Deep Approaches and Distinctive Ways of Thinking; Palgrave Macmillan: Basingstoke, UK, 2009. [Google Scholar]
- Marton, F.; Säljö, R. Approaches to learning. In The Experience of Learning: Implications for Teaching and Studying in Higher Education, 3rd ed.; Marton, F., Hounsell, D., Entwistle, N., Eds.; University of Edinburgh, Centre for Teaching, Learning and Assessment: Edinburgh, UK, 2005; pp. 106–125. [Google Scholar]
- Kwon, S.J.; Kim, Y.; Kwak, Y. Difficulties faced by university students with self-reported symptoms of attention-deficit hyperactivity disorder: A qualitative study. Child Adolesc. Psychiatry Ment. Health 2018, 12, 12. [Google Scholar] [CrossRef]
- Jones, E.; Priestley, M.; Brewster, L.; Wilbraham, S.J.; Hughes, G.; Spanner, L. Student wellbeing and assessment in higher education: The balancing act. Assess. Eval. High. Educ. 2020, 46, 438–450. [Google Scholar] [CrossRef]
- Hsu, J.L.; Goldsmith, G.R. Instructor Strategies to Alleviate Stress and Anxiety among College and University STEM Students. CBE—Life Sci. Educ. 2021, 20, es1. [Google Scholar] [CrossRef]
- Villarroel, V.; Boud, D.; Bloxham, S.; Bruna, D.; Bruna, C. Using principles of authentic assessment to redesign written examinations and tests. Innov. Educ. Teach. Int. 2020, 57, 38–49. [Google Scholar] [CrossRef]
- Ramsden, P. Learning to Teach in Higher Education; Routledge Falmer: London, UK, 2003. [Google Scholar]
- Trigwell, K.; Prosser, M. Improving the quality of student learning: The influence of learning context and student approaches to learning on learning outcomes. High. Educ. 1991, 22, 251–266. [Google Scholar] [CrossRef]
- Fawzia, S.; Karim, A. Exploring the connection between deep learning and learning assessments: A cross-disciplinary engineering education perspective. Humanit. Soc. Sci. Commun. 2024, 11, 29. [Google Scholar] [CrossRef]
- Hanrahan, M. The effect of learning environment factors on students’ motivation and learning. Int. J. Sci. Educ. 1998, 20, 737–753. [Google Scholar] [CrossRef]
- Richardson, V. Constructivist teaching and teacher education: Theory and practice. In Constructivist Teacher Education; Routledge: London, UK, 2005; pp. 13–24. [Google Scholar]
- Hein, G.E. Constructivist Learning Theory. In Proceedings of the CECA (International Committee of Museum Educators) Conference, Jerusalem, Israel, 15–22 October 1991; pp. 1–10. [Google Scholar]
- Davis, B.; Sumara, D. Constructivist discourses and the field of education: Problems and possibilites. Educ. Theory 2002, 52, 409. [Google Scholar] [CrossRef]
- Hansen, S. A constructivist approach to project assessment. Eur. J. Eng. Educ. 2004, 29, 211–220. [Google Scholar] [CrossRef]
- Hendry, G.D.; Frommer, M.; Walker, R.A. Constructivism and problem-based learning. J. Furth. High. Educ. 1999, 23, 369–371. [Google Scholar] [CrossRef]
- Corno, L.; Mandinach, E.B. The role of cognitive engagement in classroom learning and motivation. Educ. Psychol. 1983, 18, 88–108. [Google Scholar] [CrossRef]
- Nicchiotti, B.; Spagnolo, C. Gender differences in relation to perceived difficulty of a mathematical task. In Proceedings of the 47th Conference of the International Group for the Psychology of Mathematics Education, PME, Auckland, New Zealand, 17–21 July 2024; Volume 4. [Google Scholar]
- Delgado, Á.H.D.A.; Almeida, J.P.R.; Mendes, L.S.B.; Oliveira, I.N.D.; Ezequiel, O.D.S.; Lucchetti, A.L.G.; Lucchetti, G. Are surface and deep learning approaches associated with study patterns and choices among medical students? A cross-sectional study. Sao Paulo Med. J. 2018, 136, 414–420. [Google Scholar] [CrossRef]
- House of Commons, UK. The Impact of COVID-19 on University Students. 2020. Available online: https://committees.parliament.uk/publications/1851/documents/18140/default/ (accessed on 16 May 2023).
- Bashir, A.; Bashir, S.; Rana, K.; Lambert, P.; Vernallis, A. Post-COVID-19 Adaptations; the Shifts Towards Online Learning, Hybrid Course Delivery and the Implications for Biosciences Courses in the Higher Education Setting. Front. Educ. 2021, 6, 711619. [Google Scholar] [CrossRef]
- Filho, W.; Wall, T.; Rayman-Bacchus, L.; Mifsud, M.; Pritchard, D.J.; Lovren, V.O.; Farinha, C.; Petrovic, D.S.; Balogun, A.-L. Impacts of COVID-19 and social isolation on academic staff and students at universities: A cross-sectional study. BMC Public Health 2021, 21, 1213. [Google Scholar]
- Babbar, M.; Gupta, T. Response of educational institutions to COVID-19 pandemic: An inter-country comparison. Policy Futures Educ. 2021, 20, 469–491. [Google Scholar] [CrossRef]
- Gov UK. The Impact of the Coronavirus Outbreak on Exams around the World. 2020. Available online: https://ofqual.blog.gov.uk/2020/05/22/the-impact-of-the-coronavirus-outbreak-on-exams-around-the-world/ (accessed on 16 May 2023).
- Royal Geographical Society. Geography Programme Accreditation Handbook; Royal Geographical Society: London, UK, 2017. [Google Scholar] [CrossRef]
- University of Liverpool. Our Strategy 2026. 2023. Available online: https://www.liverpool.ac.uk/media/our-strategy-2026-university-of-liverpool.pdf (accessed on 16 May 2023).
- University of Liverpool. Curriculum 2021: A Curriculum Framework and Design Model for Programme Teams at the University of Liverpool. 2023. Available online: https://www.liverpool.ac.uk/media/livacuk/centre-for-innovation-in-education/liverpool-curriculum-framework/liverpool-curriculum-framework-booklet.pdf (accessed on 16 May 2023).
- Denscombe, M. The Good Research Guide: For Small-Scale Research Projects, 5th ed.; McGraw Hill Education: New York, NY, USA, 2014. [Google Scholar]
- Nowell, L.S.; Norris, J.M.; White, D.E.; Moules, N.J. Thematic Analysis. Int. J. Qual. Methods 2017, 16, 160940691773384. [Google Scholar] [CrossRef]
- Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef]
- Galletta, A. Mastering the Semi-Structured Interview and Beyond: From Research Design to Analysis and Publication; New York University Press: New York, NY, USA, 2012. [Google Scholar]
- Kallio, H.; Pietilä, A.-M.; Johnson, M.; Kangasniemi, M. Systematic methodological review: Developing a framework for a qualitative semi-structured interview guide. J. Adv. Nurs. 2016, 72, 2954–2965. [Google Scholar] [CrossRef]
- Dearnley, C. A reflection on the use of semi-structured interviews. Nurse Res. 2005, 13, 19–28. [Google Scholar] [CrossRef]
- Rubin, H.; Rubin, I. Qualitative Interviewing: The Art of Hearing Data, 2nd ed.; SAGE: Thousand Oaks, CA, USA, 2005. [Google Scholar]
- Saleh, A.; Bista, K. Examining Factors Impacting Online Survey Response Rates in Educational Research: Perceptions of Graduate Students. J. Multidiscip. Eval. 2017, 13, 63–74. [Google Scholar] [CrossRef]
- OECD. Assessment for Learning Formative Assessment. In Proceedings of the OECD/CERI International Conference “Learning in the 21st Century: Research, Innovation and Policy”, Paris, France, 15–16 May 2008. Available online: https://www.oecd.org/site/educeri21st/40600533.pdf (accessed on 16 May 2023).
- Sambell, K.; McDowell, L.; Brown, S. “But is it fair?”: An exploratory study of student perceptions of the consequential validity of assessment. Stud. Educ. Eval. 1997, 23, 349–371. [Google Scholar] [CrossRef]
- Müller-Kuhn, D.; Zala-Mezö, E.; Häbig, J.; Strauss, N.; Herzig, P. Five Contexts and Three Characteristics of Student Participation and Student Voice—A Literature Review. Int. J. Stud. Voice 2021, 6, 1–30. [Google Scholar]
- Chory-Assad, R.M. Classroom justice: Perceptions of fairness as a predictor of student motivation, learning, and aggression. Commun. Q. 2002, 50, 58–77. [Google Scholar] [CrossRef]
- Struyven, K.; Dochy, F.; Janssens, S. Students’ perceptions about evaluation and assessment in higher education: A review. Assess. Eval. High. Educ. 2005, 30, 325–341. [Google Scholar] [CrossRef]
- Whipp, P. University assessment practices at level 1: Exploring student perceptions of fairness, transparency and authenticity. In Proceedings of the Meeting the Challenges: Proceedings of the Australian Technology Network (ATN) Assessment Conference, Perth, Australia, 20–21 October 2011; pp. 161–169. [Google Scholar]
- Chory-Assad, R.M.; Paulsel, M.L. Classroom justice: Student aggression and resistance as reactions to perceived unfairness. Commun. Educ. 2004, 53, 253–273. [Google Scholar] [CrossRef]
- Jerrim, J. Test anxiety: Is it associated with performance in high-stakes examinations? Oxf. Rev. Educ. 2022, 49, 321–341. [Google Scholar] [CrossRef]
- Folkman, S.; Lazarus, R.S. If it changes it must be a process: Study of emotion and coping during three stages of a college examination. J. Personal. Soc. Psychol. 1985, 48, 150–170. [Google Scholar] [CrossRef]
- McCaldin, T.; Brown, K.; Greenwood, J. What Is It like to Experience Exam Stress? A Student Perspective. 2019. Available online: https://ofqual.blog.gov.uk/2019/03/08/what-is-it-like-to-experience-exam-stress-a-student-perspective/ (accessed on 16 June 2023).
- Hunsley, J. Test anxiety, academic performance, and cognitive appraisals. J. Educ. Psychol. 1985, 77, 678–682. [Google Scholar] [CrossRef]
- House of Commons, Education Committee. Getting the Grades They’ve Earned: COVID-19: The Cancellation of Exams and ‘Calculated’ Grades: Response to the Committee’s First Report. 2020. Available online: https://committees.parliament.uk/publications/2700/documents/26711/default/ (accessed on 16 May 2023).
- Kippin, S.; Cairney, P. The COVID-19 exams fiasco across the UK: Four nations and two windows of opportunity. Br. Politics 2021, 17, 1–23. [Google Scholar] [CrossRef]
- Feuerborn, L.; Chinn, D. Teacher Perceptions of Student Needs and Implications for Positive Behavior Supports. Behav. Disord. 2012, 37, 219–231. [Google Scholar] [CrossRef]
- Coughlan, T.; Lister, K.; Lucassen, M. Representing the Unseen with “Our Journey”: A Platform to Capture Affective Experiences and Support Emotional Awareness in University-Level Study. J. Form. Des. Learn. 2021, 5, 39–52. [Google Scholar] [CrossRef]
- Ryan, M. Conceptualising and teaching discursive and performative reflection in higher education. Stud. Contin. Educ. 2012, 34, 207–223. [Google Scholar] [CrossRef]
Theme: Perception of Exams as a Method of Assessment | Theme: Impact of COVID-19 Pandemic on Academic Journey | Theme: Importance of Student Support |
---|---|---|
Subthemes: | Subthemes: | Subthemes: |
|
|
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lyddon, C.E. Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions. Trends High. Educ. 2024, 3, 812-826. https://doi.org/10.3390/higheredu3030046
Lyddon CE. Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions. Trends in Higher Education. 2024; 3(3):812-826. https://doi.org/10.3390/higheredu3030046
Chicago/Turabian StyleLyddon, Charlotte E. 2024. "Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions" Trends in Higher Education 3, no. 3: 812-826. https://doi.org/10.3390/higheredu3030046
APA StyleLyddon, C. E. (2024). Reframing Unseen Exams in Post-Pandemic Pedagogy Based on Student Perceptions. Trends in Higher Education, 3(3), 812-826. https://doi.org/10.3390/higheredu3030046