Replacing Exams with Project-Based Assessment: Analysis of Students’ Performance and Experience
Abstract
:1. Introduction
2. Exams vs. Projects: Advantages and Shortcomings
- Struyven et al. (2003) [27] argued that exams may not be effective in retaining knowledge. According to [28], many students only study the content which will be covered in the exam, while some students memorize it without understanding it. Although students admit that they have a better understanding of the material after the test, they also quickly forget it after the assessment has passed.
- Exams may not be very practical or may not have the capacity to assess student soft and practical skills [24]. In relation to student’s industrial careers, it seems unlikely that engineers would be required to do a 2-h exam in the industry.
- Students deal with real-life problems, and this helps them better understand how theories can be applied to solve practical problems [38]. This results in long-term retention of knowledge [39] and can also lead to better learning experiences [40] because students learn valuable lessons from working on a real-life project.
- Projects are commonly referred to as formative assessment or assessment for learning because they assess student performance during learning and provide feedback for improvement [43]. Such feedback can give students a better understanding of the subject [11] and may improve students’ performance as well [44,45]. However, this feedback should be timely for it to be perceived as useful and actionable. Gipps (2003) [46] noted that timely feedback helps low-performing students improve their performance.
- Despite the number of advantages, there are some issues that need to be addressed:
- Providing timely and personal feedback on projects can amount to a heavier workload for teachers compared to traditional assessments [49].
- Unlike examinations, project assignments are usually unsupervised by the teacher, which means that students may provide assistance to other students, either voluntarily or for a fee [50].
- Compared to traditional exams, do project-based assignments lead to better academic performance?
- Does project-based learning provide students with a better learning experience?
3. Soil Mechanics Course
“I liked the assessment of the course. We were given assignments about certain topics that we were learning at the time, and I find that helpful in learning.” “The assignments questions were very good in making one understand the problems”.
“The labs being instantly assessed helped me learn much more than if the questions had to be answered by myself at home. It eliminated a lot of ambiguity and clarified confusing points especially around theoretical expectations for lab results and interpretation versus ‘real life’ soil behaviour. Stuff that would have taken ages to understand and days of deliberation was addressed.”
4. Methods
5. Student Description
6. Results and Discussion
6.1. Student Academic Performance
6.2. Student Experience with Project-Based Learning
“It seems like a really good decision to make it assignment-based and no final. The assignments have had me go through and be tested on every aspect of the course and I have learnt so much. Probably more than if I had to cram for a final and then forget everything the next week.”
“I feel like in many courses, especially soil mechanics, a large project rather than an exam is more beneficial, as more content can be assessed, plus the question-solving is more realistic (in real life you would have more than 2 hours to solve a problem!).”
“The project-based work enabled me to better see how the theoretical principles we have learned in class actually translate into the real world.”
“Having worked in the engineering industry prior to uni, I find the projects more relevant to what is expected on the job.”
6.3. Feedback to Students
6.4. Student and Teacher Workloads
7. Concluding Remarks
- Compared to the exams, the project-based assignment seems to provide students with a better learning experience, which also leads to better academic performance. The project work provides students with opportunities to learn about the practical value of the course and its relevance to their industry careers.
- Compared to the traditional exams, the better student performance for the project-based assignments may be related to a few factors, including (a) the extra time that students had to complete it, and (b) access to learning resources (which is typically not allowed during an in-person exam). However, it was also found that during and after COVID (2020–2022), the average marks for the project-based assignments significantly decreased. More research to clarify these finding is recommended.
- From a teacher’s point of view, preparing different variations of each project (to avoid cheating) and marking it can considerably increase the teaching load.
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Gibbs, G.; Simpson, C. Conditions under which assessment supports students’ learning. Learn. Teach. High. Educ. 2005, 1, 3–31. [Google Scholar]
- Van de Watering, G.; Gijbels, D.; Dochy, F.; Van der Rijt, J. Students’ assessment preferences, perceptions of assessment and their relationships to study results. High. Educ. 2008, 56, 645. [Google Scholar] [CrossRef]
- Wiliam, D. What is assessment for learning? Stud. Educ. Eval. 2011, 37, 3–14. [Google Scholar] [CrossRef]
- Stefani, L. Planning teaching and learning: Curriculum design and development. In Handbook for Teaching and Learning in Higher Education; Taylor & Francis Group: Abingdon, UK, 2009; pp. 40–57. [Google Scholar]
- Biggs, J.B.; Tang, C. Teaching according to How Students Learn. In Teaching for Quality Learning at University; Open University Press: Maidenhead, UK, 2007; Chapter 2; pp. 15–30. [Google Scholar]
- Bloom, B.S. Handbook on Formative and Summative Evaluation of Student Learning; McGraw-Hill Book Company: New York, NY, USA, 1971. [Google Scholar]
- Chickering, A.W.; Gamson, Z.F. Seven principles for good practice in undergraduate education. AAHE Bull. 1987, 3, 7. [Google Scholar]
- Carless, D. Learning-oriented assessment: Conceptual bases and practical implications. Innov. Educ. Teach. Int. 2007, 44, 57–66. [Google Scholar] [CrossRef]
- Hattie, J.; Timperley, H. The power of feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar] [CrossRef]
- Lizzio, A.; Wilson, K. Feedback on assessment: Students’ perceptions of quality and effectiveness. Assess. Eval. High. Educ. 2008, 33, 263–275. [Google Scholar] [CrossRef]
- Sadler, D.R. Formative assessment: Revisiting the territory. Assess. Educ. Princ. Policy Pract. 1998, 5, 77–84. [Google Scholar] [CrossRef]
- Light, G.; Cox, R. Assessing: Student assessment. In Learning and Teaching in Higher Education: The Reflective Practitioner; Paul Chapman Publishing: London, UK, 2001. [Google Scholar]
- Gibbs, G.; Lucas, L. Coursework assessment, class size and student performance: 1984–94. J. Furth. High. Educ. 1997, 21, 183–192. [Google Scholar] [CrossRef]
- Price, M.; O’Donovan, B.; Rust, C.; Carroll, J. Assessment standards: A manifesto for change. Brookes Ejournal Learn. Teach. 2008, 2, 1–2. [Google Scholar]
- Williams, P. Squaring the circle: A new alternative to alternative-assessment. Teach. High. Educ. 2014, 19, 565–577. [Google Scholar] [CrossRef]
- Mills, J.E.; Treagust, D.F. Engineering education—Is problem-based or project-based learning the answer. Australas. J. Eng. Educ. 2003, 3, 2–16. [Google Scholar]
- Palmer, S.; Hall, W. An evaluation of a project-based learning initiative in engineering education. Eur. J. Eng. Educ. 2011, 36, 357–365. [Google Scholar] [CrossRef]
- Traub, R.E.; MacRury, K.A. Multiple-Choice vs. Free-Response in the Testing of Scholastic Achievement; Ontario Institute for Studies in Education: Toronto, ON, Canada, 1990. [Google Scholar]
- Ben-Chaim, D.; Zoller, U. Examination-type preferences of secondary school students and their teachers in the science disciplines. Instr. Sci. 1997, 25, 347–367. [Google Scholar] [CrossRef]
- Tuckman, B.W. Using tests as an incentive to motivate procrastinators to study. J. Exp. Educ. 1998, 66, 141–147. [Google Scholar] [CrossRef]
- Goorts, K. Replacing Final Exams with Open-Ended Course Projects in Engineering Education. Teach. Innov. Proj. 2020, 9, 1–18. [Google Scholar] [CrossRef]
- Biggs, J. What the student does: Teaching for enhanced learning. High. Educ. Res. Dev. 1999, 18, 57–75. [Google Scholar] [CrossRef]
- Wilkinson, J. Staff and student perceptions of plagiarism and cheating. Int. J. Teach. Learn. High. Educ. 2009, 20, 98–105. [Google Scholar]
- Flores, M.A.; Veiga Simão, A.M.; Barros, A.; Pereira, D. Perceptions of effectiveness, fairness and feedback of assessment methods: A study in higher education. Stud. High. Educ. 2015, 40, 1523–1534. [Google Scholar] [CrossRef]
- Kandlbinder, P. Writing about practice for future learning. In Rethinking Assessment in Higher Education; Routledge: Abingdon, UK, 2007; pp. 169–176. [Google Scholar]
- Gratchev, I.; Jeng, D.S. Introducing a project-based assignment in a traditionally taught engineering course. Eur. J. Eng. Educ. 2018, 43, 788–799. [Google Scholar] [CrossRef]
- Struyven, K.; Dochy, F.; Janssens, S. Students’ perceptions about new modes of assessment in higher education: A review. In Optimising New Modes of Assessment: In Search of Qualities and Standards; Springer: Dordrecht, The Netherlands, 2003; pp. 171–223. [Google Scholar]
- Hattingh, T.; Dison, L.; Woollacott, L. Student learning behaviours around assessments. Australas. J. Eng. Educ. 2019, 24, 14–24. [Google Scholar] [CrossRef]
- Case, J.; Marshall, D. Between deep and surface: Procedural approaches to learning in engineering education contexts. Stud. High. Educ. 2004, 29, 605–615. [Google Scholar] [CrossRef]
- Villarroel, V.; Bloxham, S.; Bruna, D.; Bruna, C.; Herrera-Seda, C. Authentic assessment: Creating a blueprint for course design. Assess. Eval. High. Educ. 2018, 43, 840–854. [Google Scholar] [CrossRef]
- Winstone, N.E.; Boud, D. The need to disentangle assessment and feedback in higher education. Stud. High. Educ. 2022, 47, 656–667. [Google Scholar] [CrossRef]
- Knight, P.T. Summative assessment in higher education: Practices in disarray. Stud. High. Educ. 2002, 27, 275–286. [Google Scholar] [CrossRef]
- Sendziuk, P. Sink or Swim? Improving Student Learning through Feedback and Self-Assessment. Int. J. Teach. Learn. High. Educ. 2010, 22, 320–330. [Google Scholar]
- Carless, D.; Salter, D.; Yang, M.; Lam, J. Developing sustainable feedback practices. Stud. High. Educ. 2011, 36, 395–407. [Google Scholar] [CrossRef]
- Chamberlain, S.; Daly, A.L.; Spalding, V. The fear factor: Students’ experiences of test anxiety when taking A-level examinations. Pastor. Care Educ. 2011, 29, 193–205. [Google Scholar] [CrossRef]
- Sung, Y.T.; Chao, T.Y.; Tseng, F.L. Reexamining the relationship between test anxiety and learning achievement: An individual-differences perspective. Contemp. Educ. Psychol. 2016, 46, 241–252. [Google Scholar] [CrossRef]
- Frank, M.; Barzilai, A. Integrating alternative assessment in a project-based learning course for pre-service science and technology teachers. Assess. Eval. High. Educ. 2004, 29, 41–61. [Google Scholar] [CrossRef]
- Lehmann, M. Problem-Oriented and Project-Based Learning (POPBL) as an Innovative Learning Strategy for Sustainable Development in Engineering Education. Eur. J. Eng. Educ. 2008, 33, 283–295. [Google Scholar] [CrossRef]
- Gulbahar, Y.; Tinmaz, H. Implementing Project-Based Learning and e-Portfolio Assessment in an Undergraduate Course. J. Res. Technol. Educ. 2006, 38, 309–327. [Google Scholar] [CrossRef]
- Lopez-Querol, S.; Sanchez-Cambronero, S.; Rivas, A.; Garmendia, M. Improving Civil Engineering Education: Transportation Geotechnics Taught Through Project-Based Learning Methodologies. J. Prof. Issues Eng. Educ. Pract. 2014, 141, 1–7. [Google Scholar] [CrossRef]
- de Graaff, E.; Kolmos, A. History of Problem-Based and Project-Based Learning. In Management of Change: Implementation of Problem-Based and Project-Based Learning in Engineering; de Graaff, E., Kolmos, A., Eds.; Sense Publishers: Rotterdam, The Netherlands, 2007; pp. 1–8. [Google Scholar]
- Sotiriadou, P.; Logan, D.; Daly, A.; Guest, R. The role of authentic assessment to preserve academic integrity and promote skill development and employability. Stud. High. Educ. 2020, 45, 2132–2148. [Google Scholar] [CrossRef]
- Wiggins, G. Educative Assessment. Designing Assessments To Inform and Improve Student Performance; Jossey-Bass Publishers: San Francisco, CA, USA, 1998. [Google Scholar]
- Sly, L. Practice tests as formative assessment improve student performance on computer-managed learning assessments. Assess. Eval. High. Educ. 1999, 24, 339–343. [Google Scholar] [CrossRef]
- Collett, P.; Gyles, N.; Hrasky, S. Optional formative assessment and class attendance: Their impact on student performance. Glob. Perspect. Account. Educ. 2007, 4, 41. [Google Scholar]
- Gipps, C.V. Educational Accountability in England: The Role of Assessment. In Proceedings of the Annual Meeting of the American Educational Research Association, Chicago, IL, USA, 21–25 April 2003; pp. 1–15. [Google Scholar]
- Shekar, A. Project Based Learning in Engineering Design Education: Sharing Best Practices. In Proceedings of the 121st ASEE Annual Conference & Exposition, Indianapolis, IN, USA, 15–18 June 2014. [Google Scholar]
- Carless, D. Student Feedback: Can do Better–Here’s How; Times Higher Education (THE): London, UK, 2015. [Google Scholar]
- Li, T.; Greenberg, B.A.; Nicholls, J.A.F. Teaching experiential learning: Adoption of an innovative course in an MBA marketing curriculum. J. Mark. Educ. 2007, 29, 25–33. [Google Scholar] [CrossRef]
- Parsons, D. Encouraging learning through external engineering assessment. Australas. J. Eng. Educ. 2007, 13, 21–30. [Google Scholar] [CrossRef]
- Gratchev, I.; Balasubramaniam, A. Developing assessment tasks to improve the performance of engineering students. In Proceedings of the 23rd Annual Conference of the Australasian Association for Engineering Education, Melbourne, Australia, 3–5 December 2012. [Google Scholar]
- Gratchev, I.; Gunalan, S. Replacing Laboratory Work with Online Activities: Does It Work? In Advancing Engineering Education Beyond COVID; CRC Press: Boca Raton, FL, USA, 2022; pp. 91–100. [Google Scholar]
- Fry, H.; Ketteridge, S.; Marshall, S. Understanding student learning. In A Handbook for Teaching and Learning in Higher Education; Kogan: London, UK, 2003; pp. 9–25. [Google Scholar]
- Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef]
- Ruiz-Gallardo, J.R.; González-Geraldo, J.L.; Castaño, S. What are our students doing? Workload, time allocation and time management in PBL instruction. A case study in Science Education. Teach. Teach. Educ. 2016, 53, 51–62. [Google Scholar] [CrossRef]
- Heckendorn, R. Building a Beowulf: Leveraging Research and Department Needs for Student Enrichment via Project Based Learning. Comput. Sci. Educ. 2002, 12, 255–273. [Google Scholar] [CrossRef]
Exam-Based Assessment Plan (2015–2017) | Project-Based Assessment Plan (2018–2022) |
---|---|
Lab work (10%) | Lab work (10%) |
Two assignments (10%) | Two online quizzes (20%) |
Mid-semester exam (25%) | Project 1 (25%) |
Final Exam (55%) | Project 2 (35%) |
Site visit or industry guest lecture reflection (10%) |
Type of Exam (2015–2017) | Project-Based Assignments (2018–2022) |
---|---|
Mid-semester exam | Project 1 |
Content: Textbook-like problems on soil classification, soil constituents, soil compaction, stresses, water seepage, and soil compaction. Duration: 2-h exam in the class Submission: Exam paper was submitted to the invigilator at the end of the exam Feedback: Solutions were uploaded; additional feedback from the teacher was provided on student demand. | Content: Students were given real data from site (borehole logs) investigation and lab tests. Students were required to draw a cross-section, discuss the geology, classify soil, estimate stresses including the effect of upward seepage and analyse the data from compaction tests. Duration: 10 days Submission: Individual report was submitted online Feedback: General feedback via email to students, short personal feedback via rubrics assessment, additional feedback from the teacher was provided on student demand. |
Final exam | Project 2 |
Content: Textbook problems on water flow, flow nets, soil deformation, consolidation, and shear strength. Duration: 3-h exam in the class Submission: Exam paper was submitted to the invigilator at the end of the exam. Feedback: Feedback was provided on student demand | Content: Students were given real data from site and lab investigations. They were required to draw a flow net and estimate the stresses, estimate the time and amount of soft soil consolidation due to embankment loads, and obtain shear strength parameters for slope stability analysis. Duration: 10 days Submission: Individual report was submitted online Feedback: Short personal feedback via rubrics assessment. More detailed feedback from the teacher was provided on student demand. |
Years | 2015 | 2016 | 2017 | 2018 | 2019 | 2020 | 2021 | 2022 |
---|---|---|---|---|---|---|---|---|
Number of students | 144 | 128 | 112 | 97 | 95 | 96 | 78 | 74 |
Male/Female (%) | 85/15 | 86/14 | 89/11 | 87/13 | 92/8 | 83/17 | 87/13 | 87/13 |
Domestic/International (%) | 65/35 | 62/38 | 66/34 | 63/37 | 77/23 | 75/25 | 87/13 | 85/15 |
GPA Range | 2015 | 2016 | 2017 | 2018 | 2019 | 2020 | 2021 | 2022 |
---|---|---|---|---|---|---|---|---|
<4 | 15.5 | 18.6 | 22.4 | 13.2 | 10.5 | 7.5 | 20.0 | 14.8 |
4 to <5 | 29.6 | 32.6 | 22.4 | 32.7 | 35.8 | 28.8 | 29.1 | 37.0 |
5 to <6 | 30.3 | 24.8 | 19.8 | 28.8 | 24.2 | 25.0 | 34.5 | 31.5 |
6 to 7 | 13.4 | 8.5 | 13.8 | 14.8 | 11.6 | 20.0 | 14.5 | 13.0 |
no GPA | 11.3 | 15.5 | 21.6 | 10.6 | 17.9 | 18.8 | 1.8 | 3.7 |
Years | 2015 | 2016 | 2017 | 2018 | 2019 | 2020 | 2021 | 2022 | |
---|---|---|---|---|---|---|---|---|---|
Assessment Type | Exam-Based Assessment | Project-Based Assessment | |||||||
Mid semester exam/Project 1 mark | Average | 69.9 | 67.2 | 70.8 | 75.1 | 76.9 | 66.3 | 66.5 | 62.3 |
Standard Deviation | 17.4 | 19.3 | 19.5 | 14.0 | 10.3 | 17.2 | 16.1 | 21.8 | |
Final Exam/Project 2 mark | Average | 65.2 | 63.8 | 61.2 | 76.4 | 74.7 | 69.0 | 65.8 | 67.0 |
Standard Deviation | 15.8 | 19.5 | 15.5 | 12.0 | 12.2 | 13.1 | 15.2 | 15.3 | |
Failure rate (%) | 1.3 | 8.3 | 4.3 | 5.2 | 3.2 | 9.9 | 16.2 | 18.0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gratchev, I. Replacing Exams with Project-Based Assessment: Analysis of Students’ Performance and Experience. Educ. Sci. 2023, 13, 408. https://doi.org/10.3390/educsci13040408
Gratchev I. Replacing Exams with Project-Based Assessment: Analysis of Students’ Performance and Experience. Education Sciences. 2023; 13(4):408. https://doi.org/10.3390/educsci13040408
Chicago/Turabian StyleGratchev, Ivan. 2023. "Replacing Exams with Project-Based Assessment: Analysis of Students’ Performance and Experience" Education Sciences 13, no. 4: 408. https://doi.org/10.3390/educsci13040408