Next Article in Journal
A Human Capability Perspective on the Progression of Low-SES Students to Higher Education in Ireland and the UK
Next Article in Special Issue
Model-Eliciting Activities: Pre-Service Teachers’ Perceptions of Integrated STEM
Previous Article in Journal
Critical Issues and Trends in Innovation and Entrepreneurship Education in Higher Education in the Post-COVID-19 Era in China and Spain
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Replacing Exams with Project-Based Assessment: Analysis of Students’ Performance and Experience

School of Engineering and Built Environment, Griffith University, Southport, QLD 4222, Australia
Educ. Sci. 2023, 13(4), 408; https://doi.org/10.3390/educsci13040408
Submission received: 13 March 2023 / Revised: 3 April 2023 / Accepted: 14 April 2023 / Published: 17 April 2023
(This article belongs to the Special Issue Project Based Learning and Engineering Education)

Abstract

:
This study seeks to investigate whether project-based assignments can lead to better student performance and learning experience compared to traditional examinations. In an engineering course of soil mechanics, the traditional mid-semester and final exams were replaced by project work which was related to a real-life site investigation. Student performance was evaluated on the basis of student marks whilst student feedback was analysed to understand student experience with project-based assignments. The results indicated that the student average mark for the projects was greater than the average mark for the exams. In addition, their learning experience improved after the exams were replaced with the project-based assignments because students were able to see practical applications of the course content. However, a few issues, including feedback to students delivered at the end of the term, increased teacher’s workload, and the effect of COVID were also identified.

1. Introduction

Assessment is one of the most influential factors in student learning [1,2,3]. According to [4], students tend to think about assessments first, and this shapes their whole approach to the course. Biggs and Tang (2007) [5] noted that in order to better engage students in learning, assessment should be devised in a constructive manner to support the sequence of learning and to allow students to achieve all learning objectives progressively. Another important role of assessment is to provide students with feedback on their progress. This feedback is essential in student learning [6,7,8,9] because it bridges the gap between the actual level of performance and the desired learning goal [10] while giving students a better understanding of the subject [11,12].
In engineering courses, it has been common practice to use traditional high-stake exams to assess student knowledge and skills. The non-exam portion of the assessment plan typically includes low-stake assignments and laboratories, which are seen as an opportunity for students to better prepare for the final examination [13]. However, there is a growing concern that exams may not be effective in assessing some student skills and understanding [14,15], and thus changes in engineering education towards producing graduates with practical skills and industry experience are required [16].
In recent years, experiential learning has become a popular method of teaching, especially in first-year courses where students can develop skills and attributes expected from an engineer [17]. In such cases, traditional exams may be replaced with project-based work, which is perceived as a more engaging and industry-related. However, this prompts the question as to how effective, in comparison with traditional exams, such project-based assignments can be in assisting students in their learning journey. This work seeks to investigate the effect of exams and project-based assignments on student performance and learning experience, using a case study of a Soil Mechanics engineering course where both types of assessments were used over the past several years.

2. Exams vs. Projects: Advantages and Shortcomings

Exams have long been used in engineering education to assess student knowledge and skills. As a result, students have a good understanding of what they need to produce to pass this assessment [2,18,19]. As exams have a large weighting, students prioritize them over other assessments [20,21], and even if students do not engage in course activities, they still study for the exam to successfully pass the course [22].
Many teachers consider invigilated (or in-class) exams as fair evaluations of student performance [23] because they limit the opportunity for students to cheat or plagiarize. Additionally, examinations appear to be effective assessments for large classes [24], as they are less time-consuming and labour-intensive to implement compared to alternative assessments such as projects and/or oral presentations [25,26]. However, there are some concerns regarding the use of exams in current education:
  • Struyven et al. (2003) [27] argued that exams may not be effective in retaining knowledge. According to [28], many students only study the content which will be covered in the exam, while some students memorize it without understanding it. Although students admit that they have a better understanding of the material after the test, they also quickly forget it after the assessment has passed.
  • Case and Marshall (2004) [29] and Villarroel et al. (2018) [30] noted that when approaching the examination, a significant portion of students adopt a surface approach to learning. Students consider exams as an end to the learning process and mostly care about the number grade [31].
  • Examinations are generally seen as summative assessments because they do not provide formative feedback [32,33]. However, even if feedback is provided, it is usually too late in the course and students do not have time to process it [15,34].
  • Students tend to have a very high level of stress and anxiety before or during the exam, which may significantly affect their wellbeing and performance [35,36].
  • Exams may not be very practical or may not have the capacity to assess student soft and practical skills [24]. In relation to student’s industrial careers, it seems unlikely that engineers would be required to do a 2-h exam in the industry.
Frank and Barzilai (2004) [37] noted that traditional exams may not be the best framework by which to assess the achievement of learning goals for students who are involved in experiential learning activities. Therefore, alternative assessment such as projects may be used instead. In contrast to exams, project-based assignments have several advantages, which can be formulated as follows:
  • Students deal with real-life problems, and this helps them better understand how theories can be applied to solve practical problems [38]. This results in long-term retention of knowledge [39] and can also lead to better learning experiences [40] because students learn valuable lessons from working on a real-life project.
  • Students become more motivated when assessments are perceived as authentic, which makes the learning process more authentic [41]. Project-based learning may improve student skills such as professional identity and awareness, communication skills, and their employability prospects [42].
  • Projects are commonly referred to as formative assessment or assessment for learning because they assess student performance during learning and provide feedback for improvement [43]. Such feedback can give students a better understanding of the subject [11] and may improve students’ performance as well [44,45]. However, this feedback should be timely for it to be perceived as useful and actionable. Gipps (2003) [46] noted that timely feedback helps low-performing students improve their performance.
  • Despite the number of advantages, there are some issues that need to be addressed:
  • Designing an authentic assessment takes more time and resources. Developing effective feedback also requires time and careful thought, which may be rather difficult in large classes where teachers face multiple demands [47,48].
  • Providing timely and personal feedback on projects can amount to a heavier workload for teachers compared to traditional assessments [49].
  • Unlike examinations, project assignments are usually unsupervised by the teacher, which means that students may provide assistance to other students, either voluntarily or for a fee [50].
It is evident from the above review that both types of assessment have benefits and shortcomings, and they can be effective in assessing students’ knowledge and skills; however, at different levels. This study seeks to clarify the effect of project-based learning on student performance and learning experience in comparison to the traditional exam. The main research questions are as follows:
  • Compared to traditional exams, do project-based assignments lead to better academic performance?
  • Does project-based learning provide students with a better learning experience?

3. Soil Mechanics Course

Soil Mechanics (SM) is a second-year course of the four-year civil engineering program at Griffith University. This course is designed to provide students with knowledge of the fundamentals of soil mechanics [26]. SM consists of weekly lectures, tutorials, and laboratory sessions. The lectures and tutorials cover the theoretical and practical aspects of soil behaviour. The laboratories are arranged in a way that provides students with hands-on experience and reinforces theoretical knowledge through practical work.
The assessment plan historically included mid-semester and final exams, and the coursework consisted of labs and assignments. The low-stakes assignment and laboratory work were used during the semester to help students engage in learning and prepare them for the exams. An assignment consisted of several textbook problems was given to students prior to each exam so that students could practice and develop problem-solving skills [51]. Students enjoyed the opportunity to practice before each exam and provided positive comments:
“I liked the assessment of the course. We were given assignments about certain topics that we were learning at the time, and I find that helpful in learning.” “The assignments questions were very good in making one understand the problems”.
The hands-on laboratory work was essential in developing practical skills. The assessment was performed at the end of each laboratory so that students could receive instant feedback, which allowed them to see where they were in their learning and what was needed to improve their performance. Students provided this feedback:
“The labs being instantly assessed helped me learn much more than if the questions had to be answered by myself at home. It eliminated a lot of ambiguity and clarified confusing points especially around theoretical expectations for lab results and interpretation versus ‘real life’ soil behaviour. Stuff that would have taken ages to understand and days of deliberation was addressed.”
The exams were the major assessment items with the largest weighting (Table 1) in 2015–2017. They were used to test student knowledge, understanding of the subject, and problem-solving skills. The mid-semester exam was offered in Week 7, and it covered the content from the first part of the course (Table 2). The final exam was given at the end of the term, and it mostly dealt with the material covered in the second half of the course. In the final exam, students were required to perform more complex soil mechanics tasks compared to the mid-semester exam.
Feedback to students. After the mid-semester exam, an email that described common mistakes in the exam was sent to all students while the solution to exam problems was uploaded on the course webpage. Students were also welcome to discuss their exam papers with the lecturer and receive more detailed feedback on their performance. If students wished to receive feedback on their final exam, they were required to contact the lecturer and arrange a time to discuss their work, as it was already after the end of the semester.
In 2018, the course assessment plan was revised towards project-based learning and the exams were replaced with two project-based assignments. The rationale for this major change was that more practical project-based work should motivate students to learn more because they would be able to see practical aspects of what they studied in the lectures. Project 1 replaced the mid-semester exam, while Project 2 was the replacement of the final exam (Table 1). Both projects were given to students 10 days before the submission deadline. Project 1 covered the same topics as the mid-semester exam. However, unlike the exam, the project content was based on a real-life site investigation. In Project 1, students were required to perform a sequence of tasks similar to what engineers would do in their daily practice. Each small task was related to certain theoretical aspects of the course and built progressively on each other, while all tasks together were connected in one large engineering project. Project 2 covered the same topics as the final exam, but similar to Project 1, it was designed on the basis of a real-life geotechnical investigation. In project 2, students were required to interpret and analyse this data while solving practical engineering problems (Table 2).
Each project had 20 variations to minimize plagiarism, and it was allocated to students based on their student number. Although students could work on their projects together, each student was required to produce an individual report. Students were advised about plagiarism and were not allowed to copy from each other.
Feedback to students. The marking was performed according to the marking rubrics which was provided to all students along with the project description. The assessment rubric and short feedback that identified each student mistake were uploaded online. An email that summarized common mistakes was sent to all students, and solutions to similar problems were uploaded on the course website as well.

4. Methods

The methods used to assess the impact of exams and project-based assignments on student learning experience included analysis of student academic performance and student feedback. The information about the student’s gender, their status (domestic/international), and socioeconomic background, as well as the student’s GPA prior to enrolling in Soil Mechanics, were collected from the Griffith University Planning and Statistics Portal. It is noted that at Griffith University, GPA is considered a measure of student academic performance: the highest value is 7 (High Distinction) and the pass value is 4. If students fail one or more courses, their GPA can fall below 4. The obtained data is summarized in Table 3.
Average marks for the mid-semester and final exams (2015–2017), and Projects 1 and 2 (2018–2022), and the course failure rate were collected (Table 4), analysed, and compared. The one-way analysis of variation (ANOVA) was used to determine whether the mean differences of the mid-semester and Project 1 as well as the final exam and Project 2 were statistically significant. The following major hypothesis tests were performed: (a) the means of the mid-semester exam marks over three years (2015–2017) were equal; (b) the means of the final exam marks over three years (2015–2017) were equal; (c) the means of Project 1 marks obtained from 2018 to 2022 were equal; (d) the means of Project 2 marks obtained from 2018 to 2022 were equal; (e) the means of the mid-semester exam marks (2015–2017) and Project 1 marks (2018–2022) were equal; and (f) the means of the final exam marks (2015–2017) and Project 2 marks (2018–2022) were equal. For each hypothesis test, the p-value was calculated, and when this p-value was less than the significance level of 0.05, the hypothesis was rejected.
Formal feedback from student evaluation of course (SEC) surveys conducted by Griffith University at the end of the course, and informal feedback such as student–teacher in-person communication and emails during the course, were collected and qualitatively analysed to learn about the students’ experience and their satisfaction with the project-based assessment.

5. Student Description

The following observations can be made regarding the demographics of students in the course of Soil Mechanics in 2015–2022. The majority of each cohort comprised male students, while the number of female students varied in the range from the lowest of 8% (2019) to the highest of 18% (2020). The highest percentage (about 35%) of international students was observed in 2015–2018; however, this number gradually declined in 2019–2020, followed by a sharp decrease to about 14% in 2021–2022 mostly due to the COVID-related restrictions. Most of students were of the age of 17–24 years old from a medium socioeconomic background.
The GPA data (Table 4) indicates students’ academic performance prior to Soil Mechanics. This data is presented as the range of GPA values against the percentage of students in each year. Although there are some variations in the number, it can be inferred that, almost for all student cohorts, the largest percentage of students had GPAs in the range of 4–5, with the next highest range of 5–6. It is noted that there were students (including international students) who joined the civil engineering program at Griffith University from year 2, and Soil Mechanics was one of their first courses to complete. For this reason, these students did not have a GPA prior to enrolling in Soil Mechanics.

6. Results and Discussion

The following section discusses the main findings of this work, including student performance in the exams vs. the project-based assignments (Table 5), student feedback on their experience with project-based learning, and issues related to the use of project-based assignments in this course.

6.1. Student Academic Performance

Exam-based assessment plan (2015–2017). Analysis of student performance in the mid-semester and final exams over these three years reveals that the average mark of different student cohorts was very similar with only minor variations observed. The ANOVA analysis indicated that there was not any statistical difference between (a) the means of the mid-semester exam marks (the p-value was 0.293), and (b) the mean final exam marks over these three years were not statistically different either (the p-value was 0.196).
It is evident that students performed better in the mid-semester exam (average of 69.3) compared to the final exam (average score of 63.4%). This can be attributed to the final exam having a greater level of difficulty and, possibly, a higher level of student anxiety and stress. The failure rate was recorded to be relatively low for such a course, with the average value of 4.6%. It is believed that students generally spent sufficient time for exam preparations, as both exams had a significant weighting towards their final grade. Thus, even low-achieving students and those students who did not engage in the learning activities during the whole semester could still produce results which were satisfactory enough to pass the course.
Project-based assessment plan. The student performance in the project-based assignments can be divided into two parts: before COVID (2018–2019), and during/after COVID (2020–2022). This division is supported by the data from the ANOVA analyses that revealed that the mean marks for both Project 1 and Project 2 obtained in 2018–2019 were statistically different from the marks in 2020–2022, with the p-values for both cases being less than 0.05. The ANOVA analysis also indicated that there was not any statistical difference in the average marks for either Project 1 (p-value of 0.319) or Project 2 (p = 0.324) in 2019 and 2020 (before COVID). The statistical analysis of the average marks in 2020–2022 showed no significant statistical difference as well, with p-values of 0.278 (Project 1) and 0.346 (Project 2).
Further analysis reveals that, on average, the student mark for Project 1 in 2018–2019 was greater than the student mark for the mid-semester exam in 2015–2017 (Table 5). Similarly, the average mark for Project 2 was greater than the average mark for the final exam. The literature suggests that students generally perform better in project-based assignments because they have more time to work on them, which also reduces student stress and anxiety. In this course, students could use notes or other resources which were not allowed during the exams. In general, students were engaged in the project work, and there were a few students who even began to consider geotechnical engineering as their future career path after graduation.
In 2020, due to the COVID pandemic, all classes were changed to online delivery, including the hands-on laboratory sessions. Although all lab work was pre-recorded and discussed with students during the online laboratory sessions, the student–teacher interaction became limited, a fact that also decreased student motivation [52]. The pandemic affected the student performance as well, because some students could not fully engage in learning due to either technical or personal reasons. In 2021–2022, the blended delivery mode was used, in which online lectures and tutorial were combined with face-to-face labs and tutorials to improve the student–teacher interaction. However, the failure rate remained high as many students did not submit/attempt at least one project assignment, which lowered their final score below the pass level.
The effect of COVID on student performance in SM is out of scope of this study; however, it is noted that relatively high failure rates for the same student cohorts were recorded in other courses that the students undertook at the same time with SM.

6.2. Student Experience with Project-Based Learning

To evaluate student learning experience, their feedback was collected and analysed. Although students were not required to comment on the change from the exams to the project-based assignments, some students voluntarily provided positive feedback with a few examples given below:
“It seems like a really good decision to make it assignment-based and no final. The assignments have had me go through and be tested on every aspect of the course and I have learnt so much. Probably more than if I had to cram for a final and then forget everything the next week.”
“I feel like in many courses, especially soil mechanics, a large project rather than an exam is more beneficial, as more content can be assessed, plus the question-solving is more realistic (in real life you would have more than 2 hours to solve a problem!).”
The students also noted the project-based assignments showed them practical aspects of the course, which made it more relevant to their engineering degree.
“The project-based work enabled me to better see how the theoretical principles we have learned in class actually translate into the real world.”
“Having worked in the engineering industry prior to uni, I find the projects more relevant to what is expected on the job.”

6.3. Feedback to Students

The literature suggests that students should be regularly assessed to receive timely feedback on their performance [15]. Frequent assignments can more effectively distribute student effort across the whole course and provide students with regular opportunities to reflect on what they have learnt, and what they still have to learn [7]. However, a relatively large number of assessment items may also overload students, forcing them to become ‘strategic’ learners and to adopt a ‘surface’ approach to learning [53].
Both exams (2015–2017) and two projects (2018–2022) were the major assessment items, and they were scheduled almost at the same time during the semester; that is, in the middle and at the end of the term. This enables the comparison in terms of feedback to students for each assessment type. The literature suggests that project-based assignments tend to provide formative feedback, compared to examinations, which are generally perceived as summative assessment. However, it is also important to recognize the student feedback literacy [54] and student willingness to receive it. Although students value feedback as a means to improve their performance [28], only a few of them would seek it from the teacher. It was observed in this study that about the same number of students (10–15%) every year sought additional feedback from the lecturer either after the mid-semester exam (2015–2017) or Project 1 (2018–2022). Almost no student approached the lecturer for formative feedback after the final exam (2015–2017) or Project 2 (2018–2022). This can be attributed to the following factors: (1) the final exam and Project 2 were scheduled at the end of the term. Many students did not seem to express interest in receiving feedback once they were satisfied with their final mark. Carless et al. (2011) [34], Williams (2014) [15] noted that when assessment is given at the end of term, there is limited scope for students to apply insights from the teacher’s comments. (2) It is possible that some students felt intimidated in approaching the lecturer after the end of the term to review their final exam or Project 2 work or did not want to bother their lecturer.

6.4. Student and Teacher Workloads

During student–teacher interactions in the class and/or after class (consultation time), a few students noted that they spent more time working on the project compared to the time spent on a traditional exam. It is assumed that this extra time should help students better engage in the course content. However, when overloaded, students may not have sufficient time to complete each assignment to a satisfactory level because they may also be required to complete assignments from other courses [55]. Heckendorn (2002) [56] noted that project-based assignments require a longer time to complete, which may result in students feeling they have an excessive load [39].
There was a significant increase in the teacher’s workload as well (at least two-fold), compared to the time that the teacher spent on the exam preparation and marking. This increase was related to (1) developing/updating variations of each project; (2) marking of relatively long reports of 20 variations; (3) preparing and uploading short feedback for each student.

7. Concluding Remarks

This paper discusses the advantages and shortcomings of the exams and project-based assignment and their effect on student performance and experience in the engineering course of Soil Mechanics. Based on the obtained results, the following conclusions can be drawn:
  • Compared to the exams, the project-based assignment seems to provide students with a better learning experience, which also leads to better academic performance. The project work provides students with opportunities to learn about the practical value of the course and its relevance to their industry careers.
  • Compared to the traditional exams, the better student performance for the project-based assignments may be related to a few factors, including (a) the extra time that students had to complete it, and (b) access to learning resources (which is typically not allowed during an in-person exam). However, it was also found that during and after COVID (2020–2022), the average marks for the project-based assignments significantly decreased. More research to clarify these finding is recommended.
  • From a teacher’s point of view, preparing different variations of each project (to avoid cheating) and marking it can considerably increase the teaching load.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is contained within this article.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Gibbs, G.; Simpson, C. Conditions under which assessment supports students’ learning. Learn. Teach. High. Educ. 2005, 1, 3–31. [Google Scholar]
  2. Van de Watering, G.; Gijbels, D.; Dochy, F.; Van der Rijt, J. Students’ assessment preferences, perceptions of assessment and their relationships to study results. High. Educ. 2008, 56, 645. [Google Scholar] [CrossRef]
  3. Wiliam, D. What is assessment for learning? Stud. Educ. Eval. 2011, 37, 3–14. [Google Scholar] [CrossRef]
  4. Stefani, L. Planning teaching and learning: Curriculum design and development. In Handbook for Teaching and Learning in Higher Education; Taylor & Francis Group: Abingdon, UK, 2009; pp. 40–57. [Google Scholar]
  5. Biggs, J.B.; Tang, C. Teaching according to How Students Learn. In Teaching for Quality Learning at University; Open University Press: Maidenhead, UK, 2007; Chapter 2; pp. 15–30. [Google Scholar]
  6. Bloom, B.S. Handbook on Formative and Summative Evaluation of Student Learning; McGraw-Hill Book Company: New York, NY, USA, 1971. [Google Scholar]
  7. Chickering, A.W.; Gamson, Z.F. Seven principles for good practice in undergraduate education. AAHE Bull. 1987, 3, 7. [Google Scholar]
  8. Carless, D. Learning-oriented assessment: Conceptual bases and practical implications. Innov. Educ. Teach. Int. 2007, 44, 57–66. [Google Scholar] [CrossRef]
  9. Hattie, J.; Timperley, H. The power of feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar] [CrossRef]
  10. Lizzio, A.; Wilson, K. Feedback on assessment: Students’ perceptions of quality and effectiveness. Assess. Eval. High. Educ. 2008, 33, 263–275. [Google Scholar] [CrossRef]
  11. Sadler, D.R. Formative assessment: Revisiting the territory. Assess. Educ. Princ. Policy Pract. 1998, 5, 77–84. [Google Scholar] [CrossRef]
  12. Light, G.; Cox, R. Assessing: Student assessment. In Learning and Teaching in Higher Education: The Reflective Practitioner; Paul Chapman Publishing: London, UK, 2001. [Google Scholar]
  13. Gibbs, G.; Lucas, L. Coursework assessment, class size and student performance: 1984–94. J. Furth. High. Educ. 1997, 21, 183–192. [Google Scholar] [CrossRef]
  14. Price, M.; O’Donovan, B.; Rust, C.; Carroll, J. Assessment standards: A manifesto for change. Brookes Ejournal Learn. Teach. 2008, 2, 1–2. [Google Scholar]
  15. Williams, P. Squaring the circle: A new alternative to alternative-assessment. Teach. High. Educ. 2014, 19, 565–577. [Google Scholar] [CrossRef]
  16. Mills, J.E.; Treagust, D.F. Engineering education—Is problem-based or project-based learning the answer. Australas. J. Eng. Educ. 2003, 3, 2–16. [Google Scholar]
  17. Palmer, S.; Hall, W. An evaluation of a project-based learning initiative in engineering education. Eur. J. Eng. Educ. 2011, 36, 357–365. [Google Scholar] [CrossRef]
  18. Traub, R.E.; MacRury, K.A. Multiple-Choice vs. Free-Response in the Testing of Scholastic Achievement; Ontario Institute for Studies in Education: Toronto, ON, Canada, 1990. [Google Scholar]
  19. Ben-Chaim, D.; Zoller, U. Examination-type preferences of secondary school students and their teachers in the science disciplines. Instr. Sci. 1997, 25, 347–367. [Google Scholar] [CrossRef]
  20. Tuckman, B.W. Using tests as an incentive to motivate procrastinators to study. J. Exp. Educ. 1998, 66, 141–147. [Google Scholar] [CrossRef]
  21. Goorts, K. Replacing Final Exams with Open-Ended Course Projects in Engineering Education. Teach. Innov. Proj. 2020, 9, 1–18. [Google Scholar] [CrossRef]
  22. Biggs, J. What the student does: Teaching for enhanced learning. High. Educ. Res. Dev. 1999, 18, 57–75. [Google Scholar] [CrossRef]
  23. Wilkinson, J. Staff and student perceptions of plagiarism and cheating. Int. J. Teach. Learn. High. Educ. 2009, 20, 98–105. [Google Scholar]
  24. Flores, M.A.; Veiga Simão, A.M.; Barros, A.; Pereira, D. Perceptions of effectiveness, fairness and feedback of assessment methods: A study in higher education. Stud. High. Educ. 2015, 40, 1523–1534. [Google Scholar] [CrossRef]
  25. Kandlbinder, P. Writing about practice for future learning. In Rethinking Assessment in Higher Education; Routledge: Abingdon, UK, 2007; pp. 169–176. [Google Scholar]
  26. Gratchev, I.; Jeng, D.S. Introducing a project-based assignment in a traditionally taught engineering course. Eur. J. Eng. Educ. 2018, 43, 788–799. [Google Scholar] [CrossRef]
  27. Struyven, K.; Dochy, F.; Janssens, S. Students’ perceptions about new modes of assessment in higher education: A review. In Optimising New Modes of Assessment: In Search of Qualities and Standards; Springer: Dordrecht, The Netherlands, 2003; pp. 171–223. [Google Scholar]
  28. Hattingh, T.; Dison, L.; Woollacott, L. Student learning behaviours around assessments. Australas. J. Eng. Educ. 2019, 24, 14–24. [Google Scholar] [CrossRef]
  29. Case, J.; Marshall, D. Between deep and surface: Procedural approaches to learning in engineering education contexts. Stud. High. Educ. 2004, 29, 605–615. [Google Scholar] [CrossRef]
  30. Villarroel, V.; Bloxham, S.; Bruna, D.; Bruna, C.; Herrera-Seda, C. Authentic assessment: Creating a blueprint for course design. Assess. Eval. High. Educ. 2018, 43, 840–854. [Google Scholar] [CrossRef]
  31. Winstone, N.E.; Boud, D. The need to disentangle assessment and feedback in higher education. Stud. High. Educ. 2022, 47, 656–667. [Google Scholar] [CrossRef]
  32. Knight, P.T. Summative assessment in higher education: Practices in disarray. Stud. High. Educ. 2002, 27, 275–286. [Google Scholar] [CrossRef]
  33. Sendziuk, P. Sink or Swim? Improving Student Learning through Feedback and Self-Assessment. Int. J. Teach. Learn. High. Educ. 2010, 22, 320–330. [Google Scholar]
  34. Carless, D.; Salter, D.; Yang, M.; Lam, J. Developing sustainable feedback practices. Stud. High. Educ. 2011, 36, 395–407. [Google Scholar] [CrossRef]
  35. Chamberlain, S.; Daly, A.L.; Spalding, V. The fear factor: Students’ experiences of test anxiety when taking A-level examinations. Pastor. Care Educ. 2011, 29, 193–205. [Google Scholar] [CrossRef]
  36. Sung, Y.T.; Chao, T.Y.; Tseng, F.L. Reexamining the relationship between test anxiety and learning achievement: An individual-differences perspective. Contemp. Educ. Psychol. 2016, 46, 241–252. [Google Scholar] [CrossRef]
  37. Frank, M.; Barzilai, A. Integrating alternative assessment in a project-based learning course for pre-service science and technology teachers. Assess. Eval. High. Educ. 2004, 29, 41–61. [Google Scholar] [CrossRef]
  38. Lehmann, M. Problem-Oriented and Project-Based Learning (POPBL) as an Innovative Learning Strategy for Sustainable Development in Engineering Education. Eur. J. Eng. Educ. 2008, 33, 283–295. [Google Scholar] [CrossRef]
  39. Gulbahar, Y.; Tinmaz, H. Implementing Project-Based Learning and e-Portfolio Assessment in an Undergraduate Course. J. Res. Technol. Educ. 2006, 38, 309–327. [Google Scholar] [CrossRef]
  40. Lopez-Querol, S.; Sanchez-Cambronero, S.; Rivas, A.; Garmendia, M. Improving Civil Engineering Education: Transportation Geotechnics Taught Through Project-Based Learning Methodologies. J. Prof. Issues Eng. Educ. Pract. 2014, 141, 1–7. [Google Scholar] [CrossRef]
  41. de Graaff, E.; Kolmos, A. History of Problem-Based and Project-Based Learning. In Management of Change: Implementation of Problem-Based and Project-Based Learning in Engineering; de Graaff, E., Kolmos, A., Eds.; Sense Publishers: Rotterdam, The Netherlands, 2007; pp. 1–8. [Google Scholar]
  42. Sotiriadou, P.; Logan, D.; Daly, A.; Guest, R. The role of authentic assessment to preserve academic integrity and promote skill development and employability. Stud. High. Educ. 2020, 45, 2132–2148. [Google Scholar] [CrossRef]
  43. Wiggins, G. Educative Assessment. Designing Assessments To Inform and Improve Student Performance; Jossey-Bass Publishers: San Francisco, CA, USA, 1998. [Google Scholar]
  44. Sly, L. Practice tests as formative assessment improve student performance on computer-managed learning assessments. Assess. Eval. High. Educ. 1999, 24, 339–343. [Google Scholar] [CrossRef]
  45. Collett, P.; Gyles, N.; Hrasky, S. Optional formative assessment and class attendance: Their impact on student performance. Glob. Perspect. Account. Educ. 2007, 4, 41. [Google Scholar]
  46. Gipps, C.V. Educational Accountability in England: The Role of Assessment. In Proceedings of the Annual Meeting of the American Educational Research Association, Chicago, IL, USA, 21–25 April 2003; pp. 1–15. [Google Scholar]
  47. Shekar, A. Project Based Learning in Engineering Design Education: Sharing Best Practices. In Proceedings of the 121st ASEE Annual Conference & Exposition, Indianapolis, IN, USA, 15–18 June 2014. [Google Scholar]
  48. Carless, D. Student Feedback: Can do Better–Here’s How; Times Higher Education (THE): London, UK, 2015. [Google Scholar]
  49. Li, T.; Greenberg, B.A.; Nicholls, J.A.F. Teaching experiential learning: Adoption of an innovative course in an MBA marketing curriculum. J. Mark. Educ. 2007, 29, 25–33. [Google Scholar] [CrossRef]
  50. Parsons, D. Encouraging learning through external engineering assessment. Australas. J. Eng. Educ. 2007, 13, 21–30. [Google Scholar] [CrossRef]
  51. Gratchev, I.; Balasubramaniam, A. Developing assessment tasks to improve the performance of engineering students. In Proceedings of the 23rd Annual Conference of the Australasian Association for Engineering Education, Melbourne, Australia, 3–5 December 2012. [Google Scholar]
  52. Gratchev, I.; Gunalan, S. Replacing Laboratory Work with Online Activities: Does It Work? In Advancing Engineering Education Beyond COVID; CRC Press: Boca Raton, FL, USA, 2022; pp. 91–100. [Google Scholar]
  53. Fry, H.; Ketteridge, S.; Marshall, S. Understanding student learning. In A Handbook for Teaching and Learning in Higher Education; Kogan: London, UK, 2003; pp. 9–25. [Google Scholar]
  54. Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef]
  55. Ruiz-Gallardo, J.R.; González-Geraldo, J.L.; Castaño, S. What are our students doing? Workload, time allocation and time management in PBL instruction. A case study in Science Education. Teach. Teach. Educ. 2016, 53, 51–62. [Google Scholar] [CrossRef]
  56. Heckendorn, R. Building a Beowulf: Leveraging Research and Department Needs for Student Enrichment via Project Based Learning. Comput. Sci. Educ. 2002, 12, 255–273. [Google Scholar] [CrossRef]
Table 1. Assessment plans used in Soil Mechanics.
Table 1. Assessment plans used in Soil Mechanics.
Exam-Based Assessment Plan (2015–2017)Project-Based Assessment Plan (2018–2022)
Lab work (10%)Lab work (10%)
Two assignments (10%)Two online quizzes (20%)
Mid-semester exam (25%)Project 1 (25%)
Final Exam (55%)Project 2 (35%)
Site visit or industry guest lecture reflection (10%)
Table 2. The content of the exams and project-based assignments.
Table 2. The content of the exams and project-based assignments.
Type of Exam (2015–2017)Project-Based Assignments (2018–2022)
Mid-semester examProject 1
Content: Textbook-like problems on soil classification, soil constituents, soil compaction, stresses, water seepage, and soil compaction.
Duration: 2-h exam in the class
Submission: Exam paper was submitted to the invigilator at the end of the exam
Feedback: Solutions were uploaded; additional feedback from the teacher was provided on student demand.
Content: Students were given real data from site (borehole logs) investigation and lab tests. Students were required to draw a cross-section, discuss the geology, classify soil, estimate stresses including the effect of upward seepage and analyse the data from compaction tests.
Duration: 10 days
Submission: Individual report was submitted online
Feedback: General feedback via email to students, short personal feedback via rubrics assessment, additional feedback from the teacher was provided on student demand.
Final examProject 2
Content: Textbook problems on water flow, flow nets, soil deformation, consolidation, and shear strength.
Duration: 3-h exam in the class
Submission: Exam paper was submitted to the invigilator at the end of the exam.
Feedback: Feedback was provided on student demand
Content: Students were given real data from site and lab investigations. They were required to draw a flow net and estimate the stresses, estimate the time and amount of soft soil consolidation due to embankment loads, and obtain shear strength parameters for slope stability analysis.
Duration: 10 days
Submission: Individual report was submitted online
Feedback: Short personal feedback via rubrics assessment. More detailed feedback from the teacher was provided on student demand.
Table 3. Students’ demographics.
Table 3. Students’ demographics.
Years20152016201720182019202020212022
Number of students1441281129795967874
Male/Female (%)85/1586/1489/1187/1392/883/1787/1387/13
Domestic/International (%)65/3562/3866/3463/3777/2375/2587/1385/15
Table 4. Percentage of students in each GPA range over the past eight years.
Table 4. Percentage of students in each GPA range over the past eight years.
GPA Range20152016201720182019202020212022
<415.518.622.413.210.57.520.014.8
4 to <529.632.622.432.735.828.829.137.0
5 to <630.324.819.828.824.225.034.531.5
6 to 713.48.513.814.811.620.014.513.0
no GPA11.315.521.610.617.918.81.83.7
Table 5. Students’ academic performance. The average mark for each assessment is out of 100 points.
Table 5. Students’ academic performance. The average mark for each assessment is out of 100 points.
Years20152016201720182019202020212022
Assessment TypeExam-Based AssessmentProject-Based Assessment
Mid semester exam/Project 1 markAverage69.967.270.875.176.966.366.562.3
Standard Deviation17.419.319.514.010.317.216.121.8
Final Exam/Project 2 markAverage65.263.861.276.474.769.065.867.0
Standard Deviation15.819.515.512.012.213.115.215.3
Failure rate (%)1.38.34.35.23.29.916.218.0
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gratchev, I. Replacing Exams with Project-Based Assessment: Analysis of Students’ Performance and Experience. Educ. Sci. 2023, 13, 408. https://doi.org/10.3390/educsci13040408

AMA Style

Gratchev I. Replacing Exams with Project-Based Assessment: Analysis of Students’ Performance and Experience. Education Sciences. 2023; 13(4):408. https://doi.org/10.3390/educsci13040408

Chicago/Turabian Style

Gratchev, Ivan. 2023. "Replacing Exams with Project-Based Assessment: Analysis of Students’ Performance and Experience" Education Sciences 13, no. 4: 408. https://doi.org/10.3390/educsci13040408

APA Style

Gratchev, I. (2023). Replacing Exams with Project-Based Assessment: Analysis of Students’ Performance and Experience. Education Sciences, 13(4), 408. https://doi.org/10.3390/educsci13040408

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop