Presentation and Evaluation of a New Graduate Unit of Study in Engineering Product Development

: Engineering education has a key role to play in equipping engineers with the design skills that they need to contribute to national competitiveness. Product design has been described as “the convergence point for engineering and design thinking and practices”, and courses in which students design, build, and test a product are becoming increasingly popular. A sound understanding of product development and the implications associated with developing a product have been strongly linked to sustainability outcomes. This paper presents an evaluation of a new Master level engineering unit o ﬀ ered at Deakin University in product development technology. The unit allowed the students an opportunity to engage with the entire product development cycle from the initial idea to prototyping and testing through strategic assessment, which drove the unit content and student learning. Within this, students were also a ﬀ orded an opportunity to explore resource usage and subsequent minimisation. Student evaluation surveys over two successive years found that students were responsive to this type of learning and appreciated the opportunity to do hands-on work. Improved student e ﬀ ort and engagement indicate that the students likely had better learning outcomes, as compared to traditionally taught units.


Introduction
For many years, it has been recognised internationally that engineering design can be a key contributor to, and differentiator of, competitiveness at both organisational and national levels [1,2]. Similarly, it is not a new idea that engineering education has a key role to play in equipping engineers with the design skills that they need to contribute to national competitiveness [3]. Owing to the importance of sustainable development in society and business [4,5], it is also clear that quality engineering education in product development has a key role to play. In addition to fundamental knowledge in maths, science, and engineering, experience of the engineering design process and the experience of realistic engineering design problems is an important aspect of engineering education. Nicolai [3] proposed that engineering students need to understand and apply the tools of design, including: drawing/sketching/descriptive geometry; communication skills; kinematics; statistics; computer-aided design (CAD)/computer-aided manufacturing (CAM); materials and processes/manufacturing; and, economics (p. [11][12]. Staniškis and Katiliūtė [6] argue that contextual awareness is critical for students to gain an understanding of the broader implications of engineering decisions on a range of outcomes, including legal and environmental considerations. Product design has been described as, " . . . the convergence point for engineering and design thinking and practices" (p. 33, [7]). Courses in which students design, build, and test a device or product are becoming increasingly popular in engineering education, as they can improve student motivation and bring context to engineering studies, as well as develop improved engineering design combining CAD modelling and numerical simulation with prototyping and physical modelling [21]. Fu [16] describes students numerically simulating the performance of a centrifugal clutch. Having the opportunity to also conduct a corresponding experiment on physical hardware, the students discovered a difference between the results of the experiment and the simulation. Stimulated by this, the students investigated more deeply, and in pursuing the reasons for the discrepancy, gained further knowledge of the relationship between design inputs and system performance. Szydlowski [22] describes an engineering course on machine design which incorporates hand calculations, computer modelling of components and numerical simulation of mechanisms, and the rapid prototyping of physical models. It was observed that the physical models revealed characteristics of the components that students did not appreciate from 2D representations on a screen or in a sketch, and provided insights into the practical manufacturability of designs. As well as providing an important link between theory and practice in design education, building prototypes was seen as engaging and motivating for students.
Given that the decisions made during product development have a significant impact on the quality, cost, life cycle cost and sustainability of the final product [23], design and build units play a very important role in equipping students with first-hand experience seeing the effects of design choices on the products they develop. This experience also allows students to see and quantify the consequences of design choices they have made through the evaluation of their work formally as part of the assessment and during the product test phase. The quantification of the consequences of design choices is important to ensure that the end product meets the functional and user requirements, including life cycle and sustainability requirements [24].
Student engaged projects are also instrumental for students to learn and appreciate the importance of sustainable design. This has been shown through studies linking students' perception of their ability as a consequence of participation and collaboration [25,26]. Moreover, problem solving and collaboration skills acquired through real-world learning opportunities have been shown to enhance student competence, particularly as it relates to sustainable thinking [27]. Developing a sense of sustainable innovation is considered to be a key component of engineering education [28].
The design and build style unit is fitting with the Product Oriented Design Based Learning (PODBL) pedagogy employed at Deakin University's School of Engineering [29]. This pedagogy focuses on facilitating learning through the design and delivery of a report or a product. The process encourages innovation and creativity while fostering independence as professionals and as learners [30]. Learning pedagogies, such as PODBL, and problem based learning (PBL) has also been shown to be effective at teaching students about sustainability [31]. While this approach to teaching has been well established in the undergraduate setting, it has not been evaluated in the postgraduate setting.
The evaluation of the effectiveness of new teaching developments is important, however there are only limited evaluations of such initiatives in engineering product development present in the literature [14,18,32]. This paper presents details of the learning design for, and an evaluation of, the initial offerings of a new approach to the teaching of a Master-level coursework unit entitled "SEM711 Product Development Technologies" at Deakin University in Australia. Here, we present a method for such an evaluation, the findings of the evaluation, and a discussion of the implications of those findings. The research presented here is a contribution to the literature relating to engineering product development education.

Context
Situated in the Master of Engineering (Professional) course at Deakin University, for students who are specialising in Mechanical Engineering Design, is the unit Product Development Technologies SEM711. Mechanical Engineering Design is one of four specialisations, and the other specialisations are: Engineering Management, Electronics Engineering, and Electrical and Renewable Energy Engineering. Within the Mechanical Engineering Design specialisation, the students also complete units on CAE and Finite Element Analysis SEM712, Product Development SEM721, and Advanced Manufacturing Technology SEM722. The 16-credit point course itself is built from eight compulsory units (shared among the specialisations), four elective units, and four specialisation units. Four of the eight credit points in the compulsory units are made up from two project units (Project Scoping and Planning SEN719 and Project Implementation and Evaluation SEN720), which the students complete in series on a large research-focused project.
Whilst it is a prerequisite to have already completed a four-year bachelor degree in engineering (or equivalent), the students who come to Deakin University to study the Master of Engineering program are from a diverse range of backgrounds. The vast majority of students are international and their education journey prior to commencing their post-graduate studies can be widely varied in content and quality. This makes it difficult to assume any particular prior knowledge when designing a unit. This provides a unique challenge in developing an appropriate unit, optimising the balance between having a unit that all of the students can learn in, that meets a high enough standard, and ensures that all students, regardless of their backgrounds, have an equal ability to achieve.

Overview of SEM711
SEM711 aims to give the students the complete product development experience, from idea to manufacturing to product verification. The generic product development process, as defined by Ulrich and Eppinger [33] is: The assessment in this unit gives the students hands-on experience working through Phases 0 to 4. This approach has been shown to link well with industry and is in line with what graduates can expect to follow if they pursue a career in product development [34].
Learning in this unit is driven by the assessment, primarily through two large assessment items, Assessment Item 1 and Assessment Item 3. Assessment Item 2 is a collection of three quizzes, each worth 10 per cent of the students' final grades for the unit. The quizzes ensure that the students engage with the lecture content in the unit. In design education (as generally in education), assessment is a key strategic driver of the way in which students engage with their learning [8]. The assessment for SEM711 was deliberately structured to direct students to engage with both discipline knowledge and design practice, in order to complete a theoretical design and practical implementation, and to reflect on the connection between design theory and practice.

Assessment Item 1: Design of a Kinematic Device
In this assessment item, students design a kinematic device that they can later manufacture using technology available at Deakin University [32] (for example: laser cutting or 3D printing). In order to allow the students to work within their current skill level and produce a wide range of designs, the design restrictions were minimal: the design must have a rotating part that causes periodic motion elsewhere in the device (ideally this motion will not be symmetric); the device must be not more than 250 × 250 × 250 mm in size (assembled); and, the design must allow for subsequent testing when it is manufactured (i.e., the design must account for the use of sensors). The minimal design restrictions also opened the opportunity for students to be creative. Creativity is an important, but under-represented, skill in engineering [35]; moreover, creativity and engineering design are linked to sustainability [36,37].
The deliverable for this assessment item was a report which covers the design of the device (hand sketches and professional computer drawings on software such as Inventor or SolidWorks), a discussion on the dynamics (the physics) of the device, and evidence that the device meets the design criteria. Successful completion of this assessment item required students to work through Phases 0 to 3 of the generic product development process.
In order to help students get started, they were shown examples of kinetic devices that meet the design criteria, such as Theo Jansen walkers and piston/cylinder devices. During the design process the students were actively encouraged to think-forward about the subsequent assessment (Assessment Item 3) and how their design would help them achieve the assessment outcomes and marking criteria. For this assessment item, the marking criteria were: Idea Sketches, 2D Assembly Plan, 3D Virtual Mock-up, Intended Outcomes, and Communication. Assessment Item 1 was worth 30 per cent of the students' final grades for the unit.

Assessment Item 3: Production and Testing of a Kinematic Device
For this assessment item, students were expected to refine their designs based on the formal feedback they received from Assessment Item 1. This ensured that the intended learning outcomes from Assessment Item 1 were met by giving the students the opportunity to not only reflect on their feedback, but to also improve their work. Once their designs were finalised, the students then manufactured a prototype of their device. Once the prototype was constructed, the students moved into Phase 4 of the generic product development process: testing.
The testing phase of the generic product development process contains many learning opportunities. Students were afforded an opportunity to consider how they could test their device practically, gain experience collecting real data, consider and implement strategies to utilise the data, perform meaningful analysis of the data and then disseminate their findings. Outside of the breadth of practical learning, the students also gained insight into other considerations of the development of a product, namely the effect of manufacturing tolerances on the final operation of a product.
As with Assessment Item 1, the students were given a lot of freedom to choose how and what they test on their device. Autonomy has been linked to self-determination and knowledge and is, therefore, seen as a critical aspect of the assessment [38]. Despite the freedom given to the students in deciding how they test their devices, it was mandated that there must be a test where they hand turn their device and measure the subsequent periodic movement (a design criteria). Mandating that the device be hand turned (rather than using a motor) was to ensure that the data were non-ideal.
The students were allowed to use any of the equipment available to them for testing their device, provided that it could deliver insight into their devices' operation. All of the students, however, elected to use the equipment available at the university. In the mechatronics/physics laboratory are sensors made by PASCO capable of measuring, among other variables: rotation, displacement, and force.
Post testing, the analysis must demonstrate the use of both simple and advanced analysis techniques. Examples of simple techniques included finding means and standard deviations, examining ranges, and the extraction of individual data points. Suggested more advanced analysis techniques included cycle averaging and empirical or physical modelling for feature extraction. Students were also expected to convert their data from being time dependent to being angle (from the rotation) dependent. There is a strong emphasis in the marking scheme on the interpretation of these results, rather than just the basic calculation of them.
Once the analysis was completed, the students finalised their assessment by disseminating the product development process in the context of the generic product development process through a report and an oral presentation. Combined, this portion of the assessment was worth the remaining 40 per cent. Of this 40 per cent the oral presentation was worth 10 per cent and was there primarily to ensure that the students had done adequate testing, analysis, and interpretation. This immediate feedback gave them the opportunity to strengthen the work they had done prior to the submission of the report component. The feedback process here not only gave the students a greater opportunity to achieve in this unit, it also facilitated the learning process (by ensuring that all of the intended learning activities had been completed well) and gave the students experience implementing direct practical and critical feedback of their work in a manner they will likely encounter in the work force.

Method
Deakin University employs an online student evaluation of teaching (SET) survey known as eVALUate [39]. The eVALUate SET instrument is designed for higher education, has previously been validated, and has been employed in multiple countries [40] and in multiple discipline areas, including in undergraduate and postgraduate studies in engineering [41]. The eVALUate instrument contains eleven scale items and two open-ended response items for students to provide feedback on aspects of the unit learning design and teaching. The eVALUate survey is administered online for students to voluntarily complete towards the end of the teaching period in all units taught at Deakin University. Students are made aware that the results of the survey are anonymous and not available to the teaching staff until after unit results are released to students. This is to ensure that students can complete the survey without fear of retribution from the teaching staff. Each scale item is presented as a statement, to which students indicate their level of agreement on a four-point scale of the form: Strongly Disagree; Disagree; Agree; and, Strongly Agree. Students can also select an "Unable to Judge" response. The eVALUate scale items are: 1.
The learning outcomes in this unit are clearly identified.

2.
The learning experiences in this unit help me achieve the learning outcomes.

3.
The learning resources in this unit help me achieve the learning outcomes.

4.
The assessment tasks in this unit evaluate my achievement of the learning outcomes.

5.
Feedback on my work in this unit helps me to achieve the learning outcomes. 6.
The workload in this unit is appropriate to the achievement of the learning outcomes. 7.
The quality of teaching in this unit helps me to achieve the learning outcomes. 8.
I am motivated to achieve the learning outcomes in this unit. 9.
I make best use of the learning experiences in this unit. 10. I think about how I can learn more effectively in this unit. 11. Overall, I am satisfied with this unit.
The two open-ended response items are presented as a question to which students can provide a text response-they are:

1.
What are the most helpful aspects of this unit? 2.
How do you think this unit might be improved?
The implementation and scale and open-ended items of this survey are robust and the result of significant iteration over a number of years of development at Curtin University of Technology prior to being employed at Deakin University [39,42]. The eVALUate survey has featured in multiple research studies as a sound metric for gauging student engagement and learning [43][44][45], including other design based learning units [46]. As such, the eVALUate survey can be considered a robust metric of student satisfaction and engagement and as free from bias as a student survey can be.
Following the completion of the eVALUate survey period, summary and detailed numerical reports for the scale item results are available from a publicly accessible website (https://www.deakin. edu.au/evaluate/results/general.php), and the open-ended text responses from students are provided to the unit chair. The reports contain no information capable of identifying any student respondent. The summary report includes the following data: total unit enrolment and eVALUate response rate; the number of responses received for each scale item; the percentage agreement (Agree plus Strongly Agree) for each scale item; the percentage disagreement (Disagree plus Strongly Disagree) for each scale item; and the percentage "Unable to Judge" for each scale item. Additionally, the detailed report contains the raw counts and percentages of students indicating each of the response categories of: Strongly Disagree, Disagree, Agree, Strongly Agree, and Unable to Judge. The response scales used by students in completing the eVALUate survey are fundamentally ordinal in nature. The use of ordinal data as an interval variable in many parametric statistical procedures, while commonplace, is not universally accepted as valid [47]. However, there is a significant body of research that has demonstrated the practical utility of analysis of ordinal data, based on the robustness of many parametric methods to significant departures from assumptions about the underlying data, including departures from normality and "intervalness" that might be present in ordinal scale data [48,49].
Using the publicly available eVALUate numerical reports for SEM711 from 2016 and 2017, we assigned the following interval values to the response scale categories: Strongly Disagree = 1; Disagree = 2; Agree = 3; and Strongly Agree = 4. A mean response rating (as a score out of 4.0) and an associated 95 per cent confidence interval was computed for each eVALUate item for SEM711. Using the eVALUate reporting website, it is possible to obtain corresponding aggregated data for all units offered by the School of Engineering that are included in the eVALUate system for the same academic session as SEM711. The same process was followed and a mean response rating and 95 per cent confidence interval was computed for each eVALUate item for the aggregated set of units offered by the School of Engineering. Because the respective population sizes were known (overall unit enrolment and aggregate School enrolled student load), the confidence intervals for the mean response ratings were calculated using the finite population correction, which yield smaller, more accurate confidence intervals [50].
Generally, in SET systematic differences in evaluation ratings are observed between discipline areas, with science and technology fields, including engineering, reporting significantly lower ratings compared to many other areas [51,52]. A suggested strategy for the meaningful interpretation of SET data is to, where possible, compare like-with-like ratings, including on the basis of discipline area [53]. Here we are not specifically concerned with the absolute eVALUate ratings and use the aggregated School of Engineering results to provide a context-appropriate benchmark. The relatively coarse, four-point division of the response scales used in eVALUate mean that the distribution of student ratings is unlikely to be normal. However, confidence intervals are a relatively robust metric to represent departures from normality, especially for the larger aggregated School of Engineering data set. The comparison between the SEM711 and the aggregated School of Engineering eVALUate mean ratings here are for illustrative purposes rather than parametric testing. The results obtained and a discussion of the observed results are presented for two offerings of SEM711 (2016 and 2017).

Results and Discussion
In 2016, the enrolment for SEM711 was 98, and 28 valid eVALUate responses were received, giving a response rate of approximately 29 per cent. In 2017, the enrolment for SEM711 was 155, and 45 valid eVALUate responses were received, giving the same approximate response rate as 2016 of 29 per cent. Although the response rates obtained are comparatively low, they are not unexpected for an online voluntary survey [54], and are within the range of response rates reported in the research literature for online SET surveys [55]. The response rates obtained compare favourably to the overall university eVALUate response rate (which was 24 per cent in both years). The eVALUate survey is voluntary and reported data are anonymous, so it is not possible to further test the representativeness of the respondent group against the overall unit enrolment. Figure 1 presents the mean response ratings and computed 95 per cent confidence intervals for the 11 eVALUate survey items, for both SEM711 and the aggregated School of Engineering responses. Overall, Figure 1 clearly demonstrates that students are receptive to this approach to learning. Across the board, in both offerings of the unit, the mean response ratings were higher for each item, compared to the School point mean values, with the exception of Item 10 in the 2017 survey; however, it is comparable to the All of School data for that period. The spread of some items was larger than others, showing that the students were not uniform in their opinions across the board. There are some insights to be gained from the evaluation of these data.
The response to Item 2 demonstrates that building the learning around the assessment helps the students to see the point of doing assessments and engages them better. Additionally, combined with the response to Item 5 it can be seen that the use of feedback as a mechanism to help the students improve their existing work (in Assessment Item 1), as part of the step into Assessment Item 3, was not only useful as a teaching tool but something that the students also greatly valued. This concept has explored been explored in the literature [56]. Correctly applied, feedback can be used to support students in a productive way. Overall, Figure 1 clearly demonstrates that students are receptive to this approach to learning. Across the board, in both offerings of the unit, the mean response ratings were higher for each item, compared to the School point mean values, with the exception of Item 10 in the 2017 survey; however, it is comparable to the All of School data for that period. The spread of some items was larger than others, showing that the students were not uniform in their opinions across the board. There are some insights to be gained from the evaluation of these data.
The response to Item 2 demonstrates that building the learning around the assessment helps the students to see the point of doing assessments and engages them better. Additionally, combined with the response to Item 5 it can be seen that the use of feedback as a mechanism to help the students improve their existing work (in Assessment Item 1), as part of the step into Assessment Item 3, was not only useful as a teaching tool but something that the students also greatly valued. This concept has explored been explored in the literature [56]. Correctly applied, feedback can be used to support students in a productive way.
The strong response to Item 5 could also be attributable to the manner in which the practical times were run. Students were allowed to complete the assessment in their own time. However, they always had an academic available to give immediate feedback on what they were doing (would their design work? Is it going to meet the design criteria? Will/did their experiment yield quality data? Have they sufficiently evaluated their data and drawn conclusions?) Quality support is not only valuable because of the feedback the students receive, but also in making students feel confident. Confidence facilitates innovation and creativity, ultimately leading to enhanced learning outcomes.
The consistently high result in Item 3 is in part likely owing to the hands-on nature of the unit. Allowing the students to actually build their designs and test them provides an excellent opportunity for them to physically see the impacts of their design decisions and greatly facilitates their learning. This interpretation also ties in well with Item 2 and would explain why both items have similar mean responses and confidence intervals. Items 9 and 10 indicate that this hands-on student-driven approach to learning engages students better than other more traditional approaches. A connection that is well reported in the literature [57,58]. These responses also provide evidence of how the students were motivated to do better than the minimum required of them. When students are given the flexibility to drive their own work and learning, they may out-perform the expectation [59].
Relative to the mean eVALUate ratings for the whole of the School of Engineering, the items which did not perform as well as the others were Items 4, 6, and 8. This result could be traced back to a single root. The higher mean response rating for Item 4 shows that most of the students were happy to engage in a more challenging and time-consuming assessment, as it facilitated greater learning outcomes. That said, some students are more interested in completing the unit easily with only the minimum amount of work and effort. Moreover, the strong practical component of this unit had a positive effect on ensuring student attendance on campus. Whilst this helps with student learning and engagement, some students were likely to resent the need to take the time to travel to campus (many of the students in this student cohort live in the state capital city, travel from which can take significantly more than an hour in each direction). Surveys of students in other studies which also had activities that, by design, required engagement have had similar outcomes [60,61].
Student written responses to the two open-ended eVALUate response items included: Applying theory to practical work, the one thing which I was expecting throughout the Master's programme but only was able to do that in this unit. So thumbs up for that.
[The most help aspect of the unit was the] physical building of the model Using new technology, like 3D printing and laser cutting and using the labs to innovatively construct ideas [and] models was very helpful The oral presentation feedbacks are very useful Teaching staff and assignment. Teaching assistance, practical experience rather [than] theoretical knowledge Students also mentioned that the unit should include reference books and have more examples, including practical examples, during the lecture time. The qualitative feedback seems to be supportive of the quantitative feedback. For example, students associate the quality of teaching they receive somewhat based on the experience and enjoyment they had studying, as such the significant number of responses indicating enjoyment using technology or doing practical work should be indicative of the strong quantitative result in Items 2,7,9,and 11. Moreover, the links between these experiences, and the assessment itself, are likely drivers for the strong quantitative performance of Item 4. Strongly linked engaging experiences to assessment also, in turn, motivates students, which has a positive influence on Item 8.

Improvements to the Unit Assessment in 2017
Whilst the unit as it was run in 2016 achieved the desired learning outcomes and was well received by the students, there are always opportunities for improvement. A vital part of developing a real product is the cost of manufacture [3,62,63]. This aspect was overlooked in the initial design of the assessment in 2016. As such, in 2016, most of the products produced by the students were not cost efficient in their construction. For example, it was not uncommon to see 3D printed parts, which could have been replaced with cheap, readily available alternatives (for example: short metal rods). Although optimal placement of parts for laser cutting was a topic in the formal delivery of this unit, many of the designs used an excessive amount of acrylic sheet. On the same theme, much material was also wasted in the refinement stage. Numerous students wound up refining their devices during the manufacturing stage of the work, rather than before-hand as was intended. This led to some products having up to five iterations and substantial resource wastage (physical waste and technician's time). This observation of students not necessarily considering resources is not unique and has been shown by other studies in the literature [64]. Given this, it is clearly an important aspect of engineering education if students are to contribute to society in a sustainable manner.
In the 2017 offering of this unit, the students were required to show in Assessment Item 1 that their design did not waste resources and tabulate how each part will be manufactured, with justification of it being the most cost effective means of generating it. This idea was also followed through in Assessment Item 3, where the students had to show the true cost of developing the prototype, including any iterations, against the proposed cost given in Assessment Item 1. Outside of the significant unit running cost savings, an added benefit for the students was the experience of designing a device that could be manufactured within a budget.
An added benefit of having a resource saving component of the assessment was also exposing the students to sustainable thinking. While a key learning outcome from the assessment is gaining an appreciation for the difference between expectation and reality in product manufacturing and design, students also had the opportunity to consider the resource implications of their design decisions. The benefit here is two-fold, namely the impact of different materials/manufacturing processes on device functionality and, also, the cost implications of materials/manufacturing selection.
Another change between the 2016 and 2017 offering was with Assessment Item 2-discussion here is limited as this topic falls outside of the scope of this paper, it is added here to give the overall picture of the unit. The quizzes in Assessment Item 2 were done by the students online in 2016. Technology issues and evidence of potential student collusion limited the value of this assessment; therefore, the quizzes were readapted into in-class quizzes. Making these quizzes in-class not only encouraged better student attendance in the lectures, but the inability to collude with their peers ensured that the students kept up-to-date with the content throughout the teaching period.

Conclusions
This paper presents an evaluation of a new Master-level engineering unit offered at Deakin University in product development technology. The unit provided the students an opportunity to engage with the entire product development cycle from initial idea to prototyping and testing through strategic assessment which drove the unit content and student learning. This assessment was designed in such a way to give the students hands-on experience in design and basic manufacturing. Some key outcomes from the assessment and unit design were an appreciation of the key difference between expectation and reality when a design is manufactured and an appreciation for resource usage and sustainability.
The evaluation shown in this paper utilised the student survey, eVALUate, to quantify student engagement with the unit. The survey itself is a robust survey with a long history of development and implementation at two respected higher education institutions. In order to derive clear conclusions, the results of the survey were compared (taking into account sampling error) against aggregate survey data across the School of Engineering.
What was evident, was that students were responsive to this type of learning and appreciated the opportunity to do hands-on work. Moreover, the student responses to the eVALUate survey indicate that units taught in this style are more likely to engage students in a manner which encourages additional effort. Improved student effort and engagement indicate that the students likely had better learning outcomes compared to traditionally taught units. Successful enhancements to the unit assessment included the integration of consideration of the cost of the manufactured product by students in their design submissions and moving online quizzes to in-class quizzes. A key recommendation from this work is that students engage and learn well in units that have a significant hands-on component. Units of this style also tend to enjoy greater student participation and attendance, which significantly improves learning outcomes and student satisfaction. Where possible, engineering education should be providing students with hands-on experience. Greater engagement gives more learning opportunities and, therefore, creates more aware and knowledgeable graduates.