Next Article in Journal
Developing Pre-Service Teachers’ Pedagogical Content Knowledge: Lessons from a Science Methods Class
Previous Article in Journal
“I Don’t Believe Any Qualifications Are Required”: Exploring Global Stakeholders’ Perspectives Towards the Developmental Experiences of Esports Coaches
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Unlocking Student Choices: Assessing Student Preferences in Courses in Engineering Education

by
Patricia Mares-Nasarre
*,
Niels van Boldrik
,
Elske Bakker
,
Robert Lanzafame
and
Oswaldo Morales-Nápoles
Faculty of Civil Engineering and Geosciences, Delft University of Technology, 2628 CN Delft, The Netherlands
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(7), 859; https://doi.org/10.3390/educsci15070859
Submission received: 13 February 2025 / Revised: 31 May 2025 / Accepted: 24 June 2025 / Published: 4 July 2025
(This article belongs to the Section Higher Education)

Abstract

Effective resource planning in higher education requires anticipating student demand for courses, especially when dealing with elective programs. Monitoring student preference is a recurring topic in the literature; however, to the authors’ knowledge, no simple methods for estimating student preferences when choosing courses in higher education have been proposed. This study develops and explores the use of a simple questionnaire to capture patterns in student course preferences within a university context. The research is developed in the context of the nine Cross-Over modules offered as part of the curriculum of the master’s programs (MSc) of the Faculty of Civil Engineering and Geosciences of Delft University of Technology (The Netherlands). No prior registration is required far in advance for these courses, making an accurate estimation of student numbers critical for the planning and allocation of educational resources. The developed questionnaire is applied three times in two different academic years to the students’ choice of Cross-Over modules. The questionnaire was shared in 2021, with 225 responses out of 339 students, in 2022, with 159 responses out of 365 students, and in 2024, with 94 responses out of 272 students. Student enrollment in the academic year 2023/2024 is used to assess the performance of the questionnaire. The questionnaire is able to capture general preferences of the students, providing fair estimates of the number of students per course; larger differences are observed in courses with a lower number of students. In addition, some patterns were identified in student preferences: there is a relationship between the first and second choices, and students usually choose modules closer to their own disciplines. The developed questionnaire provides with a reasonable first estimation of the expected number of students in courses, allowing for better planning and allocation of educational resources beforehand.

1. Introduction

Effective resource planning is essential to balance institutional efficiency with the delivery of high-quality education in universities. In a context of changing enrollment numbers and evolving academic demands, universities face the complex task of efficiently allocating resources, such as lecture halls and teaching staff, in a way that supports pedagogical goals while meeting operational constraints (Aupperle, 2024). Inefficient planning can lead to uneven workloads, scheduling conflicts, and underutilized or overcrowded spaces, ultimately diminishing the student experience and straining institutional capacity.
Accurately anticipating student preference becomes a critical aspect for effective resource planning, especially when dealing with elective courses. Here, student preference refers to the expressed or inferred choices of students regarding which courses they would like to enroll in. Without a reliable forecast of enrollment patterns, institutions risk assigning inadequate room sizes or misallocating teaching staff, which can hinder the educational experience of the student (Gu & Lu, 2023). Previous research points out that students usually consider a limited pool of courses when planning their studies, so early preferences can support the prediction of their academic trajectories (Chaturapruek et al., 2021). Factors such as course difficulty, perceived relevance, and personal interest also influence elective choices (Yao et al., 2023), emphasizing the complexity of the student’s decision-making process and thus the need for data-informed forecasting.
In some cases, institutions may decide to limit the number of students allowed to enroll in certain courses as a way to manage space and staffing constraints. While this approach can ease logistical pressures, it also restricts students’ freedom to shape their academic pathways. In addition, such limitations raise additional questions about how to fairly allocate the available seats, adding complexity to the planning process (Bissias et al., 2025; Xue et al., 2024), which may create dissatisfaction or perceived inequities between students.
Several studies can be found in the literature focused on examining student preferences and perceptions (e.g., Alhumaid et al., 2020; Kooptiwoot et al., 2024). Especially after the COVID-19 pandemic, student perceptions grew in relevance to address topics such as online teaching (e.g., Alhumaid et al., 2020; Kooptiwoot et al., 2024; Larson et al., 2023; Stanley & Mitra, 2024) or mental health (e.g., Chawla & Saha, 2024; Galadima et al., 2024). Before that, monitoring of students’ preferences was applied as a tool to assess the effectiveness of teaching techniques, such as the inclusion of interprofessional education in curricula (Lin et al., 2024), teachers’ questioning during lectures (Zhang & Chen, 2024), or specific teaching activities such as live coding (Masegosa et al., 2024). Moreover, student preferences have also been used to obtain further insight in the factors that lead students to choose a specialty within a discipline (Leutritz et al., 2024) or career to follow after their academic studies (Sim-Sim et al., 2022).
Studies in the literature employ different methodological tools and theoretical frameworks. As methodological tools to retrieve empirical data, these studies usually applied interviews (e.g., Leydens et al., 2021; Potvin et al., 2024), questionnaires (e.g., Kooptiwoot et al., 2024; Mares-Nasarre et al., 2023), a combination of both (e.g., Anani et al., 2025) or clickstream logs from a course exploration platform (Chaturapruek et al., 2021). Interviews are mainly qualitative and help to better understand subjects’ opinions, although the sample of subjects remains smaller. Questionnaires have the benefit of (potentially) increased sample sizes and being able to provide quantitative measures. Clickstream logs are not simple to obtain and interpret, and their availability may depend on the platform that the university is using. Regarding the theoretical frameworks, studies focused on identifying factors that drive the student decision-making process usually adopted psychological and behavioral theories such as the Theory of Planned Behavior (TPB) (Ajzen, 1991) or Expectancy-Value Theory (EVT) (Eccles et al., 1983). Other approaches used in quantitative studies focused on analyzing enrollment trends, including data-driven methods (Chaturapruek et al., 2021).
TPB outlines a comprehensive model that explains the predictors of human behavior in the context of decision making. Ajzen (1991) theorized that beliefs lead into attitudes, and attitudes lead into intentions and behavior, in that order. The theory proposes that human behavior is driven by three factors: behavioral beliefs, normative beliefs, and control beliefs (Ajzen, 1991, 2002). For instance, Dahl et al. (2024) applied TPB to explain how undergraduate students decided to register for newly offered courses; positive attitudes toward the course content, social influences, and students’ confidence in their ability to succeed significantly impacted enrollment intentions.
EVT theorizes that achievement-related choices are driven by the subjective value that the individual assigns to the task and the individual’s expectations for success (Eccles et al., 1983). According to this model, these two factors are shaped by an individual’s characteristics, such as their abilities and previous experiences, as well as environmental influences. For example, Breetzke and Bohndick (2024) investigates how students’ expectancies and subjective study values influence their intentions to drop out of higher education programs; high study values can, to some extent, compensate for low expectancies.
Despite the wide variety of studies on student preference and perceptions, universities often face difficulties when translating these insights into straightforward tools for operational planning. Many of the models and theories used require detailed and often longitudinal data and complex analytical frameworks that are not easily implemented in day-to-day administrative workflows. As a result, institutions frequently rely on historical enrollment numbers to anticipate student demand. These approaches, while simple and convenient, can lead to significant mismatches, especially when curriculum changes or student cohorts differ significantly from previous years. The lack of simple methods for estimating student preferences at scale remains a key limitation in improving resource allocation and supporting student-centered course planning. Therefore, in this research, a methodology to estimate the distribution of students in a set of courses based on a simple questionnaire is proposed.
This research is performed in the context of the Faculty of Civil Engineering and Geosciences of Delft University of Technology (The Netherlands). During the 2022/2023 academic year, a new curriculum for the three master’s programs (MSc) of the aforementioned faculty ran for the first time. Two MSc programs were redesigned, namely MSc in Applied Earth Sciences and MSc in Civil Engineering (Delft University of Technology, n.d.-b), and a new MSc program was developed in Environmental Engineering. Each MSc program is further subdivided into four, six and three fields of study (for AES, CE and ENVM, respectively). The focus of this redesign was to boost collaboration skills of students not only within their discipline but also between the different disciplines in the faculty. In order to achieve this, two modules were introduced that are shared by all the three MSc programs: (1) Modelling, Uncertainty and Data for Engineers ( M U D E ), and (2) Cross-Over modules. M U D E is the first module that all the MSc students in the faculty take and covers fundamental mathematical methods and programming skills required for the MSc programs (Delft University of Technology, n.d.-a, n.d.-b, n.d.-c). During the second year of the MSc programs and just before the MSc thesis, the second group of shared cross-faculty modules takes place: the Cross-Over modules. These modules of 10 ECTs (i.e., order 300 h of work) span the technical domain of at least two of the three MSc programs in order for students to expand their current expertise, learn from other programs and learn to properly function in an interdisciplinary environment. Within the MSc in Civil Engineering and the MSc in Applied Earth Sciences, each student must choose one Cross-Over to complete their curriculum (Delft University of Technology, n.d.-a, n.d.-b). In the MSc in Environmental Engineering, a student can choose a Cross-Over module as one of several options to complete the 25 ECTs of elective space before the MSc thesis, along with electives, internships and/or carrying out a joint interdisciplinary or multidisciplinary project as alternatives Delft University of Technology (n.d.-c). During the 2023/2024 academic year, eight Cross-Over modules were offered (Delft University of Technology, n.d.-d):
  • Engineering for Global development ( G l o b a l )
  • Surbsurface storage for Energy, Water and Climate Applications ( S t o r a g e )
  • Data Science and Artificial Intelligence for Engineers ( D S A I E )
  • Noise and Vibration: Generation, Propagation, and Effect on Humans and Environment ( N o i s e )
  • Probabilistic Modelling of real-world phenomena through Observations and Elicitation ( M O R E )
  • From Sediments and Sludges to Solids and Soils ( M U D )
  • Monitoring of Structural Health and Geohazards ( M o S H )
  • Sustainable cities: Ecoengineering Solutions for Climate Resilient and Healthy Cities ( C i t i e s )
During the academic year 2024/2025, an additional Cross-Over module is offered: Resilient Deltas ( D e l t a s ). A summary of each Cross-Over module can be found in Appendix A.
As mentioned, most of the students in an MSc program of the faculty choose a Cross-Over module. However, no prior registration is implemented for any courses within the faculty: students attend lectures (non-compulsory), participate in various in-class activities, and eventually register for the exam at least 14 days before it takes place. This scheme gives the student freedom to attend the first lectures of several modules and make a decision later. The academic quarter system includes eight weeks of lectures and two weeks of exams; the Cross-Over modules occur in the second quarter, which includes a two week holiday break at the end of the calendar year. As actual enrollment is not known until exam registration is complete, the true student enrollment in a module is uncertain until eight weeks after the first lecture. This setting poses an obvious challenge for the planning and allocation of educational resources, especially when a module is offered to all the students of a given course in the faculty (order 300, in this case). Thus, a methodology to monitor student preferences is needed to estimate the number of students that will choose each module, based only on their study plans within the MSc programs, in order to better plan the lectures and activities that take place within each module.
In this research, a methodology to estimate the distribution of students per Cross-Over module based on the use of a questionnaire is proposed. It should be noted that the simplicity of the proposed methodology allows its implementation in any other context where students in higher education need to choose between different courses (e.g., electives in most bachelor’s and master’s programs). The developed questionnaire is analyzed here using (un)conditional probabilities, allowing for the estimation of the distribution of students per Cross-Over and the analysis of student preferences based on their choices and background. The effectiveness of the survey in predicting enrollment is also validated using the actual distribution of students for one academic year. The paper is presented in three main sections. First, in Section 2, the performed surveys are described, providing special attention to the design of the questionnaire (Section 2.1) and posterior probabilistic analysis (Section 2.2). Second, in Section 3, the questionnaire is assessed as a tool to monitor student preferences (Section 3.1). Afterwards, it is used to analyze the influence of a student’s background regarding their choices and the relationship between the first, second and third choices (Section 3.2), as well as to estimate the distribution of students per Cross-Over (Section 3.3). In Section 4, findings in this research are discussed in the context of the existing literature. Finally, in Section 5, conclusions are drawn.

2. Materials and Methods

As previously introduced, different methodologies can be applied to obtain information about the students’ preferences or beliefs, such as interviews (e.g.,: Leydens et al., 2021; Potvin et al., 2024), questionnaires (e.g.,: Kooptiwoot et al., 2024; Mares-Nasarre et al., 2023), a combination of both (e.g.,: Anani et al., 2025) or clickstream logs from a course exploration platform (Chaturapruek et al., 2021). Since the main objective of this study is to propose a simple tool to monitor preferences and not fully explain them, a questionnaire is applied following the recommendations in Cruz et al. (2020).
An online questionnaire was shared with the potential students in the faculty who were going to take the Cross-Over modules (see full questionnaire in Appendix A). This questionnaire was shared three times: (1) between 31st May and 18th June 2021, (2) between 14th November and 27th November 2022, and (3) between 13th May and 18th June 2024. Each survey was completed approximately 30, 12 and 6 months prior to the first lecture in a Cross-Over, which typically occurs in early November. Surveys 1 and 2 were completed prior to the first offer of a Cross-Over during academic year 2023/2024, whereas Survey 3 was completed prior to the second offer during the 2024/2025 academic year. Survey 1 was sent to students in the last year of the bachelor’s degree and all the students in the master’s degree in Applied Earth Sciences and Civil Engineering, receiving 339 student responses, of which 225 students were MSc students. Nearly all of the 339 students were currently enrolled in, or soon would enroll in, the prior MSc programs of the faculty and therefore would not take a Cross-Over. As the Cross-Over modules take place in the second year of the new MSc curriculum, they were offered for the first time during the 2023/2024 academic year; thus, only a small number of students in Survey 1 was able to eventually take a Cross-Over, for example, if a 1-year break was taken between BSc and MSc enrollment (the exact number is unknown, but expected to be small). The purpose of this survey was to assess the attractiveness of courses to students and, thus, provide the first estimation for resource allocation, as well as to determine whether certain offers required additional promotional efforts. The second and third surveys were shared with two different cohorts of students, each of which were taking part in the first year of the MSc programs prior to the Cross-Over modules offered (cohorts starting in September 2022 and 2023; Cross-Overs taking place during academic years 2023/2024 and 2024/2025). In the second survey (Survey 2), 159 students responded out of the total 365 students enrolled in the three MSc programs, whereas Survey 3 received 94 responses out of 272 enrolled students. As students who completed the second survey took a Cross-Over module 12 months later during the 2023/2024 academic year, the actual distribution of students per Cross-Overs for that survey is known, allowing for an assessment of questionnaire effectiveness.
The results of the three questionnaires are analyzed to assess whether the answers to the different questionnaires are consistent between them and with the actual realizations and, thus, if we can deem the method an efficient tool to provide the first estimation of the number of students per Cross-Over. Also, the results of the questionnaire are analyzed in terms of conditional probabilities to assess the distribution of students if a Cross-Over has a limited number of students with regard to the availability of lecturers. In the following sections, a description of the questionnaire is given and the methods used to analyze the responses from the students are outlined.

2.1. Questionnaire Design

Here, further details of the design of the survey are provided, which can be found in Appendix A. The questionnaire was implemented in Qualtrics (2005). All the performed surveys were identical and presented the following parts:
  • Welcome message with information such as an approximation of the time that it takes to fill out the survey.
  • Questions about the study background, where the student indicates their MSc program and the path within that MSc program. For instance, the MSc in Environmental Engineering has three paths: (1) resource and waste engineering, (2) atmospheric environmental engineering, and (3) water resource engineering. The list of the paths in each MSc program can be found in Appendix B.
  • Brief description of each of the Cross-Overs.
  • Selection of the preferred five Cross-Overs.
  • Ranking of the preferred five Cross-Overs.
  • Ranking of the remaining Cross-Overs.
Parts 4, 5, and 6 of the questionnaire may seem redundant as they could be merged into only one question. However, it has been observed that if a general ranking of modules is provided, students do not pay too much attention to the rankings besides the first or second option. Therefore, they are required to think about the five first options explicitly and then rank them, in an attempt to reduce the randomness in the ranking of the first options.

2.2. Probabilistic Analysis

With the information obtained from the questionnaire described in Section 2.1, we approximate probabilities of interest through their relative frequencies. The empirical probability of an event (e.g., a Cross-Over being selected in a survey as a first choice) can be calculated as
P ( A ) = N A / N t ,
where P ( A ) is the estimated probability of the event A happening (e.g., probability that the first choice corresponds to a given Cross-Over), N A is the number of times that the event A occurs in the survey, and N t is the total number of responses in the survey.
The confidence interval of those probabilities or proportions can be computed making use of the Central Limit Theorem as
p = p ± Z α / 2 p ( 1 p ) N t ,
where p = 100 × P ( A ) and p = 100 × P ( A ) are the actual and estimated proportions of an event A happening, P ( A ) is the actual probability of an event happening, and Z α / 2 is the value of a standard normal variable evaluated at a non-exceedance probability of 1 α .
Conditional probabilities are of great interest in this research as they allow to analyze, for instance, what is the probability of a Cross-Over being the second choice given the first choice. Similarly, one can compute what the probability of a Cross-Over being the first choice given the MSc program studied by the student is, for example. The definition of conditional probability is given by
P ( A | B ) = P ( A B ) P ( B ) ,
where P ( A | B ) is the conditional probability of the event A given that the event B has occurred, and P ( A B ) is the joint probability of A and B occurring.
The aforementioned probabilities can be used to compute the (conditional or unconditional) expected number of students in a given Cross-Over by multiplying them by the final number of students taking all Cross-Overs.

3. Results

In this section, the results of the survey described in Section 2.1 are analyzed using the methods described in Section 2.2. Those results are first used to assess the utility of a survey to monitor students’ preferences. Afterwards, the preferences of the students are further analyzed, and the last survey is used to infer the preferences of students for the next academic year.

3.1. Assessing the Survey as a Tool to Monitor Students’ Preferences

As mentioned previously, 339, 159, and 94 responses were received in Surveys 1, 2 and 3, respectively. Table 1 shows the number of students who chose a Cross-Over as their first choice in each survey. The most popular Cross-Over was D S A I E in Surveys 1 and 3, and C i t i e s in Survey 2.
In the following sections, the preferences expressed by the subset of students who filled the surveys are translated into expected numbers.

3.1.1. Number of Students per Cross-Over During the Academic Year 2023/2024

Figure 1 presents the actual number of participants per Cross-Over during the 2023/2024 academic year compared to the expected number of participants according to the first two surveys described in Section 2.1 and the confidence intervals of those estimations, computed as described in Section 2.2.
In Figure 1, two aspects can be analyzed: (1) the consistency of students over time (despite being a different cohort), and (2) the reliability of the survey in estimating the number of students per Cross-Over. Note that students who filled Survey 2 took a Cross-Over 12 months later (the next that academic year) as part of the second year of their MSc program.
Regarding the consistency of students’ choices over time, relevant differences, defined here as the number of students varying more than 30% between Survey 1 and Survey 2, are observed in three of the Cross-Overs. As the population of students in both surveys should be comparable, these differences could be caused by factors such as differences in the composition of students between surveys (see Section 2); information sessions carried out by academic staff to present each Cross-Over to inform students about their decision; or the fact that the two surveys are more than a year apart (see Section 2). However, it is noted that, in most cases, the confidence intervals of both surveys overlap, showing consistency of the results.
With regard to the estimation of the expected number of students per Cross-Over, results from the surveys are, in general, consistent with the realization: the actual number of students is typically within the confidence intervals of the estimations from both surveys. When comparing results from Survey 2 to the actual selection, three Cross-Overs have a difference between estimated and actual students of more than 30%. The highest relative difference is observed for M U D followed by M O R E and N o i s e . These high relative differences are caused by the lower number of students selecting those specific Cross-Overs (approximately 5 to 18 total responses for first choice). If absolute numbers are compared, the largest differences are obtained for D S A I E and M U D , with a difference of more than 20 students.
Overall, the surveys provided for most of the Cross-Overs provide a fair first estimate of the number of students per module. Therefore, they can be used with confidence in early planning stages to determine the resources needed by the teaching team.

3.1.2. Distribution of Students for Each Path per Cross-Over, Academic Year 2023/2024

Figure 2 presents the distribution of students per path in each Cross-Over based on the results of Survey 2 and the actual distribution observed during the academic year 2023/2024.
In general, the estimated and observed distribution of students per path are consistent for all the Cross-Overs. On the one hand, the results from Survey 2 correctly identify the paths to which most of the students in a Cross-Over belong. For instance, based on Survey 2, G l o b a l is expected to present the highest number of students coming from HOS, HE and SE. Accordingly, most of the students during the academic year 2023/2024 are coming from those three paths. On the other hand, the paths with the lowest number of students in a Cross-Over are usually well captured. This is, when no students from a path in Survey 2 choose a Cross-Over, that also happens during the last academic year in most cases. However, sometimes a low number of students from that path chooses the Cross-Over. For instance, although no students are expected to stem from CME in the Cross-Over D S A I E , a small number of them finally takes it. That is caused by the intrinsic limitations of surveys: not all the students are asked, so minorities might not be fully well represented.
As a summary, the survey is able to capture the distribution of students from each path in a Cross-Over, although some deviations are observed for cases with low occurrences.

3.2. Analyzing the Students’ Preferences

Once the use of surveys to monitor student preferences was validated, a further analysis was performed here to obtain some insights into the selections made by the students.

3.2.1. Relationship Between First, Second, and Third Choices

As shown in Section 2.1, the students were asked to rank the modules. Thus, the relationship between the first, second, and third choices of the students could be analyzed. Figure 3 shows the probability of selecting a module as a second choice given the first choice for Surveys 2 and 3.
Figure 3(left) shows some relationships between the choices made by the students. G l o b a l is often a second choice if C i t i e s is a first choice, and vice versa. This shows that students with interests in social and political aspects of engineering ( G l o b a l ) are also interested in sustainability and climate resilience ( C i t i e s ), which are impactful topics for society. Moreover, C i t i e s is often a second choice across all Cross-Overs, showing the exposure of the new generations to the current problems of engineering practice (namely climate change and aging infrastructures), which leads to their interest in these topics (Mares-Nasarre et al., 2023).
Similarly to the Survey 2 from the previous academic year, in Survey 3 (Figure 3(right)), G l o b a l and C i t i e s are often subsequent choices if one is the first choice. However, C i t i e s does not seem to be a common second choice anymore, regardless the first choice. In Survey 3, a relationship similar to G l o b a l and C i t i e s is observed for the pairs D S A I E and M O R E , and N o i s e and M o S H . D S A I E and M O R E are the two methodological Cross-Overs, focusing on mathematical methods and, thus, seeming as though they attract similar students. Similar reasoning applies for N o i s e and M o S H , as they share some components related to structures and vibrations.
In both surveys, third choices seem to be mainly uncorrelated with the first choices (see Figure A1 in Appendix C).

3.2.2. Influence of Students’ Background on Their Preferences

Figure 4 presents the conditional probabilities of a student choosing a Cross-Over given their master’s program and path based on Survey 3.
It can be observed in Figure 4 that students from the MSc in Applied Earth Sciences (MSc AES) have a clear preference for D S A I E . That preference is shared across the four paths of MSc AES. It is also noteworthy that the Cross-Over D e l t a shares popularity with D S A I E in the Climate and Weather (CW) path. The second most popular Cross-Over between the students from MSC AES is C i t i e s , with a similar degree of popularity as Earth Observations (EO), Geo-Energy (GE) and Geo-Resources (GR) paths.
The students from the MSc in Civil Engineering (MSc CIE) present a good spread between most of the Cross-Overs. S t o r a g e and M U D present a very low popularity. However, while S t o r a g e is unpopular for all the paths of MSc CIE, M U D has some popularity regarding the Geotechnical Engineering (GE) path. This seems reasonable considering that the topics covered in M U D are related to sediments and soil properties. G l o b a l and M o S H present a relevant degree of popularity in almost all the paths, being especially attractive for students following the path on Construction Materials Engineering (CME). Students from the paths in Hydraulic and Offshore Structures (HOS) and Hydraulic Engineering (HE) present a similar behavior; the most popular Cross-Over is D e l t a , followed by M O R E and G l o b a l . Deltas’ development and resilience (covered in D e l t a ) is directly linked to HOS and HE, while probabilistic analysis (covered in M O R E ) is also key in the design of hydraulic interventions. Thus, given the relevance of these topics for the aforementioned paths, it is not surprising that the students feel very attracted for those Cross-Overs. Finally, students in the Traffic and Transport Engineering (TTE) path present a clear preference for C i t i e s , which can again be identified as the closest topic to the path.
The students from the MSc in Environmental Engineering (MSc EE) present a reasonable spread between most of the different Cross-Overs, although S t o r a g e , N o i s e and M o S H are clearly unpopular. The most popular Cross-Over for the students in MSc EE is C i t i e s , which also holds for the paths regarding Atmospheric Environmental Engineering (AE). It is also the second most popular Cross-Over regarding the path related to Resource and Waste Engineering (RW). However, students from Water Resources Engineering (WR) have a clear preference for methodological Cross-Overs, namely D S A I E and M O R E .
The results of the students’ preferences related to their paths show that most of the students tend to choose the Cross-Over which is closer to the topics or tools that are covered in or clearly relevant to their path. This is, most of the students do not take the chance to explore topics further from their expertise, although some still do.

3.3. Estimating Students’ Preferences

In this section, Survey 3 is used to obtain a first estimate of the distribution of students along the Cross-Overs for the next academic year. These results are also used to compare the distribution of students across the Cross-Overs in the two academic years.

3.3.1. Expected Number of Students per Cross-Over for the Coming Year

Here, the expected number of students per Cross-Over is computed considering two values of the total number of students. First, it is computed considering that all the students taking the second year of one of the master’s programs in the Faculty are taking the Cross-Overs, namely 272 students. However, some students might decide to complete a multidisciplinary project instead of taking a Cross-Over, which will reduce the total number of students. During the 2023/2024 academic year, 77.5% of the students took a Cross-Over. Therefore, the expected number of students per Cross-Over was again calculated here, considering that the total number of students in the Cross-Overs was 0.775 × 272 211 . Figure 5 presents the expected number of students per Cross-Over computed based on data from 272 and 211 students, together with their confidence intervals.
As shown in Figure 5, five of the Cross-Overs present an expected number of students between 20 and 40 for both of the calculated total numbers of students. Two Cross-Overs present an expected number of students over 40, D S A I E and C i t i e s , which can be a challenge for the teaching team and with regard to the allocated resources. Therefore, using the results of the survey, the teaching team can be warned and consulted. Moreover, if needed, the number of students in a Cross-Over can be limited. Two Cross-Overs present an expected number of students close or below 10. This result can be used to communicate the need to further promote the course to the teaching team and to discuss the minimum number of students in a module to run.
If D e l t a is removed from the results of Survey 3 for the next academic year, the expected percentage of students per Cross-Over can be compared with the actual number of students per Cross-Over from the previous academic year. Such comparison assesses the variations in the popularity of the Cross-Overs between the two academic years. Typically, students ask their peers from higher courses for advice on the selection of electives and Cross-Overs. Thus, the popularity of a Cross-Over in an academic year can be affected by the opinions of the students from previous years. The aforementioned comparison is displayed in Figure 6.
As shown in Figure 6, in general, there is no a dramatic change in the percentage of students in most of the Cross-Overs since, typically, the actual percentage of students in the 2023/2024 academic year falls within the confidence intervals of the prediction for the 2024/2025 academic year. The exceptions are outlined for G l o b a l , S t o r a g e and M o S H : while G l o b a l and S t o r a g e reduce the number of students significantly, M o S H increases in popularity.

3.3.2. Expected Distribution of Students of Each Path for the Coming Year

Based on Survey 3, the expected distribution of students from the different paths in each Cross-Over is computed and shown in Figure 7. Figure 7 also displays the distribution of students from the previous academic year (2023/2024).
Based on Survey 3, G l o b a l , D S A I E , M o S H and C i t i e s present students from at least five different paths in the next academic year. The other Cross-Overs have a narrower variety of students’ backgrounds. This is especially noteworthy for S t o r a g e , where only students of one path selected it. However, it should be noted that the survey presented limitations in estimating the contribution of paths with few students. Therefore, it can be still expected that a small number of students from paths with no contribution in Figure 7 will take a given Cross-Over.
Regarding the differences between both academic years (Survey 3 and actual numbers from 2023/2024), the distributions are, in general, consistent. The paths with the highest representation in a Cross-Over are usually the same across the years. For instance, M O R E is especially popular among the students from HOS and HE.

3.3.3. Limiting the Number of Students in a Given Cross-Over

Having an estimation of the number of students that will follow a course allows for the planning of the resources and the teaching team. If the number of students is higher than the capacity of the team, there is a possibility of establishing a procedure to limit the maximum number of students. Here, the expected number of students per Cross-Over when limiting the maximum number to 40 is investigated using the results of Survey 3.
To estimate the distribution of students when limiting the maximum number of students per Cross-Over, a random assignment approach of the expected 211 students is applied here 1000 times. Thus, the mean number of students per Cross-Over and its variability are assessed. The assignment approach is structured as follows: for each of the 211 students, their first choice is randomly picked given the probabilities of a Cross-Over being the first choice. The student is assigned to that first choice if there is room. If that Cross-Over has already reached the maximum number of 40 students, the conditional probabilities of the second choice given the first one are used to sample the second choice of the student and assign it. If the second choice is also full, the process is repeated with the third choice. Figure 8 presents the results of such random assignment together with an uncapped distribution of students.
Small differences are observed between the uncapped and capped distributions in Figure 8 as only two Cross-Overs contained above 40 students. Largest differences were observed in D S A I E as it was the module with the highest number of students. The reassigned students were mainly distributed between G l o b a l , M O R E and M o S H .

4. Discussion

In this study, a questionnaire was used to gather data on student preferences when choosing courses. While questionnaires provide valuable quantitative information and enable the identification of some patterns (e.g.,: Kooptiwoot et al., 2024; Mares-Nasarre et al., 2023), they do not allow for an in-depth analysis of the underlying factors that drive these preferences. As such, future research should aim to explore these contributing factors more thoroughly, potentially using qualitative methods such as interviews (Leydens et al., 2021; Potvin et al., 2024) or mixed methods combining quantitative and qualitative methods (Anani et al., 2025), in order to gain a deeper understanding of the motivations and decision-making processes behind student course selections.
The estimated number of students per course was compared with the actual choices to assess the performance of the questionnaire. The results of the questionnaire capture the general preferences of the students. Therefore, the early preferences of students can be used as a first assessment for planning tasks, as proposed in previous studies (Chaturapruek et al., 2021). Larger relative differences are observed when computing the number of students in courses or paths with low numbers due to the intrinsic limitations of performing a survey. Not all the students fill questionnaires, so the sample size for smaller target groups is very small and might not be captured in detail. If a high level of detail with regard to the number of students per course is needed, a registration system prior to the beginning of lectures is recommended.
When comparing the preferences of the target group of students in time, some significant changes are observed. This can be partially caused by the surveyed students, as they might have partially changed despite being part of the same target group. This is, not all the students who filled the first survey also filled the second, and vice versa. In addition, the preferences of smaller target groups might not be captured due to the reduced sample size. Moreover, the decision-making process is complicated and depends on many factors (Godbole et al., 2024; Leutritz et al., 2024) that can also make preferences change over time.
The results show that three courses consistently rank among the most popular choices. One of these, D S A I , deals with the application of artificial intelligence and machine learning to engineering problems. This subject has gained significant attention in recent years as a promising tool in a wide variety of fields, making it especially attractive to students (Anani et al., 2025; Chan & Hu, 2023; Coffey, 2024). The other two popular courses, G l o b a l and C i t i e s , focus on sustainability and the broader societal impact of engineering. This reflects a growing interest among students in addressing global environmental and social challenges related to their future profession, as found in previous research (Mares-Nasarre et al., 2023; Students Organising for Sustainability United Kingdom, n.d.).

5. Conclusions and Recommendations

This research proposes a questionnaire as a simple tool to monitor student preferences when choosing courses. The questionnaire captures the general preferences of the students; however, larger relative differences are observed when computing the number of students in courses or paths with low numbers due to the intrinsic limitations of performing a survey. If a high level of detail in the number of students per course is needed, a registration system prior to the beginning of lectures is recommended.
Some patterns are observed in student preferences. A relationship between the first and second choices can be established; students who choose a methodological module as their first choice, namely D S A I E and M O R E , choose a methodological module as their second module. The same holds for modules focused on social impact ( C i t i e s and G l o b a l ), or vibrations and structural responses ( N o i s e and M o S H ). However, no clear relationship is observed with regard to the third choices. This might be caused by students knowing that they will probably get their first choice and thus do not experience actual pressure with regard to defining their complete ranking. Therefore, for future research or further applications of the proposed questionnaire, its length could be potentially reduced.
When analyzing the relationship between the background of the students and their preferences, it can be concluded that most of them do not explore far out of their discipline. Students generally choose between modules closer or more relevant to their own discipline. Therefore, if it is deemed desirable that more students take the opportunity to expand their horizons further from their background, future efforts could focus on identifying the factors that drive student choices in further detail. This would make it possible for us to design strategies to motivate students to step out of their comfort zone (Kahvo et al., 2023; Leutritz et al., 2024).
Finally, the validated questionnaire in this study was applied to estimate the distribution of students for the coming academic year. The distribution of students along the different modules is relatively even despite the lower numbers for two modules. This estimation is also compared with the preferences of the students during the previous academic year, identifying modules which gained or lost popularity. Regarding the background of students in each module, no significant differences were found between the two academic years. Also, the number of students per Cross-Over was simulated, assuming a maximum number of students per module of 40. This simulates the situation where a maximum number of students is defined according to the availability of the teaching team.
In conclusion, in this study, a questionnaire was developed and used to monitor student preferences when choosing higher education courses. This research validates its performance and illustrates its possible applications.

Author Contributions

Conceptualization, P.M.-N. and O.M.-N.; methodology, P.M.-N., N.v.B., E.B. and O.M.-N.; formal analysis, P.M.-N. and N.v.B.; data curation, N.v.B.; writing—original draft preparation, P.M.-N.; writing—review and editing, E.B., R.L. and O.M.-N. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable. Ethical approval was not required as the anonymity of the participants was ensured.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors upon request.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of this study; in the collection, analyses, or interpretation of data; in the writing of this manuscript; or in the decision to publish the results.

Appendix A. Survey

Education 15 00859 i001Education 15 00859 i002Education 15 00859 i003Education 15 00859 i004Education 15 00859 i005Education 15 00859 i006Education 15 00859 i007Education 15 00859 i008Education 15 00859 i009Education 15 00859 i010

Appendix B. List of Paths per Master’s Program

The Faculty of Civil Engineering and Geosciences offers three MSc programs: MSc in Applied Earth Sciences, MSc in Civil Engineering, and MSc in Environmental Engineering. Their specialization paths are listed below.
MSc in Applied Earth Sciences:
  • Climate & Weather (CW)
  • Earth Observation (EO)
  • Geo-Energy (GE)
  • Geo-Resources (GR)
MSc in Civil Engineering:
  • Construction Materials Engineering (CME)
  • Geotechnical Engineering (GEn)
  • Hydraulic & Offshore Structures (HOS)
  • Hydraulic Engineering (HE)
  • Structural Engineering (SE)
  • Traffic & Transport Engineering (TTE)
MSc in Environmental Engineering:
  • Atmospheric Environmental Engineering (AE)
  • Resource and Waste Engineering (RW)
  • Water Resources Engineering (WR)

Appendix C. Relationship Between First and Third Choices

Figure A1. Students’ third choice (x-axis) was based on their first choice (y-axis) for (left) Survey 2 and (right) Survey 3. For example, based on Survey 2, the probability of Global Development being the third choice given MORE being the first choice is 0.26. Surveys 2 and 3 took place 12 and 6 months prior to the first lecture in two different academic years (see Section 2).
Figure A1. Students’ third choice (x-axis) was based on their first choice (y-axis) for (left) Survey 2 and (right) Survey 3. For example, based on Survey 2, the probability of Global Development being the third choice given MORE being the first choice is 0.26. Surveys 2 and 3 took place 12 and 6 months prior to the first lecture in two different academic years (see Section 2).
Education 15 00859 g0a1

References

  1. Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. [Google Scholar] [CrossRef]
  2. Ajzen, I. (2002). Perceived behavioral control, self-efficacy, locus of control, and the theory of planned behavior. Journal of Applied Social Psychology, 32(4), 665–683. [Google Scholar] [CrossRef]
  3. Alhumaid, K., Anbreen Waheed, S. A., Zahid, E., & Habes, M. (2020). COVID-19 & elearning: Perceptions & attitudes of teachers towards E-learning acceptancein the developing countries. Multicultural Education, 6(2), 100–115. [Google Scholar]
  4. Anani, G. E., Nyamekye, E., & Bafour-Koduah, D. (2025). Using artificial intelligence for academic writing in higher education: The perspectives of university students in Ghana. Discover Education, 4, 46. [Google Scholar] [CrossRef]
  5. Aupperle, K. (2024). Texas A&M University will pause undergraduate enrollment growth. Available online: https://www.kbtx.com/2025/01/24/texas-am-university-will-pause-undergraduate-enrollment-growth/ (accessed on 14 May 2025).
  6. Bissias, G., Cousins, C., Diaz, P. N., & Zick, Y. (2025). Deploying fair and efficient course allocation mechanisms. arXiv, arXiv:2502.10592. [Google Scholar] [CrossRef]
  7. Breetzke, J., & Bohndick, C. (2024). Expectancy-value interactions and dropout intentions in higher education: Can study values compensate for low expectancies? Motivation and Emotion, 48, 700–713. [Google Scholar] [CrossRef]
  8. Chan, C. K. Y., & Hu, W. (2023). Students’ voices on generative AI: Perceptions, benefits, and challenges in higher education. International Journal of Educational Technology in Higher Education, 20, 43. [Google Scholar] [CrossRef]
  9. Chaturapruek, S., Dalberg, T., Thompson, M. E., Giebel, S., Harrison, M. H., Johari, R., Stevens, M. L., & Kizilcec, R. F. (2021). Studying undergraduate course consideration at scale. AERA Open, 7, 2332858421991148. [Google Scholar] [CrossRef]
  10. Chawla, S., & Saha, S. (2024). Exploring perceptions of psychology students in Delhi-NCR Region towards using mental health apps to promote resilience: A qualitative study. BMC Public Health, 24, 2000. [Google Scholar] [CrossRef]
  11. Coffey, L. (2024). Majority of grads wish they’d been taught ai in college. Available online: https://www.insidehighered.com/news/tech-innovation/artificial-intelligence/2024/07/23/new-report-finds-recent-grads-want-ai-be (accessed on 14 May 2025).
  12. Cruz, M. L., Saunders-Smits, G. N., & Groen, P. (2020). Evaluation of competency methods in engineering education: A systematic review. European Journal of Engineering Education, 45(5), 729–757. [Google Scholar] [CrossRef]
  13. Dahl, W., Alford, K., Rivero-Mendoza, D., Moreno, M., Emmanuel, S., & Gorwitz, G. (2024). Factors influencing undergraduate students toward choosing a new course. NACTA Journal, 68, 35–43. [Google Scholar] [CrossRef]
  14. Delft University of Technology. (n.d.-a). Msc applied earth sciences programme. Available online: https://www.tudelft.nl/en/education/programmes/masters/ect/msc-earth-climate-and-technology/programme (accessed on 28 October 2024).
  15. Delft University of Technology. (n.d.-b). Msc civil engineering programme. Introduction to the curriculum. Available online: https://www.tudelft.nl/onderwijs/opleidingen/masters/ce/msc-civil-engineering/programme#:~:text=Faculty%20Wide%20Module%20(MUDE),and%20Environmental%20Engineering%20master%20students (accessed on 28 October 2024).
  16. Delft University of Technology. (n.d.-c). Msc environmental engineering programme. Introduction to the curriculum. Available online: https://www.tudelft.nl/onderwijs/opleidingen/masters/enve/msc-environmental-engineering/programme (accessed on 28 October 2024).
  17. Delft University of Technology. (n.d.-d). MSc programmes at CEG. Cross-over modules information booklet. Available online: https://filelist.tudelft.nl/TUDelft/Onderwijs/Opleidingen/Master/MSc_Civil_Engineering/Brochure%20Cross-Overs%20Final%20v2.pdf (accessed on 28 October 2024).
  18. Eccles, J. S., Adler, T., Futterman, R., Goff, S. B., Kaczala, C. M., & Meece, J. (1983). Expectancies, values and academic behaviours Expectancies, values and academic behaviours. In J. T. Spence (Ed.), Achievement and achievement motivation (pp. 75–146). W. H. Freeman. [Google Scholar]
  19. Galadima, H., Dumadag, A., & Tonn, C. (2024). Navigating new normals: Student perceptions, experiences, and mental health service utilization in post-pandemic academia. Education Sciences, 14(2), 125. [Google Scholar] [CrossRef]
  20. Godbole, A. A., Oka, G. A., Ketkar, M. N., Solanki, R. S., Desai, D. T., Bangale, S. V., & Rele, A. S. (2024). Specialty preferences of undergraduate medical students: What do they choose and why? Medical Journal Armed Forces India, 81, 66–71. [Google Scholar] [CrossRef]
  21. Gu, Q., & Lu, G. (2023). Factors influencing the satisfaction level of college students in China: Literature analysis based on grounded theory. Frontiers in Psychology, 13, 1023420. [Google Scholar] [CrossRef]
  22. Kahvo, M., Whelan, R., & Vallabhaneni, P. (2023). Why choose paediatrics? A scoping review of factors affecting the choice of paediatrics as a career. European Journal of Pediatrics, 182(1), 9–23. [Google Scholar] [CrossRef]
  23. Kooptiwoot, S., Kooptiwoot, S., & Javadi, B. (2024). Application of regression decision tree and machine learning algorithms to examine students’ online learning preferences during COVID-19 pandemic. International Journal of Education and Practice, 12(1), 82–94. [Google Scholar] [CrossRef]
  24. Larson, M., Davies, R., Steadman, A., & Cheng, W. M. (2023). Student’s choice: In-person, online, or on demand? A comparison of instructional modality preference and effectiveness. Education Sciences, 13(9), 877. [Google Scholar] [CrossRef]
  25. Leutritz, T., Krauthausen, M., Simmenroth, A., & König, S. (2024). Factors associated with medical students’ career choice in different specialties: A multiple cross-sectional questionnaire study at a German medical school. BMC Medical Education, 24, 798. [Google Scholar] [CrossRef] [PubMed]
  26. Leydens, J. A., Johnson, K. E., & Moskal, B. M. (2021). Engineering student perceptions of social justice in a feedback control systems course. Journal of Engineering Education, 110(3), 718–749. [Google Scholar] [CrossRef]
  27. Lin, G. S. S., Ng, Y. S., Hashim, H., Foong, C. C., Yahya, N. A., Halil, M. H. M., & Ahmad, M. S. (2024). Shaping tomorrow’s dentists: A multi-institutional survey of undergraduate dental students’ perceptions towards interprofessional education. BMC Oral Health, 24, 762. [Google Scholar] [CrossRef]
  28. Mares-Nasarre, P., Martínez-Ibáñez, V., & Sanz-Benlloch, A. (2023). Analyzing sustainability awareness and professional ethics of civil engineering Bachelor’s degree students. Sustainability, 15(7), 6263. [Google Scholar] [CrossRef]
  29. Masegosa, A. R., Cabañas, R., Maldonado, A. D., & Morales, M. (2024). Learning styles impact students’ perceptions on active learning methodologies: A case study on the use of live coding and short programming exercises. Education Sciences, 14(3), 250. [Google Scholar] [CrossRef]
  30. Potvin, M., Morales, A., West, E., Kalimi, M., & Coviello, J. (2024). Occupational therapy students’ perceptions of their experience in a role-emerging Level II fieldwork within higher education student services. BMC Medical Education, 24, 384. [Google Scholar] [CrossRef]
  31. Qualtrics. (2005). Qualtrics (Provo, Utah, USA) [Computer software manual]. Available online: https://www.qualtrics.com/ (accessed on 14 May 2025).
  32. Sim-Sim, M., Zangão, O., Barros, M., Frias, A., Dias, H., Santos, A., & Aaberg, V. (2022). Midwifery now: Narratives about motivations for career choice. Education Sciences, 12(4), 243. [Google Scholar] [CrossRef]
  33. Stanley, D., & Mitra, S. (2024). The impact of the COVID-19 pandemic on business students’ future preference for online courses. Decision Sciences Journal of Innovative Education, 22(3), 180–196. [Google Scholar] [CrossRef]
  34. Students Organising for Sustainability United Kingdom. (n.d.). Two thirds of students want to learn more about sustainability. Available online: https://www.sos-uk.org/post/two-thirds-of-students-want-to-learn-more-about-sustainability (accessed on 14 May 2025).
  35. Xue, G., Offodile, O. F., Razavi, R., Kwak, D.-H., & Benitez, J. (2024). Addressing staffing challenges through improved planning: Demand-driven course schedule planning and instructor assignment in higher education. Decision Support Systems, 187, 114345. [Google Scholar] [CrossRef]
  36. Yao, Z. M., Vongchalitkul, B., Rungrueng, P., So-in, R., & Chen, W.-K. (2023). The influencing factors of college students’ intention in selecting elective courses. International Journal of Development Administration Research, 6(1), 16–30. [Google Scholar]
  37. Zhang, Z., & Chen, X. (2024). An analysis of students’ perceptions of teachers’ questioning in secondary biology classrooms. Disciplinary and Interdisciplinary Science Education Research, 6, 5. [Google Scholar] [CrossRef]
Figure 1. Comparison of expected number of students per Cross-Over based on surveys and the actual selection by the students of academic year 2023/2024, including confidence intervals for each. Surveys 1 (N = 339) and 2 (N = 159) took place 30 and 12 months prior to the first lecture (see Section 2).
Figure 1. Comparison of expected number of students per Cross-Over based on surveys and the actual selection by the students of academic year 2023/2024, including confidence intervals for each. Surveys 1 (N = 339) and 2 (N = 159) took place 30 and 12 months prior to the first lecture (see Section 2).
Education 15 00859 g001
Figure 2. Comparison of the distribution of students of each path per Cross-Overs based on Survey 2 (12 months prior to the first lecture; see Section 2) and the actual selection by the students of 2023/2024. Note that orange and blue bars are somewhat transparent, so overlapping parts of the bars are shown in a maroon color.
Figure 2. Comparison of the distribution of students of each path per Cross-Overs based on Survey 2 (12 months prior to the first lecture; see Section 2) and the actual selection by the students of 2023/2024. Note that orange and blue bars are somewhat transparent, so overlapping parts of the bars are shown in a maroon color.
Education 15 00859 g002
Figure 3. Students’ second choice (x-axis) conditioned on their first choice (y-axis) for (left) Survey 2 and (right) Survey 3. For example, based on Survey 2, the probability of Global Development being the second choice given MORE being the first choice is 0.5. Surveys 2 and 3 took place 12 and 6 months prior to the first lecture in two different academic years (see Section 2).
Figure 3. Students’ second choice (x-axis) conditioned on their first choice (y-axis) for (left) Survey 2 and (right) Survey 3. For example, based on Survey 2, the probability of Global Development being the second choice given MORE being the first choice is 0.5. Surveys 2 and 3 took place 12 and 6 months prior to the first lecture in two different academic years (see Section 2).
Education 15 00859 g003
Figure 4. Probability of selecting a Cross-Over conditioned on the student background based on Survey 3. For example, the probability of DSAIE being a first choice given a student is from AES is 0.59.
Figure 4. Probability of selecting a Cross-Over conditioned on the student background based on Survey 3. For example, the probability of DSAIE being a first choice given a student is from AES is 0.59.
Education 15 00859 g004
Figure 5. Expected number of students per Cross-Over for the 2024/2025 academic year based on Survey 3 (N = 94), completed 12 months prior to the first lecture.
Figure 5. Expected number of students per Cross-Over for the 2024/2025 academic year based on Survey 3 (N = 94), completed 12 months prior to the first lecture.
Education 15 00859 g005
Figure 6. Comparison of the percentage of students per Cross-Over between the academic years 2023/2024 (actual numbers) and 2024/2025 (Survey 3), disregarding the inclusion of the new module ( D e l t a s ).
Figure 6. Comparison of the percentage of students per Cross-Over between the academic years 2023/2024 (actual numbers) and 2024/2025 (Survey 3), disregarding the inclusion of the new module ( D e l t a s ).
Education 15 00859 g006
Figure 7. Expected distribution of students from the different paths in each Cross-Over based on Survey 3 and the actual distribution from the academic year 2023/2024. Note that orange and green bars are transparent, so overlapping parts of the bars are shown in a darker orange color.
Figure 7. Expected distribution of students from the different paths in each Cross-Over based on Survey 3 and the actual distribution from the academic year 2023/2024. Note that orange and green bars are transparent, so overlapping parts of the bars are shown in a darker orange color.
Education 15 00859 g007
Figure 8. Expected number of students per Cross-Over with and without the limitation of a maximum number of students of 40, based on Survey 3.
Figure 8. Expected number of students per Cross-Over with and without the limitation of a maximum number of students of 40, based on Survey 3.
Education 15 00859 g008
Table 1. Number of students selecting a given Cross-Over as a first choice in each survey.
Table 1. Number of students selecting a given Cross-Over as a first choice in each survey.
Survey G l o b a l S t o r a g e D S A I E N o i s e M O R E M U D M o S H C i t i e s D e l t a
#15727961418153181-
#232193056181138-
#312120984111811
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mares-Nasarre, P.; van Boldrik, N.; Bakker, E.; Lanzafame, R.; Morales-Nápoles, O. Unlocking Student Choices: Assessing Student Preferences in Courses in Engineering Education. Educ. Sci. 2025, 15, 859. https://doi.org/10.3390/educsci15070859

AMA Style

Mares-Nasarre P, van Boldrik N, Bakker E, Lanzafame R, Morales-Nápoles O. Unlocking Student Choices: Assessing Student Preferences in Courses in Engineering Education. Education Sciences. 2025; 15(7):859. https://doi.org/10.3390/educsci15070859

Chicago/Turabian Style

Mares-Nasarre, Patricia, Niels van Boldrik, Elske Bakker, Robert Lanzafame, and Oswaldo Morales-Nápoles. 2025. "Unlocking Student Choices: Assessing Student Preferences in Courses in Engineering Education" Education Sciences 15, no. 7: 859. https://doi.org/10.3390/educsci15070859

APA Style

Mares-Nasarre, P., van Boldrik, N., Bakker, E., Lanzafame, R., & Morales-Nápoles, O. (2025). Unlocking Student Choices: Assessing Student Preferences in Courses in Engineering Education. Education Sciences, 15(7), 859. https://doi.org/10.3390/educsci15070859

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop