Next Article in Journal
Knowledge Retention of Undergraduate Medical Students in Regional Anatomy Following a One-Month Gross Anatomy Course Setting
Next Article in Special Issue
Can Video Lectures on Enthymemes Improve Adult Learners Critical Thinking and Clickbait Detection Skills?
Previous Article in Journal
Institutional Ethos of Less Selective Massive Private Universities in Chile: Organizational Identities in a Competitive and Marketized University System
Previous Article in Special Issue
The Impact of Online Interactive Teaching on University Students’ Deep Learning—The Perspective of Self-Determination
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Malleability of Higher Education Study Environment Factors and Their Influence on Humanities Student Dropout—Validating an Instrument

Department of Design, Media, and Educational Science, University of Southern Denmark, DK-5230 Odense, Denmark
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(8), 904; https://doi.org/10.3390/educsci14080904
Submission received: 10 June 2024 / Revised: 3 August 2024 / Accepted: 14 August 2024 / Published: 19 August 2024

Abstract

:
In this article, we investigate how tertiary humanities students’ perceptions of the study environment, dropout considerations, and background variables, respectively, explain variations in dropout. Based on Tinto’s Institutional Departure Model and a systematic review of the dropout literature, the study environment comprised an academic system, a social system, and teaching. Multivariate statistical analyses in the form of explorative factor analysis and logistic bivariate regressions were used on half-early register and survey data from all humanities students at a Danish university [University of Southern Denmark], matriculated in 2017–2019. This article found that students’ perceptions of their study environment explained between 15.8% and 36.9% of dropout, whereas dropout considerations and background parameters explained only between 0 and 9.1% and between 7.9 and 21.4% of dropout, respectively. We hereby present and discuss the results obtained during different terms. The discussion revolves around the proposed research instrument and the longitudinal research methodology, as well as around what we could learn from this study about being a humanities student and about study environments that could help us increase the number of graduates.

1. Introduction

Student dropout has been a matter of concern for educational policymakers and politicians since the establishment of formal education. Studies have succeeded in determining correlations between dropout and students’ and institutions’ background characteristics, while pedagogical factors have been less researched. Therefore, we know significantly more about the impact of “given”, “endogenous”, or simply “prior” conditions or factors such as student demographics, individual student characteristics (previous educational results, admission criteria, housing situation, etc.), and teacher experience than about the impact of factors related to educational intervention programs, policies, and practices [1]. A recent review of research on dropouts [2] concludes that the literature on dropouts “focuses on retention and well-being based on individual conditions, rather than institutional structures and potential teaching strategies” (p. 25, our translation). Also, Tinto [3,4], in his latest publications, concludes that institutional factors are both theoretically and empirically under-researched. The shortage of research on pedagogical factors is remarkable, considering that Tinto’s institutional departure model was introduced in 1975. This model introduces a correlation between dropouts and students’ institutional integration within social and academic systems, and the 1997 edition places classroom activities (classroom, labs, studios) as straddling both the social system and the academic system, suggesting that they overlap the two systems, key to improving student integration in or identification with their institution. Given its focus on institutional conditions and activities, the model was conceptualized as a counter-response to dropout models focusing solely on students’ psychological factors and individual characteristics [5,6], and it was suggested to indicate a change in paradigm in the field of dropout research [7]. Today, it is still one of the most used and cited models in the field of higher education student dropout [8]. The shortage of research on pedagogical factors is unfortunate, as the strongest basis for understanding and strengthening the quality of teaching is established when changeable factors, on which one can actually intervene [9], are given particular awareness [10,11]. The reason for the shortage of research on the correlation between dropout and pedagogical factors is most likely manifold, but we consider two reasons to be dominant. Firstly, to avoid recall bias rooted in participants’ lack of ability to recall their experiences of the social environment over long periods of time [12], it is necessary to collect data on students’ experiences of the study environment longitudinally and wait until a sufficient number of students have dropped out to have enough data to carry out solid analyses of the correlation between pedagogical factors and actual dropout. This is a demanding and time-consuming research strategy, and, therefore, considerations regarding or intentions to drop out are often used as an early alert proxy for dropout [13,14], without us knowing if it is actually a good proxy. Secondly, it is suggested that pedagogical factors are often theoretically well-described but not clearly identifiable empirically [15], and it has been difficult to reliably identify specific factors with a reasonable effect size [10,16,17]. Due to the cruciality of these factors in the opportunity to strengthen the quality of teaching, a number of studies have sought to understand the reason why it is difficult to determine changeable factors [15]. Scheerens [11] and Muijs and Brookman [18] point to the lack of instruments as a weakness. Based on a review of 645 studies, Cheung and Slavin conclude that “researcher-made tests are associated with much higher effect sizes than are standardized tests” (p.286, [19], page 286). Scheerens suggests that this may be caused by the fact that changeable factors vary among educational institutions, study programs, subjects, and educational levels or different terms, as they are malleable with reference to context and time [11]. Due to the malleability of these factors, Scheerens further suggests that “research-made instruments ‘are’ more tailored for the treated group” (p.253, [11], page 523). This is substantiated in other reviews and studies that find considerable differences in the effect sizes when using nonstandard as opposed to standardized tests [20,21,22,23]. In a previous study, we found that empirically identifiable study environment factors differ widely across Nordic countries [24] and we also find that they differ across two faculties—humanities and natural sciences—at the same university, but no studies to date have examined the malleability of study environments’ factors over time, within a faculty.
In this article, we present a research instrument and a research methodology developed to determine study environment factors (changeable factors) in higher education with regards to their malleability. Furthermore, we investigate the malleability of study environment factors over time in humanities study programs and whether study environment factors explain tertiary humanities students’ dropout better than their known background parameters and dropout considerations. Our research questions are the following:
  • To what extent are humanistic higher education study environment factors malleable across different terms?
  • Can study environment factors, students’ background parameters, or their dropout considerations best explain dropout from tertiary humanities education across different terms?
Based on our previous studies referenced above, it is our hypothesis that the study environment factors are malleable across different terms. Furthermore, based on the studies referenced above, it is our hypothesis that study environment factors, students’ background parameters, and their dropout considerations all explain dropout. As we have a particular interest in factors which we can intervene on, we will focus on which study environment factors explain persistence at different times during the course of study and on opportunities to support students’ persistence. Thus, we can add the following two research questions to the ones above:
  • Which study environment factors explain students’ dropout across different terms?
  • With a focus on the study environment, how is it possible to support student persistence?
The article is based on data from the project “Data-driven Quality Development and Dropout Minimization”. In this project, longitudinal survey and register data from all students of humanities at the University of Southern Denmark, matriculating in 2017–2019, were collected half-yearly until they were no longer enrolled (i.e., until 2020–2022, when the students had either graduated or dropped out).

2. Tinto’s Institutional Departure Model and Study Environment Factors

Tinto’s institutional departure model describes dropout as a longitudinal process of interactions between the individual (including background variables such as skills and abilities, family background, and prior schooling), an academic system with subcategories such as interactions with personnel and academic achievements, and a social system with subcategories such as extracurricular activities and interactions with fellow students of the institution. The systems continually modify the student’s academic and social integration, leading either to persistence (the choice to continue studying) or departure (the choice to drop out). This focus and the variables intentions and goal commitment, which, in the model, are proposed to mediate between the institutional meeting and persistence/departure, leave space for individuality and for dropout to be understood as the student’s active decision, being procedurally influenced by their attitude towards persistence and dropout (often operationalized as dropout considerations) [25]. This focus on attitudes and decisions is different from other often-referenced dropout theories such as, for instance, Astin’s Student Involvement Theory, where student persistence is described as also involving actions: “the quantity and quality of the physical and psychological energy that students invest in the college experience” ([26] p. 307).
The 1997 model places classroom activities (classroom, labs, studios) as straddling both the social system and the academic system, suggesting that they overlap the two systems and are key to improving student integration or identification (see Figure 1). This article focuses on humanities study programs that do not involve activities in labs nor, for many, in studios. In order to have a term which could still signal our wish to direct our attention towards a wide range of activities (not only traditional classes), we chose the term “Teaching” as a category that included both classes and other activities such as reading groups, lectures, exercises, etc.
The project’s operationalization of the social system, the academic system, and teaching builds on a literature review of international articles on institutional factors and dropout in higher education based on a phenomenological review approach intending to identify all possible study environment factors that could have an impact on higher education dropout [8]. The literature review identified sixty-five studies, which, together, made it possible to specify twenty-nine study environment factors associated with dropout. These factors were categorized according to whether they belonged to the academic system, the social system, or teaching. This categorization led to a specification of the academic system into eight categories, a specification of the social system into another eight categories, and a specification of teaching into thirteen categories, as shown in Table 1.

3. Materials and Methods

3.1. Register Data

Data for the identified categories of “Demographic composition of the student body (age and sex)/ethnicity”, “GPA and learning”, and “Prior results and experiences of and disappointment in exams” (cf. Table 1) were collected as register data alongside other background information on the students: students’ hometown (we evaluated whether the campus town was equal to their hometown or not), their lower secondary educational type, number of gap years, the priority of their program when applying for tertiary education, and the admission method. We did not collect information on institutional, fixed categories (i.e., size of the institution), as we only surveyed students from one institution, meaning that these did not vary across the respondents. Students’ study status (i.e., enrolled, leave of absence, graduated, or dropped out) and dropout/completion dates were additionally provided by the university each term, along with mail addresses for all the students.

3.2. Survey Construction and Test

We formulated statements corresponding to each of the categories (cf. Table 1) not addressed by register data (demographics, GPA) or related to the induction program (and, thus, not relevant across terms) and added supplementary items based on more general expectations of what could be possible study environment factors at this specific university. This resulted in 28 items for the social system (see Appendix A), 22 items for the academic system (see Appendix B), and 64 items for the teaching system (see Appendix C). Additionally, three items addressing students’ dropout considerations were included (see Appendix D), giving a total of 117 items for this particular study.
Each statement was answered using a five-point Likert scale. However, we assume an interval level of responses in the following analysis. The participants were provided the option to respond “I don’t know/I do not wish to answer” for all the statements.
The survey was pilot-tested in two ways. First, it was tested by qualitatively interviewing first-term students while they filled in the survey. This led to some minor adjustments in wording and the inclusion of statements that allowed the respondents to agree/disagree regarding whether they perceived differences between their teachers (items 53, 68, 73, 83, and 114 in Appendix C). Secondly, we tested the survey quantitatively in two iterations (first- and second-term students matriculated in 2017) and made corrections to the items with a high degree of students answering, “I don’t know/I do not wish to answer” (i.e., we changed the items addressing workload from requiring students to fill in the hours spent each week on several specified tasks to the current items 45–47 in Appendix B). After these two pilot-testing iterations, no further adjustments to the survey were needed. As the first two rounds of the survey served as pilot tests, these responses will not be included in the following analysis. This gives us a total of 2014 responses from the first term (only students matriculating in 2018 and 2019) and 1060 responses from the second term (see bracketed numbers in Table 2).

3.3. Data Collection and Response Rates

The survey was distributed electronically to the students’ study emails to all humanities students at the university og Southern Denmark, matriculated in 2017–2019 (N = 2781), half-yearly, until they were no longer enrolled (i.e., graduated or dropped out), i.e., in 2020–2022. To minimize the number of non-respondents, teachers were encouraged to set aside time in class for students to fill out the survey. This resulted in a very high response rate (i.e., 91% in the first term). The number of responses dropped from term to term both because of the lower number of students being enrolled (second column in Table 2) and lower response rates (fourth column in Table 2). The dramatic drop in response rates might be partly due to the COVID-19 shutdown, during which the students were home-taught, meaning that the teachers were not able to set aside time for the survey during (physical) classes.
Some characteristics (demographics) are presented in Appendix E. Additional descriptive data can be retrieved by contacting the authors.

4. Analysis

The data were transferred to SPSS, and all statistical analyses were conducted in SPSS v.28 and Excel v.2406. “I don’t know/I do not wish to answer” responses were treated as missing values in this analysis. All negatively phrased questions were inverted (i.e., item 1 in Appendix A), and the data were screened for impermissible values (no values were outside the 1–5 range) and for respondents answering the questionnaire twice during the same term. This resulted in the response numbers presented in Table 2. We anticipate that a potential non-response bias might be caused by those likely to drop out of the study also being less likely to respond to the survey. This is also the reason why we have not used population data weights to compensate for our reduced response rates across terms, as such weights might lead to an over- or under-representation of certain groups, skewing the results despite (or because of) weighting. Because of the low response rates in the fifth and sixth term, the analysis will only be shown for the first-to-fourth terms.

4.1. Exploratory Factor Analysis

In order to investigate the malleability across terms and, thus, answer research question one, we performed exploratory factor analyses for each term on the survey data. The dataset’s suitability for factor analysis was evaluated using the Kaiser–Meyer–Olkin measure of sampling adequacy (KMO > 0.5) and Bartlett’s test of sphericity. Exploratory factor analysis (EFA) was used to openly investigate the underlying factor structure within the three systems. Principal axis factoring was used since the data did not meet multivariate normality assumptions (common for Likert scale items) and a Promax (oblique) rotation was chosen, as the factors were expected to correlate. Kaiser’s criterion (eigenvalues ≥ 1) was used to determine the number of factors, and items with loadings above 0.3 were included [28]. The total variance explained ranged across terms from 51 to 55% for the social system, from 43 to 49% for the academic system, and from 48 to 56% for teaching. The factor reliability was tested with Cronbach’s Alpha (we used standardized α), where α > 0.6 was chosen to indicate reliability [29]. This resulted in the reliable factors presented in Table 3, Table 4, Table 5 and Table 6.

4.2. Binary Logistic Regression Analysis

In order to answer research questions two and three, we performed a binary logistic regression to model the relationship between the binary outcome (drop out = 1, not dropped out = 0) and a number of different predictor variables. “Not dropped out” (0) included students still active in their program as well as students who had graduated. A first regression analysis (for each term) was conducted using students’ dropout considerations (Appendix D) as the predictor variable. A second regression analysis (for each term) was conducted using register data background information (students’ age, gender, native country, home language, hometown (equal to the campus town or not), lower secondary educational type, lower secondary GPA, number of gap years, the priority of their program when applying for tertiary education, and the admission method) as the predictor variables. Besides the GPA and the number of gap years, all these variables were treated as categorial variables. Finally, a third regression was conducted (for each term) using the identified exploratory study environment factors (Appendix A, Appendix B and Appendix C) as predictors.
The models’ goodness of fit was assessed with Omnibus tests of model coefficients and Hosmer and Lemeshow tests (>0.05). We assessed the model by addressing the pseudo-R2 (Nagelkerke R2) value of the variability in dropout accounted for by the model. The pseudo-R2 does not represent the proportion of variance explained by the model (like in linear regression), but it indicates the model’s improvement over a null model. The logistic regression model offers interpretable results in terms of impact of individual predictors on dropout (odds ratios). We present factors with a significant odds ratio (OR) (p < 0.05) with a 95% confidence interval.

5. Results

With respect to the research questions presented in the Introduction, we will first present the results of the exploratory factor analysis, which was used to investigate the malleability across terms and, thus, answer research question one, after which we will present the results of the binary logistic regression analysis, which we used to model the relationship between dropout and background parameters and the factors identified in the factor analysis, thus answering research questions two and three. We will address research question four in the Discussion.

5.1. Empirical Factors’ Malleability across Terms

The factors identified for each term based on the exploratory factor analyses are presented in Table 3, Table 4 and Table 5 and compared with the dimensions identified from the literature review (Table 1). As illustrated in the tables, there are noticeable differences between the theoretical categories and the empirically identified factors, while the empirical factors are relatively stable over time.
Table 3. Theoretical categories (rows) together with the empirical factors (columns) for the social system domain during terms 1–4. Loadings above 0.3 are marked with blue. Factors with poor reliability (Cronbach’s alpha < 0.6) are marked in gray and will not be included in a further analysis.
Table 3. Theoretical categories (rows) together with the empirical factors (columns) for the social system domain during terms 1–4. Loadings above 0.3 are marked with blue. Factors with poor reliability (Cronbach’s alpha < 0.6) are marked in gray and will not be included in a further analysis.
Social system
Initial factorItemEmpirical factors (1st term)Empirical factors (2nd term)Empirical factors (3rd term)Empirical factors (4th term)
Experienced social relations with fellow students Expected social relations with fellow studentsSocial relations with teachersExtracurricular activities (communities)Extracurricular activities (internship and study-relevant job)Institutional integrityExperienced participation in activitiesExperienced social relations with fellow students Expected social relations with fellow studentsSocial relations with teachersExtracurricular activities (communities)Extracurricular activities (internship and study-relevant job)Institutional integrityExperienced level of social activityExperienced participation in activitiesExperienced social relations with fellow students Expected social relations with fellow studentsExtracurricular activities (communities)Extracurricular activities (internship and study-relevant job)Institutional integrityExperienced level of social activityExperienced participation in activitiesExperienced social relations with fellow students Expected social relations with fellow studentsExtracurricular activities (communities)Extracurricular activities (internship and study-relevant job)Institutional integrityExperienced level of social activityExperienced participation in activities
Social integration1I feel lonely in my study program
2I have a good relationship with my fellow students
3I feel like part of a community in my study program
4I continue to study in this study program because we have a good sense of community in my class
5The level of social activity is high
6My relationship with my fellow students is close
7I participate in academic activities
8I participate in social activities orgnaized by either the university or my class
9I expected the level of social activity to be high
10I expected the relationship with my fellow students to be close
11I expected the relationship with my lecturers to be close
12My relationship with my lecturers is close
Extracurricular activities13Participation in social study organizations
14Participation in academic communities
15Do academic work (paid or unpaid) as a mentor/study advisor, teaching assistant or student employee
16Go on exchange as a part of study program
17Do an internship as a part of study program
18Have a student job within a field where gradueates with the same education often find employment
Institutional integrity19The university lives up to my expectations
20My study program lives up to my expectations
21The classes lives up to my perception of the values of the university
22The classes lives up to my perception of the values of my study program
23My personal values are well alligned with my perception of the values of the university
24My personal values are well alligned with my perception of the values of my study program
Social infrastructure25The other students participate in academic activities
26The other students participate in social activities organized by either the university of my class
27There is a satisfactory selection of social activities at the university
28It has been easy to interact with people from different academic years
Cronbach’s alpha0.8840.7530.4220.7530.6010.9150.6410.8980.741-0.7710.5310.9060.6590.7130.8760.7230.7520.4910.9150.748-0.8030.6190.7640.5470.9040.7090.182
As illustrated in Table 3, we identified seven empirical factors in the social system. The social integration category was quite stable across terms; however, it split into the following parts: experienced integration and relations with fellow students; expected social relations with fellow students; and, finally (although not reliable), relations with teachers. As such, we named the factor to capture these characteristics. The theoretical category extracurricular activities split into two factors across all terms: extracurricular activities related to study environment communities and internship and study-relevant jobs. In terms two and four, the last factor included an item on traveling abroad. The factor on internship and study-relevant jobs was, however, only reliable during the first term. The institutional integrity factor was stable and aligned with the theoretical category across all terms. Finally, the social infrastructure category split into two empirical factors: fellow students’ participation in activities and the range of activities that are offered, i.e., an institutional perspective.
Table 4. Theoretical categories together (rows) with the empirical factors (columns) for the academic system domain during terms 1–4. Loadings above 0.3 are marked in blue. Factors with poor reliability (Cronbach’s alpha < 0.6) are marked in gray and will not be included in a further analysis.
Table 4. Theoretical categories together (rows) with the empirical factors (columns) for the academic system domain during terms 1–4. Loadings above 0.3 are marked in blue. Factors with poor reliability (Cronbach’s alpha < 0.6) are marked in gray and will not be included in a further analysis.
Initial factorItemEmpirical factors (1st term)Empirical factors (2nd term)Empirical factors (3rd term)Empirical factors (4th term)
Academic integration and identificationPerception of learning communitySupport from facultyLecturer availabilityWorkload Prior exam results, experiences and dissapointmentsAcademic integration and identificationPerception of learning communitySupport from facultyLecturer availabilityWorkload Prior exam results, experiences and dissapointmentsAcademic integration and identificationPerception of learning communitySupport from facultyLecturer availabilityWorkload Prior exam results, experiences and dissapointmentsAcademic integration and identificationPerception of learning communitySupport from facultyLecturer availabilityWorkload Prior exam results, experiences and dissapointments
Academic integration and identification29I like being a student
30My study is an essential part of my life
31My study is a significant part of who I am
32It is important to me to learn as much as possible in my courses
33Experiences of interesting academic content of study program
Perception of learning community34I seek help from my fellow students if I experience challenges in my studies
35I share notes/literature/ideas for assignments with my fellow students
36My fellow studetns share notes/literature/ideas for assignments with me
37Our lecturers have introduced us to ways to work well in a group, e.g., how you allign your expectations, handle potential conflict, etc.
Relation to and interaction with and support from faculty38I have worked with a lecturer or another employee at the university for example on an article, as a student assistant, on a board etc.
39I have discussed study relate topics, ideas or academic terms with a lecturer or another employee at the university outside of class
40I have discussed my academic abilities with a lecturer or another employee at the university outside of class
41At my campus you often see the lecturers outside of the classroom (for instance in hallways and common areas)
42Most lecturers are easy to get in contact with
43The lecturers that I have had contact with generally seem interested in the students
44I have reached out to a study counsellor, or study secretary, a mentor, the leagal department or another employee at the university to discuss something concerning my studies
Workload45Experiences of work load of study program
46How many hours a week do you spend doing exercises that the lecturer has asked you to solve before class? (e.g., work-questions for texts, small problems, bloags, wikis, etc.)
47How many hours a week do you spend preparing for class and/or post-processing lectures? (alone or in groups)
Prior exam results, experiences and dissapointments48If I experience disappointment in exams I am good at shaking it off
49My exams have gone as I expected
50I have felt disappointment or a sense of failure in exams
Cronbach’s alpha0.7840.8490.7720.7330.4660.5150.7640.8550.7250.7570.4980.5620.7660.8510.7070.7170.5150.5460.7610.8600.6950.7560.3970.527
As shown in Table 4, we identifyed six empirical factors referring to Tinto’s academic system domain. These factors were very close to the theoretically assumed factors; however, one theoretical factor—relation to, interaction with, and support from faculty—split into two factors. Thus, we obtained a factor regarding students’ feeling of identification with being a student and wanting to learn the academic content (academic integration and identification), a factor for their perception of the learning community, and two factors for collaboration with and support from faculty and lecturer availability. Finally, we obtained a factor each for the workload and prior results and experiences of and disappointment in exams; these factors were, however, not reliable.
Table 5. Theoretical categories together with the empirical factors for the teaching domain during terms 1–4. Loadings above 0.3 are marked with blue. Factors with poor reliability (Cronbach’s alpha < 0.6) are marked with gray text and will not be included in a further analysis.
Table 5. Theoretical categories together with the empirical factors for the teaching domain during terms 1–4. Loadings above 0.3 are marked with blue. Factors with poor reliability (Cronbach’s alpha < 0.6) are marked with gray text and will not be included in a further analysis.
Initial factorItemEmpirical factors (1st term)Empirical factors (2nd term)Empirical factors (3rd term)Empirical factors (4th term)
Experienced teaching qualityStudy groupsAlignmentExam clearity Support between classesInstructional clarityFeedback (quantitative)Feedback (qualitative)Active learningHigher order thinkingCooperative learningIntroductory courses (reading)Percieved lack of study skill knowledgeStudent research programsParticipationPercieved teacher differencesExperienced teaching qualityStudy groupsAlignmentRecommended reading materialsSupport between classesInstructional clarityFeedback (quantitative)Feedback (qualitative)Feedback (satisfaction)Active learningHigher order thinkingCooperative learningIntroductory courses (reading)Percieved lack of study skill knowledgeParticipationPercieved teacher differencesAlignmentStudy groupsRecommended reading materialsSupport between classesInstructional clarityFeedback (quantitative)Feedback (qualitative)Feedback (satisfaction)Active learningHigher order thinkingCooperative learningIntroductory courses (reading)Percieved lack of study skill knowledgeStudent researcher programPerception of difficultiesParticipationPercieved teacher differencesStudy groupsAlignmentRecommended reading materialsSupport between classesInstructional clarityFeedback (quantitative)Feedback (qualitative)Feedback (satisfaction)Active learningHigher order thinkingCooperative learningIntroductory courses (reading)Percieved lack of study skill knowledgeStudent researcher programPerception of difficultiesDebatesParticipationPercieved teacher differences
Teaching quality51Experiences of good teaching methods of study program
52All in all I’m satisfied with the quality of teaching in my study program
53There is a big difference in the quality of teaching I recieve
Study groups54I am a part of a study group or have a study buddy that I meet up with regularly throughout the semester
55I am a part of a study group or have a study buddy that I meet up with in the exam periods
Alignment in the teaching56I am aware of what I am supposed to learn in the different courses
57I see the connection between the things we do in class and what I am supposed to learn
58I see the connection between the things we’re are asked to do outside of class and what I’m supposed to learn
59My lecturers are clear in their explanation of what is expected in exams 0.764
60The continuity in my study program as a whole is clear to me
61The academic progression in my study program as a whole is clear to me
62I see the connection between the teaching methods and activities in my studies, and what we’re being assessed on in the exams
63My lecturers support my work between classes by assigning reading materials
64My lecturers support my work between classes by recomending further reading materials
65My lecturers support my work between classes by providing questions for reading materials
66My lecturers support my work between classes by giving obligatory assignments (online, individual or for study groups)
67My lecturers do not ask us to do work between classes
68There is a big difference between my lecturers regarding how they support my work between classes
Instructional clarity69My lectures are good at explaining the material so that I understand it
70The lectures’ instructions for group work and homework assignments are clear
71My lectures ofthen use examples to explain the material
72When the lecturers go through new material in class they often connect it to materials we have learned in other courses
73There is a big difference between my lecturers in regard to how clear they are, and how often they use examples and/or connect new material to things we’ve learned in other courses
Feedback74How often do you written reviece feedback from your lecturer (for instance on assignments)?
75How often do you recieve oral feedback in class (from both lectureres and fellow students)?
76How often do you recieve oral feedback in the form of an individual conversation with a lecturer?
77How often do you engage in group conversations where you get feedback from your lecturer
78How often do you engage in group conversations where you get feedback from your fellow students (peer feedback)?
79The feedback I receive on my assignmetns makes it clear what I haven’t quite understood
80The feedback I receive on my academic work during the semester improves the way in which I learn and work
81I receive sufficient feedback on my academic work during the semester
82There are good opportunities to receive feedback on my performance in exams
83There is a big difference between my lecturers in regard to how much feedcback I receive in ther courses
Active learning84I have thought about questions I would like answered when I arrive to class
85I always come to class prepared
86I try to academically defend my thoughts and ideas either in class, with my study group or with otherstudents/friends/family
87I participate actively in class
88After class I reflect on the content we have worked with
89After a class i revisit the revisit the readings or my notes on them
Higher order thinking90My current study program has provided me with abilities to write academically, clearly and precisely
91My current study program has provided me with abilities to converse academically, clearly and precisesly
92My current study program has provided me with abilities to being able to think critically
93My current study program has provided me with abilities to being able to think analytically
94My current study program has provided me with abilities to being able to analyze numeric and statistical information
95My current study program has provided me with abilities to being able to collaborate
Cooperative learning96My lecturers often use quizzex, clickers, etc. in class
97My lecturers often use mind maps, brainstorms, etc. in class
98We regularly work on developing our knowledge and ideas through wikis, discussion boards, etc.
Courses on study technique and introductory courses99I have been taught how to point out the main arguments of a text
100I have been how to deduce the implicit assumptions in the texts I read
101I have been taught how to critically evaluate the arguments and conclusions of a text
102I lack knowledge about text reading
103I have been taught study techniques/study planning as a part of my study
104I have sougth out courses on study techniques/study planning myself
105I lack knowledge about study techniques/study planning
Student research programms106We/I have done research-like activities as a part of our/my class work
107We/I have done research together with a reseacher at the university
Percep-tion of diffi-culty108Experiences of academic level of study program
109It is my experience that I meet the academic expectations in my study
Coherence110I find it easy to connect what I’m learning to what I already know
Participation in class111My lecturers facilitate discussions/debates either with the entire class or in smaller groups
112My lecturers involve the students in the teaching through student presentations
113My lectruers make us work with the academic content thgrouhg cas- and problem-based teaching
114There is a big difference between my lecturers in regard to how active a role they give the students in class
Cronbach’s alpha0.6100.8510.885-0.5240.7970.7420.7660.7910.8280.7540.8810.4480.6380.6630.6940.6520.8570.8790.5370.5750.7690.7560.8330.6860.6860.8260.7580.8630.5400.6820.7090.8950.8830.5350.6090.7800.7680.8350.6710.7850.8400.7540.8680.5990.583-0.6450.6930.8610.8830.5220.6650.8040.7340.8870.6620.7810.8340.7050.8810.4590.490--0.7240.686
Finally, as illustrated in Table 5, we identified 18 factors within Tinto’s system of teaching. Similarly to the academic systems, the empirically identified factors were very similar to the initial theoretical categories; however, three categories (alignment in the teaching, feedback, and courses on study technique and introductory courses) split into two or more factors. Overall, there was one factor regarding the students’ experience of the teaching quality and the academic level of their education and one on their participation in study groups. Furthermore, there was one factor on alignment (whether the study program was clear and whether the students were aware of what they were supposed to learn and see a connection between the things they were asked to carry out and needed to learn or not), one on exam clarity (if what was expected in exams was clear), one on support for the work they have to complete between classes, and one on instructional clarity, that is, whether the lecturers explained the material and group work well. There were three factors on feedback: one on the frequency of feedback, one on the perceived quality of feedback, and one on students’ satisfaction with the feedback offered. Looking at the activities during the students’ study program that helped develop their knowledge, we identified five empirical factors. There was one factor on active learning, covering how the students prepared for their classes (for instance, whether they had thought about questions that they wanted to have answered), how they participated, and how they post-processed the teaching content after the lectures, one on higher-order thinking (if the study program had provided the students with the ability to write and converse academically, think analytically and critically, and collaborate), one on cooperative learning, that is, whether the students worked on developing their knowledge and ideas using quizzes, clickers, mind maps, brainstorming, wikis, and discussion boards in class, and on the lecturers’ facilitation of discussions/debates, presentations, and case- and problem-based teaching. Finally, there was one on study technique courses on how to read texts. Related to these, we identified a self-evaluating factor concerning the perceived level of expertise regarding study skills. Finally, we referenced Tinto’s teaching system to identify one empirical factor on student research programs, concerning whether the students had participated in research activities or research-like activities as part of their class work. Last, but not least, we identified a factor that was not part of the theoretical categories: perceived teacher differences. This factor emerged from items spread out through the survey, addressing quite different topics (teaching quality, support of work between classes, clarity, feedback, and how active a role students play during class), all of them having perceived teacher differences in common. These items (53, 68, 73, 83, and 114) where the ones added as part of the qualitative pilot study, where students expressed a need to indicate that their teachers and their teaching methods were quite different. As described in the Introduction, in previous publications, we found that empirically identifiable study environment factors differed widely across Nordic countries and Danish university faculties. With regard to temporal variation—i.e., whether the factors vary within a faculty across different terms—as we examine in this article, the picture is different. Altogether Table 3, Table 4 and Table 5 make it clear that the study environment factors are quite stable across terms. Thus, while the previous analyses focused on different country and faculty contexts confirmed the suggestion by Scheerens that pedagogical factors are malleable and vary across contexts [11], interestingly, we cannot, on basis of the analyses in this article, confirm that this is also the case across time.

5.2. Factors Determining Dropout

Table 6 illustrates the binary logistic regression models between drop out and dropout considerations, background parameters, and the study environment factors identified in the factor analysis.
Table 6. Binary regression models for each term, model fit data, and odds ratios for individual predictors of dropout (p < 0.05) with a 95% confidence interval.
Table 6. Binary regression models for each term, model fit data, and odds ratios for individual predictors of dropout (p < 0.05) with a 95% confidence interval.
Model InputsModel Fit DataIndependent Variables
Omnibus Tests (<0.05)Hosmer and Lemeshow Test (>0.05)Nagelkerke R2Significant FactorsOdds Ratio (CI: 95%)Sig. (<0.05)
1st termDropout considerations<0.0010.7139.1%Dropout considerations0.523<0.001
Background parameters0.8340.97013.1%Campus is in the same municipality as respondents’ hometown (1st term)0.3720.004
Study environment factors0.0030.44115.8%Academic integration and identification (A)0.4070.002
Introductory courses (reading) (T)1.9130.007
2nd termDropout considerations<0.0010.8008.8%Campus is in the same municipality as respondents’ hometown (1st term)0.583<0.001
Background parameters0.3280.2148.2%
Study environment factors0.0050.49115.8%Experienced social relations with fellow students (S)0.4060.000
Expected social relations with fellow students (S)1.8920.016
Experienced participation in activities (S)1.5640.048
Participation (T)0.6290.046
3rd termDropout considerations0.4620.5670.9%
Background parameters<0.0010.58821.4%Identified gender = Male0.4880.008
Number of gap years0.7850.017
Study environment factors0.0840.22654.2%Participation (T)0.0220.030
4th termDropout considerations0.7620.4420.0%
Background parameters0.9870.5967.9%Upper secondary program = The Higher General Examination Program (stx)2.6720.027
Study environment factors<0.0010.63336.9%Support from faculty (A)1.7440.015
Feedback (satisfaction) (T)0.3710.001
Active learning (T)1.8880.005
Students’ dropout considerations were significant (Omnibus tests sig. < 0.05) in the first and second terms and explained 9.1% and 8.1% of dropouts, respectively. Dropout considerations are, thus, not a particularly good indicator for actual dropout.
Students’ background parameters explained between 7.9 and 21.4% of dropouts; however, this was only significant (Omnibus tests sig. < 0.05) in the third term (explaining 21.4%). Here, the significant background factors (Odds ratio, sig. < 0.05) preventing dropout were the following: (1) being a male and (2) having several gap years before entering tertiary education.
Looking at the study environment factors, the regression models were significant (Omnibus tests, sig. < 0.05) in three out of four terms and included variables from all three systems: the academic system (A), the social system (S), and teaching (T). In the first term, study environment factors explained 15.8% of dropouts and were, thus, better predictors than dropout considerations and background parameters. We found, in the first term, the factor “academic integration and identification” from the academic system and the factor “introductory courses (reading)” from the teaching system. In the second term, factors from the social system dominated the model. We had three factors from this system—"social relations with teachers”, “experienced participation in activities”, and “experienced level of social activity”—and one factor (“participation”) from the teaching system explaining 15.8% of dropouts. In the third term, when the background parameters explained 21.4% of dropouts, the study environment was not a significant predictor of dropouts (Omnibus test, sig. > 0.05). The share of dropouts explained by the study environment factors increased significantly in the fourth term, where one factor from the academic system “support from faculty” and two factors from the teaching system (“feedback (satisfaction)” and “active learning”) explained 36.9% of dropouts.

6. Discussion

In this article, we first set out to investigate to what extent study environment factors were malleable across different terms. In our investigation, we considered four factor structures, identified in the exploratory factor analyses. We found that only few of the empirical structures resulted in the structure anticipated by the theoretical categories, but that the empirically identified factors were relatively stable across terms. In the Introduction, we formulated the hypothesis that the study environment factors would be malleable across different terms. Our analyses confirm that the study environment factors were malleable, but not across terms. Thus, this hypothesis was not confirmed. Although the idea of pedagogical factors’ malleability is well-described in epistemological theories on education and teaching [15], the empirical contributions on this topic are very scarce. In these epistemological theories, it is suggested that the malleability is due to a variety of empirical conditions, including both traditions and understandings of schooling sedimented in different school systems [30] and students’ conceptions or beliefs [31]. We could not conclude anything about the exact origin of the malleability of the study environment factors in our study. We had only one country and one faculty represented in this study, but, based on our previous studies on malleability across Nordic countries and university faculties (cf. Introduction), there is reason to assume that it would look different in other contexts. Meanwhile, students have been predicted to play an important role in the formation of study environment factors [32,33,34,35]. We examined the same students over time, despite large variations in the response rates, and the stability in the empirically identified factors across the terms could indicate that what were recognized as study environment factors remained fairly stable within the same group of students (here, Danish humanities students).
In addition to investigating the malleability of the study environment factors, the article set out to investigate whether study environment factors, students’ background parameters, or their dropout considerations best explain dropout from tertiary humanities education during different terms and which study environment factors explained students’ dropout over different terms. Our study confirmed our hypothesis that study environment factors, students’ background parameters, and dropout considerations all explain dropout. It is, however, interesting that the study environment factors were the best explanations for dropout in the first term, second term, and fourth term. Thus, across the terms, the environment factors were the best predictors of dropout. However, it is interesting that the factors that explained dropout varied significantly over time. In the first term, “academic integration and identification” and “introductory courses (reading)” explained dropout, while, in the second term, these were “social relations with teachers”, “experienced participation in activities”, “experienced level of social activity”, and “participation”. In the fourth term, the factors were “support from faculty”, “feedback (satisfaction)”, and “active learning”. This indicated a maturation as the students progressed through their studies: e.g., looking at significant study environment factors, there was a revolution from study skills to participation in classes and active learning. This maturation in educational contexts is essential as a criterion for talking about what educational quality is. It has been previously discussed whether students’ perceptions of educational quality might be biased [36], but, based on an empirical review, Follman [37] concluded that students’ perceptions of the study environment are valid indicators of quality. Thus, it seems reasonable to think of the total number of factors as a framework for study environment quality and take their variations over time into account when working on strengthening the quality of the study environment.
The discussion on the study environment’s quality leads us to the article’s fourth and final research question: namely, how can student persistence be supported. As described in the Introduction, the lack of opportunity to take malleability into account in effect studies is described as a major reason for the lack of progression in studies of teaching quality, and this article’s focus on the study environment is justified by a desire to focus on the so-called “changeable” [9] factors such as intervention programs, policies, and practices hypothesized (or believed) to enhance educational performance [11], which can be intervened on and are, therefore, the strongest basis for strengthening the quality of education [10] and, thus, hopefully retaining students in humanities programs. When it comes to the question of what can, therefore, be implemented to strengthen the quality of the study environment at different times during the students’ educational trajectory, it is interesting that the regression models include variables from all three systems—the academic system, the social system, and teaching—changing so that the academic and teaching factors dominate the first-term model, the social factors dominate the second-term model, and the teaching factors dominate the fourth-term factor. In the first semester, it is crucial that the students feel academically integrated and able to identify with the academic characteristics of their education, while, in the second semester, social aspects are important. Perhaps, it could be the case that the students first must find out whether they have chosen the right education context academically and only then they have capacity to focus and reflect on social relations and activities. In the third term, background parameters (gap years before entering tertiary education and being male) were the only factors identified that explained dropout. Perhaps, this might be because the third semester is a difficult semester, due to the first phase, when everything is new and exciting, being over and this third one being, instead, a transition phase before the end phase, during which students are close to finishing their journey. Such a transition phase requires persistence and resilience, and it is crucially important for students to believe in themselves when dealing with this phase. It is also well documented that male students have higher degrees of academic self-efficacy than female students [38] and that academic self-efficacy varies with age [39]. In our sample, we recorded 28,7% of respondents identifying as male students and 71.4% as female students (see Appendix E). This is interesting because Ryder et al. note that, when girls belong to the minority gender, they often try to fit in and “be like the guys” in male-dominated tertiary study programs, whereas, when boys are the minority gender in a female-dominated program, they will seek to stand out from the girls [40], which might explain why the boys in our study were more cushioned against dropping out. In the fourth term, students are expected to have integrated into their study programs by having assimilated and adapted to their new environment [41]. They focus on completing and succeeding in their education, which might be why teaching aspects such as support from faculty and feedback are the focus. We suggest that students who seek a lot of support from faculty might not be as integrated as students who do not seek this kind of support, or, at least, students who receive support from faculty during the fourth term are particularly likely to quit their program.
The results presented above cannot necessarily be transferred to other contexts. In Denmark, about 25% of each youth cohort attends university. Programs are tax-financed and free of charge, with students receiving financial support from the government. Despite these benefits, 19% of university students (with large program variations) drop out within their first year. Students must, in no more than three attempts, pass specific exams explicitly selected for their programs before the end of the first year. We hypothesize that these financial and regulatory factors might have influenced the importance of certain study environment factors over others in our particular context. Despite the shortcomings of our findings in terms of transferability, it is an important point across contexts that what constitutes relevant quality assurance initiatives varies over time. In order to launch targeted quality assurance initiatives, we encourage colleagues to replicate our study across different faculties and universities. The time dependency of retention initiatives can theoretically be reflected using Tinto’s work, who, in 2016, described his understanding of dropout as ”a journey that occurs over time in which students are becoming capable and educated upon graduation. As such we need to be sensitive to the impact of time on the way we address the issue of student retention” [42]. A straightforward explanation of the time dependence might be that the objectives and assignments in the final phase of a study (clearly exemplified by internships and theses, among others) require more academic skills and executive functions. This explains many of our findings on support and the differences between male and female students. Less straightforward explanations can perhaps be found in van Gennep’s theory on transition rituals in tribal societies [41], in which Tinto found a reference for the procedural understanding of dropouts. van Gennep’s theory proposes that transitions take place through three phases—separation, transition, and incorporation [43]. Tinto, therefore, describes the relation between the procedural understanding of dropout and an institution in the following way: ”Decisions to withdraw are more a function of what occurs after entry than what precedes it” [43]. He, thus, suggests that “The problem of becoming a new member of a community that concerned Van Gennep is conceptually similar to that of becoming a student in a college. It follows that we may also conceive of the process of institutional persistence as conceived of tree major stages or passages—separation, transition, and incorporation—through which students typically must pass in order to complete their degree programs” [43]. As students proceed through their course of study, we can see how the parameters associated with retention change. During the first term, when students enter the liminal phase, academic integration and identification play a protective role. In the second term, social aspects are more crucial for students’ persistency. Turner (1969) describes how students in the liminal phase might experience a sense of disorientation, uncertainty, and/or anticipation and excitement and how they tend to bond socially through these feelings and their liminal experience [43]. He describes this societal mode in the liminal phase as unstructured and relatively undifferentiated (unlike non-liminal societies’ structural, hierarchical modes), which makes it easy for one to bond with co-liminal individuals. We note that students who experienced social relationships with fellow students were more likely to persist, whereas students who expected social relationships with fellow students were more likely to drop out in the second term.
This article’s contributions can be summarized under different categories. First, this article contributes a number of empirical findings regarding different study environment factors and their malleability, the extent to which study environment factors explain dropout, and the possibility of strengthening study quality in order to support students’ persistence. These findings contribute knowledge that can support early-alert interventions. The data have shown that identifying at-risk students early on in their courses and intervening accordingly makes a positive impact on student success [44,45], but studies have also shown that addressing students who are at risk can be counterproductive and, therefore, ethically indefensible. The approach suggested in this paper allows us to shift the focus from students at risk to the study environment and link early-alert interventions to the overall study environment’s quality development.
In addition to the empirical findings, this article provides a number of theoretical contributions. The differentiation between Tinto’s three systems—the academic system, the social system, and teaching—and, thus, the revision of his model are significant contributions to future studies in contexts other than the one examined in this article. Similarly, the framework for educational quality in which the results can be summarized is another significant contribution.
Finally—and, probably, most importantly—this article makes very significant methodological contributions through the development of an approach and the validation of an instrument to investigate study environment factors. What distinguishes our approach from previous attempts is that we, inspired by the works of Cheung and Slavin and Scheerens [11,19], took the malleability of the study environment factors into account. Thus, there is reason to assume that this is the decisive reason for our study’s success and investigate this assumption in later works. Regarding the context of this article, we will continue to work on strengthening the categories that we did not succeed in surveying, and we hereby invite colleagues in the field to accomplish the same in different contexts.
This article’s contributions are relevant in and applicable to educational and research contexts as well as political contexts, where they can be used as a basis for guidelines related to educational quality in higher education. It is, however, important to take into account the limitations of this study due to its limited context and not yet fully developed instrument. More studies in other contexts are needed to strengthen the method and instrument presented in this paper, and we hereby invite colleagues to collaborate on this.

Author Contributions

The authors have worked together on conceptualization, methodology, validation, formal analysis, investigation, data curation, writing and editing, and project administration. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of University of Southern Denmark (protocol number 17/57237) on 6 October 2017.

Informed Consent Statement

Informed consent was obtained from all the subjects involved in this study.

Data Availability Statement

Data can be obtained by contacting the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Items addressing categories in the social system.
Table A1. Items addressing categories in the social system.
CategoriesStatement
Social integration1I feel lonely in my study program.
2I have a good relationship with my fellow students.
3 I feel like part of a community in my study program.
4 I continue to study in this study program because we have a good sense of community in my class.
5 The level of social activity is high.
6My relationship with my fellow students is close.
7I participate in academic activities.
8I participate in social activities organized by either the university or my class.
9I expected the level of social activity to be high.
10I expected the relationship with my fellow students to be close.
11I expected the relationship with my lecturers to be close.
12My relationship with my lecturers is close.
Extracurricular activities13Participation in social study organizations.
14Participation in academic communities.
15Carry out academic work (paid or unpaid) as a mentor/study advisor, teaching assistant, or student employee.
16Go on an exchange as a part of the study program.
17Complete an internship as a part of the study program.
18Have a student job within a field where graduates with the same education often find employment.
Institutional integrity19The university lives up to my expectations.
20My study program lives up to my expectations.
21The classes live up to my perception of the values of the university.
22The classes live up to my perception of the values of my study program.
23My personal values are well aligned with my perception of the values of the university.
24My personal values are well aligned with my perception of the values of my study program.
Social infrastructure25The other students participate in academic activities.
26The other students participate in social activities organized by either the university or my class.
27There is a satisfactory selection of social activities at the university.
28It has been easy to interact with people from different academic years.
Social aspects of the induction program Not relevant across terms
Demographic composition of the student body (age and sex) Addressed through register data
Demographic composition of the student body (ethnicity) Addressed through register data
Size of the institution Not relevant for a one-institution study

Appendix B

Table A2. Items addressing categories in the academic system.
Table A2. Items addressing categories in the academic system.
CategoriesStatement
Academic integration and identification29I like being a student.
30My study is an essential part of my life.
31My study is a significant part of who I am.
32It is important to me to learn as much as possible in my courses.
33Experiences of interesting academic content of study program.
Perception of learning community34I seek help from my fellow students if I experience challenges in my studies.
35I share notes/literature/ideas for assignments with my fellow students.
36My fellow students share notes/literature/ideas for assignments with me.
37Our lecturers have introduced us to ways to work well in a group, e.g., how you align your expectations, handle potential conflict, etc.
Relation to and interaction with and support from faculty38I have worked with a lecturer or another employee at the university, for example, on an article, as a student assistant, on a board, etc.
39I have discussed study-related topics, ideas, or academic terms with a lecturer or another employee at the university outside of class.
40I have discussed my academic abilities with a lecturer or another employee at the university outside of class.
41At my campus, you often see the lecturers outside of the classroom (for instance, in hallways and common areas).
42Most lecturers are easy to get in touch with.
43The lecturers that I have had contact with generally seem interested in the students.
44I have reached out to a study counselor, a study secretary, a mentor, the legal department, or another employee at the university to discuss something concerning my studies.
Workload45Experiences of workload of study program.
46How many hours a week do you spend completing exercises that the lecturer has asked you to solve before class? (e.g., work or questions for texts, small problems, blogs, wikis, etc.)
47How many hours a week do you spend preparing for class and/or post-processing lectures? (alone or in groups)
GPA and learning Addressed through register data
Prior exam results, experiences and disappointments48If I experience disappointment in exams, I am good at shaking it off.
49My exams have gone as I expected.
50I have felt disappointment or a sense of failure in exams.
Academic aspects of the induction program Not relevant across terms
Support and guidance See items under the initial factor “relation to and interaction with and support from faculty”

Appendix C

Table A3. Items addressing categories in the teaching system.
Table A3. Items addressing categories in the teaching system.
CategoriesStatement
Teaching quality51Experiences of good teaching methods of the study program.
52All in all, I am satisfied with the quality of teaching in my study program.
53There is a big difference in the quality of teaching I receive.
Study groups54I am part of a study group or have a study buddy that I meet up with regularly throughout the semester.
55I am part of a study group or have a study buddy that I meet up with in the exam periods.
Alignment in the teaching56I am aware of what I am supposed to learn in the different courses.
57I see the connection between the things we accomplish in class and what I am supposed to learn.
58I see the connection between the things we are asked to complete outside of class and what I am supposed to learn.
59My lecturers are clear in their explanation of what is expected in exams.
60The continuity in my study program as a whole is clear to me.
61The academic progression in my study program as a whole is clear to me.
62I see the connection between the teaching methods and activities in my studies, and what we are being assessed on in the exams.
63My lecturers support my work between classes by assigning reading materials.
64My lecturers support my work between classes by recommending further reading materials.
65My lecturers support my work between classes by providing questions for reading materials.
66My lecturers support my work between classes by giving obligatory assignments (online, individual, or for study groups).
67My lecturers do not ask us to complete work between classes.
68There is a big difference between my lecturers regarding how they support my work between classes.
Instructional clarity69My lectures are good at explaining the material so that I understand it.
70The lectures’ instructions for group work and homework assignments are clear.
71My lectures often use examples to explain the material.
72When the lecturers go through new material in class, they often connect it to materials we have learned in other courses.
73There is a big difference between my lecturers in regard to how clear they are and how often they use examples and/or connect new material to things we have learned in other courses.
Feedback74How often do you receive written feedback from your lecturer (for instance, on assignments)?
75How often do you receive oral feedback in class (from both lecturers and fellow students)?
76How often do you receive oral feedback in the form of an individual conversation with a lecturer?
77How often do you engage in group conversations where you receive feedback from your lecturer?
78How often do you engage in group conversations where you receive feedback from your fellow students (peer feedback)?
79The feedback I receive on my assignments makes it clear what I have not quite understood.
80The feedback I receive on my academic work during the semester improves the way in which I learn and work.
81I receive sufficient feedback on my academic work during the semester.
82There are good opportunities to receive feedback on my performance in exams.
83There is a big difference between my lecturers in regard to how much feedback I receive in their courses.
Active learning84I have thought about questions I would like answered when I arrive to class.
85I always come to class prepared.
86I try to academically defend my thoughts and ideas in class, with my study group, or with other students/friends/family.
87I participate actively in class.
88After class, I reflect on the content we have worked with.
89After a class, I revisit the readings or my notes on them.
Higher order thinking90My current study program has provided me with the abilities to write academically, clearly, and precisely.
91My current study program has provided me with the abilities to converse academically, clearly, and precisely.
92My current study program has provided me with the ability to think critically.
93My current study program has provided me with the ability to think analytically.
94My current study program has provided me with the abilities needed to be able to analyze numeric and statistical information.
95My current study program has provided me with the ability to collaborate.
Cooperative learning96My lecturers often use quizzes, clickers, etc., in class.
97My lecturers often use mind maps, brainstorming, etc., in class.
98We regularly work on developing our knowledge and ideas through wikis, discussion boards, etc.
Courses on study technique and introductory courses99I have been taught how to point out the main arguments of a text.
100I have been taught how to deduce the implicit assumptions in the texts I read.
101I have been taught how to critically evaluate the arguments and conclusions of a text.
102I lack knowledge about text reading.
103I have been taught study techniques/study planning as part of my study.
104I have sought out courses on study techniques/study planning myself.
105I lack knowledge about study techniques/study planning.
Student research programs106We/I have carried out research-like activities as part of our/my class work.
107We/I have carried out research together with a researcher at the university.
Perception of difficulty108Experiences of the academic level of the study program.
109It is my experience that I meet the academic expectations in my study program.
Coherence110I find it easy to connect what I am learning to what I already know.
Participation in class111My lecturers facilitate discussions/debates either with the entire class or in smaller groups.
112My lecturers involve the students in the teaching through student presentations.
113My lecturers make us work with the academic content through case- and problem-based teaching.
114There is a big difference between my lecturers in regard to how active a role they give the students in class.

Appendix D

Table A4. Items addressing dropout considerations.
Table A4. Items addressing dropout considerations.
CategoryStatement
Dropout consideration115During the last six months, I have doubted whether I should continue in the study program.
116During the last six months, I have considered dropping out.
117During the last six months, I have talked with friends and/or family about dropping out.

Appendix E

Table A5. Student characteristics.
Table A5. Student characteristics.
GenderAge When Matriculating
Male79828.7%20 or younger55019.8%
Female198571.4%21–25160157.6%
26–3037813.6%
31 or older2338.4%
Additional descriptive data can be retrieved by contacting the authors.

References

  1. Scheerens, J.; Marks, G.N. Malleability in Educational Effectiveness: What Are Realistic Expectations about Effect Sizes? Introduction to the Special Issue; Taylor & Francis: Abingdon, UK, 2017; Volume 23, pp. 143–147. [Google Scholar]
  2. Felby, L.C.; Kristiansen, B. Førsteårsdidaktik—Hvad og Hvordan? Center for Undervisningsudvikling og digitale Medier, Aarhus Universitet: Aarhus, Denmark, 2020. [Google Scholar]
  3. Tinto, V. Research and practice of student retention: What next? J. Coll. Stud. Retent. Res. Theory Pract. 2006, 8, 1–19. [Google Scholar] [CrossRef]
  4. Tinto, V. Enhancing student success: Taking the classroom success seriously. Int. J. First Year High. Educ. 2012, 3, 1. [Google Scholar] [CrossRef]
  5. Aljohani, O. A Comprehensive Review of the Major Studies and Theoretical Models of Student Retention in Higher Education. High. Educ. Stud. 2016, 6, 1–18. [Google Scholar] [CrossRef]
  6. Gore, P.; Metz, A.J. First Year Experience Programs, Promoting Successful Student Transition. In Encyclopedia of International Higher Education Systems and Institutions; Shin, J.C., Teixeira, P., Eds.; Springer: Dordrecht, The Netherlands, 2017; pp. 1–6. [Google Scholar] [CrossRef]
  7. Braxton, J.; Hirschy, A.S. Reconceptualizing antecedents of social integration in student departure. In Retention and Student Success in Higher Education; Yorke, M., Longden, B., Eds.; McGraw-Hill Education: London, UK, 2004; pp. 89–103. [Google Scholar]
  8. Qvortrup, A.; Lykkegaard, E. Study environment factors associated with retention in higher education. High. Educ. Pedagog. 2022, 7, 37–64. [Google Scholar] [CrossRef]
  9. Yik, B.J.; Raker, J.R.; Apkarian, N.; Stains, M.; Henderson, C.; Dancy, M.H.; Johnson, E. Evaluating the impact of malleable factors on percent time lecturing in gateway chemistry, mathematics, and physics courses. Int. J. STEM Educ. 2022, 9, 15. [Google Scholar] [CrossRef]
  10. Hanushek, E.A. The economic value of higher teacher quality. Econ. Educ. Rev. 2011, 30, 466–479. [Google Scholar] [CrossRef]
  11. Scheerens, J. The perspective of “limited malleability” in educational effectiveness: Treatment effects in schooling. Educ. Res. Eval. 2017, 23, 247–266. [Google Scholar] [CrossRef]
  12. Bell, A.; Ward, P.; Tamal, M.E.H.; Killilea, M. Assessing recall bias and measurement error in high-frequency social data collection for human-environment research. Popul. Environ. 2019, 40, 325–345. [Google Scholar] [CrossRef]
  13. Eicher, V.; Staerklé, C.; Clémence, A. I want to quit education: A longitudinal study of stress and optimism as predictors of school dropout intention. J. Adolesc. 2014, 37, 1021–1030. [Google Scholar] [CrossRef]
  14. Truta, C.; Parv, L.; Topala, I. Academic engagement and intention to drop out: Levers for sustainability in higher education. Sustainability 2018, 10, 4637. [Google Scholar] [CrossRef]
  15. Qvortrup, A.; Lykkegaard, E. Malleable Factors in Teaching: Why and How to Address Them from a Constructivist Perspective. Scand. J. Educ. Res. 2024, 68, 355–370. [Google Scholar] [CrossRef]
  16. Rivkin, S.G.; Hanushek, E.A.; Kain, J.F. Teachers, schools, and academic achievement. Econometrica 2005, 73, 417–458. [Google Scholar] [CrossRef]
  17. Rockoff, J.E.; Jacob, B.A.; Kane, T.J.; Staiger, D.O. Can you recognize an effective teacher when you recruit one? Educ. Financ. Policy 2011, 6, 43–74. [Google Scholar] [CrossRef]
  18. Muijs, D.; Brookman, A. Quantitative methods. In The Routledge International Handbook of Educational Effectiveness and Improvement; Chapman, D., Muijs, D., Reynolds, D., Sammons, P., Teddlie, C., Eds.; Routledge: London, UK, 2016; pp. 173–201. [Google Scholar]
  19. Cheung, A.C.; Slavin, R.E. How methodological features affect effect sizes in education. Educ. Res. 2016, 45, 283–292. [Google Scholar] [CrossRef]
  20. Slavin, R.; Madden, N.A. Measures inherent to treatments in program effectiveness reviews. J. Res. Educ. Eff. 2011, 4, 370–380. [Google Scholar] [CrossRef]
  21. Scammacca, N.; Roberts, G.; Vaughn, S.; Edmonds, M.; Wexler, J.; Reutebuch, C.K.; Torgesen, J.K. Interventions for Adolescent Struggling Readers: A Meta-Analysis with Implications for Practice; Center on Instruction: Portsmouth, NH, USA, 2007. [Google Scholar]
  22. de Boer, H.; Donker, A.S.; van der Werf, M.P. Effects of the attributes of educational interventions on students’ academic performance: A meta-analysis. Rev. Educ. Res. 2014, 84, 509–545. [Google Scholar] [CrossRef]
  23. Edmonds, M.S.; Vaughn, S.; Wexler, J.; Reutebuch, C.; Cable, A.; Tackett, K.K.; Schnakenberg, J.W. A synthesis of reading interventions and effects on reading comprehension outcomes for older struggling readers. Rev. Educ. Res. 2009, 79, 262–300. [Google Scholar] [CrossRef]
  24. Lykkegaard, E.; Qvortrup, A. The malleability of students’ perception of key teaching factors within and across Nordic contexts: Validating the 7Cs framework. Educ. North 2024, 31, 79–105. [Google Scholar]
  25. Lykkegaard, E.; Qvortrup, A. Longitudinal mixed-methods analysis of tertiary students’ dropout consideration. Front. Educ. 2024, 9, 1399714. [Google Scholar] [CrossRef]
  26. Astin, A.W. Student involvement: A developmental theory for higher education. J. Coll. Stud. Pers. 1984, 25, 297–308. [Google Scholar]
  27. Tinto, V. Classrooms as communities: Exploring the educational character of student persistence. J. High. Educ. 1997, 68, 599–623. [Google Scholar] [CrossRef]
  28. Field, A. Discovering Statistics Using IBM SPSS Statistics; Sage: London, UK, 2013. [Google Scholar]
  29. Ursachi, G.; Horodnic, I.A.; Zait, A. How reliable are measurement scales? External factors with indirect influence on reliability estimators. Procedia Econ. Financ. 2015, 20, 679–686. [Google Scholar] [CrossRef]
  30. Hopmann, S. ‘Didaktik meets Curriculum’revisited: Historical encounters, systematic experience, empirical limits. Nord. J. Stud. Educ. Policy 2015, 2015, 27007. [Google Scholar] [CrossRef]
  31. Olteanu, C.; Olteanu, L. Enhancing mathematics communication using critical aspects and dimensions of variation. Int. J. Math. Educ. Sci. Technol. 2013, 44, 513–522. [Google Scholar] [CrossRef]
  32. Priestley, M.; Biesta, G.; Philippou, S.; Robinson, S. The teacher and the curriculum: Exploring teacher agency. In The SAGE Handbook of Curriculum, Pedagogy and Assessment; Wyse, D., Hayward, L., Pandya, J., Eds.; Sage: London, UK, 2015; pp. 187–201. [Google Scholar]
  33. Carlgren, I.; Klette, K. Reconstructions of Nordic Teachers: Reform policies and teachers’ work during the 1990s. Scand. J. Educ. Res. 2008, 52, 117–133. [Google Scholar] [CrossRef]
  34. Liljenberg, M. Teacher leadership modes and practices in a Swedish context—A case study. Sch. Leadersh. Manag. 2016, 36, 21–40. [Google Scholar] [CrossRef]
  35. Pietarinen, J.; Pyhältö, K.; Soini, T. Large-scale curriculum reform in Finland–exploring the interrelation between implementation strategy, the function of the reform, and curriculum coherence. Curric. J. 2017, 28, 22–40. [Google Scholar] [CrossRef]
  36. Goe, L.; Bell, C.; Little, O. Approaches to Evaluating Teacher Effectiveness: A Research Synthesis; National Comprehensive Center for Teacher Quality: Washington, DC, USA, 2008. [Google Scholar]
  37. Follman, J. Secondary school students’ ratings of teacher effectiveness. High Sch. J. 1992, 75, 168–178. [Google Scholar]
  38. Robinson, K.A.; Perez, T.; White-Levatich, A.; Linnenbrink-Garcia, L. Gender differences and roles of two science self-efficacy beliefs in predicting post-college outcomes. J. Exp. Educ. 2022, 90, 344–363. [Google Scholar] [CrossRef]
  39. Huang, C. Gender differences in academic self-efficacy: A meta-analysis. Eur. J. Psychol. Educ. 2013, 28, 1–35. [Google Scholar] [CrossRef]
  40. Ryder, J.; Ulriksen, L.; Bøe, M.V. Understanding Student Participation and Choice in Science and Technology Education: The Contribution of IRIS. In Understanding Student Participation and Choice in Science and Technology Education; Springer: Berlin/Heidelberg, Germany, 2015; pp. 351–366. [Google Scholar]
  41. van Gennep, A. The Rites of Passage (MB Vizedom & GL Caffee, Trans.); Routledge and Kegan Paul: London, UK, 1960. [Google Scholar]
  42. Tinto, V. Student success does not arise by chance. In Ten Times the First Year; Bonne, P., Nutt, D., Eds.; LannooCampus: Leuven, Belgium, 2016. [Google Scholar]
  43. Tinto, V. Leaving College: Rethinking the Causes and Cures of Student Attrition; ERIC: New York, NY, USA, 1987. [Google Scholar]
  44. Hudson-Holness, D.; Minchala, M.; Le, U. Improving Student Success in Introductory Chemistry Using Early Alert and Intervention. Int. J. Res. Educ. Sci. 2022, 8, 752–764. [Google Scholar] [CrossRef]
  45. Sage, A.J.; Cervato, C.; Genschel, U.; Ogilvie, C.A. Combining academics and social engagement: A major-specific early alert method to counter student attrition in science, technology, engineering, and mathematics. J. Coll. Stud. Retent. Res. Theory Pract. 2021, 22, 611–626. [Google Scholar] [CrossRef]
Figure 1. Modified institutional departure model [27].
Figure 1. Modified institutional departure model [27].
Education 14 00904 g001
Table 1. Categories belonging to the academic system, the social system, and teaching [8].
Table 1. Categories belonging to the academic system, the social system, and teaching [8].
Social SystemAcademic SystemTeaching
Social integrationAcademic integration and identificationTeaching quality
Extracurricular activitiesPerception of learning
Community
Study groups
Institutional integrityRelation to, interaction with, and support from facultyAlignment in the teaching
Social infrastructureWorkloadInstructional clarity
Social aspects of the induction programGPA and learningFeedback
Demographic composition of the student body (age and sex)Prior results and experiences of and disappointment in examsActive learning: engaging teaching with discussions and group work
Demographic composition of the student body (ethnicity)Academic aspects of the induction programHigher-order thinking
Size of the institutionSupport and guidanceCooperative learning
Courses on study technique and introductory courses
Student research programs
Perception of difficulty
Coherence between courses in the study program
Participation in class
Table 2. Response rates and dropout rates for students matriculating in 2017–2019. The response rates in brackets exclude students who matriculated in 2017 (these surveys served as pilot tests).
Table 2. Response rates and dropout rates for students matriculating in 2017–2019. The response rates in brackets exclude students who matriculated in 2017 (these surveys served as pilot tests).
Students Matriculated in 2017–2019Survey Responses pr. TermDropout pr. Term
1. term27812532 [2.014]91%271%
2. term25921300 [1.060]50%1897%
3. term245893738%1345%
4. term238254223%763%
5. term232532014%572%
6. term22871858%382%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Qvortrup, A.; Lykkegaard, E. The Malleability of Higher Education Study Environment Factors and Their Influence on Humanities Student Dropout—Validating an Instrument. Educ. Sci. 2024, 14, 904. https://doi.org/10.3390/educsci14080904

AMA Style

Qvortrup A, Lykkegaard E. The Malleability of Higher Education Study Environment Factors and Their Influence on Humanities Student Dropout—Validating an Instrument. Education Sciences. 2024; 14(8):904. https://doi.org/10.3390/educsci14080904

Chicago/Turabian Style

Qvortrup, Ane, and Eva Lykkegaard. 2024. "The Malleability of Higher Education Study Environment Factors and Their Influence on Humanities Student Dropout—Validating an Instrument" Education Sciences 14, no. 8: 904. https://doi.org/10.3390/educsci14080904

APA Style

Qvortrup, A., & Lykkegaard, E. (2024). The Malleability of Higher Education Study Environment Factors and Their Influence on Humanities Student Dropout—Validating an Instrument. Education Sciences, 14(8), 904. https://doi.org/10.3390/educsci14080904

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop