Next Article in Journal
Agroecology in Morocco at a Crossroads: Structural Limits, Transition Constraints, and Pathways for a Water-Resilient Transformation
Previous Article in Journal
The Effect of Total Quality Management on Organisational Performance: A Systematic Review and Meta-Analysis of Structural Equation Modelling (SEM) Studies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Conceptualising Higher-Education Teacher Excellence for More Inclusive and Sustainable Evaluation: An Exploratory Sequential Mixed-Methods Study

by
Nina Kristl
1,*,
Gregor Sočan
2 and
Danijela Makovec Radovan
1
1
Department of Educational Sciences, Faculty of Arts, University of Ljubljana, 1000 Ljubljana, Slovenia
2
Department of Psychology, Faculty of Arts, University of Ljubljana, 1000 Ljubljana, Slovenia
*
Author to whom correspondence should be addressed.
Sustainability 2026, 18(10), 4858; https://doi.org/10.3390/su18104858
Submission received: 17 April 2026 / Revised: 7 May 2026 / Accepted: 10 May 2026 / Published: 13 May 2026

Abstract

Higher-education teacher excellence is widely regarded as a key dimension of educational quality, yet the ways it is commonly evaluated are often only weakly aligned with its multidimensional nature. A necessary step towards fair, inclusive, and sustainable quality assurance is therefore to specify what counts as excellence before attempting to measure it. In this context, the present study examined how university teachers conceptualise excellence and what these accounts imply for evaluation design. Using an exploratory sequential mixed-methods design, Study 1 elicited teachers’ accounts of excellence through a focus group with pedagogy and didactics experts (n = 7) and interviews with experienced university teachers from other disciplinary backgrounds (n = 5), yielding eight conceptual domains and a 95-item pool. In Study 2, a self-selected sample of 261 university teachers from a comprehensive university across disciplinary fields rated the importance of each item. Dimensionality analysis yielded an interpretable six-dimensional role-based structure of higher-education teacher excellence. The findings suggest that higher-education teacher excellence is unlikely to be adequately represented by student-only ratings or other narrow indicators. Rather, the results indicate the need for a multisource approach to evaluating higher-education teacher excellence, in which sources of evidence are aligned with the facets of excellence that different raters can legitimately observe, with implications for more inclusive and sustainable quality assurance in higher education.

1. Introduction

Universities cite teacher excellence as an indicator of educational quality, yet what constitutes excellence often remains conceptually unsettled and is operationalised inconsistently. The term appears in strategic plans, promotion criteria, quality assurance systems, and teaching awards [1,2,3,4], but the criteria it signifies are not always explicit or coherent. This is not merely a measurement issue. In the context of Sustainable Development Goal 4 (hereinafter SDG 4), where educational quality is defined in terms of equity, inclusion, and effective learning environments, it is also a quality assurance issue [5,6].
If universities are to evaluate teacher excellence in fair, inclusive, and sustainable ways, they must first define what constitutes teacher excellence before basing evaluations on specific indicators.
The connection between teacher excellence and sustainable quality assurance can be further clarified through the lens of Education for Sustainable Development (ESD). Within UNESCO’s ESD for 2030 framework [5], sustainability in education extends beyond curricular content and is linked to the transformation of learning environments and educators’ capacities. Recent work in higher education develops this further by highlighting the importance of teachers’ competences for student-centred sustainable learning and emphasising the need for relevant and valid indicators when embedding ESD in higher-education systems [7,8]. From this perspective, the way universities define and evaluate teacher excellence is part of how they sustain educational quality over time. At the institutional level, excellence is often translated into accountability requirements and demonstrable performance indicators [9,10]. At the level of educational practice, however, excellence is also experienced as an ethical, relational, and professional commitment to students’ learning and development, enacted through pedagogical decisions and professional judgement [11]. This tension between top-down representations of excellence and the lived reality of academic work creates a central evaluative challenge: what exactly should be assessed when higher-education institutions claim to evaluate teacher excellence, and how can such assessment be anchored in a defensible conceptual model?
One reason for this difficulty is that excellence is inherently normative. Unlike effectiveness, understood narrowly as the attainment of predefined outcomes [12], excellence involves value judgements about what is desirable in teaching and the academic role [13,14,15]. Readings [16] argued that “excellence” often functions as a flexible and seemingly self-evident signifier that can be mobilised across contexts without requiring shared substantive content. In higher education, the shift from discourses of quality to discourses of excellence has frequently been linked to marketisation and competitive positioning, whereby excellence becomes both a promise to students and a resource for institutional differentiation [17,18,19]. Yet if excellence is left conceptually vague, it risks becoming a catch-all label that legitimises evaluation practices without clarifying what is actually being valued [11,16].
The problem is intensified by the fact that different stakeholder groups may endorse excellence while emphasising different criteria. Student-facing perspectives, as reflected in research on student evaluations of teaching, often privilege clarity, organisation, fairness, and a supportive learning environment [20]. Professional and disciplinary perspectives tend to emphasise content knowledge, general pedagogical knowledge, and pedagogical content knowledge [21]. Institutional perspectives, meanwhile, often privilege compliance with formal frameworks and demonstrable performance within evaluative systems [9]. These perspectives are not mutually exclusive, but neither are they interchangeable. When conflated, the result may be an evaluation system in which claims about excellence are inferred from narrow and only partially relevant indicators.
If higher education is to contribute to inclusive, equitable, and high-quality learning, then the ways in which universities define and evaluate teacher excellence matter. Evaluation systems shape what is recognised and rewarded within academic work. They may encourage relationally responsive and inclusive teaching, or they may privilege only those aspects of practice that are easiest to quantify. They may support professional development, or they may reduce excellence to a limited set of administratively convenient proxies. From this perspective, fair and sustainable quality assurance requires conceptual clarity about what excellent higher-education teaching entails and which facets of that excellence are visible to which evaluators.

1.1. Conceptualising Higher-Education Teacher Excellence as a Role-Based, Multidimensional Construct

Higher-education teacher excellence is unlikely to be adequately captured by a single dimension and instead spans multiple, partially overlapping domains, including pedagogical competence, relational and ethical aspects of teaching, and the knowledge base and professional responsibilities associated with academic work [11,14,21]. Excellence is frequently associated with the capacity to make disciplinary knowledge accessible without oversimplifying it, to design learning environments that support student learning, and to cultivate students’ active participation [22,23]. Importantly, these principles are not limited to technique; they also involve pedagogical judgement about what counts as meaningful learning in a given disciplinary context [21,24].
Higher-education teaching is an interactional process in which students’ motivation, sense of belonging, and willingness to take intellectual risks are shaped by the learning environment [25,26,27]. Research on student engagement and belonging suggests that students’ sense of connection with teachers and peers, as well as experiences of respect and inclusion, are associated with persistence and learning outcomes [26,27]. In pedagogical terms, a student-centred orientation implies attunement to students’ needs and characteristics, responsiveness to diversity, and the deliberate cultivation of supportive and psychologically safe learning contexts in which students can participate, ask questions, and articulate their views [28,29]. In the context of SDG 4, such features are central to understanding educational quality as inclusive and equitable rather than merely efficient.
Accounts of professional excellence also emphasise that teaching is a moral and professional practice, guided by ethical commitments and enacted through professional conduct [14,30]. The reflective practice tradition [31,32] conceptualises excellence as the capacity to examine one’s own assumptions, actions, and emotional responses, and to learn from experience in a way that improves practice [33].
In higher education, excellence is also discussed as a matter of professional responsibility in how courses are organised, standards are enacted, and judgements are made— particularly regarding transparency, fairness, and integrity in assessment and in responding to challenging situations [34]. This situates excellence within a broader frame of professional ethos and accountability rather than treating it as a purely instructional matter [14]. Additionally, excellence may involve helping students engage with the ethical and societal implications of disciplinary knowledge and practice, thereby linking teaching to wider educational purposes beyond the immediate curriculum [13].
Furthermore, excellence in higher education increasingly involves competence in engaging with pedagogical and technological innovation. Digitalisation has expanded the range of teaching approaches and heightened expectations that teachers integrate educational technologies in meaningful ways [35,36,37]. In higher education specifically, however, the role extends beyond teaching practice alone. Higher-education teachers work in contexts where disciplinary expertise, research engagement, and broader academic or societal contributions are institutionally significant [38,39,40,41,42].
Higher-education teacher excellence is also closely connected to knowledge and research work. Subject-matter expertise and pedagogical content knowledge [21] provide a foundation for a teacher’s credibility and for the ability to respond flexibly to students’ questions and misunderstandings. Beyond foundational knowledge, excellence in higher education also relates to the teacher’s relationship with the discipline as a living field of inquiry. This includes engagement with developments in the scientific field and contribution to knowledge production, as well as the integration of research and teaching [38]. In higher-education contexts, students’ learning is explicitly framed as participation in disciplinary ways of thinking; accordingly, teachers’ engagement in scholarship shapes the intellectual ambition and authenticity of the learning experience [43].
Finally, broader engagement with the scientific and professional community (e.g., industry) and with the public has been conceptualised as part of the contemporary academic role [39,40,42]. The “third mission” [39] of universities emphasises knowledge transfer, collaboration with external partners, and public engagement with science [41]. While such activities are sometimes categorised as service or outreach in academic evaluation frameworks [44], they can also be understood as forms of engaged scholarship and activity that facilitate knowledge exchange and extend the societal relevance of academic work [39,41].
The present study therefore focuses on higher-education teacher excellence as a multidimensional construct that may include, but is not limited to, excellence in teaching practice. Conceptualising the construct at the level of the higher-education teacher role is especially relevant if universities seek evaluation practices that are educationally meaningful and aligned with broader commitments to inclusion, equity, and sustainable quality.

1.2. Limitations of Prevailing Evaluation Practices

Even with increasingly refined conceptualisations, measuring excellence in higher-education teaching remains challenging. Many institutions rely heavily on student evaluations of teaching (SETs), which can provide important information about students’ experiences but are limited as sole indicators of excellence [45,46]. Student ratings may reflect factors unrelated to teaching quality (e.g., course difficulty, grading practices, or student expectations) and are vulnerable to systematic biases [47,48,49,50]. Moreover, student evaluations typically capture students’ perceptions of the teaching encounter but do not fully represent broader domains of the higher-education teacher role (e.g., professional conduct, reflective practice, course management, and broader scholarly engagement). Other approaches, such as peer reviews of teaching, can broaden the evidence base [51], but they can lack standardisation and therefore offer limited interpretability when the criteria for excellence are not explicitly defined. As a result, there is a need for empirically grounded measurement approaches that clarify the dimensional structure of excellence in higher-education teachers and provide a coherent item pool that can support both research and formative professional development.
An additional limitation of existing measures is that they are often developed top-down, reflecting policy frameworks or expert-derived competency models, rather than grounded in the language and meaning structures of higher-education teachers themselves [11]. Yet teacher sense-making matters: instruments that align with teachers’ own conceptions of excellence may have greater face validity and may be more useful for professional reflection, potentially narrowing the gap between accountability-driven definitions and practice-oriented understandings [52].

1.3. The Present Study

Against this background, the present study examined how university teachers conceptualise higher-education teacher excellence and what these accounts imply for evaluation design. The study was guided by the premise that a necessary step towards fairer, more inclusive, and more sustainable quality assurance is to specify what counts as excellence before attempting to measure it. To achieve this, we used an exploratory sequential mixed-methods design in which, through an initial qualitative phase, we developed a grounded conceptualisation of the construct; then, a quantitative phase was used to examine the structure of the resulting indicators [53]. This design was appropriate because the study first sought to elicit and organise teachers’ accounts of excellence and only then to examine the dimensionality of the resulting item pool in a larger sample. Accordingly, Study 1 used a focus group and individual interviews and translated the resulting qualitative category system into a structured item pool. In Study 2, we asked a larger sample of university teachers to rate the importance of the resulting items and examined their dimensional structure.
The study had two primary aims. First, it sought to develop a teacher-grounded conceptualisation of higher-education teacher excellence at the level of the academic role. Second, it aimed to examine whether the resulting indicators formed empirically distinguishable yet related dimensions. Accordingly, the study was guided by two research questions: (1) How do university teachers conceptualise higher-education teacher excellence? (2) What dimensional structure emerges from a teacher-grounded item pool representing higher-education teacher excellence? The implications of the resulting conceptualisation and dimensional structure for evaluation design were then considered at the interpretive level.
Although the study generated an item pool and assessed its dimensionality, its primary contribution is conceptual. It clarifies the multidimensional content of higher-education teacher excellence and demonstrates why evaluation systems based solely on student ratings or other narrow indicators are unlikely to capture the construct adequately. By clarifying the content domain in this way, the study contributes to current debates on how higher-education institutions can evaluate teacher excellence in ways that better reflect the multidimensional nature of the construct and support more sustainable approaches to quality assurance in higher education [54]. Accordingly, the resulting 95-item set should be understood as a provisional, teacher-grounded pool of candidate indicators for exploratory dimensionality analysis. Given that the study foregrounded teachers’ perspectives and did not test a multisource evaluation model in practice, the findings are best interpreted as an initial step towards a more inclusive evaluation framework rather than as a validated instrument for immediate evaluative use.

2. Materials and Methods

2.1. Study Context

Higher-education teachers from the University of Ljubljana, Slovenia, participated in the study. Focusing on one comprehensive university at this stage allowed the measurement instrument to be evaluated in a heterogeneous disciplinary context without introducing additional variability associated with differences between institutions. The study protocol and materials were reviewed and approved by the Ethics Committee of the Faculty of Arts, University of Ljubljana (Protocol No. 441-2025, issued in March 2025). Participation was voluntary and based on informed consent, and procedures were implemented to ensure the anonymity of participants.

2.2. Study 1: Conceptualisation and Operationalisation of Higher-Education Teacher Excellence

2.2.1. Participants and Procedure

Study 1 included 12 higher-education teachers. A focus group with seven experts in pedagogy and didactics was conducted, and five individual semi-structured interviews were carried out with higher-education teachers whose primary educational background was outside pedagogy. All participants had at least 10 years of teaching experience in higher education (up to 40 years) and held a range of academic ranks (e.g., assistant professor, associate professor, full professor). Sampling was purposive for the focus group and convenience-based for the individual interviews. Data were collected using two methods: a moderated online focus group discussion and individual interviews. The online focus group lasted 55 min, and the individual interviews lasted between 20 and 30 min. With participants’ consent, the focus group was audio-recorded and subsequently transcribed for analysis. The individual interviews were not audio-recorded; instead, detailed written notes were taken during the interviews, capturing participants’ responses as closely to verbatim as possible. In both data collection formats, participants responded to the same guiding question regarding the characteristics that define an excellent higher-education teacher, while follow-up questions were used to encourage clarification and elaboration of participants’ responses. Prior to participation, all participants were informed about the purpose of the study, the voluntary nature of participation, and anonymity. All participants provided informed consent.

2.2.2. Data Analysis

The participants’ responses were analysed using a stepwise procedure aligned with recommendations for qualitative analysis by Miles et al. [55]. Extraction of self-contained meaning units was followed by assigning initial codes, with explicit effort to keep codes as close as possible to participants’ language, even when codes summarised longer statements. Codes were then reviewed, compared, and iteratively clustered into first-order categories (subcategories) based on conceptual similarity. Subcategories were further organised into second-order categories, which were treated as broader dimensions of the higher-education teacher excellence construct. The coding and categorisation process resulted in eight second-order categories defined by 65 first-order subcategories. Both codes and subcategories were subsequently used to develop a provisional item pool intended to operationalise candidate indicators of higher-education teacher excellence. Item development followed a combined inductive–deductive approach: the results of qualitative analysis defined the conceptual framework and terminology, while an existing questionnaire developed for measuring teacher roles in secondary education [56] served as a reference for item wording and operationalisation where conceptually applicable.
During item operationalisation, a coverage principle was applied wherever subcategories were sufficiently specific and observable; each such subcategory was represented by at least one item, and conceptually complex subcategories were represented by multiple items to avoid overly narrow coverage. Some subcategories were not operationalised because they were formulated at a highly abstract level (e.g., broad overarching characteristics that are latent and difficult to translate into a single observable indicator) or were insufficiently specific to higher-education teaching (particularly transversal competencies).

2.3. Study 2: Dimensionality of Higher-Education Teacher Excellence

2.3.1. Participants

A total of 261 higher-education teachers participated in this study, the majority of whom were female (61.0%). The participants’ teaching experience in higher education ranged from less than one year to 45 years (M = 18.31; SD = 10.82). Participants represented various academic ranks; the largest group was full professors (28.0%), followed by assistant professors (22.6%), assistants (18.8%), and associate professors (11.9%), with smaller proportions in other ranks (lector, senior instructor). Disciplinary representation in the sample was consistent with the university’s overall disciplinary profile: humanities and social sciences (39.8%), natural sciences (30.3%), engineering and architecture (13.0%), arts (8.0%), manufacturing technologies and construction (7.7%), and health and medicine (6.5%).

2.3.2. Instrument

The questionnaire comprised 95 positively worded items and was designed as a provisional item pool for exploratory dimensionality analysis. Of these items, 39 were adapted from the questionnaire developed by Makovec [56], while the remaining items were newly developed based on the results of Study 1. Items were organised into the eight domains, with the following distribution: professional ethos (30 items), relationality (7), didactical competencies (31), values-oriented guidance (7), professional knowledge (6), discipline development and knowledge creation (5), engagement with the wider community (6), and transversal competencies (3). For transparency, Table A1 (Appendix A) shows the domain assignment of each item, together with its dimension membership as assigned by exploratory graph analysis (EGA). Participants rated how important it is, from their perspective, for a higher-education teacher to possess each characteristic described by the items. Ratings were provided on a 10-point scale ranging from 1 (not important at all) to 10 (extremely important).

2.3.3. Procedure

Data were collected using an online questionnaire administered via the 1KA web survey platform, version 25.03.21 [57]. In April 2025, invitations containing the survey link were distributed to employees of the University of Ljubljana through the management of its member faculties. Before participation, respondents were informed about the study aims, the estimated completion time, and the voluntary and anonymous nature of participation. The survey was self-administered, and participants could discontinue at any point. To reduce respondent burden, participants could choose between a full version of the questionnaire (95 items) and a shortened version comprising a random subset of 48 items. The mean completion time was 13.2 min. Only participants who provided informed consent were able to proceed. A non-probability self-selected sampling approach was employed. Since participants could choose between a full and a shortened version, a specific type of missing data occurred (in 24.5% of cases)—structurally missing responses resulting from the randomised item presentation in the shortened version (treated as missing completely at random, MCAR). Although fatigue effects cannot be entirely excluded, the low level of attrition-related missingness in the full version (1.98%) suggests that the response burden was manageable in the present sample. Missing values were imputed using a random forest algorithm implemented in the missForest package, version 1.6.1 [58]. The out-of-bag imputation error estimate indicated acceptable performance (NRMSE = 0.625) for the present dataset (n = 261 respondents; 95 items). Given the pronounced skewness and restricted variance of several items, mean absolute error indices were additionally examined for individual items and suggested satisfactory imputation accuracy for the vast majority of variables.

2.3.4. Data Analysis

The dimensionality assessment was conducted using exploratory graph analysis (EGA). EGA is a network psychometric method in which dimensionality is inferred from the structure of (regularised) partial correlations among items [59]. The procedure involves (1) estimating an appropriate item correlation matrix, (2) estimating a partial-correlation network, and (3) identifying communities of items within the network. The number of detected communities is interpreted as the estimated number of dimensions, and item community membership provides an empirical assignment of an item to a dimension [59]. A Gaussian graphical model (GGM) was used, in which nodes represent items, and edges represent regularised partial correlations, interpreted as conditional dependencies between item pairs given all other items in the network [60]. To improve interpretability and reduce spurious associations, the network was estimated with regularisation, which shrinks weak edges towards zero and yields a sparse network [59,61]. Estimation was performed using the graphical lasso (GLASSO) [62]. Model selection relied on the extended Bayesian information criterion (EBIC) [63], with the EBIC hyperparameter fixed at γ = 0.5 (default). The regularisation parameter λ was selected automatically by EBIC minimisation over a path of 100 candidate λ values, yielding λ = 0.116 as the optimal value for the estimated network. Communities (dimensions) were identified using the Walktrap community-detection algorithm with four steps, which detects densely connected subgraphs via random walks [64,65].
Because EGA is exploratory, the stability of the dimensionality solution was evaluated using bootstrap exploratory graph analysis (bootEGA) [66]. A total of 500 parametric bootstrap samples were generated under an assumed multivariate normal distribution; for each sample, the network was re-estimated and communities were re-identified. Stability was summarised by (1) the distribution of the number of dimensions across bootstrap replications, (2) item stability (the proportion of replications in which an item was assigned to the same dimension as in the original EGA solution), and (3) structural consistency (the proportion of replications in which an entire dimension was reproduced with an identical item set) [66]. To support the interpretation of dimensions, network loadings were computed using the net.loads function in EGAnet, version 2.4.0 [67]. Network loadings quantify each item’s relative contribution to the identified dimensions based on within- and between-community connections and were used together with item stability to distinguish anchors from more peripheral indicators [68]. The EGA and bootEGA were performed using the EGAnet package [67] in R, version 4.5.2 [69].

3. Results

3.1. Conceptualisation of Higher-Education Teacher Excellence (Study 1)

The main finding from the qualitative analysis of focus group and interview responses is that higher-education teacher excellence is best understood as a multidimensional construct comprising conceptually distinct yet interrelated dimensions.
The conceptual structure of higher-education teacher excellence consists of eight second-order categories (Table 1). First, professional ethos refers to the teacher’s professional and personal orientation, including integrity, commitment to the discipline and profession, openness, reflexivity, adaptability, and ongoing professional development. Second, teacher–student relationality reflects how the teacher’s professional ethos is enacted in relationships and interactions with students through the creation of a safe learning environment, connectedness with and among students, and responsiveness to students’ needs and perspectives. Third, didactical competencies represent what the teacher does in the planning and implementation of the pedagogical process and in supporting students’ learning and development, including motivating students, adapting to students’ needs, providing relevant feedback, and supporting students’ active participation. Mentorship emerged as a distinct subcategory of this domain. Fourth, values-oriented guidance captures the teacher’s role in the pedagogical process related to norm-setting and the regulation of the learning context, particularly by establishing boundaries for acceptable student behaviour. Fifth, professional knowledge refers to what the teacher knows as a foundation for pedagogical and disciplinary practice, including didactical, pedagogical, and subject-specific knowledge. Sixth, discipline development and knowledge creation position the excellent higher-education teacher as an active contributor to knowledge creation and the advancement of the discipline. Seventh, engagement with the wider community captures outward-facing activity that connects teachers’ work with professional, scientific, and societal contexts. Eighth, transversal competencies describe supportive competencies that enable the effective implementation of core academic roles across domains, such as teamwork, communication, organisation, self-management, and balancing teaching and research responsibilities.
Study 1 provided an empirically grounded conceptual map of higher-education teacher excellence (Table 1), which served as a substantive starting point for item generation. In the next step, we empirically tested the dimensional structure implied by this framework by applying exploratory graph analysis (EGA) to the resulting item pool. The EGA results (Section 3.2) therefore provide a psychometric counterpart to the qualitative conceptualisation and inform subsequent decisions on scale development.

3.2. Dimensionality of Higher-Education Teacher Excellence

Exploratory graph analysis (EGA) indicated that the initial 95-item set is multidimensional. The Louvain unidimensionality test did not support a single-dimensional solution, and the Walktrap community-detection algorithm identified six communities, suggesting a six-dimensional structure. The estimated partial-correlation network comprised 95 nodes (items) and 836 non-zero edges, corresponding to a network density of 0.187 (i.e., approximately 19% of all possible edges were retained after regularisation). Regularised partial correlations were generally small (M = 0.055, SD = 0.059), ranging from −0.049 to 0.440. Negative edges were rare and weak, whereas the strongest positive edges were of moderate magnitude (up to 0.44). The resulting network and item communities are visualised in Figure 1, with nodes coloured by their EGA-assigned dimension.
To evaluate the robustness of the dimensionality estimate, bootstrapped EGA (bootEGA) was used to examine the distribution of the number of dimensions across 500 bootstrap replications. The number of detected dimensions ranged from four to eleven (Table 2), with the six- and seven-dimensional solutions occurring most frequently (23.8% and 23.6%, respectively). The median number of dimensions was six (95% CI [3.03; 8.97]). As the six-dimensional solution was among the most frequently recovered and yielded interpretable item groupings, it was retained for substantive interpretation.
The network plot suggested that the six dimensions were reasonably distinguishable, although some inter-community connections were evident (Figure 1). Certain communities appeared topologically more compact (most notably Dimension 6), consistent with a denser pattern of within-community conditional dependencies, whereas other communities (particularly Dimension 3) were more diffuse and showed more cross-community connections. As node placement in the figure reflects a layout algorithm optimised for readability [60], interpretation was guided primarily by community membership and patterns of connections, rather than absolute distances between nodes.
Below, the six dimensions are described based on item content and network loadings (used to identify anchor indicators with the strongest within-dimension connections). Detailed item assignments and network loadings are provided in Table A1 (Appendix A). Anchor indicators were identified primarily by high within-dimension network loadings; item stability and item replicability profiles are reported in Section 3.3 (Appendix ATable A1 and Table A2) to evaluate the robustness of item placements.

3.2.1. Dimension 1: Foundational Pedagogical–Didactical Principles

Dimension 1 (9 items) captured a set of core pedagogical–didactical principles centred on clarity of explanation (DC13; 0.50), facilitation of dialogue (DC18; 0.48), and the fair treatment of students (PE4; 0.45). These are the strongest anchor indicators; they collectively reflect fundamental teaching principles. Additional items in this community extended the dimension towards broader conditions that enable learning (DC10; 0.31) and students’ expression of opinions (DC17; 0.27), as well as encouraging students to reach their full potential (DC1; 0.27). Based on network loadings, the item coming to sessions well-prepared (PE11) aligned more strongly with Dimension 3 than with Dimension 1 (0.24 vs. 0.19), whereas encouraging active student participation showed weak associations with both Dimension 1 and Dimension 5 (0.16 and 0.13), suggesting partial conceptual overlap with the student-centred orientation dimension.

3.2.2. Dimension 2: Research and Scientific Excellence

Dimension 2 (15 items) was clearly defined by items describing scientific and research-related excellence, positioning the higher-education teacher as a contributor to disciplinary development and knowledge creation. The most prominent anchors were contributing to the development of the scientific field (DKC1; 0.51), demonstrating commitment to the scientific field through work (PE10; 0.48), creating new knowledge within the scientific field (DKC2; 0.44), demonstrating expert-level mastery of one’s scientific field (PK3; 0.41), and demonstrating a high level of competence in research (DKC3; 0.41). Together, these items represent a coherent core of research-oriented disciplinary excellence. Items such as demonstrating openness to recent advances in one’s scientific field (PE20; 0.45) and engaging in continuous professional development (PE30; 0.44) also aligned with this dimension, but conceptually they are less specific to research performance (i.e., they reflect a broader professional ethos), compared with the core research indicators. The dimension further incorporated items linking research and teaching (DC30; 0.31) and updating course content with scientific progress (DC29; 0.24). Several items at the periphery of this dimension (e.g., interdisciplinary orientation and presenting scientific content to different audiences) showed less specific alignment, indicating that these characteristics may sit at the intersection of research excellence, teaching practice (Dimension 3), and engagement with the wider community (Dimension 6).

3.2.3. Dimension 3: Course Management and Professional Ethos

Dimension 3 was the largest community in the network (33 items), which is reflected in its heterogeneity, as it encompasses a broad range of characteristics describing higher-education teacher excellence. Substantively, it can be interpreted as a combination of at least two clusters: (1) planning and implementation of the pedagogical process, representing the core of didactical excellence (e.g., creating conditions in which learning is possible, mentoring, providing appropriate feedback, fostering students’ interest in the discipline, and supporting students’ professional identity development), and (2) a professional ethos, including respectful behaviour towards colleagues and students, personal integrity, pedagogical authority, and the capacity for self-reflection regarding one’s own conduct. Importantly, the EGA solution does not support a clear separation between these clusters; rather, items from both clusters converge into a shared core, indicating that course management and professional ethos are empirically intertwined in the network structure. The core of Dimension 3 was defined by the highest within-dimension network loadings (approximately 0.40 or higher), particularly reflecting on one’s own conduct (PE25; 0.45), setting clear boundaries for acceptable student behaviour (VOG1; 0.44), accepting feedback on one’s work from colleagues (PE15; 0.41), possessing foundational pedagogical knowledge (PK2; 0.41), demonstrating personal integrity (PE1; 0.41), and demonstrating knowledge of how to teach (DC24; 0.40). These anchors define an integrated core of reflective professional conduct and fundamental pedagogical competence, with personal integrity as a central underpinning.
Beyond these anchors, Dimension 3 also included treating students respectfully (PE5; 0.39), using appropriate strategies to address inappropriate behaviour and conflicts (VOG7; 0.37), and being respected by students (TSR3; 0.35), as well as engaging in emotional self-reflection in interactions with students (PE26; 0.34), showing passion for teaching (PE9; 0.34), and critically evaluating teaching practice (PE23; 0.33). Further indicators included demonstrating objectivity in assessment and grading (DC6; 0.31), demonstrating openness to colleagues’ differing opinions (PE16; 0.31), communicating articulately (DC14; 0.30), demonstrating knowledge of higher-education didactics (PK1; 0.30), and identifying the most suitable teaching approach (PE24; 0.30). Additional items with smaller loadings within Dimension 3 were demonstrating consistency (PE3; 0.28), demonstrating decisiveness (PE2; 0.27), and treating other higher-education teachers respectfully (PE18; 0.23). However, these indicators tended to contribute less than the core indicators, which is consistent with the more diffuse and less topologically compact structure of this community in the network plot, including a noticeable number of edges spanning community boundaries.
Overall, Dimension 3 is best interpreted as an integrated cluster of course management and professional ethos, reflecting how teachers combine the management of course processes (didactical competencies) with a professional stance characterised by integrity, respect, and critical reflection. This dimension was treated as integrated rather than conceptually split because the EGA did not recover separate communities corresponding to course management and professional ethos. Instead, items from both content areas converged around a shared core defined by reflective accountability, norm-setting, pedagogical knowledge, and didactical competence. In the context of higher education, this pattern is substantively plausible, since course organisation, assessment practice, pedagogical decision-making, and professional conduct are often enacted together through the teacher’s responsibilities. At the same time, given the breadth of this community and its lower structural consistency, the present interpretation should be treated as provisional. Future work should examine whether Dimension 3 is best represented as a broad integrated dimension, as two correlated subdimensions, or as a hierarchical structure.

3.2.4. Dimension 4: Innovation-Oriented Practice and Regulatory Compliance

Dimension 4 captured a coherent set of nine items reflecting openness to innovation in teaching—particularly through educational technology and innovative teaching approaches—together with a smaller cluster of items reflecting regulatory and administrative competence. Anchor indicators included keeping up with developments in ICT and integrating innovations into teaching (DC27; 0.51), using innovative teaching methods (DC25; 0.50), demonstrating openness to new teaching approaches and methods (PE19; 0.49), and integrating ICT into teaching in a meaningful way (DC26; 0.39). These items represent an innovation and modernisation component within higher-education teacher excellence. This dimension also included items reflecting continual improvement and change (e.g., continuously introducing innovations into the teaching–learning process, PE29; 0.26; demonstrating willingness to change teaching practice, PE28; 0.22), as well as regulatory compliance (e.g., adherence to formal rules and familiarity with relevant regulations) and administrative capability (TC3; 0.21). The co-occurrence of innovation-oriented and regulatory and administrative content within the same community suggests that, in this sample, orientations towards teaching innovation may be closely associated with the practical capacity to operate within institutional frameworks and requirements.

3.2.5. Dimension 5: Student-Centred Relational Orientation

Dimension 5 (22 items) was highly coherent in content and captured a student-centred relational orientation characterised by attunement to students and the creation of supportive and inclusive conditions for learning and development. The strongest anchors were demonstrating insight into students’ needs (TSR7; 0.60) and demonstrating insight into students’ characteristics (TSR6; 0.53), complemented by indicators emphasising group belonging and inclusion, such as encouraging the formation of a student community (TSR4; 0.49) and encouraging students to respect diversity (VOG3; 0.42). The core of Dimension 5 further included recognising students’ potential and strengths (DC4; 0.39), developing students’ awareness of current events in the wider social context (VOG4; 0.38), encouraging students to help one another (TSR5; 0.37), and expressing confidence in students’ abilities (PE7; 0.37). Contextualising learning was also central to this dimension, as reflected in connecting course content with current events and everyday life (DC28; 0.35) and relational proximity indicators such as establishing rapport with students (TSR2; 0.34) and being attentive to students’ personal concerns (PE22; 0.33). In addition, adapting to changes (PE27; 0.33) contributed to the core.
Beyond these core indicators, Dimension 5 also incorporated items that extend student-centredness into developmental support, including encouraging students to think about current developments in the professional field (DC23; 0.29), demonstrating openness to students’ differing opinions (PE17; 0.29), encouraging students to explore and discover new things (DC19; 0.28), and adapting teaching to students (DC5; 0.28). Additional indicators included creating a safe environment in which students feel accepted (TSR1; 0.27), motivating students to engage in their learning (DC2; 0.25), being available to students for conversation (PE6; 0.25), systematically developing students’ ability to think critically and justify their viewpoints (DC21; 0.25), creating an environment that supports students’ professional and personal development (DC12; 0.23), and encouraging students to engage in activities beyond the formal curriculum (VOG5; 0.22). The student-centred relational orientation in this sample is therefore empirically intertwined with motivational and learning-facilitation practices that enable students’ participation, engagement, and growth.

3.2.6. Dimension 6: External Academic Engagement

Dimension 6 comprised seven items describing the teacher’s engagement with the wider scientific and professional community and the public, including scientific outreach, project leadership, international collaboration, and networking. Anchor indicators included promoting interest in science among the wider public (EWC4; 0.51), leading research projects (DKC4; 0.50), collaborating with researchers abroad (EWC1; 0.49), contributing to the dissemination of scientific findings to the general public (EWC3; 0.45), securing funding for research projects (DKC5; 0.32), and engaging in external professional networking and collaboration (EWC2; 0.27). Conceptually, this dimension corresponds closely to broader academic and societal engagement and the capacities that enable knowledge creation and research activity.
Notably, this dimension showed empirical proximity to Dimension 2 (research and scientific excellence), consistent with the idea that external academic engagement relies on a strong research base. One item—balancing teaching and research responsibilities (TC2; 0.21)—showed a weaker and less specific association with the core of this community and appeared to function as a conceptual bridge between planning and implementation of the pedagogical process (more aligned with the course management part of Dimension 3) and broader research-related engagement (Dimension 6).

3.3. Item Stability

The stability of the six-dimensional solution was evaluated using item stability and structural consistency indices derived from bootEGA. Item stability reflects the proportion of bootstrap replications in which an individual item remains assigned to the same dimension, whereas structural consistency reflects the proportion of replications in which a dimension is reproduced with exactly the same item membership as in the original solution. Because this is a stringent criterion, even a single item shift to another dimension counts as non-replication. Structural consistency (i.e., dimension stability) values for the six-dimensional solution ranged from 0.002 to 0.386 (Table 3). The highest structural consistency was observed for Dimension 6 (0.386) and Dimension 4 (0.356), whereas Dimension 2 (0.028) and particularly Dimension 3 (0.002) showed lower rates of exact item-set replication across bootstrap samples.
This pattern is consistent with the variability in the number of detected dimensions across bootstrap replications reported in Section 3.2, and with the fact that larger and conceptually broader dimensions provide more opportunities for item reallocations across replications.
Item stability values (Appendix ATable A1) indicate the proportion of bootEGA replications in which each item was assigned to the same dimension as in the original EGA solution. Table A2 (Appendix A) complements this information by reporting each item’s replicability profile across dimensions (i.e., the percentage of replications in which an item was assigned to each recovered dimension), thereby indicating the most common alternative placements for less stable items. Network loadings were used to identify more central indicators within each dimension (Section 3.2), whereas item stability was used to assess the robustness of these item placements across bootstrap replications.
Item stability varied substantially across the 95 items, ranging from 13% to 100%. Using a ≥65% criterion as an indicator of high stability [66], 55 items (57.9%) showed high stability and 22 items (23.2%) low stability (<50%). Items assigned to Dimension 5 exhibited the highest item stability (M = 85.14%, range 60–100%), whereas stability was lower and more dispersed for items in Dimension 2 (M = 60.13%, range 13–91%) and Dimension 3 (M = 55.97%, range 25–83%).
Taken together, the bootEGA results suggest that the item pool contains interpretable and partly stable cores, but also porous boundaries between some communities. The six-dimensional solution should therefore be treated as a provisional structural proposal rather than a definitive representation of the construct.

4. Discussion

Two main findings stand out. First, the results support a teacher-grounded conceptualisation of higher-education teacher excellence as a multidimensional, role-based construct rather than a narrow account of classroom performance alone. Second, the resulting structure helps explain why evaluation systems based on student-only ratings or other narrow indicators are unlikely to capture the construct adequately, thereby weakening quality assurance by privileging only the most visible facets of academic work and providing a limited basis for evaluative judgement.
At a substantive level, the six dimensions identified by the EGA suggest that higher-education teacher excellence is understood by teachers as spanning pedagogical, relational, organisational, scholarly, and outward-facing aspects of academic work. Foundational pedagogical–didactical principles (D1) captured core teaching basics (clear articulation of course content, facilitation of dialogue, and establishing conditions for high-quality learning) that align with prevailing teaching quality frameworks [70,71]. Research and scientific excellence (D2) operationalised teacher excellence as disciplinary contribution and research competence, consistent with the higher-education-specific coupling of teaching with knowledge creation [70,72]. Course management and professional ethos (D3) formed the broadest and most heterogeneous community, combining norm-setting, reflective professional conduct, and pedagogical knowledge and competence. Innovation-oriented practice and regulatory compliance (D4) grouped items on educational technology and innovative teaching approaches together with items reflecting adherence to formal rules and administrative competence, suggesting that, in this sample, innovation-oriented teaching was intertwined with competencies for navigating institutional requirements.
Student-centred relational orientation (D5) emerged as a highly coherent cluster centred on insight into students, inclusion, belonging, and supportive relational conditions—consistent with work emphasising belonging and engagement as central to persistence and learning [26,73]. Its relevance extends beyond persistence and the immediate learning environment. Because this dimension concerns how students are recognised and included within the learning process, it also directly shapes the kinds of competences they can develop. Dialogic and inclusive learning environments require students to articulate and justify their views, engage seriously with different perspectives, and participate responsibly in a shared academic community. In this sense, excellence in the relational domain directly contributes to the development of critical thinking, respect for diversity, and the capacities associated with active citizenship. External academic engagement (D6) encompassed outward-facing academic activity (science communication, research projects, international collaboration, networking), aligning with “third mission” discussions and engaged scholarship frameworks [41,42,74,75]. However, the solution was not characterised by strict separability. Both the network plot (Figure 1) and the stability inspection (Section 3.3) indicate that certain domains are empirically close, reflecting that excellence is enacted as an integrated role rather than a set of independent components. This is substantively consistent with the argument that excellence is normative and contextual, and in higher education it is enacted at the intersection of pedagogy, professional judgement, institutional conditions, and scholarship [70,72,76]. Importantly, the broader dimensions identified here should not be read as narrowly delimited traits, but as role-based configurations of practices and responsibilities that are frequently enacted together in higher education. Their coherence lies in the organisation of academic work itself, where pedagogical, professional, institutional, and scholarly elements often intersect rather than appearing as fully separable domains.
Study 1 yielded eight second-order categories, offering broad coverage of the semantic field of excellence as articulated by teachers (Table 1). The EGA solution, by contrast, yielded six empirical communities. This divergence is informative, as it indicates that some conceptual distinctions articulated in participants’ accounts (and formalised in the qualitative coding system) do not necessarily translate into separable empirical clusters once items are rated together and conditional dependencies are modelled. To make the relationship between the qualitative and quantitative phases more explicit, Table 4 presents a cross-phase mapping between the eight second-order categories from Study 1 and the six EGA dimensions retained for interpretation in Study 2.
As shown in Table 4, the correspondence between the qualitative and quantitative phases was not one-to-one. Some qualitative categories were largely retained at the quantitative level, most notably teacher–student relationality and engagement with the wider community, which mapped primarily onto D5 and D6, respectively. By contrast, broader categories such as professional ethos and didactical competencies were redistributed across several EGA dimensions. Conversely, some EGA dimensions—especially D3—combined indicators originating from multiple qualitative categories, suggesting that the qualitative framework preserved distinctions important for content coverage, whereas the EGA solution reflects how candidate indicators clustered empirically when rated together.
The fact that six- and seven-dimensional solutions occurred with nearly equal frequency, together with the lower structural consistency of some communities—most notably the broad and heterogeneous Dimension 3—suggests that the present structure should be interpreted as provisional. It provides an interpretable model of the construct in this sample, but it should not be treated as a final representation of its dimensionality. Differences in dimensional stability should therefore be understood primarily as differences in how clearly defined the empirical cores of the dimensions are in the present item pool, rather than as differences in their substantive importance. For future applications, this means that more stable dimensions may provide stronger starting points for subsequent refinement, whereas broader and less stable dimensions require more cautious interpretation, further item revision, and confirmatory testing before any score-based use can be justified.
Two patterns are particularly noteworthy. First, course management and professional ethos converged. In the qualitative framework, professional ethos (values, openness, reflection, integrity) is distinguished from didactical competencies and from norm-setting. In the EGA solution, however, the core of Dimension 3 was defined simultaneously by reflective accountability (e.g., reflecting on one’s conduct; openness to colleague feedback), norm-setting (setting boundaries for acceptable behaviour), foundational pedagogical knowledge, and didactical knowledge. This convergence suggests that, at least in this sample and item format, critical reflection, course management, and basic pedagogical competence function as a single interconnected cluster rather than cleanly separable strands. However, this pattern should not be interpreted as implying that these elements are conceptually indistinguishable. Rather, in the present item pool, they were sufficiently interdependent to form a single community. It can be seen as a role-based structure of excellence—when teachers describe what it means to be excellent, competence in teaching is closely linked to how one exercises authority, governs standards, and takes responsibility for one’s conduct, particularly in contexts where course design and assessment involve substantial autonomy [34,54,77]. At the same time, the breadth of this community and its lower structural consistency suggest that Dimension 3 may represent either a broad integrated role dimension or a provisional aggregation of more differentiated subdimensions. Further refinement and confirmatory approaches will be needed to determine whether these elements should remain combined or be represented separately.
Second, innovation clustered with institutional/regulatory competence. Dimension 4’s dual content (innovation-oriented practice alongside compliance and administrative indicators) suggests that innovation in teaching may be perceived (or enacted) as an institutional practice—integrating technology and new approaches is coupled with the capacity to navigate rules, documentation, and administrative demands. This aligns with broader accounts of performativity and governance in higher education, where innovation is rarely free-standing and is often shaped by evaluation environments and institutional constraints [10,76,78]. However, this linkage should not be over-interpreted, as the community structure in EGA can reflect conditional dependence in a given sample and item pool (e.g., common institutional framing, overlap in instructional practices), and the co-occurrence of innovation and compliance indicators may therefore be partly incidental. Replication in independent samples and confirmatory approaches are needed before treating this as a stable substantive coupling.
The role-based nature of the findings is equally important. Two of the dimensions identified in the present study—research and scientific excellence, and external academic engagement—extend beyond a narrow conception of teaching performance. This suggests that, in university contexts, teacher excellence may be understood as a broader form of academic professionalism in which teaching is connected to scholarship, disciplinary development, and engagement beyond the classroom. This result is especially relevant in research-intensive institutions, where the sustainable development of teaching expertise occurs under incentive structures that often privilege research productivity over pedagogical work [72].
The present findings therefore support a view of excellence as an integrated and contextually embedded construct rather than a single trait or a collection of unrelated competencies. If higher-education teacher excellence is multidimensional and role-based, then no single source of evidence can be expected to capture it fully. Recent work similarly emphasises that credible and sustainable evaluation requires explicit conceptual foundations, clear evaluation purposes, and alignment between criteria, methods, and intended uses [54,79]. This logic also resonates with participatory approaches to evaluation in higher education, where different stakeholders contribute distinct but complementary forms of evidence [80]. However, different evaluation contexts (self-evaluation, peer review, student feedback, promotion criteria) imply different observability constraints and validity threats.
From a practical perspective, the present findings suggest that student feedback is likely to be most informative for dimensions that students directly experience, particularly foundational pedagogical–didactical principles and student-centred relational orientation. This is consistent with recent work showing that student perspectives on teaching are most useful when treated as multidimensional rather than collapsed into a single overall indicator [81]. Where student feedback is used, a combination of scaled ratings and qualitative comments may offer a richer and fairer basis for interpretation than numeric ratings alone [82]. At the same time, recent work on student evaluation of teaching points to continuing interpretive challenges, including contextual influences and the need for cautious use in evaluative decision-making [83,84]. Peer review may be better suited to aspects of excellence that require disciplinary and pedagogical judgement, including course management, assessment and feedback practices, and the scholarly grounding of teaching. Self-evaluation and teaching portfolios may be relevant across all dimensions as a reflective source of evidence, but are especially valuable where excellence involves self-reflection, the articulation of teaching values, and the integration of different aspects of the academic role [85,86]. Institutional documentation and broader academic review processes may be more appropriate for aspects of excellence related to research activity and external academic engagement, and for formal responsibilities that are not equally visible to students or peers. The implications are not that every dimension must be assessed in every context, or aggregated into a single overall score. Rather, sources of evidence should be matched to the dimension of excellence under consideration and to the purpose of evaluation.
This implication is particularly relevant in the context of sustainable quality assurance. In higher education, quality assurance is increasingly viewed as a long-term, improvement-oriented process rather than a narrow exercise in compliance or episodic performance control [54]. Similarly, research on education and sustainability has linked educational quality to meaningful learning environments, inclusion, and learner-centred pedagogical conditions [8,87]. The teacher-grounded dimensions identified here highlight the mismatch between broad institutional claims about teacher excellence and the narrower indicators often used to support them. Evaluation systems do not simply record excellence; they help define what is recognised and developed within academic work. If universities rely mainly on narrow or highly visible indicators, they risk reinforcing limited conceptions of excellence and privileging those aspects of teaching that are easiest to quantify. By contrast, the multidimensional conceptualisation developed here, together with the multisource logic it implies for evaluation, supports a more development-oriented approach to evaluation. In this sense, sustainable evaluation should be understood as evaluation that supports continuous professional development, as it creates conditions for reflection and learning [88]. Its significance therefore extends beyond the evaluation of individual teachers. Evaluation designed in this way can strengthen the wider educational ecosystem and contribute to the adaptive capacity and resilience of higher-education institutions in the face of changing technological and societal demands. From this perspective, the contribution of the present study lies in clarifying the content of higher-education teacher excellence and in showing how conceptual clarity can support evaluation approaches that are development-oriented and institutionally sustainable, while remaining aligned with the long-term quality of higher education.
However, it should be emphasised that the present study does not provide a validated instrument that is ready for direct use in evaluating higher-education teachers’ work. The 95-item set should be regarded as a provisional, teacher-grounded item pool designed to map the content domain of higher-education teacher excellence and to explore its potential dimensions. Moreover, Study 2 examined teachers’ ratings of the importance of candidate indicators rather than the functioning of these items in actual teacher evaluations. The present findings therefore support construct specification and exploratory dimensionality assessment, but they do not yet justify score-based evaluative use. Further work is required to refine the item set, test alternative measurement models, establish reliability and validity across institutions and stakeholder groups, and examine how the instrument functions in concrete evaluation contexts.
Several limitations constrain interpretation and directly indicate directions for future work. First, the study relied on a single institution in both phases (Study 1: n = 12; Study 2: n = 261), with the quantitative phase additionally based on a self-selected sample, limiting generalisability. Although the present findings should not be generalised directly beyond the study context, the broader role-based logic of the model may still be transferable to other higher-education settings where academic work is organised around similar combinations of teaching, scholarship, institutional responsibility, and external engagement. However, conceptions of excellence are likely to vary across disciplines, institutions, and national higher-education systems. Second, the study foregrounded teachers’ accounts to specify the construct from within the role, but did not examine how other stakeholder groups conceptualise excellence or how they would respond to the identified indicators. Third, although the results have implications for multisource evaluation, the study did not directly test a multisource design in practice. Future research should therefore examine how different rater groups respond to those facets of excellence they can legitimately observe, and whether the dimensions identified here remain stable across evaluative contexts. Fourth, Study 2 examined teachers’ ratings of the importance of candidate indicators rather than evaluations of actual teachers. The resulting 95-item set should therefore not be treated as a validated evaluative scale, and further refinement, confirmatory analyses, and validity evidence are needed before score interpretations can be justified in applied evaluation settings. Fifth, the study design involved planned missingness and subsequent imputation. The short-version design introduced structurally missing responses, and additional attrition-related missingness was imputed. Although this approach was justified and diagnostics were reported, network estimation may be sensitive to imputation choices. Sixth, the bootstrapping approach involved distributional assumptions. The bootEGA procedure assumed multivariate normality; given the skewness and restricted variance observed in some items, the stability estimates should therefore be interpreted with caution.

5. Conclusions

This study suggests that higher-education teacher excellence is best understood as a broad, role-based construct rather than as a single attribute of classroom performance. The provisional six-dimensional structure retained for interpretation indicates that, from teachers’ perspectives, excellence encompasses pedagogical practice, relationships with students, professional responsibility, innovation, scholarship, and engagement beyond the classroom. By making these dimensions explicit, the study clarifies what universities may actually mean when they refer to excellence as a marker of educational quality. However, the study should be seen as an initial step towards a more inclusive evaluation framework rather than as a model ready for immediate institutional application. The findings are based on a single university, a self-selected sample, teachers’ perspectives, and importance ratings of candidate indicators, and a multisource evaluation model was not tested in practice.
Nevertheless, the findings point towards a more differentiated approach to evaluation. Because these dimensions are not equally visible to all evaluators, broad judgements about teacher excellence cannot convincingly be based on student ratings alone or on other narrow indicators. Instead, the study highlights the value of aligning sources of evidence with the facets of excellence they can credibly capture. In this way, conceptual clarity is not separate from quality assurance; it is part of building evaluation practices that are more coherent, more development-oriented, and better aligned with wider commitments to inclusion and sustainability in higher education. More broadly, the findings suggest that evaluation frameworks grounded in a multidimensional understanding of teacher excellence can serve as mechanisms of professional learning rather than merely instruments of judgement. Their significance therefore lies in their capacity to support the ongoing development of teaching and, through this, to strengthen the long-term adaptability and resilience of higher-education institutions.

Author Contributions

Conceptualization, N.K.; methodology, N.K., G.S. and D.M.R.; formal analysis, N.K.; investigation, N.K.; data curation, N.K.; writing—original draft preparation, N.K.; writing—review and editing, N.K., G.S. and D.M.R.; visualization, N.K.; supervision, G.S. and D.M.R.; project administration, N.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Slovenian Research and Innovation Agency, grant numbers P5-0174 and P5-0062.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Commission of Faculty of Arts, University of Ljubljana (Protocol No. 441-2025, issued on 28 March 2025).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

During the preparation of this manuscript, the authors used ChatGPT (GPT-5.2, OpenAI) for translation and to support English-language editing. The authors have reviewed and edited the output and take full responsibility for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Appendix A

Table A1. Item stability and network loadings of the higher-education teacher excellence scale items.
Table A1. Item stability and network loadings of the higher-education teacher excellence scale items.
Short NameItem LabelItem
Stability *
Network Loadings
123456
DC13Providing clear and understandable explanations of course content1 (75%)0.50
DC18Encouraging students to ask questions1 (68%)0.48
PE4Treating students fairly and professionally1 (75%)0.45
DC16Encouraging students to co-create the learning process1 (44%)0.32 0.11
DC10Establishing conditions for high-quality learning1 (71%)0.31
DC17Enabling students to express their views1 (57%)0.27 0.21
DC1Encouraging students to reach their full potential1 (67%)0.27 0.14
PE11Coming to lectures, seminars, and practical sessions well-prepared1 (63%)0.19 0.24
DC15Encouraging active student participation in the learning process1 (30%)0.16 0.13
DKC1Contributing to the development of the scientific field2 (90%) 0.51
PE10Demonstrating commitment to the scientific field through one’s work 2 (90%) 0.48
PE20Demonstrating openness to recent advances in one’s scientific field2 (59%) 0.45
DKC2Creating new knowledge within the scientific field2 (90%) 0.44
PE30Engaging in continuous professional development2 (45%) 0.44
PK3Demonstrating expert-level mastery of one’s scientific field2 (91%) 0.41
DKC3Demonstrating a high level of competence in research2 (90%) 0.41 0.11
PK4Integrating scientific disciplines2 (33%) 0.35
DC30Integrating one’s own research with teaching2 (86%) 0.31
EWC6Appropriately presenting and explaining scientific content to experts in one’s field2 (43%) 0.300.22
PE14Demonstrating a desire for professional development and advancement2 (44%) 0.280.17
DC29Incorporating the latest discoveries into course content2 (77%) 0.24
PE21Demonstrating openness to interdisciplinary work2 (27%) 0.180.13 0.11
PE12Demonstrating openness to continuous learning2 (13%) 0.170.19 0.12
EWC5Appropriately presenting and explaining scientific content to the general public2 (24%) 0.150.12 0.16
DC9Clearly communicating expectations (including rules and criteria) regarding students’ course requirements3 (80%) 0.51
PE25Reflecting on one’s own conduct3 (68%) 0.45
VOG1Setting clear boundaries for acceptable student behaviour in the course3 (83%) 0.44
PE15Accepting feedback on one’s work from colleagues3 (66%) 0.41
PK2Possessing foundational pedagogical knowledge3 (55%) 0.41
PE1Demonstrating personal integrity3 (83%) 0.41
DC24Demonstrating knowledge of how to teach3 (70%)0.11 0.40
PE5Treating students respectfully3 (49%)0.10 0.39
VOG7Using appropriate strategies to address inappropriate behaviour and conflicts3 (61%) 0.370.10
TSR3Being respected by students3 (81%) 0.35
PE26Engaging in emotional self-reflection in interactions with students3 (58%) 0.34 0.14
PE9Showing passion for teaching3 (72%) 0.34
PE23Critically evaluating teaching practice3 (43%) 0.33
DC6Demonstrating objectivity in assessment/grading3 (58%)0.120.110.31
PE16Demonstrating openness to colleagues’ differing opinions3 (59%) 0.31
DC14Communicating articulately3 (69%) 0.30
DC3Sparking students’ interest in the professional field3 (37%) 0.30 0.11
PK1Demonstrating knowledge of higher-education didactics3 (54%) 0.300.10
PE24Identifying the most suitable teaching approach3 (67%) 0.30
PE3Demonstrating consistency3 (66%) 0.28
DC11Creating an environment in which learning is possible3 (57%) 0.27 0.14
PE2Demonstrating decisiveness3 (65%) 0.27
DC7Using feedback to guide students’ learning3 (44%) 0.26 0.12
DC31Providing high-quality mentoring to students3 (38%) 0.160.26
PE18Treating other higher-education teachers respectfully3 (60%) 0.23
DC20Encouraging students’ thinking3 (40%)0.13 0.23
TC1Working effectively in a team3 (44%) 0.22 0.16
PE8Treating students as responsible individuals3 (29%) 0.22 0.17
VOG6Expecting moral behaviour from students3 (75%) 0.21
DC22Encouraging students’ professional identity development3 (33%) 0.19 0.17
DC8Providing substantive feedback and justifying grades after assessment3 (25%) 0.18 0.24
PE13Acknowledging one’s own knowledge limitations3 (29%) 0.14
VOG2Emphasising ethical dimension when addressing course content3 (29%) 0.11 0.10
DC27Keeping up with developments in ICT and integrating innovations into teaching4 (84%) 0.51
DC25Using innovative teaching methods4 (84%) 0.50
PE19Demonstrating openness to new teaching approaches and methods4 (83%) 0.130.49
DC26Integrating ICT into teaching in a meaningful way4 (83%) 0.39
PK6Consistently adhering to formal rules (laws, regulations) in professional practice4 (59%) 0.110.34
PK5Being thoroughly familiar with legal and other formal rules that affect teaching4 (58%) 0.150.270.12
PE29Continuously introducing innovations into the teaching–learning process4 (66%) 0.180.26
PE28Demonstrating willingness to change teaching practice4 (58%)0.13 0.130.22
TC3Demonstrating competence in performing administrative tasks4 (71%) 0.21 0.13
TSR7Demonstrating insight into students’ needs5 (97%) 0.60
TSR6Demonstrating insight into students’ characteristics5 (100%) 0.11 0.53
TSR4Encouraging the formation of a student community5 (99%) 0.49
VOG3Encouraging students to respect diversity5 (96%) 0.42
DC4Recognising students’ potential and strengths5 (99%) 0.39
VOG4Developing students’ awareness of current events in the wider social context5 (85%) 0.38
TSR5Encouraging students to help one another5 (99%) 0.110.37
PE7Expressing confidence in students’ abilities5 (85%)0.11 0.16 0.37
DC28Connecting course content with current events and everyday life5 (96%) 0.35
TSR2Establishing rapport with students5 (95%) 0.34
PE22Being attentive to students’ personal concerns5 (100%) 0.33
PE27Adapting to changes (e.g., different generations of students, new approaches)5 (79%) 0.16 0.33
DC23Encouraging students to think about current developments in the professional field5 (60%) 0.15 0.29
PE17Demonstrating openness to students’ differing opinions5 (75%)0.16 0.17 0.29
DC19Encouraging students to explore and discover new things5 (63%) 0.11 0.28
DC5Adapting teaching to students (based on observed changes in students)5 (85%) 0.19 0.28
TSR1Creating a safe environment in which students feel accepted5 (67%) 0.21 0.27
DC21Systematically developing students’ ability to think critically and to justify their viewpoints5 (62%) 0.19 0.25
DC2Motivating students to engage in their learning5 (84%) 0.13 0.25
PE6Being available to students for conversation5 (84%) 0.25
DC12Creating an environment that supports students’ professional and personal development5 (74%) 0.16 0.23
VOG5Encouraging students to engage in activities beyond the formal curriculum5 (89%) 0.22
EWC4Promoting interest in science among the wider public6 (76%) 0.11 0.51
DKC4Leading research projects6 (78%) 0.11 0.50
EWC1Collaborating with researchers abroad6 (78%) 0.10 0.49
EWC3Contributing to the dissemination of scientific findings to the general public6 (76%) 0.14 0.45
DKC5Securing funding for research projects6 (78%) 0.13 0.32
EWC2Engaging in external professional networking and collaboration (e.g., professional organisations, associations)6 (64%) 0.13 0.27
TC2Balancing teaching and research responsibilities6 (46%) 0.19 0.21
Note. PE = Professional Ethos; TSR = Teacher–Student Relationality; DC = Didactical Competencies; VOG = Values-Oriented Guidance; PK = Professional Knowledge; DKC = Discipline Development & Knowledge Creation; EWC = Engagement with the Wider Community; TC = Transversal Competencies. * = The integers indicate the EGA-assigned dimension for each item. The percentages represent the proportion of 500 bootstrap samples in which the item clustered within that same dimension.
Table A2. Replicability of the higher-education teacher excellence scale items.
Table A2. Replicability of the higher-education teacher excellence scale items.
Short NameItem Replicability
1234567891011
DC1375%0%8%0%2%3%8%2%1%0%0%
DC1868%0%6%0%8%2%9%2%3%1%0%
PE475%0%8%0%2%3%8%2%1%0%0%
DC1644%0%9%2%22%2%9%6%4%1%0%
DC1071%0%8%0%5%2%8%2%3%0%0%
DC1757%0%4%0%24%2%7%3%2%1%0%
DC167%0%6%0%10%2%8%2%3%1%0%
PE1163%0%18%0%3%3%9%3%2%0%0%
DC1530%0%12%0%30%2%11%7%6%1%0%
DKC10%90%0%0%0%1%8%0%0%0%0%
PE100%90%0%0%0%1%8%0%0%0%0%
PE200%59%9%0%0%4%23%4%1%0%0%
DKC20%90%0%0%0%1%8%0%0%0%0%
PE300%45%15%0%0%4%28%6%1%0%0%
PK30%91%0%0%0%1%8%0%0%0%0%
DKC30%90%0%0%0%1%8%0%0%0%0%
PK40%33%15%3%0%10%27%8%1%1%0%
DC300%86%2%0%0%2%9%1%0%0%0%
EWC61%43%17%0%0%4%24%8%1%0%0%
PE140%44%16%1%0%4%28%6%1%0%0%
DC290%77%5%0%0%3%12%2%0%0%0%
PE210%27%16%9%0%12%26%7%1%1%0%
PE121%13%42%0%5%3%23%10%2%1%0%
EWC50%24%6%5%0%43%15%5%1%1%0%
DC92%0%80%0%1%1%10%5%1%0%0%
PE253%0%68%0%4%3%13%6%2%0%0%
VOG10%0%83%1%0%1%9%4%1%0%0%
PE150%0%66%6%0%2%15%7%2%0%0%
PK20%0%55%20%0%3%12%7%2%0%0%
PE10%0%83%0%0%2%8%5%1%0%0%
DC243%0%70%3%0%3%11%7%2%0%0%
PE513%0%49%0%7%3%17%9%3%0%0%
VOG70%0%61%6%6%3%13%8%2%1%0%
TSR31%0%81%0%2%1%9%4%1%0%0%
PE260%0%58%1%15%2%12%9%3%0%0%
PE90%0%72%1%0%2%14%8%1%0%0%
PE233%0%43%1%19%3%16%11%3%1%0%
DC611%0%58%0%3%3%15%9%2%0%0%
PE160%1%59%4%4%3%16%9%3%1%0%
DC143%0%69%0%3%2%12%7%2%0%0%
DC31%2%37%0%32%2%12%10%3%1%0%
PK10%0%54%21%0%3%12%7%2%0%0%
PE241%0%67%1%7%2%11%8%2%1%0%
PE36%0%66%0%2%2%14%8%1%0%0%
DC110%0%57%1%14%2%15%9%2%0%0%
PE20%0%65%12%2%2%11%6%2%0%0%
DC72%0%44%0%25%2%15%9%4%1%0%
DC317%4%38%0%11%3%21%12%4%1%0%
PE181%0%60%10%1%3%14%8%2%1%0%
DC2019%0%40%0%8%3%17%9%4%1%0%
TC10%0%44%10%18%3%14%9%2%1%0%
PE84%0%29%0%37%3%14%9%3%1%0%
VOG61%0%75%1%7%2%9%5%2%0%0%
DC220%1%33%1%36%2%13%9%3%1%0%
DC83%0%25%0%49%1%10%7%3%1%0%
PE1310%1%29%0%25%3%17%10%4%1%0%
VOG20%0%29%15%34%3%10%7%2%1%0%
DC270%0%6%84%0%6%3%1%0%0%0%
DC250%0%8%84%0%2%4%2%0%0%0%
PE190%0%8%83%0%2%4%2%0%0%0%
DC260%0%6%83%0%7%3%1%0%0%0%
PK60%0%18%59%4%3%9%4%2%1%1%
PK50%0%19%58%4%3%9%4%2%1%1%
PE290%0%18%66%0%3%7%5%1%0%0%
PE281%0%22%58%1%3%9%5%1%0%1%
TC30%1%5%71%0%18%3%1%0%0%0%
TSR70%0%0%0%97%0%1%0%1%0%0%
TSR60%0%0%0%100%0%0%0%0%0%0%
TSR40%0%0%0%99%0%0%0%0%0%0%
VOG31%0%1%0%96%0%1%1%0%0%0%
DC40%0%0%0%99%0%0%0%0%0%0%
VOG45%0%2%0%85%1%4%2%1%1%0%
TSR50%0%0%0%99%0%0%0%0%0%0%
PE72%0%3%0%85%1%4%4%1%0%0%
DC280%0%0%0%96%0%1%1%1%1%0%
TSR20%0%1%0%95%0%1%1%1%0%0%
PE220%0%0%0%100%0%0%0%0%0%0%
PE271%0%6%0%79%1%5%4%3%1%0%
DC237%0%10%0%60%2%8%8%4%1%0%
PE173%0%5%0%75%1%6%6%3%1%0%
DC194%0%9%0%63%1%9%9%3%1%0%
DC51%0%3%0%85%1%4%4%2%0%0%
TSR12%0%12%0%67%2%7%6%3%1%0%
DC217%0%8%0%62%2%8%8%4%1%0%
DC21%0%6%0%84%1%3%3%2%0%0%
PE61%0%6%0%84%0%3%3%2%0%0%
DC120%0%13%0%74%1%6%3%2%1%0%
VOG50%0%4%1%89%0%2%3%1%0%0%
EWC40%15%0%6%0%76%1%1%0%1%0%
DKC40%15%0%7%0%78%0%0%0%0%0%
EWC10%15%0%7%0%78%0%0%0%0%0%
EWC30%15%0%6%0%76%1%1%0%1%0%
DKC50%15%0%7%0%78%0%0%0%0%0%
EWC20%11%1%19%0%64%3%1%0%0%0%
TC20%12%8%17%0%46%11%4%1%0%0%
Note. PE = professional ethos; TSR = teacher–student relationality; DC = didactical competencies; VOG = values-oriented guidance; PK = professional knowledge; DKC = discipline development & knowledge creation; EWC = engagement with the wider community; TC = transversal competencies. Note. Percentages indicate the proportion of bootEGA replications in which an item was assigned to each dimension. Because the number of dimensions varied across replications, additional dimension labels (7–11) occurred in solutions with more than six dimensions.

References

  1. Brusoni, M.; Damian, R.; Sauri, J.; Jackson, S.; Kömürcügil, H.; Malmedy, M.; Matveeva, O.; Motova, G.; Pisarz, S.; Patricia, P.; et al. The Concept of Excellence in Higher Education; ENQA: Brussels, Belgium, 2014. [Google Scholar]
  2. Gunn, V.; Fisk, A. Considering Teaching Excellence in Higher Education: 2007–2013: A Literature Review since the CHERI Report 2007; Higher Education Academy: York, UK, 2014. [Google Scholar]
  3. Little, B.; Locke, W. Conceptions of Excellence in Teaching and Learning and Implications for Future Policy and Practice. In Questioning Excellence in Higher Education: Policies, Experiences and Challenges in National and Comparative Perspective; Rostan, M., Vaira, M., Eds.; SensePublishers: Rotterdam, The Netherlands, 2011; pp. 119–137. [Google Scholar][Green Version]
  4. Miller-Young, J.; Sinclair, M.; Forgie, S. Teaching Excellence and How It Is Awarded: A Canadian Case Study. Can. J. High. Educ. 2020, 50, 40–52. [Google Scholar] [CrossRef]
  5. UNESCO. Education for Sustainable Development: A Roadmap; UNESCO: Paris, France, 2020. [Google Scholar]
  6. UNESCO. Transforming Education Towards SDG4: Report of a Global Survey on Country Actions to Transform Education; UNESCO: Paris, France, 2024. [Google Scholar]
  7. Veidemane, A. Education for Sustainable Development in Higher Education Rankings: Challenges and Opportunities for Developing Internationally Comparable Indicators. Sustainability 2022, 14, 5102. [Google Scholar] [CrossRef]
  8. Valantinaite, I.; Navickiene, V. The Phenomenon of Lecturer Competences as a Prerequisite for the Advancement of Sustainable Development Ideas in the Context of Student-Centred Studies. Sustainability 2024, 16, 1472. [Google Scholar] [CrossRef]
  9. Ball, S.J. The Teacher’s Soul and the Terrors of Performativity. J. Educ. Policy 2003, 18, 215–228. [Google Scholar] [CrossRef]
  10. Hillebrandt, M.; Huber, M. Editorial: Quantifying Higher Education: Governing Universities and Academics by Numbers. Polit. Gov. 2020, 8, 1–5. [Google Scholar] [CrossRef]
  11. Skelton, A. Understanding Teaching Excellence in Higher Education: Towards a Critical Approach; Routledge: London, UK, 2005. [Google Scholar]
  12. Mattos, L.K.; Flach, L.; Costa, A.M.; Moré, R.P. Effectiveness and Sustainability Indicators in Higher Education Management. Sustainability 2023, 15, 298. [Google Scholar] [CrossRef]
  13. Biesta, G. Good Education in an Age of Measurement: On the Need to Reconnect with the Question of Purpose in Education. Educ. Assess. Eval. Account. 2009, 21, 33–46. [Google Scholar] [CrossRef]
  14. Fenstermacher, G.; Richardson, V. On Making Determinations of Quality in Teaching. Teach. Coll. Rec. 2005, 107, 186–213. [Google Scholar] [CrossRef]
  15. Harvey, L.; Green, D. Defining Quality. Assess. Eval. High. Educ. 1993, 18, 9–34. [Google Scholar] [CrossRef]
  16. Readings, B. The University in Ruins; Harvard University Press: Cambridge, MA, USA, 1996. [Google Scholar]
  17. Hazelkorn, E. Rankings and the Reshaping of Higher Education. The Battle for World-Class Excellence; Palgrave Macmillan: New York, NY, USA, 2015. [Google Scholar]
  18. Morley, L. Quality and Power in Higher Education; Open University Press: Maidenhead, UK, 2003. [Google Scholar]
  19. Naidoo, R. Repositioning Higher Education as a Global Commodity: Opportunities and Challenges for Future Sociology of Education Work. Br. J. Sociol. Educ. 2003, 24, 249–259. [Google Scholar] [CrossRef]
  20. Marsh, H.; Roche, L. Making Students’ Evaluations of Teaching Effectiveness Effective: The Critical Issues of Validity, Bias, and Utility. Am. Psychol. 1997, 52, 1187–1197. [Google Scholar] [CrossRef]
  21. Shulman, L.S. Those Who Understand: Knowledge Growth in Teaching. Educ. Res. 1986, 15, 4–14. [Google Scholar] [CrossRef]
  22. Biggs, J.; Tang, C. Teaching for Quality Learning at University, 4th ed.; Society for Research into Higher Education; Open University Press: Berkshire, UK; New York, NY, USA, 2011. [Google Scholar]
  23. Prince, M. Does Active Learning Work? A Review of the Research. J. Eng. Educ. 2004, 93, 223–231. [Google Scholar] [CrossRef]
  24. Ramsden, P. Learning to Teach in Higher Education, 2nd ed.; Routledge Falmer: London, UK; New York, NY, USA, 2003. [Google Scholar]
  25. Kahu, E.R. Framing Student Engagement in Higher Education. Stud. High. Educ. 2013, 38, 758–773. [Google Scholar] [CrossRef]
  26. Li, J.; Xue, E. Dynamic Interaction between Student Learning Behaviour and Learning Environment: Meta-Analysis of Student Engagement and Its Influencing Factors. Behav. Sci. 2023, 13, 59. [Google Scholar] [CrossRef]
  27. Strayhorn, T.L. College Students’ Sense of Belonging: A Key to Educational Success for All Students, 2nd ed.; Routledge: London, UK, 2018. [Google Scholar]
  28. Hockings, C. Inclusive Learning and Teaching in Higher Education: A Synthesis of Research; Higher Education Academy: York, UK, 2010. [Google Scholar]
  29. O’Neill, G.; McMahon, T. Student-Centred Learning: What Does It Mean for Students and Lecturers? In Emerging Issues in the Practice of University Learning and Teaching; O’Neill, G., Moore, S., McMullin, B., Eds.; AISHE: Dublin, Ireland, 2005; pp. 30–39. [Google Scholar]
  30. Campbell, E. The Ethics of Teaching as a Moral Profession. Curric. Inq. 2008, 38, 357–385. [Google Scholar] [CrossRef]
  31. Dewey, J. How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process; D.C. Heath & Co.: Lexington, MA, USA, 1933. [Google Scholar]
  32. Schön, D.A. Educating the Reflective Practitioner: Toward a New Design for Teaching and Learning in the Professions; Jossey-Bass: San Francisco, CA, USA, 1987. [Google Scholar]
  33. Kember, D.; Leung, D.; Jones, A.; Loke, A.Y.; Mckay, J.; Sinclair, K.; Tse, H.; Webb, C.; Wong, F.; Wong, M.; et al. Development of a Questionnaire to Measure the Level of Reflective Thinking. Assess. Eval. High. Educ. 2000, 25, 381–395. [Google Scholar] [CrossRef]
  34. Morris, E.J. Integrating Academic Integrity: An Educational Approach. In Second Handbook of Academic Integrity; Eaton, S.E., Ed.; Springer: Cham, Switzerland, 2024; pp. 305–324. [Google Scholar]
  35. Henderson, M.; Selwyn, N.; Aston, R. What Works and Why? Student Perceptions of ‘Useful’ Digital Technology in University Teaching and Learning. Stud. High. Educ. 2017, 42, 1567–1579. [Google Scholar] [CrossRef]
  36. Kirkwood, A.; Price, L. Technology-Enhanced Learning and Teaching in Higher Education: What Is ‘Enhanced’ and How Do We Know? A Critical Literature Review. Learn. Media Technol. 2014, 39, 6–36. [Google Scholar] [CrossRef]
  37. Vaclavik, M.; Tomasek, M.; Cervenkova, I.; Baarova, B. Analysis of Quality Teaching and Learning from Perspective of University Students. Educ. Sci. 2022, 12, 820. [Google Scholar] [CrossRef]
  38. Brew, A. Research and Teaching: Beyond the Divide; Palgrave Macmillan: New York, NY, USA, 2006. [Google Scholar]
  39. Laredo, P. Revisiting the Third Mission of Universities: Toward a Renewed Categorization of University Activities? High. Educ. Policy 2007, 20, 441–456. [Google Scholar] [CrossRef]
  40. Perkmann, M.; Tartari, V.; McKelvey, M.; Autio, E.; Broström, A.; D’Este, P.; Fini, R.; Geuna, A.; Grimaldi, R.; Hughes, A.; et al. Academic Engagement and Commercialisation: A Review of the Literature on University–Industry Relations. Res. Policy 2013, 42, 423–442. [Google Scholar] [CrossRef]
  41. Peters, H.P. Scientists as Public Experts: Expectations and Responsibilities. In Routledge Handbook of Public Communication of Science and Technology; Bucchi, M., Trench, B., Eds.; Routledge: London, UK, 2021; pp. 114–128. [Google Scholar]
  42. Trench, B.; Bucchi, M. Global Spread of Science Communication. In Routledge Handbook of Public Communication of Science and Technology; Bucchi, M., Trench, B., Eds.; Routledge: London, UK, 2021; pp. 97–113. [Google Scholar]
  43. Healey, M. Linking Research and Teaching: Exploring Disciplinary Spaces and the Role of Inquiry-Based Learning. In Reshaping The University: New Relationships Between Research, Scholarship And Teaching; Barnett, R., Ed.; Society for Research into Higher Education; Open University Press: Berkshire, UK, 2005; pp. 67–78. [Google Scholar]
  44. Calice, M.N.; Beets, B.; Bao, L.; Scheufele, D.A.; Freiling, I.; Brossard, D.; Feinstein, N.W.; Heisler, L.; Tangen, T.; Handelsman, J. Public Engagement: Faculty Lived Experiences and Perspectives Underscore Barriers and a Changing Culture in Academia. PLoS ONE 2022, 17, e0269949. [Google Scholar] [CrossRef]
  45. Marsh, H. Students’ Evaluations of University Teaching: Dimensionality, Reliability, Validity, Potential Biases and Usefulness. In The Scholarship of Teaching and Learning in Higher Education: An Evidence-Based Perspective; Springer: Dordrecht, The Netherlands, 2007; pp. 319–383. [Google Scholar]
  46. Spooren, P.; Brockx, B.; Mortelmans, D. On the Validity of Student Evaluation of Teaching: The State of the Art. Rev. Educ. Res. 2013, 83, 598–642. [Google Scholar] [CrossRef]
  47. Boring, A. Gender Biases in Student Evaluations of Teaching. J. Public Econ. 2017, 145, 27–41. [Google Scholar] [CrossRef]
  48. Hornstein, H. Student Evaluations of Teaching Are an Inadequate Assessment Tool for Evaluating Faculty Performance. Cogent Educ. 2017, 4, 1304016. [Google Scholar] [CrossRef]
  49. MacNell, L.; Driscoll, A.; Hunt, A. What’s in a Name: Exposing Gender Bias in Student Ratings of Teaching. Innov. High. Educ. 2014, 40, 291–303. [Google Scholar] [CrossRef]
  50. Uttl, B.; White, C.; Gonzalez, D. Meta-Analysis of Faculty’s Teaching Effectiveness: Student Evaluation of Teaching Ratings and Student Learning Are Not Related. Stud. Educ. Eval. 2016, 54, 22–42. [Google Scholar] [CrossRef]
  51. Makki, A.A.; Alqahtani, A.Y.; Abdulaal, R.M.S.; Madbouly, A.I. A Novel Strategic Approach to Evaluating Higher Education Quality Standards in University Colleges Using Multi-Criteria Decision-Making. Educ. Sci. 2023, 13, 577. [Google Scholar] [CrossRef]
  52. Penny, A. Changing the Agenda for Research into Students’ Views about University Teaching: Four Shortcomings of SRT Research. Teach. High. Educ. 2003, 8, 399–411. [Google Scholar] [CrossRef]
  53. Creswell, J.W.; Plano Clark, V.L. Designing and Conducting Mixed Methods Research, 3rd ed.; SAGE Publications: Thousand Oaks, CA, USA, 2018. [Google Scholar]
  54. Javed, Y.; Alenezi, M. A Case Study on Sustainable Quality Assurance in Higher Education. Sustainability 2023, 15, 8136. [Google Scholar] [CrossRef]
  55. Miles, M.B.; Huberman, A.M.; Saldaña, J. Qualitative Data Analysis: A Methods Sourcebook, 4th ed.; SAGE Publications: Thousand Oaks, CA, USA, 2020. [Google Scholar]
  56. Makovec, D. The Dimensions of Teacher’s Professional Development. Sodob. Pedagog. 2018, 69, 106–125. [Google Scholar]
  57. Centre for Social Informatics. 1KA, Version 25.03.21; Faculty of Social Sciences, University of Ljubljana: Ljubljana, Slovenia, 2025.
  58. Stekhoven, D.J. missForest: Nonparametric Missing Value Imputation Using Random Forest, Version 1.6.1; R Core Team: Vienna, Austria, 2025.
  59. Epskamp, S. Exploratory Graph Analysis: A New Approach for Estimating the Number of Dimensions in Psychological Research. PLoS ONE 2017, 12, e0174035. [Google Scholar] [CrossRef]
  60. Epskamp, S.; Fried, E.I. A Tutorial on Regularized Partial Correlation Networks. Psychol. Methods 2018, 23, 617–634. [Google Scholar] [CrossRef]
  61. Louvet, G.; Raymaekers, J.; Van Bever, G.; Wilms, I. The Influence Function of Graphical Lasso Estimators. Econom. Stat. 2023; in press. [CrossRef]
  62. Friedman, J.; Hastie, T.; Tibshirani, R. Sparse Inverse Covariance Estimation with the Graphical Lasso. Biostatistics 2008, 9, 432–441. [Google Scholar] [CrossRef] [PubMed]
  63. Chen, J.; Chen, Z. Extended Bayesian Information Criteria for Model Selection with Large Model Spaces. Biometrika 2008, 95, 759–771. [Google Scholar] [CrossRef]
  64. Brusco, M.; Steinley, D.; Watts, A.L. A Comparison of Spectral Clustering and the Walktrap Algorithm for Community Detection in Network Psychometrics. Psychol. Methods 2024, 29, 704–772. [Google Scholar] [CrossRef]
  65. Pons, P.; Latapy, M. Computing Communities in Large Networks Using Random Walks. In Proceedings of the Computer and Information Sciences-ISCIS 2005; Yolum, P., Güngör, T., Gürgen, F., Özturan, C., Eds.; Springer: Berlin/Heidelberg, Germany, 2005; pp. 284–293. [Google Scholar]
  66. Christensen, A.P.; Golino, H. Estimating the Stability of Psychological Dimensions via Bootstrap Exploratory Graph Analysis: A Monte Carlo Simulation and Tutorial. Psych 2021, 3, 479–500. [Google Scholar] [CrossRef]
  67. Golino, H.; Christensen, A. EGAnet: Exploratory Graph Analysis—A Framework for Estimating the Number of Dimensions in Multivariate Data Using Network Psychometrics, Version 2.4.0; R Core Team: Vienna, Austria, 2025.
  68. Christensen, A.P.; Golino, H. On the Equivalency of Factor and Network Loadings. Behav. Res. Methods 2021, 53, 1563–1580. [Google Scholar] [CrossRef] [PubMed]
  69. R Core Team. R: A Language and Environment for Statistical Computing, Version 4.5.2; R Core Team: Vienna, Austria, 2025.
  70. López-Hernández, C.; Martínez-Orozco, E.; Soto-Pérez, M. Typology of Teaching Profiles: A Model for Improving the Quality of University Education in the Context of Sustainable Development Goal 4. Sustainability 2025, 17, 11066. [Google Scholar] [CrossRef]
  71. Ciuchi, O.M.; Șerbănescu, L.E.; Dobre, C.M.; Georgescu, B.G.; Țigănoaia, B.D.; Țucă, P.L. The Impact of Student Evaluation of Teaching Staff on Enhancing the Quality of Teaching in Higher Education in Romania. Sustainability 2024, 16, 10196. [Google Scholar] [CrossRef]
  72. Huang, Y.; Jiang, L.; Zhai, R. Reconciling Teaching and Research Tensions: A Sustainability Framework for Expert Teacher Development in Research Intensive Universities. Sustainability 2025, 17, 7113. [Google Scholar] [CrossRef]
  73. Robinson, J.M.; Seymour, R.; Jin, S.; Whiteman, R.S. Sense of Belonging, DFW Reduction, and Student Success: Centering Student Experience in Groups with Ethnographic Methods. Educ. Sci. 2025, 15, 523. [Google Scholar] [CrossRef]
  74. Zdonek, I.; Zdonek, D.; Król, K.; Halva, J. Sustainability-Oriented Higher Education Activities: Insights from Institutional Isomorphism Perspective. Sustainability 2025, 17, 11034. [Google Scholar] [CrossRef]
  75. Ghorbani, A.; Blankesteijn, M.L. Beyond the Ivory Tower: How Dutch Universities Convert Missions into ESG Performance. Sustainability 2026, 18, 624. [Google Scholar] [CrossRef]
  76. Wang, F.; Jo, N. Staying Without Sustainability: How Everyday Governance Reshapes Teachers’ Work in Private Higher Education in China. Sustainability 2026, 18, 1587. [Google Scholar] [CrossRef]
  77. Levy-Feldman, I. The Role of Assessment in Improving Education and Promoting Educational Equity. Educ. Sci. 2025, 15, 224. [Google Scholar] [CrossRef]
  78. Chen, X.; Wu, L.; Jia, L.; AlGerafi, M.A.M. Flow Experience and Innovative Behavior of University Teachers: Model Development and Empirical Testing. Behav. Sci. 2025, 15, 363. [Google Scholar] [CrossRef] [PubMed]
  79. Zhang, Y.; Sun, S.; Ji, Y.; Li, Y. The Consensus of Global Teaching Evaluation Systems under a Sustainable Development Perspective. Sustainability 2023, 15, 818. [Google Scholar] [CrossRef]
  80. Curtis, H.L.; Gabriel, L.C.; Sahakian, M.; Cattacin, S. Practice-Based Program Evaluation in Higher Education for Sustainability: A Student Participatory Approach. Sustainability 2021, 13, 10816. [Google Scholar] [CrossRef]
  81. Wongvorachan, T.; Bulut, O.; Gorgun, G.; Daniels, L. Evaluating the Validity of the Student Perspectives of Teaching Survey: A Network Psychometrics Approach. Trends High. Educ. 2025, 4, 74. [Google Scholar] [CrossRef]
  82. Zipser, N.; Kurochkin, D.; Yu, K.W.; Mincieli, L.A. Exploring Relationships Between Qualitative Student Evaluation Comments and Quantitative Instructor Ratings: A Structural Topic Modeling Framework. Educ. Sci. 2025, 15, 1011. [Google Scholar] [CrossRef]
  83. Papadogiannis, I.; Vassilakis, C.; Wallace, M.; Katsis, A. On the Quality and Validity of Course Evaluation Questionnaires Used in Tertiary Education in Greece. Trends High. Educ. 2024, 3, 221–234. [Google Scholar] [CrossRef]
  84. Bart, W.M.; Abulela, M.A.A.; Khalaf, M.A. Investigating Course Level Effects on Student Evaluations of Teaching in Higher Education. Educ. Sci. 2026, 16, 94. [Google Scholar] [CrossRef]
  85. Blake, J. Aligning Teaching Philosophy Statements with Practice: An Evidence-Based Approach Using Retrospective Think-Aloud Protocols. Educ. Sci. 2024, 14, 795. [Google Scholar] [CrossRef]
  86. Beckett, R.D.; Sheehan, A.H.; Isaacs, A.N.; Ramsey, D.; Sprunger, T. Development and Assessment of a Rubric for Evaluating Teaching Portfolios Developed by Teaching and Learning Curriculum (TLC) Program Participants. Am. J. Pharm. Educ. 2024, 88, 101262. [Google Scholar] [CrossRef] [PubMed]
  87. Sánchez-Santamaría, J.; Boroel-Cervantes, B.I.; López-Garrido, F.-M.; Hortigüela-Alcalá, D. Motivation and Evaluation in Education from the Sustainability Perspective: A Review of the Scientific Literature. Sustainability 2021, 13, 4047. [Google Scholar] [CrossRef]
  88. Lim, S.H.; Lim, L.; Lye, C.Y.; Lim, W.Y. Personalised Professional Development in Teaching and Learning in Higher Education. Trends High. Educ. 2025, 4, 16. [Google Scholar] [CrossRef]
Figure 1. Dimensionality of higher-education teacher excellence. Note. PE = professional ethos; TSR = teacher–student relationality; DC = didactical competencies; VOG = values-oriented guidance; PK = professional knowledge; DKC = discipline development & knowledge creation; EWC = engagement with the wider community; TC = transversal competencies. Green edges represent positive regularised partial correlations, while red edges represent negative regularised partial correlations.
Figure 1. Dimensionality of higher-education teacher excellence. Note. PE = professional ethos; TSR = teacher–student relationality; DC = didactical competencies; VOG = values-oriented guidance; PK = professional knowledge; DKC = discipline development & knowledge creation; EWC = engagement with the wider community; TC = transversal competencies. Green edges represent positive regularised partial correlations, while red edges represent negative regularised partial correlations.
Sustainability 18 04858 g001
Table 1. Conceptual structure of higher-education teacher excellence (first- and second-order categories).
Table 1. Conceptual structure of higher-education teacher excellence (first- and second-order categories).
Categories (Second Order)Subcategories (First Order)
professional ethos personal engagement, pedagogical eros, commitment to teaching, personal integrity, attitude toward students, aptitude for working with students, desire for teaching, commitment to the discipline and profession, coming to sessions well-prepared, responsibility and accountability in the teaching role, openness to learning, intellectual humility, curiosity and desire to discover new things, openness to others’ perspectives, openness to diverse didactical approaches, openness to disciplinary, scientific developments, interdisciplinary orientation, empathy, reflection on teaching practice, behavioural self-reflection, emotional self-reflection, adaptability/flexibility, willingness to modify teaching practice, commitment to professional development
teacher–student relationalitycreating a safe learning environment, teacher–student relatedness, peer-relatedness facilitation, student insight and understanding
didactical competenciesmotivational support, student-centred teaching, fostering students’ enthusiasm for course content, fostering students’ enthusiasm for the profession, recognising students’ potential and strengths, adapting to students’ needs, providing relevant feedback, establishing conditions for high-quality learning, instructional explanation, active participation facilitation, fostering general education, theory–practice integration, instructional competencies, method selection and application, ICT-supported instruction, integration of research and teaching,
mentorship
values-oriented guidance setting boundaries for acceptable student behaviour
professional knowledge higher-education didactics knowledge, foundational pedagogical knowledge, subject-matter expertise, general knowledge, interdisciplinary engagement, regulatory and institutional framework knowledge
discipline development and knowledge creationdevelopment of the scientific field/discipline; project leadership and management
engagement with the wider community embeddedness in the wider community, promotion of science
transversal competencies teamwork competencies, communication skills, organisational skills, self-management, balancing teaching and research responsibilities, administrative skills
Note. Codes were generated inductively from the data and subsequently organised into first-order subcategories (overarching concepts). The subcategories were then clustered into second-order categories (higher-order themes).
Table 2. Frequency distribution of the number of dimensions recovered across 500 replications using bootstrap exploratory graph analysis (bootEGA).
Table 2. Frequency distribution of the number of dimensions recovered across 500 replications using bootstrap exploratory graph analysis (bootEGA).
Number of Dimensions4567891011
Frequency (%)7.419.423.823.615.27.62.01.0
Table 3. Structural consistency of dimensions.
Table 3. Structural consistency of dimensions.
Dimension1st2nd3rd4th5th6th
Dimension stability 0.2060.0280.0020.3560.2820.386
Table 4. Cross-phase mapping between categories from Study 1 and EGA dimensions from Study 2.
Table 4. Cross-phase mapping between categories from Study 1 and EGA dimensions from Study 2.
CategoryD1D2D3D4D5D6TotalMain Pattern Across Phases
Professional ethos261435030Redistributed across several dimensions, but concentrated in D3
Teacher–student relationality0010607Largely retained in D5, with minor spillover to D3
Didactical competencies721138031Split across D1, D3, and D5, with smaller contributions to D2 and D4
Values-oriented guidance0040307Divided mainly between D3 and D5
Professional knowledge0222006Redistributed across D2, D3, and D4
Discipline development and knowledge creation0300025Split between D2 and D6
Engagement with the wider community0200046Largely retained in D6, with some overlap with D2
Transversal competencies0011013Dispersed; no distinct dimension recovered
Total91533922795
Note. The first column lists the second-order qualitative categories identified in Study 1. D1–D6 denote the six dimensions retained from the EGA solution in Study 2: D1 = foundational pedagogical–didactical principles; D2 = research and scientific excellence; D3 = course management and professional ethos; D4 = innovation-oriented practice and regulatory compliance; D5 = student-centred relational orientation; D6 = external academic engagement. Cell entries indicate the number of items originally generated within each qualitative category in Study 1 that were assigned to each EGA dimension in Study 2. The final column provides an interpretive summary of the dominant cross-phase pattern.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kristl, N.; Sočan, G.; Makovec Radovan, D. Conceptualising Higher-Education Teacher Excellence for More Inclusive and Sustainable Evaluation: An Exploratory Sequential Mixed-Methods Study. Sustainability 2026, 18, 4858. https://doi.org/10.3390/su18104858

AMA Style

Kristl N, Sočan G, Makovec Radovan D. Conceptualising Higher-Education Teacher Excellence for More Inclusive and Sustainable Evaluation: An Exploratory Sequential Mixed-Methods Study. Sustainability. 2026; 18(10):4858. https://doi.org/10.3390/su18104858

Chicago/Turabian Style

Kristl, Nina, Gregor Sočan, and Danijela Makovec Radovan. 2026. "Conceptualising Higher-Education Teacher Excellence for More Inclusive and Sustainable Evaluation: An Exploratory Sequential Mixed-Methods Study" Sustainability 18, no. 10: 4858. https://doi.org/10.3390/su18104858

APA Style

Kristl, N., Sočan, G., & Makovec Radovan, D. (2026). Conceptualising Higher-Education Teacher Excellence for More Inclusive and Sustainable Evaluation: An Exploratory Sequential Mixed-Methods Study. Sustainability, 18(10), 4858. https://doi.org/10.3390/su18104858

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop