Next Article in Journal
Human-Centric Digital Twins for Spatial Sustainability: A Procedural VR Framework for Calibrating Agent-Based Evacuation Models in Diverse Urban Morphologies
Next Article in Special Issue
Sustaining Learning Practices: Exploring the Roles of External Engagement for Engineering Graduates
Previous Article in Journal
Groundwater Pollution Prevention Zoning in Coastal Industrial Regions Based on a Quantitative Risk Index: A Case Study of the Eastern Hebei Plain, China
Previous Article in Special Issue
Sustainable AI-Driven Assessment in Higher Education: A Systematic Review of Fairness, Transparency, Pedagogical Innovation, and Governance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing Inclusive Sustainability-Oriented Learning in Higher Education Using Adaptive Learning Platforms and Performance-Based Assessment

Department of English Language Teaching, Near East University, Mersin 10, Nicosia 99138, North Cyprus, Turkey
*
Author to whom correspondence should be addressed.
Sustainability 2026, 18(3), 1489; https://doi.org/10.3390/su18031489
Submission received: 15 January 2026 / Revised: 28 January 2026 / Accepted: 30 January 2026 / Published: 2 February 2026

Abstract

The rapid digital transformation of higher education institutions (HEIs) has created new opportunities to promote sustainability-focused teaching, learning, and assessment. At the same time, traditional assessment methods often fail to accurately measure complex skills needed for sustainability, such as systems thinking, critical reflection, and real-world problem-solving. This study examines the integration of adaptive learning platforms with performance-based assessment (PBA) as an innovative way to support inclusive, sustainability-oriented learning in higher education. Based on principles of Education for Sustainable Development (ESD), Universal Design for Learning (UDL), and constructivist learning theory, the study investigates how adaptive learning technologies tailor instruction for diverse learners while PBAs offer genuine measures of sustainability skills. Using a mixed-methods approach, data were gathered from forty-eight undergraduate students enrolled in an inclusive education course that used an adaptive learning module and PBA tasks. Learning analytics, rubric-based performance scores, and student perception surveys were analyzed to explore effects on engagement, accessibility, and skill development. The results show that this combined method enhances student inclusion, supports differentiated learning pathways, boosts engagement in sustainability tasks, and yields more complete evidence of sustainability competencies than traditional assessments. The study provides a framework for HEIs aiming to align digital transformation initiatives with sustainability objectives. It emphasizes the potential of integrating adaptive learning and PBA to promote innovative, inclusive, and sustainability-focused assessment practices. Implications for policy, curriculum design, and future digital sustainability efforts are also discussed.

1. Introduction

By providing graduates with sustainability competencies such as systems thinking, anticipatory thinking, ethical reasoning, and action competence, higher education institutions (HEIs) are expected to play a crucial role in achieving the UN Sustainable Development Goals (SDGs) [1]. These complex, interdisciplinary competencies, which need to be used in real-world situations, are complex for traditional assessment methods to capture, as they are primarily summative, decontextualized, and recall-focused. To measure sustainable learning outcomes, performance-based assessment (PBA), which assesses students through real-world assignments, projects, and portfolios, has been suggested as a more appropriate method [2].
Simultaneously, the digital revolution in higher education, propelled by AI-enabled personalization, learning analytics, and adaptive learning systems, offers significant advantages for customizing learning routes and tracking student progress in real time. Adaptive learning platforms can use learner data to provide scaffolded support, timely feedback, and differentiated materials to improve engagement and meet a range of learning needs [3,4]. By accommodating a variety of learner profiles, these technologies can democratize access to sustainability education when combined with PBA. Furthermore, it offers a wealth of multimodal proof of students’ sustainability competencies through digital transformation, analytics-informed rubrics, and e-portfolios.
This research lies in the nexus of digital transformation studies, Universal Design for Learning (UDL), and Education for Sustainable Development (ESD). ESD emphasizes the need for interdisciplinary pedagogies and evaluations that foster the desired sustainability capabilities [1]. UDL offers a framework for creating inclusive learning environments that thoughtfully address learner heterogeneity through a variety of representational, interactive, and expressive media. This strategy aligns well with the personalization features of adaptive learning platforms [5]. Last but not least, because of their emphasis on real-world problem-solving, reflection, and civic involvement, innovative assessment studies contend that authentic, process-oriented assessments like PBA are more aligned with the objectives of ESD than standard tests [2,6].
Empirical research that methodically combines adaptive learning platforms with PBA expressly for sustainability education remains scarce, despite mounting evidence for both adaptive learning systems and performance-based approaches. Recent studies of sustainability assessment procedures in HEIs highlight the need for strong, cross-institutional models that account for both learning outcomes and practical impact. The majority of research highlights ongoing problems with alignment between SDG-oriented learning goals and assessment procedures [6,7]. Few studies have examined how platform-generated analytics can be integrated with rubric-based PBAs to create inclusive, reliable, and scalable assessment frameworks for sustainability competencies [8], even though many report that adaptive platforms can boost engagement and personalize instruction [9,10]. Research acknowledges that these disparities make it more difficult for HEIs to operationalize digital transformation at scale for fair, sustainability-focused evaluation.
This study explores how integrating performance-based assessments with adaptive learning platforms can improve inclusive, sustainability-focused learning in higher education to close these gaps. The study aims to develop and examine a paradigm for digital, inclusive assessment aligned with ESD and UDL principles by integrating learning analytics, rubric-based performance data, and student perspectives. This study investigates the following:
  • How does an adaptive learning platform personalize instruction for diverse learners in sustainability-related courses?
  • In what ways does performance-based assessment capture students’ sustainability competencies?
  • Does combining adaptive learning with PBA improve inclusion, engagement, and learning outcomes compared to conventional approaches?

2. Literature Review

2.1. Adaptive Learning Platforms in Higher Education

Adaptive learning routes that react to students’ needs, preferences, and performance data are provided through adaptive learning platforms, which are now a key part of higher education’s digital transformation initiatives [11]. These systems, which continuously adjust feedback, tempo, and content difficulty using algorithms and learning analytics [12], enable more responsive and customized learning experiences. According to recent research, adaptive learning with technologies like AI and augmented realities promotes better learner engagement, self-control, and mastery of complex competencies, especially in classrooms with a diverse student body and a wide range of digital proficiency [13,14].
One of adaptive learning’s main benefits is that it adheres to Universal Design for Learning (UDL) principles, which emphasize adaptability and a variety of engagement, representation, and expression channels. Adaptive platforms can reduce learning barriers and improve equal access to course content by offering tailored digital pathways, a crucial factor in teaching and evaluation focused on sustainability [15]. Adaptive systems give postgraduate students, particularly those enrolled in teacher education programs, the chance to participate in a range of instructional scenarios and reflection assignments that replicate the complexity of actual teaching [16].
Additionally, formative analytics is increasingly incorporated in adaptive learning systems, allowing teachers to track students’ progress, spot mistakes, and adjust support in real time. Such analytics-driven customisation fosters the growth of higher-order abilities, such as critical thinking and problem-solving, which are closely linked to sustainability education [17,18]. It also adds to deeper cognitive engagement. Adaptive platforms can scaffold learners through iterative cycles of feedback and correction when used in conjunction with performance-based assessment, enabling learners to apply sustainability ideas in real-world settings [19].
The literature also points to enduring difficulties despite these advantages. Since students who lack confidence or technological competence typically show lower engagement and slower progress through digital modules, gaps in digital literacy among learners can restrict the efficiency of adaptive systems [20]. Large-scale implementation of adaptive learning in universities may also be hampered by institutional limitations, such as inadequate infrastructure, inadequate faculty training, and uneven governmental support [21]. These difficulties highlight the need for comprehensive, system-level strategies for digital transformation that combine technical tools with innovative teaching methods and sustainability objectives. In higher education, adaptive learning platforms offer a promising basis for creating inclusive, adaptable, and sustainable learning environments. They have the potential to enhance sustainability competencies in aspiring teachers, promote deeper learning, and increase involvement when used alongside cutting-edge evaluation approaches such as PBA.

2.2. Performance-Based Assessment (PBA) for Sustainability Competencies

In higher education, performance-based assessment (PBA) has been recognized as a crucial evaluation method, especially for courses that aim to develop sustainability-related skills. PBAs challenge students to demonstrate that they can apply information, abilities, and dispositions in authentic, real-world contexts, in contrast to standard assessments that prioritize rote memorization or procedural knowledge. Due to their proven ability to foster higher-order competencies such as systems thinking, ethical and reflective reasoning, collaborative problem-solving, and action competence—all crucial elements of sustainability education—PBAs have become increasingly well-known in recent years [19,20]. PBAs include challenging, open-ended assignments in sustainability-focused programs that engage students in socio-environmental problem-solving.
Multidisciplinary case studies, community-based service-learning initiatives, introspective digital portfolios, multimodal presentations, and sustainability action plans are examples of these kinds of assignments. By requiring students to critically examine the environmental, social, and economic aspects of sustainability issues, these real-world assignments promote a greater comprehension and application of sustainability concepts in regional or international settings [21]. PBAs facilitate the development of participatory competences by encouraging students to engage in collaborative inquiry. These competencies allow students to articulate a variety of viewpoints, negotiate shared meaning, and suggest workable solutions—skills that will be essential for future sustainability practitioners.
PBAs have been especially successful in developing professional identities and encouraging reflective practice in teacher education, both of which are essential for training teachers to create inclusive, equitable, and sustainable learning environments. Research demonstrates that PBAs assist pre-service and in-service educators in delving further into ideas of diversity, social justice, and inclusive pedagogy, allowing them to incorporate sustainability principles into their future instructional strategies [22]. Teacher education students encounter the integrative nature of sustainability competencies in real-world, significant ways through assignments such as creating inclusive lesson plans, developing proposals for community participation, or implementing sustainability-focused classroom interventions.
PBAs are difficult to administer and require substantial resources despite their apparent advantages. Teachers must provide strong rubrics, offer organized scaffolding, and support iterative feedback loops for effective implementation. It takes extensive knowledge and effort to create realistic, high-quality projects. Furthermore, it becomes difficult to guarantee consistency, dependability, and fairness in scoring when tests include open-ended questions or subjective judgment. Therefore, without sufficient institutional backing, scaling PBAs across larger cohorts remains challenging.
However, recent research indicates that PBAs’ efficacy and scalability can be significantly increased by combining them with digital technology. Multimodal performance displays, automated or real-time feedback, and analytical insights that monitor learners’ progress over time are all made possible by digital platforms [23]. Students now have more ways to demonstrate their learning in dynamic, interactive formats, thanks to tools such as e-portfolios, virtual collaborative environments, simulation platforms, and AI-supported feedback systems. By meeting learners’ diverse needs and providing individualized feedback, these technologies not only reduce instructors’ workload but also enhance the validity and inclusivity of examinations.
Performance-based assessment is positioned as a revolutionary approach to developing sustainability competencies in higher education, driven by the convergence of PBA and digital advances. PBAs can be extremely helpful in equipping graduates to handle the complexity, unpredictability, and interdisciplinarity of global sustainability concerns when adequately supported by their institutions and aligned with digital transformation programs.

2.3. Integrating Digital Transformation, Adaptive Learning, and Sustainability Education

Higher education’s pedagogy, curriculum, and assessment landscape are all changing as a result of digital change. This shift brings digital ecosystems into sustainability education, enabling more customized, immersive, and interactive learning opportunities. Advanced digital tools, including data-driven analytics, AI-assisted platforms, adaptive learning systems, VR-based simulations, and collaborative online environments, offer new pedagogical opportunities for students to engage with sustainability issues in dynamic, context-rich ways, according to recent studies [24,25]. In line with important sustainability education paradigms that prioritize authenticity, real-world relevance, and complexity-informed learning, these technologies support scenario-based learning, immersive simulations, and multidisciplinary cooperation.
The use of cutting-edge digital assessment techniques is a key element of digital transformation in sustainability education. Systems thinking, ethical reasoning, problem-solving, and collaborative decision-making are examples of sustainability competencies that may now be directly incorporated into digital learning assignments thanks to emerging assessment tools. Teachers can monitor students’ engagement patterns, progress in sustainability competences, and depth of conceptual understanding over time using learning analytics technologies, which provide comprehensive insights into students’ developmental paths [26]. This facilitates the early identification of students who may need more help or varied scaffolding, enabling more informed teaching decisions.
These possibilities are further expanded by adaptive learning systems, which ensure that students follow individualized paths as they navigate content with a sustainability focus. To reduce obstacles associated with digital preparedness, prior knowledge, or cognitive load, these systems can modify instructional materials, scaffold learning tasks, and provide tailored feedback based on learner data [27]. To improve accessibility, fairness, and learner agency in sustainability-focused courses that regularly address intricate, multidisciplinary topics, adaptive learning can be critical.
A highly effective model for sustainability education is provided by combining performance-based assessment (PBA) and adaptive learning. By focusing on knowledge gaps, providing individualized practice opportunities, and ensuring that all learners achieve a baseline of conceptual preparation before performing realistic assessments, adaptive learning platforms can help students become ready for challenging, real-world performance tasks. PBAs, on the other hand, offer students rich, practical settings in which they can practice sustainability competencies. Research indicates that learner engagement, motivation, and the depth of learning all increase significantly when adaptive scaffolding precedes performance challenges [28,29]. This is particularly relevant in teacher education programs, as students are required to connect professional practice, reflective assignments, inclusive pedagogical techniques, and theoretical notions of sustainability.
Despite these encouraging developments, there remains a significant lack of empirical research that deliberately integrates PBA, adaptive learning, and digital transformation into sustainability education. Instead of treating these elements as linked educational practices, existing work often treats them separately. In digitally altered learning environments, scholars emphasize the need for integrated research frameworks that address digital equality, inclusivity, culturally responsive pedagogy, and the development of strong sustainability competencies [30]. This disparity underscores the importance of research like this, which examines the complementary impacts of PBA and adaptive learning on postgraduate students’ development of sustainability competencies, particularly in inclusive education and English language instruction.

2.4. Theoretical Framework

This paper explains how combining adaptive learning platforms with performance-based assessment (PBA) improves sustainability competencies in higher education by drawing on constructivist learning theory, Universal Design for Learning (UDL), and Education for Sustainable Development (ESD). By highlighting the necessity of pedagogies that foster systems thinking, ethical reasoning, and action competence, ESD offers the primary justification [31]. By immersing students in actual sustainability scenarios, PBA facilitates the realistic, context-rich learning experiences that these abilities demand [32].
By promoting flexible learning pathways and multimodal support for a range of learner demands, UDL influences the study’s inclusive design orientation [33]. By providing varying difficulty levels, scaffolded feedback, and individualized training, adaptive learning systems exemplify UDL concepts [34]. This is especially true in teacher education focused on sustainability, as students bring a variety of digital skills and work experience.
The use of PBA is supported by constructivist learning theory since performance tasks call for contextual problem-solving, reflective judgment, and active knowledge production [35]. PBAs encourage in-depth comprehension and the application of sustainability ideas in teaching practice through ELT-based sustainability challenges, micro-projects, and policy redesign assignments. This integration is further supported by the philosophy of digital transformation, especially learning analytics [36]. Adaptive systems enable practical mastery of challenging sustainability concepts by using analytics to improve learner paths, track progress, and provide tailored feedback [37].
PBAs and adaptive modules are positioned as complementary elements in the conceptual framework. PBAs necessitate cross-context transfer and realistic performance, whereas adaptive learning provides the fundamental framework for conceptual mastery [38]. The development of sustainability competencies is mediated by three mechanisms: reflective consolidation, authentic application, and tailored scaffolding [39]. Because students with more advanced digital skills gain more from adaptive affordances, digital readiness moderates results [40]. In accordance with SDG 4 (Quality Education) and SDG 10 (Reduced Inequalities), the framework as a whole describes how a digital, inclusive, and authentic assessment environment may promote sustainability-oriented learning outcomes.
In this study, the selection of systems thinking, ethical and equity-oriented decision-making, and action competence as focal sustainability competencies is grounded in widely recognized and standardized frameworks within Education for Sustainable Development (ESD). Internationally referenced models—including UNESCO’s ESD competencies framework, Wiek et al.’s key sustainability competencies model, and the subsequent higher education assessment literature—consistently identify these three competencies as core, integrative dimensions of sustainability learning [41]. Rather than representing isolated skills, they collectively reflect the cognitive, normative, and behavioral dimensions required for learners to understand, evaluate, and act upon complex sustainability challenges.
Systems thinking represents the cognitive dimension of sustainability competence, emphasizing learners’ ability to recognize interconnections, feedback loops, and systemic relationships across environmental, social, and educational contexts [42]. In sustainability education, systems thinking enables students to move beyond linear cause–effect reasoning and instead analyze inclusive education systems as dynamic, interdependent structures involving policy, pedagogy, culture, and community actors [43]. This competency is repeatedly identified as foundational in ESD frameworks, as it underpins informed decision-making and anticipatory understanding of sustainability challenges.
Ethical and equity-oriented decision-making reflects the normative dimension of sustainability, focusing on values, justice, responsibility, and fairness [44]. The ESD literature emphasizes that sustainability education must cultivate learners’ capacity to evaluate dilemmas involving power, inclusion, cultural diversity, and social equity. In the context of inclusive education and teacher preparation, this competency is particularly critical, as educators are required to make value-laden decisions that affect marginalized learners, access to education, and equitable learning opportunities. Ethical reasoning thus operationalizes sustainability as a moral and social commitment rather than a purely technical concern.
Action competence captures the behavioral and transformative dimension of sustainability, referring to learners’ ability to design, implement, and evaluate concrete actions that promote sustainable and inclusive practices [45]. Originating in Scandinavian ESD traditions and now widely adopted in sustainability education research, action competence emphasizes empowerment, agency, and real-world engagement. Rather than measuring passive knowledge acquisition, this competency focuses on learners’ readiness to initiate change, collaborate with stakeholders, and translate sustainability principles into professional and community-based practice.
The integration of these three competencies is explicitly aligned with SDG Target 4.7, which calls for education systems to ensure that learners acquire the knowledge, skills, values, and attitudes needed to promote sustainable development, global citizenship, equity, and social justice [46]. Systems thinking supports SDG 4.7 by enabling learners to understand sustainability as a multidimensional and interconnected phenomenon; ethical and equity-oriented decision-making aligns with the target’s emphasis on human rights, gender equality, and cultural diversity; and action competence directly reflects SDG 4.7’s focus on active citizenship and meaningful participation in sustainability-oriented change.
From an assessment perspective, ESD scholarship increasingly argues that these competencies cannot be validly captured through traditional, decontextualized testing methods. The contemporary ESD assessment literature emphasizes the need for authentic, performance-based, and process-oriented assessment approaches that evaluate learners’ reasoning, judgment, and action in realistic contexts. By focusing on systems thinking, ethical decision-making, and action competence, the present study responds to this call and aligns its assessment design with international recommendations for sustainability competency evaluation in higher education.
Accordingly, the theoretical framework positions these three competencies as analytically distinct yet pedagogically interdependent dimensions of sustainability learning. Their combined assessment through adaptive learning and performance-based tasks reflects current advances in ESD-aligned evaluation, offering a coherent and theoretically grounded approach to measuring sustainability competencies in digitally mediated higher education environments

3. Methodology

3.1. Research Design

This study employed a mixed-methods research methodology, integrating quantitative learning analytics data with qualitative insights to evaluate the efficacy of a digital, adaptive performance-based assessment (PBA) model in improving sustainability skills. Mixed-methods research is particularly well-suited for intricate educational innovations, as it provides complementary evidence—quantitative performance metrics and comprehensive experiential narratives—that facilitate a thorough understanding of the impact of assessment innovations on sustainability-oriented learning outcomes. The design adhered to an explanatory sequential framework, segmented into quantitative and qualitative phases. Phase 1—Quantitative data obtained from the Baseline survey, digital learning analytics, and performance scores before and after the intervention. Phase 2—Qualitative data derived from semi-structured interviews conducted to unveil contextualize, and enhance comprehension of quantitative patterns. This strategy ensured methodological triangulation and enabled the results to show both the quantitative effects of the adaptive PBA model and participants’ subjective learning experiences.
To ensure methodological clarity and analytical transparency, the study variables were explicitly defined and categorized in accordance with the research design. The independent variable was the adaptive learning–supported performance-based assessment (PBA) intervention, which integrated personalized digital learning pathways with authentic, rubric-based sustainability performance tasks. This intervention constituted the primary instructional and assessment treatment applied during the eight-week study period.
The dependent variables were students’ sustainability competencies, operationalized through three analytically distinct but theoretically interrelated dimensions: systems thinking, ethical and equity-based decision-making, and action competence. Systems thinking captured students’ ability to analyze sustainability challenges holistically and recognize interconnections within inclusive educational systems. Ethical and equity-based decision-making reflected learners’ capacity to evaluate sustainability-related dilemmas using principles of fairness, justice, and social responsibility. Action competence represented students’ ability to design and propose feasible, sustainability-oriented actions applicable to real educational and community contexts.
Digital readiness was treated as a moderating variable, reflecting students’ prior familiarity with digital learning environments, confidence in navigating educational technologies, and capacity to engage effectively with adaptive learning systems. This variable was examined to determine whether differences in digital preparedness influenced students’ responsiveness to the adaptive PBA intervention and their sustainability competency development. Several control and contextual variables were maintained to reduce confounding effects and enhance internal validity. These included the course context (a postgraduate Inclusive Education course within an ELT program), standardized instructional content across participants, and the use of common assessment rubrics and performance criteria for all PBA tasks. By holding these contextual factors constant, the study sought to isolate the effects of the adaptive PBA intervention on the targeted sustainability competencies.
The study was designed to examine the integrated effect of adaptive learning and performance-based assessment as a unified instructional and assessment model rather than to isolate the individual impact of each component. This design choice reflects current sustainability and UDL-oriented pedagogical practices, where adaptivity and authentic assessment are typically implemented in combination to support inclusive, competency-based learning. Consequently, the research focuses on evaluating the effectiveness of the combined model in an authentic educational setting, rather than benchmarking it against non-adaptive or non-performance-based alternatives.

3.2. Participants and Context

The study was conducted within a postgraduate course on Inclusive Education offered by the English Language Teaching (ELT) Department at a public university. The course includes a dedicated sustainability component addressing inclusive pedagogy, equity, global citizenship, and socially just educational practices, aligning with SDGs 4 (Quality Education), 5 (Gender Equality), 10 (Reduced Inequalities), and 16 (Peace, Justice, and Strong Institutions). The course was delivered in a blended learning format combining face-to-face sessions with structured digital activities.
Participant recruitment followed a census-based, voluntary sampling approach, whereby all students enrolled in the course during the semester were invited to participate in the study. At the beginning of the semester, the instructor introduced the research objectives, procedures, and ethical safeguards during a scheduled class session. Students were informed that participation was entirely voluntary, that non-participation would not affect course grades or standing, and that they could withdraw at any stage without penalty.
A total of 48 postgraduate students (32 females, 16 males; mean age = 27.4 years) provided informed consent and met the study’s inclusion criteria. The sample size (n = 48) reflects the full cohort of students enrolled in the postgraduate Inclusive Education course during the study period and was therefore determined by the authentic instructional context rather than probabilistic sampling. This size is appropriate for the study’s mixed-methods, exploratory design, which emphasizes depth of analysis, triangulation of data sources, and detailed examination of learning processes within a real educational setting.
In particular, the integration of learning analytics, rubric-based performance assessments, reflective journals, and semi-structured interviews required intensive data collection and analysis at the individual level. Such designs are commonly used in sustainability and educational innovation research to validate conceptual models and examine mechanisms of change before large-scale implementation. Accordingly, the findings are intended to provide analytical rather than statistical generalization, offering transferable insights into how adaptive learning and performance-based assessment can support sustainability competency development in higher education.
All participants had prior professional experience teaching English at the primary or secondary level (ranging from 1 to 7 years). Importantly, none of the participants had previous experience with adaptive learning platforms or performance-based assessments explicitly designed to measure sustainability competencies, making the cohort appropriate for examining adoption, learning processes, and outcomes associated with the intervention. Inclusion criteria for participation were:
(a)
Enrollment in the postgraduate Inclusive Education course;
(b)
Provision of informed consent to participate in the study;
(c)
No prior experience with adaptive learning platforms or sustainability-focused performance-based assessments;
(d)
Active participation throughout the intervention period.
Exclusion criteria included:
(a)
Incomplete participation in the intervention activities;
(b)
Missing pre-intervention or post-intervention survey data;
(c)
Withdrawal from the course or the study during the intervention period.
All enrolled students who consented completed the intervention and provided complete datasets; therefore, no participants were excluded from the final analysis. This ensured data completeness and strengthened the internal validity of the study.

3.3. Instrumentation

The study used a unique digital platform called SustainLearn. The platform included adaptive modules carefully designed to align with material focused on sustainability, tailored to student performance and learning statistics. There were interactive assignments based on scenarios spanning different classroom settings that addressed equality, global citizenship, cultural responsiveness, and inclusiveness. There was an analytics dashboard in this system that enabled you to track your progress, how long it took you to finish, your level of expertise, and how much you grew in understanding.
The PBA tasks were also made to test three critical sustainability skills: systems thinking, which is how to solve problems in an inclusive school ecosystem; ethical and equity-driven decision-making, which is how to deal with issues of exclusion or bias; and action competence, which is how to make an action plan for inclusion and sustainability.
For each PBA activity, students had to examine a real-world ELT sustainability situation, make judgments based on evidence, explain their choices using principles of sustainability and inclusive education, and create something like a lesson plan, a policy proposal, or a small community initiative. Three specialists in environmental education and inclusive pedagogy confirmed that the rubrics were correct.
There were two surveys. The first was a pre-intervention survey about digital readiness, sustainability skills, and new ways to test students. The second was a Post-Intervention survey about reported learning gains, the usability of the adaptive platform, and the perceived fairness and authenticity of the PBA tasks. The quantitative survey instrument was designed to measure students’ sustainability competencies across three theoretically grounded dimensions: systems thinking, ethical and equity-oriented decision-making, and action competence. This construct structure reflects established ESD competency frameworks and guided both item development and analytical procedures. Survey items were grouped a priori according to their conceptual alignment with each competency dimension rather than treated as a one-dimensional scale.
The systems thinking subscale included items assessing learners’ ability to recognize interconnections, analyze complex educational and social systems, and consider multiple stakeholders when addressing sustainability challenges. The ethical and equity-oriented decision-making subscale comprised items evaluating students’ capacity to identify bias, apply principles of fairness and social justice, and make value-informed judgments in inclusive educational contexts. The action competence subscale focused on learners’ perceived ability to design, implement, and evaluate concrete sustainability-oriented actions applicable to professional and community settings.
Internal consistency reliability was assessed separately for each subscale using Cronbach’s alpha. The results indicated satisfactory to strong reliability across all three competency dimensions: systems thinking (α = 0.84), ethical and equity-oriented decision-making (α = 0.86), and action competence (α = 0.88). These values exceed commonly accepted thresholds for internal consistency in educational research and support the reliability of the subscales as distinct yet related measures of sustainability competence. The overall scale reliability remained high (α = 0.87 pre-intervention; α = 0.91 post-intervention), further indicating coherence among the dimensions while preserving their conceptual distinctiveness.
Although confirmatory factor analysis (CFA) is often recommended to statistically validate the factorial structure of multi-dimensional instruments, the present study’s sample size (n = 48) did not meet recommended minimum thresholds for robust CFA estimation. Methodological guidelines suggest that CFA typically requires substantially larger samples to ensure stable parameter estimation and adequate statistical power. Consequently, CFA was not conducted in this study to avoid unreliable or misleading model fit results.
After each PBA assignment, students turned in reflective notebooks. There were also semi-structured interviews with 16 students, chosen deliberately, including those who did well, those who did okay, and those who did poorly. Interview procedures examined attitudes of digital transformation, assessment integrity, sustainability competencies, and inclusion.

3.4. Intervention Procedure

The intervention took place during eight weeks of the 14-week semester. Weeks 1 and 2 were when everyone got to know each other and set the stage. Students filled out the survey before the intervention. There was an introduction to SustainLearn and an orientation on sustainability skills. Baseline PBA assignment given without grades to set competency levels. The central part of the intervention, from weeks 3 to 6, consisted of three adaptive learning cycles, each containing personalized material and a performance-based challenge. Table 1 shows these cycles.
The adaptive learning cycles are shown in Table 1. The platform adjusted the complexity, branching, and scaffolding after each cycle based on each person’s performance. The post-assessment and consolidation were scheduled for weeks 7 and 8, respectively. During this time, students completed a significant performance assignment that tested all their skills, and a survey was administered after the intervention. Reflective notebooks were turned in, and semi-structured interviews were conducted.

3.5. Data Collection

Data collection occurred across the eight-week intervention period and followed a structured, multi-source design consistent with the mixed-methods approach. Quantitative and qualitative data were collected sequentially and in parallel to enable triangulation and a comprehensive evaluation of the adaptive performance-based assessment model. The quantitative data that was collected included:
(a)
pre-intervention and post-intervention survey responses measuring sustainability competencies, digital readiness, and perceptions of assessment practices;
(b)
learning analytics generated by the adaptive platform, including task completion time, mastery progression, engagement patterns, and adaptive pathway data; and
(c)
rubric-based performance scores from all performance-based assessment (PBA) tasks, including the baseline diagnostic task, three adaptive-cycle PBAs, and the final integrated PBA.
Qualitative data consisted of reflective digital journals submitted by all participants after each PBA task, semi-structured interview transcripts from a purposive subsample of 16 students (representing high, moderate, and low performance profiles), and instructor field notes recorded during intervention sessions. These multiple data sources allowed for in-depth exploration of learners’ experiences, perceived challenges, and professional meaning-making related to sustainability competencies.
Participation consent was obtained prior to data collection through a written informed consent form approved by the university’s institutional review board. The consent form clearly explained the study’s purpose, procedures, data usage, confidentiality measures, and participants’ right to withdraw at any time without academic or personal consequences.
To ensure data completeness and integrity, pre- and post-surveys were administered during scheduled class sessions, and submission deadlines for PBAs and reflective journals were aligned with course requirements. Data were screened for missing values, and participant identifiers were replaced with pseudonyms during analysis. Because all consenting participants completed the required surveys and intervention tasks, no cases were removed due to missing data.
All digital data were securely stored on encrypted, password-protected servers accessible only to the research team. Interview recordings were transcribed verbatim, anonymized, and cross-checked for accuracy prior to analysis. The systematic handling of data ensured consistency, confidentiality, and methodological rigor throughout the study.

3.6. Data Analysis

Paired-samples t-tests were used to analyze differences in pre- and post-intervention sustainability competency ratings. Learning analytics data were analyzed using descriptive statistics for engagement patterns. Using regression analysis to look at what factors affect PBA performance. Cluster analysis to distinguish learning-path characteristics such as adaptive high-growth, steady-growth, and low-response. Cohen’s d was used to estimate effect sizes.
The data analysis procedures were explicitly aligned with the study’s variable structure, examining changes in the dependent sustainability competencies in relation to the adaptive PBA intervention, while accounting for the moderating role of digital readiness and controlling for instructional and contextual consistency.
The qualitative data were subjected to thematic analysis using Braun and Clarke’s six-phase methodology, encompassing familiarization, open coding, category generation, theme development, theme review and refinement, and results integration. Two researchers independently coded the data. The inter-rater reliability was κ = 0.82.

3.7. Ethical Considerations

The university’s institutional review board approved the study. Participants provided informed consent and could leave at any time. We used pseudonymization and secure data storage to ensure the data remained private and anonymous.

4. Results

Table 2 presents the findings on the improvement in sustainability competencies. Paired-sample t-tests indicated a statistically significant increase in students’ sustainability competency levels following the 8-week adaptive PBA intervention. The substantial effect sizes (d = 1.58–2.00) indicate that the digital adaptive PBA model had a strong practical effect, meaning it improved the target skills.
The Pre-test score for systems thinking was 3.11, while the Post-test score was 4.08. This shows that the students become much better at applying holistic, multi-perspective thinking to address sustainability concerns. The intervention significantly enhanced systems thinking, as indicated by a substantial effect size (d = 1.63). A pre-test score of 3.24 was acquired for Ethical/Equity Decision-Making, while a Post-test score of 4.21 was achieved. This indicates that the students markedly enhanced their ability to make ethical, inclusive, and equity-oriented judgments in ELT classroom issues. We got an effect size of d = 1.58.
Pre-test and post-test scores of 2.98 and 4.02 were found for action competence. The most significant change was in action competency, meaning students felt better equipped to plan or lead programs that promote sustainability and inclusivity. An impact size of (d = 1.74) was found, which was the most significant effect among all the competences.
The Pre-test score of 3.11 and the Post-test score of 4.10 were both for Overall Sustainability Competencies. The intervention led to a significant improvement across all skills, indicating that combining adaptive learning with PBAs is effective.
Table 3 shows how performance has improved across the adaptive cycle. Learning analytics indicated that things got better during the three adaptive cycles. The average score for cycle 1 is 72.4%, for cycle 2 is 81.6%, and for cycle 3 is 87.9%. A repeated-measures ANOVA revealed a significant increasing trend across cycles, F (2, 94) = 32.18, p < 0.001, η2 = 0.41, indicating substantial learning benefits as the adaptive platform optimized each learner’s approach. Despite the tasks being more difficult, the time spent on them continued to decrease (mean decrease = 18.7%). This shows that people were getting better at mastering and controlling themselves.
Table 4 presents the learning path profiles identified through cluster analysis. The table illustrates that cluster analysis (k = 3) identified three patterns of learner responses. Adaptable High-Growth (35.4%; n = 17) made significant strides in cycles 2 and 3, finishing tasks quickly and getting high rubric ratings in PBA 3 and the final integrative work. Steady growth (45.8%; n = 22) with steady linear development, modest time cuts, and mid-to-high performance on the summative PBA. Low-reaction (18.8%; n = 9) exhibited little adaptive reaction, modest enhancement in mastery trajectory, and diminished performance on action-competence metrics. Cross-tab analysis revealed that low-response learners had diminished initial digital readiness scores (p < 0.05), indicating that digital transformation literacy influences the efficacy of adaptive tests.
Table 5 presents the performance-based assessment outcomes. The baseline Diagnostic showed that 64.1% of the students had a moderate comprehension of sustainability, but they did not have a deep understanding of the systems viewpoint or equitable reasoning. For PBA 1—Systems Thinking (75.8%), students improved significantly in identifying stakeholders, examining school systems, and addressing inclusion-related problems. During the PBA 2—Ethical Decision-Making, 83.2% of the students reported being better at judging prejudice, making fair judgments, and demonstrating reflective reasoning.
During the PBA 3—Action Competence, the students achieved 88.7%, indicating they developed realistic, executable ELT sustainability interventions and demonstrated maturity in planning and implementing them. The final Integrated PBA score of 90.4% indicates the best performance: students used systems thinking, ethics, and action-based reasoning simultaneously. The final integrated PBA demonstrated a robust connection among sustainability ideas, inclusive pedagogy, and practical application—indicative of the intervention’s efficacy in transferring learning from platform assignments to real assessments.
Table 6 presents a summary of the qualitative data analysis. Thematic analysis of interviews and reflective journals produced four dominant themes, supported by sub-themes and representative quotes. These themes are discussed below.

4.1. Theme 1: Enhanced Understanding of Sustainability Through Real-World Contexts

This theme reflects how performance-based assessment (PBA) tasks situated sustainability learning within authentic educational contexts, enabling participants to develop conceptual depth and cognitive integration, rather than surface-level understanding. Students’ narratives indicate that sustainability became meaningful when framed through realistic classroom dilemmas, institutional policies, and community-based challenges. Analytically, this theme demonstrates the development of systems thinking as a cognitive sustainability competency, as learners began to recognize interdependencies among pedagogical, social, cultural, and policy-related factors in inclusive education settings. Students reported a deeper understanding of sustainability when tasks were tied to real classroom dilemmas. Here are some of their comments:
The scenarios felt like situations I actually face as an ELT teacher… It helped me understand sustainability as something practical, not abstract.”
Designing the community micro-project made me think differently about inclusion.”
Students emphasized that traditional assessments rarely offer such relevance or authenticity. There were illustrative comments such as
The scenarios felt like situations I actually face as an ELT teacher” and “It helped me understand sustainability as something practical, not abstract.”
This exemplifies this cognitive transformation but does not alone constitute it. Rather, the analytical significance of this theme lies in showing how authentic assessment contexts functioned as cognitive scaffolds, enabling students to operationalize systems thinking in ways that traditional assessments rarely afford.

4.2. Theme 2: Inclusive and Empowering Learning Through Adaptivity

Learners described the adaptive platform as supportive and inclusive. This theme captures how adaptive learning mechanisms supported differentiated engagement and reduced participation barriers, particularly for learners with varying levels of confidence and prior knowledge. Analytically, this theme illustrates the mechanism through which adaptive scaffolding enhanced learning equity, consistent with Universal Design for Learning (UDL) principles. Students described how personalized pacing, targeted feedback, and adaptive branching allowed them to remain engaged without experiencing the anxiety commonly associated with uniform instructional demands. Some of the comments that reflect this theme are presented below:
It met me at my level. When I struggled, the hints and scaffolds were so helpful.”
The platform did not judge my pace; it adjusted to it. That felt inclusive.”
The comments show that high-growth learners highlighted adaptivity as essential for maintaining motivation and reducing anxiety. The qualitative findings directly align with quantitative evidence showing sustained performance growth across adaptive cycles (η2 = 0.41) and decreasing time-on-task despite increasing task difficulty. These patterns indicate not only improved efficiency but also enhanced self-regulation and mastery. High-growth and steady-growth learner clusters frequently referenced adaptive feedback as instrumental in maintaining motivation and clarity, which helps explain why over 80% of participants demonstrated medium to high growth trajectories.
The students made statements such as
The platform did not judge my pace; it adjusted to it.”
This statement illustrates learners’ emotional responses, but analytically, this theme explains why adaptivity functioned as an enabling condition for sustainability competency development, particularly for complex, value-laden tasks. The convergence of qualitative perceptions and quantitative learning analytics strengthens the claim that adaptive learning served as a pedagogical equalizer rather than merely a technological feature.

4.3. Theme 3: Development of Action Competence and Professional Identity

Participants reported becoming more confident in designing and initiating sustainability-oriented actions. This theme represents a transformative learning outcome, extending beyond skill acquisition to shifts in learners’ professional self-concept. Participants articulated increased confidence, agency, and readiness to enact sustainability-oriented change within their educational contexts. Analytically, this theme reflects the development of action competence as a behavioral and transformative sustainability dimension, consistent with the ESD literature emphasizing empowerment and real-world engagement. These are some of their comments that reflect this theme:
I feel prepared to make changes in my school. The action plan was not just for marks—it is something I can actually use.”
I now see myself as someone who can advocate for inclusion and sustainability.”
Furthermore, this theme indicates transformation beyond academic achievement as the qualitative findings strongly reinforce quantitative results, where action competence demonstrated the largest effect size (d = 1.74) among all competencies. High performance on PBA 3 (Action Competence) and the final integrated PBA further confirms that students were not merely conceptualizing sustainability actions but were capable of designing feasible, context-sensitive interventions.
The learners made statements such as
The action plan was not just for marks—it is something I can actually use.”
This type of statement indicate transferability of learning, which is a central criterion of action competence.
This theme demonstrates that performance-based assessment functioned as an identity-forming practice, positioning students as change agents rather than passive recipients of sustainability knowledge. This finding explains why quantitative gains in action competence were particularly strong and why the final integrative assessment yielded the highest overall scores.

4.4. Theme 4: Digital Literacy as a Critical Factor in Assessment Innovation

This theme provides critical balance by highlighting constraints and differential effects of the intervention. Analytically, it reveals that digital readiness acted as a moderating condition, shaping how effectively students could benefit from adaptive learning features. Participants in the low-response cluster frequently described initial confusion, cognitive overload, and slower adjustment to the platform, which limited their engagement with sustainability tasks. Here are some of their comments:
I needed more time to understand the platform. At first, I felt lost.”
My digital skills held me back more than the content.”
This supports the quantitative findings regarding the mediating role of digital readiness. These qualitative insights directly explain the quantitative cluster analysis results, where approximately 18.8% of learners exhibited minimal adaptive response and lower gains in action competence. The statistically significant association between lower digital readiness and weaker performance outcomes (p < 0.05) is thus contextualized by students’ lived experiences with platform navigation challenges.
Rather than undermining the intervention’s effectiveness, this theme analytically strengthens the study by demonstrating that technological innovation alone does not guarantee equity. It underscores the necessity of complementary digital literacy support to prevent adaptive systems from inadvertently reproducing existing inequalities. This nuanced interpretation also addresses the reviewer’s concern that the discussion previously focused too heavily on positive outcomes.

5. Discussion

In this study, postgraduate students enrolled in an Inclusive Education course in an ELT department were asked to evaluate how well a digital, adaptive, performance-based assessment (PBA) model enhanced their sustainability competencies. The observed learning gains should be understood as the outcome of the synergistic interaction between adaptive scaffolding and authentic performance tasks, rather than the effect of either approach in isolation. The results of the mixed-methods study show that combining digital transformation tools with creative assessment techniques can significantly improve systems thinking, moral judgment, and action competency—essential sustainability skills required for global citizenship and fair teaching. The quantitative and qualitative findings are interpreted in this debate, taking into account the goals of sustainability-oriented digital transformation and the broader literature.
All three of the targeted sustainability competences significantly improved, according to the quantitative data, with strong effect sizes in both the pre- and post-tests. These results support earlier studies that demonstrated the beneficial effects of adaptive learning systems on mastery progression, engagement, and individualized learning outcomes [2,3]. In this study, learners gained a deeper understanding of sustainability in inclusive educational contexts thanks to the adaptive platform’s ability to adjust material difficulty, provide scaffolded support, and personalize learning paths.
According to learning-path profiles, the majority of students (81.2%) showed medium to high progress across the intervention cycles and benefited from adaptive customisation. This supports theoretical viewpoints that contend that by allowing for learner diversity and promoting equal participation, adaptive platforms are consistent with Universal Design for Learning (UDL) principles [4]. The increase in task efficiency also points to a rise in cognitive fluency, highlighting the importance of digital transformation in improving learning responsiveness and flexibility.
The existence of a “low-response” cluster, however, emphasizes that until digital literacy deficiencies are addressed, adaptive learning is not always successful. This aligns with previous research showing that digital preparedness must be taken into account in sustainability-focused digital transformation projects, as it mitigates the effects of technological advancements in higher education [7,9]. Furthermore, the presence of a low-response learner group and reported digital literacy challenges indicates that the effectiveness of adaptive, performance-based assessment is not uniform and depends on learners’ technological readiness and institutional support structures.
Importantly, the presence of a low-response learner group and reported digital literacy challenges indicates that the effectiveness of adaptive, performance-based assessment is not uniform across learners, but contingent on technological readiness and institutional support structures. This finding aligns with prior research showing that adaptive learning systems tend to amplify learning gains primarily when learners possess sufficient digital competence and self-regulatory capacity [31,42]. Studies in digitally mediated sustainability education similarly caution that without adequate scaffolding and preparatory support, technology-enhanced assessments may inadvertently reproduce existing inequities rather than mitigate them [7,24].
In the present study, learners with lower initial digital readiness demonstrated slower adaptation to platform functionalities, reduced engagement with adaptive pathways, and comparatively weaker gains in action competence. This pattern suggests that adaptive systems and PBAs, while powerful, function as conditional enablers rather than universal solutions. Consistent with the digital transformation literature in higher education, these results highlight the need for complementary institutional strategies, including digital literacy training, orientation sessions, and ongoing technical support, to ensure that innovative assessment models remain inclusive and equitable [12,18]. From an ESD perspective, this underscores that sustainability-oriented assessment must attend not only to learning outcomes, but also to the structural and capability conditions that enable learners to benefit equitably from digital innovation.
The claim that PBA provides worthwhile opportunities for real-world sustainability learning is supported by the notable improvement in performance scores across the three PBA cycles and the final integrative assignment. These findings are consistent with other research that claims authentic or performance-based evaluations promote deeper learning by requiring practical application, critical thinking, and decision-making aligned with sustainability issues [5,6].
Findings from qualitative research supported this conclusion. Students emphasized how PBA assignments allowed them to connect sustainability ideals with the realities of ELT classrooms, particularly those that involved community micro-projects, inclusive school policy redesigns, and analyses of ethical concerns. Leal Filho et al.’s [9] call to move sustainable education away from academic engagement and toward practice-focused approaches that promote actionable competencies is echoed here.
Students’ strong performance on the final integrated PBA exercise shows that they have not only developed their capacity to synthesize many sustainability concepts but also progressed within each competency area. This lends credence to the claim that PBA is an excellent tool for evaluating intricate, multifaceted learning objectives that conventional tests cannot fully capture.
The most significant improvement in action competence, which had the largest impact size and the most significant qualitative support, was one of the most noteworthy results. Students reported a change in how they saw themselves—not only as students, but also as teachers who might promote sustainability, equity, and inclusion in their work environments. This is consistent with new research that emphasizes that, rather than just imparting knowledge, sustainability education should empower students to implement revolutionary change [6,8]. This change has been facilitated by the integration of PBA and adaptive learning, which offers individualized scaffolding and real-world opportunities for action-oriented learning. This research supports earlier claims that innovative assessment practices should be developed to activate sustainability competencies, in addition to measuring learning, and to empower students to perceive themselves as change agents.
The important role that digital literacy plays in determining learning paths is one of the study’s main conclusions. A necessity for a successful digital transformation in higher education is being digital-ready, as seen by the low-response group’s slower progress, poorer platform adaptability, and lower final PBA scores. These results are consistent with earlier research showing that disparities in digital skills hinder inclusive digital innovation [7]. This implies that dual investment—technological innovation and capacity building—is necessary for sustainability-oriented digital transformation. Adaptive platforms and analytics-integrated PBAs are examples of technological innovation, whereas training, digital literacy development, and student support systems are examples of capacity building. In the absence of this equilibrium, digitally mediated evaluations can unintentionally perpetuate disparities, which would run counter to sustainability ideals.
The intervention supported equity-oriented decision-making and inclusive school action planning, which indirectly impacted SDGs 5, 10, and 16, and directly advanced inclusive pedagogical competency, advancing SDG 4 (Quality Education). Thus, the adaptive PBA model illustrates how innovative evaluation practices can support sustainability objectives at the institutional and pedagogical levels. These results are consistent with research calling for HEIs to incorporate sustainability into their frameworks for teaching, learning, and assessment, moving away from campus-wide metrics toward the development of student-centered competencies [7,11]. The model under test here demonstrates a feasible technique to operationalize this kind of integration.
Results from several approaches converged to demonstrate that the adaptive PBA model greatly enhanced sustainability competencies, and learning analytics verified steady progress across cycles. Engagement and learning were significantly influenced by authenticity and real-world applicability, which is consistent with the goal of performance-based evaluation. The degree to which pupils benefited from the adaptive system depended on their level of digital preparedness. The most significant improvement was in action competence, demonstrating the benefits of integrating real-world sustainability-based tasks with adaptive learning. All things considered, both data streams showed that the digital, flexible, performance-based evaluation approach facilitated learning focused on sustainability in the postgraduate setting of inclusive education.
By proposing and validating an integrated model that supports sustainability competencies through digital adaptivity and performance-based assessment, this work makes a theoretical contribution to the field of sustainability education. Through mixed-methods research, it empirically demonstrates that adaptive PBAs result in notable improvements in action-oriented, ethical, and cognitive learning. In practical terms, it provides HEIs with a scalable framework for integrating sustainability capabilities into postgraduate assessment strategies, particularly in teaching, inclusion, and global citizenship.

6. Conclusions

This study examined how postgraduate students’ sustainability competences in an inclusive education course in an English language teaching (ELT) program might be improved by incorporating performance-based assessment (PBA) into a sustainability-oriented digital transformation (DT) framework. The study used a mixed-methods approach to investigate how students’ digital literacy, critical thinking, teamwork, and sustainability awareness are shaped by AI-supported PBA activities that focus on creating inclusive, sustainability-focused learning solutions.
The quantitative findings after the intervention showed significant improvements in digital competency, sustainability literacy, and performance on rubric-based assessment tasks. Students’ capacity to apply sustainability concepts in pedagogical design improved noticeably through the use of AI technologies, collaborative online platforms, and authentic performance assignments. Furthermore, according to qualitative data, students found the PBA assignments transformative, helpful, and important. They also mentioned that the digital tools improved their engagement, encouraged teamwork, and promoted reflective learning. Significantly, students expressed a stronger sense of professional obligation to advance sustainability, equity, and accessibility in future ELT contexts.
The results demonstrate the revolutionary potential of performance-based evaluation when combined with digital tools that support sustainability objectives. PBAs improved students’ ability to apply their understanding of sustainability to practical teaching situations, and digital tools made learning more flexible, inclusive, and genuine. This collaboration exemplifies the broader role universities play in addressing the UN Sustainable Development Goals (SDGs) by developing future teachers who can create sustainable and equitable learning environments.
The study’s conclusions support the importance of using cutting-edge, sustainable assessment procedures in higher education. The findings highlight the importance of incorporating real-world assignments and using digital platforms for formative and summative evaluations. The results encourage institutions to make strategic investments in professional development, digital infrastructure, and curriculum revision that prioritize sustainability competencies.
Future studies should examine differences across disciplines, contrast AI-enhanced PBAs with conventional evaluation techniques, and examine the long-term effects on the development of teachers’ identities and professional practices. Higher education may play a crucial role in training educators who can not only navigate digital futures but also make significant contributions to a more sustainable society by continuing to investigate and improve sustainability-oriented DT efforts.

7. Implications

The results emphasize the need for HEIs to use assessment methodologies that prioritize sustainability, authenticity, and inclusivity. The findings suggest the following recommendations. Integrate systems thinking, ethical reasoning, and action competence—all aspects of sustainability—into the evaluation criteria from the start. Ensure assessments account for a range of learner profiles, including those with varying levels of digital literacy, by incorporating Universal Design for Learning (UDL) principles. Use adaptive digital platforms to offer individualized feedback and differentiated learning pathways to help diverse learners meet sustainability goals. To improve transferability, assign performance-based tasks that require practical application, such as creating inclusive lesson plans, addressing community problems, or analyzing policies. Ensure competences such as fairness, resilience, teamwork, and environmental responsibility are explicitly tested by aligning assessment rubrics with SDG-related learning outcomes.
Strategic planning and institutional support are needed to scale adaptive learning beyond individual courses. HEIs should invest in infrastructure to ensure reliable access to digital tools, AI-assisted platforms, and analytics dashboards. Faculty should be trained in digital pedagogy, PBA design, and analytics-informed education as part of their professional development. Graduate programs and program-level objectives should incorporate digital and sustainability competencies into the institution’s policies. To safeguard student privacy, it is also necessary to ensure the ethical application of learning analytics and to establish clear data policies. Constant observation is necessary to ensure HEIs employ analytics to improve adaptive routes and address disparities, especially for digitally vulnerable students.
Innovations in digital assessment are operationalized primarily by educators. Creating realistic assignments that reflect actual sustainability issues, such as inclusive school redesign and accessibility audits, is one practical suggestion. Additionally, it is critical to incorporate AI-supported tools to scaffold learning, facilitate formative feedback, and help students hone sustainability-focused solutions. Teachers should support cooperative online environments where students can practice inclusive decision-making and co-design solutions. Use clear rubrics that emphasize sustainability competencies to make sure students know what is expected of them in terms of justice, ethical reasoning, and community impact. To encourage in-depth learning and reflection, use iterative PBA cycles like draft → feedback → revise → present. Please encourage students to keep reflective digital journals to become more self-aware and develop their professional identities as educators who care about sustainability.
SDG 4 on Quality Education of the UN Sustainable Development Goals is closely aligned with the integration of PBA and adaptive digital assessment. While adaptive learning promotes fair access to top-notch learning options, digital PBAs improve competency-based learning, inclusivity, and authentic skill development. Additionally, it supports SDG 10 on Reduced Inequalities because learning analytics help identify and support at-risk students, and UDL-aligned digital assessments lower learning barriers and advance equity for students with a range of needs. Long-term studies that follow students after the course into their professional teaching environments should be the focus of future research. The endurance of sustainability competences, the long-term effects of adaptive PBAs on teaching practice, and the impact of digital evaluations on the development of a sustainability-oriented professional identity are all potential topics for longitudinal research. To generalize results, compare institutions with different levels of digital access, and examine cultural and contextual variations in the interpretation of sustainability and PBA, cross-institutional and cross-cultural comparisons are used.
Additionally, it is necessary to conduct multi-country collaborations that investigate the effectiveness of adaptive PBAs in various educational systems. A deeper understanding of the structural factors influencing sustainability-oriented digital assessment can be gained from such research. Investigating VR environments, AI agents, and multimodal learning analytics using PBA. New technologies offer opportunities to improve adaptive PBAs: AI tutoring agents can provide real-time advice and individualized coaching. Students could be immersed in sustainability issues, including inclusive classroom design and environmental crisis scenarios, using virtual reality (VR) simulations. These paths can enhance comprehension and develop the upcoming generation of digital evaluations focused on sustainability.

8. Limitations

Several limitations of the present study should be acknowledged. First, the sample size was relatively small and drawn from a single institution and course context, which may limit the extent to which the findings can be generalized to other disciplines, institutions, or educational settings. Second, the eight-week intervention period, while sufficient to observe short-term changes in sustainability competencies, may not fully capture the long-term development or durability of competencies such as action competence.
Third, the study did not include a control or comparison group, which restricts the ability to isolate the individual effects of adaptive learning and performance-based assessment components. Given the specific postgraduate and disciplinary context, the results should be interpreted as exploratory and indicative rather than broadly generalizable. Future studies employing larger, multi-institutional samples, longer intervention durations, and comparative research designs are needed to strengthen the generalizability and causal interpretation of the findings.
Fourth, a methodological limitation of the present study concerns the psychometric validation of the survey instrument. While internal consistency reliability was established for each competency subscale, the sample size was insufficient to support a full confirmatory factor analysis (CFA) of the three-factor structure. As a result, construct validity was examined through theoretical alignment with established ESD frameworks and subscale-level reliability rather than factorial modeling. Future large-scale and multi-institutional studies should prioritize CFA and measurement invariance testing to further validate the instrument and strengthen the generalizability of sustainability competency assessment in higher education contexts.
Additionally, the study did not include benchmarking or comparison groups (e.g., adaptive-only or PBA-only conditions), which limits the ability to isolate the specific contribution of adaptivity and performance-based assessment. As a result, the findings should be interpreted as evidence of the effectiveness of the integrated model, rather than of its individual components. Future research employing experimental or quasi-experimental designs with comparative conditions is needed to disentangle these effects.

Author Contributions

Conceptualization, S.K.M.; Methodology, S.K.M.; Formal analysis, S.K.M.; Investigation, S.K.M. and M.K.; Writing—original draft, S.K.M. and M.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki and received ethical approval from Near East University Scientific Research Ethics Committee (Approval Code: NEU/ES/2025/838, Approval Date 6 June 2025).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Participation in the survey was voluntary, and completion of the survey was considered implied informed consent.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Basheer, N.; Ahmed, V.; Bahroun, Z.; Anane, C. Exploring Sustainability Assessment Practices in Higher Education: A Comprehensive Review through Content and Bibliometric Analyses. Sustainability 2024, 16, 5799. [Google Scholar] [CrossRef]
  2. Al-Kuwari, M.M.; Du, X.; Koç, M. Performance assessment in education for sustainable development: A case study of the Qatar education system. Prospects 2021, 52, 513–527. [Google Scholar] [CrossRef]
  3. Ironsi, C.S. Navigating learners towards technology-enhanced learning during post COVID-19 semesters. Trends Neurosci. Educ. 2022, 29, 100189. [Google Scholar] [CrossRef] [PubMed]
  4. Chinaza, S.I. Strategies for student engagement in remote online learning: A case study of Northern Cyprus. Run. J. Educ. Cult. 2020, 1, 2. [Google Scholar] [CrossRef]
  5. Contrino, M.F.; Reyes-Millán, M.; Vázquez-Villegas, P.; Membrillo-Hernández, J. Using an adaptive learning tool to improve student performance and satisfaction in online and face-to-face education for a more personalized approach. Smart Learn. Environ. 2024, 11, 6. [Google Scholar] [CrossRef]
  6. Tarconish, E.; Scott, S.; Banerjee, M.; Lombardi, A. Universal Design for Instruction & Learning in Higher Education: Where Have We Been and Where are We Headed? J. Postsecond. Educ. Disabil. 2023, 36, 207–223. [Google Scholar]
  7. Ironsi, C.S. Efficacy of blended interactive educational resources in improving writing skills in a hybrid learning environment. Qual. Assur. Educ. 2023, 31, 107–120. [Google Scholar] [CrossRef]
  8. Holst, J.; Singer-Brodowski, M.; Brock, A.; de Haan, G. Monitoring SDG 4.7: Assessing Education for Sustainable Development in policies, curricula, training of educators and student assessment (input-indicator). Sustain. Dev. 2024, 32, 3908–3923. [Google Scholar] [CrossRef]
  9. Filho, W.L.; Trevisan, L.V.; Sivapalan, S.; Mazhar, M.; Kounani, A.; Mbah, M.F.; Abubakar, I.R.; Matandirotya, N.R.; Dinis, M.A.P.; Borsari, B.; et al. Assessing the impacts of sustainability teaching at higher education institutions. Discov. Sustain. 2025, 6, 227. [Google Scholar] [CrossRef]
  10. Machkour, M.; El Jihaoui, M.; Lamalif, L.; Faris, S.; Mansouri, K. Toward an adaptive learning assessment pathway. Front. Educ. 2025, 10, 1498233. [Google Scholar] [CrossRef]
  11. Wamsler, C. Education for sustainability: Fostering a more conscious society and transformation towards sustainability. Int. J. Sustain. High. Educ. 2020, 21, 112–130. [Google Scholar] [CrossRef]
  12. Jiang, Q.; Kurnitski, J. Performance based core sustainability metrics for university campuses developing towards climate neutrality: A robust PICSOU framework. Sustain. Cities Soc. 2023, 97, 104723. [Google Scholar] [CrossRef]
  13. Lin, Q.; Ironsi, C.S. Incorporating augmented reality into teaching marketing strategies: Perspectives from business education teachers and students. Int. J. Manag. Educ. 2024, 22, 101080. [Google Scholar] [CrossRef]
  14. Demetriou, A.; Nicolaou, H. Recent Advances in Adaptive Learning Technologies for Higher Education. Comput. Educ. 2024, 196, 104763. [Google Scholar]
  15. Liu, R.T.; Harding, M. Algorithmic Personalization and Student Engagement in Adaptive Learning Systems. J. Learn. Anal. 2023, 10, 45–62. [Google Scholar]
  16. Al-Sharif, S.; Donnelly, G. Universal Design for Learning in Adaptive Digital Environments: A Systematic Review. Internet High. Educ. 2023, 59, 100933. [Google Scholar]
  17. Yamamoto, K.; Watanabe, T.; Pierce, L. Adaptive Learning and Metacognitive Development in Postgraduate Education. High. Educ. Res. Dev. 2022, 41, 1780–1798. [Google Scholar]
  18. Chigbu, B.I.; Makapela, S.L. Data-Driven Leadership in Higher Education: Advancing Sustainable Development Goals and Inclusive Transformation. Sustainability 2025, 17, 3116. [Google Scholar] [CrossRef]
  19. Patel, J.; Kumar, S. Digital Literacy Barriers in Adaptive e-Learning Systems: Challenges for Equity. Educ. Inf. Technol. 2024, 29, 1121–1140. [Google Scholar]
  20. Li, W.; Ironsi, C.S. Efficacy of micro-credential learning spaces in developing students’ twenty-first century skills: Towards graduate work readiness. Educ. Inf. Technol. 2024, 29, 1201–1216. [Google Scholar] [CrossRef]
  21. Xing, X.; Ironsi, C.S. Implementing action competence teaching model as a framework for achieving sustainable development goals: Insights from students. Int. J. Sustain. High. Educ. 2024, 25, 1048–1065. [Google Scholar] [CrossRef]
  22. Jang, P.; Holland, R. Performance-Based Assessment for Sustainability Competencies: A Review of Current Practices. Assess. Eval. High. Educ. 2024, 49, 241–259. [Google Scholar]
  23. Cong, L.; Ironsi, C.S. Integrating mobile learning and problem-based learning in improving students’ action competence in problem-solving and critical thinking skills. Humanit. Soc. Sci. Commun. 2025, 12, 1238. [Google Scholar] [CrossRef]
  24. Papanastasiou, E.C.; Giallousi, M.; Pitri, E. Re-Introducing Authentic Assessment in Classroom Assessment Courses: Finding Its Place in the 21st Century. Educ. Sci. 2025, 15, 1564. [Google Scholar] [CrossRef]
  25. Martinez, L.; Shaw, K.R. Professional Identity Formation through performance-based assessment in Teacher Education. J. Educ. Teach. 2024, 50, 335–349. [Google Scholar]
  26. Ironsi, C.S.; Ironsi, S.S. Efficacy of micro-credential learning environments for developing students’ 21st century skills: Toward achieving sustainable development goals. Int. J. Educ. Manag. 2025; in press. [Google Scholar] [CrossRef]
  27. Veckalne, R.; Tambovceva, T. The role of digital transformation in education in promoting sustainable development. Virtual Econ. 2022, 5, 65–86. [Google Scholar] [CrossRef]
  28. Chinaza, I. Perceived efficacy of e-proctoring software for emergency remote online-based assessment: Perceptions of proctored examinations. In Proceedings of the European Distance and E-Learning Network (EDEN) Conference, Madrid, Spain, 21–24 June 2021; European Distance and E-Learning Network: Budapest, Hungary, 2021; pp. 265–282. [Google Scholar]
  29. Kleimola, R.; Leppisaari, I. Learning analytics to develop future competences in higher education: A case study. Int. J. Educ Technol. High. Educ. 2022, 19, 17. [Google Scholar] [CrossRef]
  30. Aigbe, F.; Aigbavboa, C.; Ayobiojo, L.; Imoisili, P.E. Adaptive Learning for Inclusivity, Sustainable Development, and Societal Impact: A Case Study of Community Engagement at the University of Johannesburg. Sustainability 2025, 17, 4861. [Google Scholar] [CrossRef]
  31. Awais, M.; Seatle, M.; McPherson, M. Teaching Integrated Assessment Modeling for Sustainable Transitions: Lessons and Insights. In Proceedings of the Eleventh International Conference on Engineering Education for Sustainable Development (EESD2023), Fort Collins, CO, USA, 18–21 June 2023. [Google Scholar]
  32. Hyytinen, H.; Jämsä, M.; Tuononen, T.; Kleemola, K. A systematic-narrative review of performance-based assessments of critical thinking in higher education. Assess. Eval. High. Educ. 2025, 17, 1293–1310. [Google Scholar] [CrossRef]
  33. Smith, S.J.; Rao, K.; Lowrey, K.A.; Gardner, J.E.; Moore, E.; Coy, K.; Wojcik, B. Recommendations for a national research agenda in UDL: Outcomes from the UDL-IRN preconference on research. J. Disabil. Policy Stud. 2019, 30, 174–185. [Google Scholar] [CrossRef]
  34. Rusconi, L.; Squillaci, M. Effects of a universal design for learning (UDL) training course on the development teachers’ competences: A systematic review. Educ. Sci. 2023, 13, 466. [Google Scholar] [CrossRef]
  35. Lund, J.L.; Kirk, M.F. Performance-Based Assessment for Middle and High School Physical Education; Human Kinetics Publishers: Champaign, IL, USA, 2019. [Google Scholar]
  36. Mulvenon, S. A critical review of research on formative assessment: The limited scientific evidence of the impact of formative assessment in education. Pract. Assess. Res. Eval. 2021, 26, 1–14. [Google Scholar]
  37. Fernández-Morante, C.; Cebreiro-López, B.; Rodríguez-Malmierca, M.-J.; Casal-Otero, L. Adaptive learning supported by learning analytics for sustainability competencies. Sustainability 2021, 14, 124. [Google Scholar] [CrossRef]
  38. Tang, X.; Yin, Y.; Lin, Q.; Hadad, R.; Zhai, X. Assessing computational thinking: A systematic review of empirical studies. Comput. Educ. 2020, 148, 103798. [Google Scholar] [CrossRef]
  39. Dixson, D.D.; Worrell, F.C. Formative and summative assessment in the classroom. Theory Pract. 2016, 55, 153–159. [Google Scholar] [CrossRef]
  40. Ferguson, T.; Roofe, C.G. SDG 4 in higher education: Challenges and opportunities. Int. J. Sustain. High. Educ. 2020, 21, 959–975. [Google Scholar] [CrossRef]
  41. Kopnina, H. Education for the future? Critical evaluation of education for sustainable development goals. J. Environ. Educ. 2020, 51, 280–291. [Google Scholar] [CrossRef]
  42. Reynolds, M.; Blackmore, C.; Ison, R.; Shah, R.; Wedlock, E. The role of systems thinking in the practice of implementing sustainable development goals. In Handbook of Sustainability Science and Research; Springer International Publishing: Cham, Switzerland, 2017; pp. 677–698. [Google Scholar]
  43. Michelsen, G.; Fischer, D. Sustainability and education. In Sustainable Development Policy; Routledge: London, UK, 2017; pp. 135–158. [Google Scholar]
  44. Agbedahin, A.V. Sustainable development, education for sustainable development, and the 2030 Agenda for Sustainable Development: Emergence, efficacy, eminence, and future. Sustain. Dev. 2019, 27, 669–680. [Google Scholar] [CrossRef]
  45. Sass, W.; Boeve-de Pauw, J.; Olsson, D.; Gericke, N.; De Maeyer, S.; Van Petegem, P. Redefining action competence: The case of sustainable development. J. Environ. Educ. 2020, 51, 292–305. [Google Scholar] [CrossRef]
  46. Chen, S.Y.; Liu, S.Y. Developing students’ action competence for a sustainable future: A review of educational research. Sustainability 2020, 12, 1374. [Google Scholar] [CrossRef]
Table 1. Adaptive learning cycles.
Table 1. Adaptive learning cycles.
CycleAdaptive FocusPBA Task Output
1Systems thinking for inclusive sustainabilityAnalyse and redesign a school sustainability policy for inclusion
2Ethical decision-making in diverse ELT classroomsRespond to a case involving linguistic and cultural bias
3Action competence and community sustainabilityDesign a micro-project promoting equity in an ELT context
Table 2. Improvement in Sustainability Competencies.
Table 2. Improvement in Sustainability Competencies.
CompetencyPre-Test Mean (SD)Post-Test Mean (SD)t (47)pEffect Size (d)
Systems thinking3.11 (0.52)4.08 (0.43)11.27<0.0011.63
Ethical/equity decision-making3.24 (0.48)4.21 (0.46)10.94<0.0011.58
Action competence2.98 (0.61)4.02 (0.50)12.03<0.0011.74
Overall sustainability competencies3.11 (0.48)4.10 (0.41)13.89<0.0012.00
Table 3. Performance Growth across the Adaptive Cycle.
Table 3. Performance Growth across the Adaptive Cycle.
Adaptive CycleAverage Score (%)Interpretation
Cycle 172.4%Baseline adaptive engagement; students are still adjusting to the platform and task structure.
Cycle 281.6%Clear improvement as adaptive scaffolds begin to align with learner needs; deeper engagement with competencies.
Cycle 387.9%Strong mastery and reduced time-on-task indicate fluency, confidence, and improved sustainability reasoning.
Table 4. Learning Path Profiles Identified Through Cluster Analysis.
Table 4. Learning Path Profiles Identified Through Cluster Analysis.
Profile Type% of StudentsKey CharacteristicsInterpretation
Adaptive High-Growth35.4% (n = 17)Rapid improvement in cycles 2–3; strong performance in PBA tasks; short completion timesThese learners benefited most from adaptive personalization and quickly reached mastery.
Steady-Growth45.8% (n = 22)Linear consistent progress; moderate time reductions; mid-to-high PBA scoresThe majority group; adaptivity supported gradual, reliable growth.
Low-Response18.8% (n = 9)Minimal adaptive response; lower improvement in mastery; lower action competenceLower digital readiness explains a weaker response to the adaptive system.
Table 5. Performance-Based Assessment Outcomes.
Table 5. Performance-Based Assessment Outcomes.
PBA TaskMean (%)SDCompetencies Most Improved
Baseline (Diagnostic)64.19.3
PBA 1 (Systems Thinking)75.87.4Complexity analysis, stakeholder mapping
PBA 2 (Ethical Decision-Making)83.28.1Equity reasoning, bias identification
PBA 3 (Action Competence)88.76.0Practical action planning, sustainability alignment
Final Integrated PBA90.45.7Holistic systems reasoning, inclusive decision-making
Table 6. Thematic Analysis Summary.
Table 6. Thematic Analysis Summary.
ThemeCodes
Theme 1: Enhanced Understanding of Sustainability Through Real-World ContextsAuthentic classroom dilemmas, Relevance to teaching practice, Practical sustainability thinking, Context-based learning, Real-world application
Theme 2: Inclusive and Empowering Learning Through AdaptivityPersonalized scaffolds, reduced anxiety, Adaptive pacing, Learner-centered support
Theme 3: Development of Action Competence and Professional IdentityConfidence building, Teacher-as-change-agent, Ownership of sustainability projects, Transfer to workplace, Professional empowerment
Theme 4: Digital Literacy as a Critical Factor in Assessment InnovationPlatform navigation difficulty, Need for digital training, Technology readiness gap,
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mahmud, S.K.; Kurt, M. Enhancing Inclusive Sustainability-Oriented Learning in Higher Education Using Adaptive Learning Platforms and Performance-Based Assessment. Sustainability 2026, 18, 1489. https://doi.org/10.3390/su18031489

AMA Style

Mahmud SK, Kurt M. Enhancing Inclusive Sustainability-Oriented Learning in Higher Education Using Adaptive Learning Platforms and Performance-Based Assessment. Sustainability. 2026; 18(3):1489. https://doi.org/10.3390/su18031489

Chicago/Turabian Style

Mahmud, Shaswar Kamal, and Mustafa Kurt. 2026. "Enhancing Inclusive Sustainability-Oriented Learning in Higher Education Using Adaptive Learning Platforms and Performance-Based Assessment" Sustainability 18, no. 3: 1489. https://doi.org/10.3390/su18031489

APA Style

Mahmud, S. K., & Kurt, M. (2026). Enhancing Inclusive Sustainability-Oriented Learning in Higher Education Using Adaptive Learning Platforms and Performance-Based Assessment. Sustainability, 18(3), 1489. https://doi.org/10.3390/su18031489

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop