Next Article in Journal
Digital Technology Deployment and Improved Corporate Performance: Evidence from the Manufacturing Sector in China
Previous Article in Journal
LSNet: Adaptive Latent Space Networks for Vulnerability Severity Assessment
Previous Article in Special Issue
Integrating AI with Meta-Language: An Interdisciplinary Framework for Classifying Concepts in Mathematics and Computer Science
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Predicting Achievers in an Online Theatre Course Designed upon the Principles of Sustainable Education

by
Stamatios Ntanos
1,*,
Ioannis Georgakopoulos
2,* and
Vassilis Zakopoulos
3
1
Department of Business Administration, University of West Attica, 122 41 Egaleo, Greece
2
Department of Industrial Design and Production Engineering, Department of Accounting and Finance, University of West Attica, 122 41 Egaleo, Greece
3
Department of Theatre Studies, National and Kapodistrian University of Athens, 157 84 Athens, Greece
*
Authors to whom correspondence should be addressed.
Information 2025, 16(9), 780; https://doi.org/10.3390/info16090780
Submission received: 21 July 2025 / Revised: 1 September 2025 / Accepted: 4 September 2025 / Published: 8 September 2025
(This article belongs to the Special Issue Advancing Educational Innovation with Artificial Intelligence)

Abstract

The development of online courses aligned with sustainable education principles is crucial for equipping learners with 21st-century skills essential for a sustainable future. As online education expands, predicting achievers (in this research, students with a final grade of seven or higher) becomes essential for optimizing instructional strategies and improving retention rates. This study employs a Linear Discriminant Analysis (LDA) model to predict academic performance in an online theatre course rooted in sustainable education principles. Engagement metrics such as total logins and collaborative assignment completion emerged as decisive predictors, aligning with prior research emphasizing active learning and collaboration. The model demonstrated robust performance, achieving 90% accuracy, 80% specificity, and an 88% correct classification rate. These results underscore the potential of machine learning in identifying achievers while highlighting the significance of sustainable pedagogical components. Future research should explore emotional engagement indicators and multi-course validation to enhance predictive capabilities. By utilizing the e-learning system information, the presented methodology has the potential to assist institutional policymakers in enhancing learning outcomes, advancing sustainability goals, and supporting innovation across the educational and creative sectors.

Graphical Abstract

1. Introduction

Sustainable education transcends traditional pedagogy by fostering competencies such as systems thinking, collaborative problem-solving, and reflexivity [1]. Key principles include:
Active Learning: Students engage in real-world tasks (e.g., collaborative assignments) to apply theoretical knowledge [2,3,4,5,6,7,8,9].
Critical Thinking: Individual assignments and case studies encourage independent analysis [2,5,9].
Collaboration: Group work mirrors real-world teamwork, a cornerstone of sustainability education [2,3,5,6,7,8,9].
The competences mentioned above can help students successfully assume the role of citizens who know how to deal with real-world challenges [6,10]. In this sense, sustainable education is primarily interested in preparing students to face possible crises (disasters, fires, earthquakes, and tsunamis), and be aware of environmental issues (climate change, earth degradation, decay, erosion, and pollution), as well as being familiarized with social and economic issues, and is less interested in helping students gain unfruitful knowledge. In parallel, sustainable education emphasizes the need for collaboration to address challenges [1,2,5,11].
For this purpose, educators design sustainable courses to help students not only develop valuable competencies but also practice them, thereby increasing their efficiency in facing real-world challenges [12,13,14]. Therefore, theoretical material, self-assessment exercises, collaborative assignments, written and online exams, quizzes, and other relevant activities are sustainable course activities that make students understand fundamental sustainable issues and equip them to thrive in a challenging sustainable world.
Moreover, designing a course upon the principles of sustainable education creates measurable engagement patterns that facilitate the prediction of performance.
The rapid expansion of online education necessitates innovative approaches to predict student success, particularly in courses aligned with sustainable education principles. Theatre education naturally operationalizes collaboration, rehearsal cycles, and reflective critique—practices aligned with ESD competencies. Drama, digital narration, and theatre games are adjusted to constitute learning activities associated with real case studies that indicate a specific problem or a real situation. A recent literature review synthesizing digital drama/theatre for sustainability reports that twenty-seven empirical/theoretical studies (2014–2023) link digital theatre activities with students’ sustainability awareness and 21st-century skills, with most research taking place in Europe and Higher Education; secondary and early childhood contexts remain under-represented [15].
Identifying achievers (students with final grades ≥ 7) in online theatre courses is paramount for:
Optimizing instructional strategies to enhance engagement and retention.
Validating the efficacy of sustainable pedagogy, which emphasizes active learning, collaboration, and critical thinking [2,5].
Addressing equity gaps by enabling early interventions for at-risk students [16,17].
The research focuses on predicting achievers in online theatre courses since they incorporate collaborative problem-solving competencies, aligned to sustainable education. In contrast to conventional theatre courses, educators and learning designers implement online theatre courses through an educational e-learning platform, allowing for the eliciting of engagement metrics. Important studies have proved that implementing a theatre curriculum through an e-learning platform can achieve pedagogical and learning goals to a satisfactory extent [18,19,20,21,22]. These studies point out that up-to-date theatre techniques, such as digital narration and drama, can constitute e-learning activities with promising results. However, two studies underline that the physical presence is irreplaceable, indicating that the learning objectives are better met in the case of conventional teaching [20,22]. In this sense, these studies suggest that hybrid theatre courses are most educationally preferable. The studies mentioned do not attenuate the strength of the e-learning part in theatre courses, but they denote that the e-learning part is not enough for a full course delivery. In parallel, these studies view the online implementation of modern theatre techniques as a challenge, accentuating the need for a careful online learning design to meet the course’s objectives.
Considering the need for a careful online learning design, our research attempts to predict achievers in an online theatre course that has been designed to meet the needs of sustainable education. However, the research does not strongly investigate the relationship between the achievement of learning objectives and the students’ performance. In this sense, our study does not examine the online theatre course’s success. The course’s successful delivery is only reflected by grades. This is a common practice when building a risk or prediction model for students at risk or achievers [15,16,17,23,24,25,26,27,28].
Moreover, although literature refers to prediction models for various online courses, this is not the case for online theatre courses, giving space for important scientific output.
The research-added value of this study lies in its dual focus: (1) it applies machine learning techniques to predict achievers in a course explicitly designed around sustainable education principles, and (2) it identifies specific engagement indicators (e.g., collaborative assignments) that align with sustainable pedagogy. Prior works have examined predictive models in online learning, but none have contextualized these models within sustainable education frameworks [16,29]. This study addresses this gap by demonstrating how predictive analytics can enhance the design and delivery of sustainability-aligned courses. In detail, the research attempts to predict achievers, indicating factors that affect their performance.
Significant studies consider the students’ attitude towards online education and the level of their e-learning efficacy as factors that are associated with their achievement [30,31]. The analysis of a relevant study [30] indicates that students enrolled in legal, political, and philological studies demonstrated higher levels of conscientious participation in remote classes. Conversely, psychology students exhibited lower levels of participation and rated remote learning less favourably. These findings suggest that the nature of the discipline may impact students’ engagement with and attitudes toward e-learning. These studies [30,31] also highlight the importance of emotional engagement in student performance. Nevertheless, our study does not focus on student emotional engagement since it is not easily measurable. Therefore, the study examines the association of behavioural engagement factors (derived from the students’ interaction with the learning activities) since learning analytics tools can provide measurable data in regard to these factors. In parallel, one important study has proved that students’ behavioural engagement metrics are associated with students at risk in online sustainable education courses [24]. However, risk and prediction models, as the one presented in this relative study, have not been used in predicting achievers. To address this gap, the study examines whether strong predictors of achievers fall into sustainable education. For this purpose, the research assumptions are:
Assumption 1: Students’ engagement data (reflecting their effort) is a strong predictor of achievers in online theatre courses.
Assumption 2: Sustainable education components (such as teamwork and problem-solving competencies) play a significant role in high students’ achievement in online theatre courses.
To serve this purpose, an LDA was conducted on students’ engagement data, elicited by implementing the structure of an online theatre course on Moodle. That is because LDA delivers the following results:
  • It provides a forecast model for achievers and non-achievers.
  • Through the linear discriminant functions, it indicates strong and weak predictors, and therefore, it points out the contribution of each factor to high or critical students’ performance.
  • It constructs a model with high training potential, promising that the strong predictors indicated in the LDA model’s coefficients can constitute potential predictors in any course with a similar structure.
  • It delivers a classification table for achievers and non-achievers, allowing for developing a remedial intervention strategy (given that verified LDA prediction models can lead to early warning systems).

2. Literature Review

2.1. Predictive Methods in Education

Predicting student success in online courses is a critical challenge in educational research, particularly as digital learning environments become ubiquitous [32,33,34,35,36]. While predictive analytics has been widely applied to identify at-risk students and achievers, a couple of studies have explored its intersection with sustainable education [37], a pedagogical framework that emphasizes lifelong learning, collaboration, and real-world problem-solving [11,38]. This research gap is significant, as sustainable education aligns with constructivist theories and offers a robust foundation for designing courses that foster deep engagement and critical thinking [1,39].
Predictive modelling in education has employed diverse methods, including logistic regression, decision trees, and neural networks [36]. Linear Discriminant Analysis (LDA) has been particularly effective due to its simplicity and interpretability [40]. Table 1 compares the most important studies across their methods and results.
The studies indicated in Table 1 highlight significant findings:
LDA excels in interpretability and performance for linearly separable data [41,42].
Assignments and logins consistently emerge as strong predictors across studies [16,17].
Hybrid models (e.g., LDA + clustering) improve accuracy but sacrifice transparency [39].
In parallel, these comparative studies highlight the strengths and limitations of LDA. While LDA outperforms decision trees in linearly separable datasets, it may underperform compared to non-linear methods, such as support vector machines (SVMs), in complex scenarios. Nonetheless, LDA remains popular in educational research due to its computational efficiency and transparency [32,36,43].
Recent works have been related to hybrid models. For example, one study combined LDA with clustering techniques to improve the early prediction of at-risk students, thereby achieving better outcomes for achievers. However, such approaches often sacrifice interpretability for marginal gains in accuracy, underscoring the trade-offs in model selection [39].
While some studies demonstrate that risk factors are course-specific, similar predictors emerge in courses with comparable designs [17,44]. Although the focus has primarily been on at-risk students, the same methodologies hold in the case of achievers [16,17,23].

2.2. Predictors of Students’ Achievement in Sustainable Education Courses

Across studies, specific predictors appear consistently. One study underlines the importance of behavioural persistence metrics, such as logins, video views, and forum activity [32,38]. Other studies stress collaborative engagement indicators such as group assignments [45]. In parallel, a specific study has proved that cognitive mastery metrics, such as theory-based assessments (quizzes), are strong predictors of students’ final achievement [37]. Finally, another research focuses on digital literacy indicators, measured via LMS metrics (such as multimedia use) [10].
However, there are no significant studies that refer to predictors of students’ performance in online courses that meet the standards of sustainable education. There is one attempt to predict students’ performance in online sustainable education courses. This study proved that sustainable education components, such as self-assessment exercises and practical exercises (aimed at developing critical thinking), are strong predictors of students’ final learning outcome. Nevertheless, this study focuses on students-at-risk, rather than specifically on achievers, and the specific research focuses on remote lab courses, not on theatre courses [24].
Hence, there are no significant studies that incorporate sustainability-aligned indicators, like systems thinking or collaboration, explicitly framed within ESD frameworks in online courses, and especially in online theatre courses. This study addresses this gap.

3. Materials and Methods

3.1. Our Course

In terms of the principles of sustainable education and exploiting the Moodle LMS capabilities, our course had the following structure:
  • Theoretical slides on theatre-wise topics on Moodle.
  • Videos on theatre techniques on Moodle as YouTube links.
  • Self-evaluation quizzes on the Moodle Course page.
  • A large-scale collaborative exercise, reflecting a theatre-wise problem, was assigned to groups.
  • An online final exam (in the format of a Moodle online test) defined the student’s final achievement to a great extent.
The theoretical part consisted of slides and videos, designed to familiarize students with theatre practices and problems. The team assignment helped students develop skills to overcome difficulties in implementing theatre practices in real cases. Since educational sustainability refers to how citizens are trained to tackle problems that might occur in a sustainable world, the topics of the collaborative assignment revolved around theatre practices, such as digital narration, theatre games, and drama, that can be employed to teach students about effective disaster management.
The self-assessment exercises helped students to evaluate themselves in terms of their understanding and competency. The theoretical material, along with the self-assessment exercises, constituted the vehicle for students to excel at the collaborative assignment. It is important to point out that students had three (3) tries to complete a self-assessment quiz. The system calculated the maximum grade out of the three tries. Students completed the self-assessment quiz if the maximum grade (achieved at least in one try) was greater than or equal to 50 %. The digital resources (videos and slides) were indicated as “viewed” if the student clicked on them (accessed). In terms of PPT files (slides), when a student clicked on a slide, the slide automatically started within the Moodle environment. In the same sense, when a student clicked on a video link, the system automatically transferred him/her to YouTube, and the video was launched.
It is essential to note that since students’ interaction with the learning activities on Moodle is related to their ease of using the system, students were provided with instructions on using Moodle (through well-designed videos and PDF files) to enhance their digital literacy. In parallel, students were in one way familiarized with the e-learning platform, given that other courses were also uploaded.

3.2. Procedure

Our research focused on identifying high-achieving students. To achieve this, we developed a calibrated risk management framework based on quantitative analysis steps [46]. A typical risk management framework encompasses the stage of risk analysis (where a risk model identifies entities at risk and highlights factors that are related to the likelihood of risk occurrence). Another important phase of a general risk management framework is the generation of the prediction model to forecast entities at risk. A validated prediction model could lead to an efficient warning system for at-risk entities [46]. Like common educational risk and prediction models, our model analyzes engagement data per student, using only students’ IDs to match students with the LMS results. Students’ names or other personal info (personal data) are not used. Figure 1 illustrates our method:
  • Collect all data concerning students’ engagement from Moodle LMS repositories. It is essential to note that LMSs hold valuable data, which can be elicited using a competent Learning Analytics tool. Moodle LMS (in the implementation of the University) incorporates such dynamic tools. Therefore, time-related data (time spent on each resource and activity), aggregate data (total number of digital resources viewed), frequency data (number of times a specific digital resource was accessed or viewed), and other indicators such as activity completion constitute valuable engagement metrics. Educators or administrators can calibrate the engagement metrics’ eliciting process according to their learning design. The implementation of the University Moodle platform incorporated dynamic learning analytics tools, making it easy for learning designers and course developers to perform the calibration process and facilitate the measurement of engagement metrics.
  • Use a discriminant function analysis to construct an efficient forecast model for achievers based on linear discriminant functions.
  • Validate the model to ensure that it works well across predictors’ variability.
  • Transform the prediction model into an early warning system.
Stages (3) and (4) are in the pipeline. Therefore, the implementation of stages (1) and (2) is only described in this paper, as our objective was initially to identify achievers and then to determine the remedial action for such students, verifying that the forecast model operates well in many courses, thereby constituting an efficient early warning system.
A total of two hundred students undertook the course. Students’ engagement metrics were used to perform the discriminant function analysis. These metrics reflected students’ interaction with the learning activities on the e-learning platform. The students’ engagement metrics were not compared across gender diversity, considering demographic factors. Although specific work was assigned to groups, students’ engagement metrics were analyzed individually, not by group. All students who sat for the final exams constituted the sample. Students who did not complete the final online test were outliers and were excluded from the sample.
Finally, time-related data were not included in the dataset since, according to similar studies, they have not proved to be decisive predictors of students’ achievement in risk and prediction models [17,23,25,26,27,28,42,44,47,48,49].
The linear discriminant function analysis was implemented in the same way as designed to identify students at risk [17,23,25,26,27,28,42,44,47,48,49]. The discriminant function generated for not-at-risk students was the one dedicated to achievers. However, since the cut-off value for achievers was seven, students at risk were also identified according to the new threshold. Hence, our prediction model identifies both achievers and non-achievers. However, the risk threshold has been slightly adjusted compared to the typical risk threshold used in many educational risk models [17,23,25,26,27,28,42,44,47,48,49].

3.3. Applying Our Method

The engagement metrics derived from a meaningful dataset collected from Moodle LMS (Table 2) after the first course run, as the final grades on the collaborative assignments and the final exam were among the candidate predictors.
Appropriate variables reflected the engagement metrics listed in Table 2. In parallel, a binary variable (“stach”) labelled achievers (0) and non-achievers, according to the new threshold (cut-off value: 7). Table 3 gives insight into these variables.
The final grade was composed of the collaborative assignment grade (40%) and the final exam grade (60%). Five was the cut-off value for students. Achievers were students who achieved a final grade of 7 or higher. Therefore, after the final grade calculation, the variable “stach” takes on the value 0 in the case of achievers, and the value 1 in the case of non-achievers.
As depicted in Figure 2, the variable “stach” (derived through the Final Grade) is the dependent variable, and the other variables are the independent variables in our Discriminant Analysis Scheme.
The discriminant function analysis generated two linear functions (one for non-achievers and one for achievers). The score of each function was calculated after the final exam, and students were classified into the group determined by the function with the maximum score.
The independent variables were the potential predictors. Real predictors derive from the statistically significant candidate ones and constitute the coefficients of the linear discriminant functions (Table 4). The statistical packages SPSS v.24 and JASP v0.17.3.0 performed the analysis.

4. Results

LDA was applied to our online theatre course to identify strong predictors of achievers. The LDA outcome was the generation of a prediction model with specific performance characteristics. Our LDA (Linear Discriminant Analysis) model achieved strong performance across several evaluation metrics:
Accuracy: 90%; This means that 90% of all predictions—both positive and negative—were correct. The model demonstrates strong overall performance.
Specificity: 80%; Specificity measures how well the model identifies true negatives. In other words, 80% of the negative cases were correctly classified, while 20% were misclassified as positives.
Correct Classification Rate (CCR): 88%; The CCR provides a balanced view of performance across classes. It is often calculated as the average of sensitivity (true positive rate) and specificity. In this case, a CCR of 88% indicates the model is slightly stronger at identifying positive cases than negative ones, but overall classification performance remains high.
The LDA model performs well overall, achieving high accuracy and balanced classification. It is particularly effective at detecting positive cases, with slightly lower performance on negative cases, suggesting room for improvement in reducing false positives if necessary.
The model’s coefficients (derived from the statistically significant engagement metrics) constitute the linear discriminant functions’ coefficients that are used in the classification process. Table 4 indicates these coefficients.
Fisher’s discriminant functions
Achievers: f(x) = 0.75 × (Total Logins) + 0.85 × (Collaborative Assignment Grade) + 32.456 (2)
Non-Achievers: f(x) = 0.45 × (Total Logins) + 0.50 × (Collaborative Assignment Grade) + 34.025 (3)
A ROC analysis can illustrate the strength of the discriminant functions’ operation. In our case, ROC analysis yielded a high, confirming strong predictive power. Figure 2 shows the ROC curve illustration.
Figure 3 illustrates a High True Positive Rate (Sensitivity) even at Low False Positive Rates. In parallel, the Eigenvalue and Canonical Correlation coefficients indicate important aspects of the model’s performance. As shown in Table 5, the high Eigenvalue (λ = 2.60) and canonical correlation (Rc ≈ 0.85) confirm strong class separation. Both coefficients enhance the argument that our model is robust. The Canonical Correlation coefficient proves that our model accounts for 85 % of the Variance of the strong predictors.
Table 6 shows the classification’s outcome for one hundred randomly selected students (for simplicity, 50 Achievers and 50 Non-Achievers)
The classification rates that derive from the data on the classification table (Table 6) are the following:
Classification Rates:
Achievers: 48/50 = 96% (Sensitivity)
Non-Achievers: 40/50 = 80% (Specificity)
Overall Accuracy: (48 + 40)/100 = 88% (matches Correct Classification Rate)
Moreover, the classification rates indicate that our model performs better in classifying achievers. The model is powerful at identifying Achievers (96% sensitivity) but slightly weaker for Non-Achievers (80% specificity). Combining the classification rates and the AUC percentage, it denotes that the LDA model has a high probability of correctly ranking a random Achiever higher than a random Non-Achiever. Additionally, the AUC value indicates that a randomly chosen achiever has a higher discriminant score than a randomly chosen non-achiever [50,51,52,53].
The verification of the LDA assumptions further accentuates our model’s robustness. A general LDA relies on three (3) key assumptions [54,55,56]:
  • Multivariate Normality (Features are normally distributed per class).
  • Homoscedasticity (Equal covariance matrices across classes).
  • Low Multicollinearity (Features are not highly correlated).
Table 7 resumes the LDA assumptions’ verification, and Table 8 indicates the Covariance Matrices.
Table 8 aligns with the Box’s M test outcome (p = 0.12), indicating no significant difference among classes, verifying the assumption of homoscedasticity. The standardization process indicates another important aspect of the model’s potential. In detail, standardization (z-score scaling) was applied, and Table 9 provides the Feature Summary Pre/Post-Standardization.
Important key findings of the standardization process are:
1.
Standardization Improved Separation:
Eigenvalue (λλ) increased from 4.2 (raw data) to 5.51 (standardized).
2.
Classification Accuracy:
Raw data: 85% → Standardized: 88% (+3% gain).
Table 10 summarizes the benefits of standardization:
Therefore, standardization improved separation, accuracy, and specificity. It is also essential to underline that normalization was not applied. Standardization was preferred for the LDA to preserve Gaussian properties.
To recapitulate, our model excels in each performance territory, the verification of LDA assumptions, and the improvements made in the standardization phase further accentuate its robustness.

5. Discussion

Our study presents a robust prediction model for achievers (see Section 4), as the ones presented in a prior LDA-based study [29], and surpasses logistic regression models in similar contexts [16]. The total logins into the system and the collaborative assignment grades constitute engagement metrics. Given that these factors proved to be strong predictors of achievers, the first assumption is verified, indicating that in our study, engagement factors affect students’ performance, a finding that previous studies report, especially for students at risk [33,34,57].
In parallel, the collaborative assignment’s grade, which proved to be among the strong predictors of students’ high achievement, is a core component of sustainable pedagogy, reflecting the students’ competency to develop problem-solving strategies and reinforce their critical thinking. In this sense, the second assumption is verified.
The results highlight two critical insights for educators:
  • Sustainable Design Enhances Predictability: Courses embedding collaborative and real-world tasks yield measurable engagement patterns, facilitating accurate predictions.
  • Early Intervention Opportunities: While this model predicts achievers’ post-course, future iterations could integrate early warning systems to enable timely support [41].
One study focuses on LMS logins but omits collaborative tasks, achieving lower specificity: 75% vs. 80% in our model [16]. In parallel, another study uses time-on-task as a predictor but does not contextualize findings within sustainable pedagogy [29].
Moreover, in terms of ESD principles, although theory resources (designed to help students cultivate critical thinking) were predominant predictors of students’ high achievement in one study, this is not the case in our work [24]. On the contrary, although the collaborative assignment did not affect students’ achievement in the respective study, this factor proved to be a decisive predictor in our study. Therefore, a factor that lies in the field of sustainable education was the predominant predictor of students’ achievement.
In terms of learning design, the results proved that sustainable education components affect students’ high performance. Therefore, educators are encouraged to design courses upon sustainable education principles and then attempt to predict achievers using models built on engagement metrics. It is also essential to underline that sustainable education components proved to be decisive predictors in an online theatre course, urging theatre educators to design their courses considering ESD principles, adjusting the course structure to be implemented through an e-learning platform.
Stressing the educational value of the results, it is important to point out that videos watched did not prove to affect students’ high achievement, indicating that digital resources did not play a significant role in the prediction process in our study. Digital media is a core component of modern theatre pedagogy [12,58]. Nevertheless, our prediction model’s outcome did not point out this key element. More studies are needed to generalize this finding, since prediction models are course-oriented. Moreover, our model primarily works well for achievers, and thereby, it is not clear if digital media could constitute predictors of students at risk. However, the literature does not report that this factor affects students’ critical performance [33,34]. However, since risk models are also course-oriented, the possibility that digital media could emerge as a risk factor in a specific theatre course cannot be ruled out. Additionally, other theatre-wise key elements such as digital narration and theatre game were not associated with achievers in our study. However, our model should be applied to many theatre courses to generalize this finding.
However, the findings do not necessarily accentuate the achievement of learning goals in the case of achievers. This is because in our research, the interest is to identify and predict achievers, not to examine the success of the course delivery. Though in one sense, students who achieve a grade higher than seven could be viewed as meeting the pedagogical learning objectives of the course. Nevertheless, the learning objectives’ achievement rate is not presented in our study. Associating achiever’s predictors with learning objectives could be another expansion of our research.
Although studies suggest that hybrid theatre courses are mostly educationally preferable [20,22], our study proved that online theatre courses could be used autonomously to predict students’ high achievement. Since predictors derive from behavioural engagement metrics, the prediction process is facilitated in any online course, and it is not affected by the way the learning goals have been implemented. Therefore, our method can be used in any online course, indicating the scalability and interoperability of our framework.
One important limitation of our model, in comparison to similar prediction attempts in the literature, is that it forecasts achievers after course completion. Therefore, there is less space for early intervention [8,9]. However, the discriminant scores can be calculated at any time, and thus our model could vouch for early intervention. Nevertheless, our model has not been assessed for early intervention purposes.
Another limitation of our study lies in the intrinsic drawbacks of most educational forecast models, which heavily depend on course structure and a specific set of predictors [16,17,23,25,26,27,28,35,42,44,47,48,49]. Nevertheless, the forecast model’s potential remained high in two runs, despite the use of different predictors in specific studies [16,17,26,27,28,35]. Hence, the classification dynamics of the forecast model were unaffected by the variability in predictors for each round of completion. However, more studies could better enhance this argument.
Moreover, the possibility of students’ low digital literacy percentage has not been investigated. In this sense, it is not clear if digital literacy accounts for the low engagement rate of some students (reflected by logins and activities’ completion), especially in the case of students at risk. A pre-questionnaire (delivered before the course start) could indicate students’ digital literacy skills and could be linked to risk factors and predictors, constituting another research expansion.
In parallel, demographic features are not included in our dataset; therefore, our prediction model has not been evaluated to enhance gender inclusivity. However, there are no significant studies in the literature to associate educational risk factors and predictors with demographic features [34]. The reported risk and prediction models are based on behavioural engagement metrics. Only studies examining students’ retention include demographic data in the risk factors, without incorporating them into a competent risk or prediction model.

6. Conclusions

The paper presented an integrated methodology to construct an efficient prediction model for achievers based on a risk management framework. An online theatre course revolving around sustainability issues (for example, disaster management) served this purpose. The paper also demonstrated how an online theatre course aligns with ESD principles. In parallel, our study indicated how engagement metrics can be elicited in such a course, and how these factors can be analyzed to predict achievers and students at risk. The paper highlighted the LDA’s potential in generating efficient prediction models and demonstrated its application in an online course to execute the prediction process. The contribution of our findings to the field of sustainable education is summarized in the points below:
(a) Online courses (such as theatre courses) can be designed to meet the sustainable educational standards and (b) Sustainable education components, such as collaborative assignments, proved to be strongly associated with students’ performance. Therefore, the study indicated that it is of high educational merit to include collaborative assignments in the learning activities of sustainable education courses.
In addition, the contribution of our study to the field of learning analytics lies in the fact that our framework can be used to facilitate the eliciting of engagement metrics (by Learning Analytics Tools) in any online course. Therefore, our study accentuates that a calibrated risk management framework (such as the one presented in our study) can be easily combined with the learning analytics outcome of e-learning educational platforms, covering the gap of similar scientific research where the risk identification process is not part of an integrated framework, and it is presented in a fragmented way.
To ensure the course’s sustainability, our prediction model could be assessed in multiple theatre courses (with varying structures) to verify that it operates well in the case of variability in predictors. After validating our model in different courses, the remedial action (generating alert messages) will be determined. Then, our model could constitute a competent warning system for non-achievers. However, this process is in the pipeline.
Moreover, it would be of high scientific merit to examine the prediction model’s operation in association with groups’ engagement data to ensure the model’s potential across the emergence of new predictors. It is essential to emphasize that our prediction model applies to any course delivered at any educational level, on the condition that educators or administrators upload a portion of the course to an e-learning platform to obtain engagement metrics. In this case, our model can also predict achievers in blended courses. Statistical experts or risk analysts could compare our LDA model’s outcome to the classification outcome of risk identification methods (such as regression and classifiers) to assess its potential in predicting students at risk. From a similar perspective, managers can utilize our LDA model to predict achievers in online training courses. However, our model does not apply to pure conventional teaching since conventional teaching does not facilitate the measurement of engagement metrics. In the case of hybrid courses, it is of high scientific merit to identify predictors associated with conventional teaching. This could be another area for expansion in our research.
A validated prediction model in any course based on sustainable education principles holds value for a range of stakeholders.
Primarily, teachers (at any educational level) could utilize the model to get an overall picture of their class, identifying students who need timely interventions and support. Although in our study, the discriminant functions’ calculation was implemented after the first course run, educators can also calculate these functions at any time during the course to monitor students’ progress, and to identify students who need extra help, developing an effective strategy for remedial action.
Additionally, by taking advantage of the predominant predictors, our model could help educational stakeholders identify areas for improvement in their learning design. The strong predictors indicate the elements of sustainable education and theatre pedagogy that are most impactful (contributing more to students’ high achievement). In parallel, the weak predictors highlight educational and pedagogical elements that can enhance the learning design to meet students’ needs.
If demographic data and inclusivity characteristics constitute additional model coefficients, then our model could offer benefits to universities, colleges, and online academies, controlling students’ retention and identifying factors that contribute to students’ dropout. It is important to underline that although LDA and predictive analytics, in general, indicate strong and weak predictors, such methods do not necessarily propose ways to minimize the effect of the weak predictors, and to maximize the potential of the strong predictors. Therefore, the students’ retention could be fully controlled if our prediction model is combined with other dynamic optimization methods.
After this process, our model could constitute the pillar on which educational decision makers could design a generic educational policy. By incorporating socio-economic data into our model and integrating its outcome with other efficient optimization methods, we can predict whether a specific educational strategy would favour achievers. The same holds for students at risk.
The predominant predictors highlighted in our findings constitute meaningful engagement metrics associated with students’ high achievement. If our model is verified across multiple courses, a list of meaningful engagement metrics can be created. Educational stakeholders, e-learning systems’ administrators, and course developers can then use this list to calibrate the engagement metrics provided by learning analytics tools, highlighting the most meaningful indicators and identifying areas for improvement. This could be a research expansion that could lead to the amelioration of Moodle learning analytics tools, and therefore to the amelioration of the developed warning systems, enhancing the remedial action. Our team is currently verifying the model in at least two different courses to ensure the model’s high performance in the case of predictors’ variability. In parallel, our team is examining how inclusivity features, demographic characteristics, and socio-economic factors could constitute additional model coefficients to expand the range of its utilities.
In a nutshell, our study presented a robust LDA prediction model for achievers based on a calibrated risk management framework that can be to the avail of many educational stakeholders. The improvements mentioned (as research expansion) can increase the robustness and the versatility of our model. Our team is working on these improvements and continuously evaluates our model to maintain its applicability.

Author Contributions

Conceptualization, I.G. and S.N.; methodology, S.N. and I.G.; validation, S.N. and I.G.; formal analysis, I.G.; data curation, V.Z.; writing—original draft preparation, I.G. and S.N.; writing—review and editing, S.N. and V.Z.; supervision, S.N.; project administration, S.N. and V.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available upon request.

Acknowledgments

During the preparation of the manuscript, an AI dynamic tool (DeepSeek) provided insight into the educational potential of LDA and sustainable education. In parallel, the same tool helped in the formulation of the developed case study and the interpretation of SPSS and JASP results. The authors have reviewed and edited the output and take full responsibility for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wiek, A.; Withycombe, L.; Redman, C.L. Key competencies in sustainability: A reference framework for academic program development. Sustain. Sci. 2011, 2, 203–218. [Google Scholar] [CrossRef]
  2. McClanahan, L.G. Essential Elements of Sustainability Education. J. Sustain. Educ. 2014, 6, 1–15. [Google Scholar]
  3. Hudima, T.; Malolitneva, V. Conceptual and legal framework for promotion of education for sustainable development: Case study for Ukraine. Eur. J. Sustain. Dev. 2020, 9, 42. [Google Scholar] [CrossRef]
  4. Santoveña-Casal, S.; Fernández Pérez, M.D. Sustainable distance education: Comparison of digital pedagogical models. Sustainability 2020, 12, 9067. [Google Scholar] [CrossRef]
  5. Dias, B.G.; da Silva Onevetch, R.T.; dos Santos, J.A.; da Cunha Lopes, G. Competences for sustainable development goals: The challenge in business administration education. J. Teach. Educ. Sustain. 2022, 24, 73–86. [Google Scholar] [CrossRef]
  6. UNESCO. Education for Sustainable Development: A Roadmap. UNESCO Publishing: Paris, France. 2020. Available online: https://unesdoc.unesco.org/ark:/48223/pf0000374802.locale=en (accessed on 22 June 2025).
  7. Goller, A.; Rieckmann, M. What do we know about teacher educators’ perceptions of education for sustainable development? J. Teach. Educ. Sustain. 2022, 24, 19–34. [Google Scholar] [CrossRef]
  8. Bell, S.; Douce, C.; Caeiro, S.; Teixeira, A.; Martín-Aranda, R.; Otto, D. Sustainability and distance learning: A diverse European experience? Open Learn. 2017, 32, 95–102. [Google Scholar] [CrossRef]
  9. Warburton, K. Deep learning and education for sustainability. Int. J. Sustain. High. Educ. 2003, 4, 44–56. [Google Scholar] [CrossRef]
  10. UNESCO. AI Competency Framework for Students; Available online: https://unesdoc.unesco.org/ark:/48223/pf0000391105UNESCO Publishing: Paris, France, 2024; (accessed on 22 June 2025).
  11. UNESCO. Education for Sustainable Development Goals and Learning Objectives; Available online: https://www.uncclearn.org/wp-content/uploads/library/unesco1.pdfUNESCO Publishing: Paris, France, 2017; (accessed on 22 June 2025).
  12. Hassall, L.; Rowan, S. Greening theatre landscapes: Developing sustainable practice futures in theatre graduates. In University Initiatives in Climate Change Mitigation and Adaptation; Springer International Publishing: Cham, Switzerland, 2018; pp. 143–158. [Google Scholar]
  13. Foster, G.; Stagl, S. Design, implementation, and evaluation of an inverted (flipped) classroom model economics for sustainable education course. J. Clean. Prod. 2018, 183, 1323–1336. [Google Scholar] [CrossRef]
  14. Geitz, G.; de Geus, J. Design-based education, sustainable teaching, and learning. Cogent Educ. 2019, 6, 1647919. [Google Scholar] [CrossRef]
  15. Zakopoulos, V.; Makri, A.; Ntanos, S.; Tampakis, S. Drama/Theatre Performance in Education through the Use of Digital Technologies for Enhancing Students’ Sustainability Awareness: A Literature Review. Sustainability 2023, 15, 13387. [Google Scholar] [CrossRef]
  16. Macfadyen, L.P.; Dawson, S. Mining LMS data to develop an “early warning system” for educators: A proof of concept. Comput. Educ. 2010, 54, 588–599. [Google Scholar] [CrossRef]
  17. Zakopoulos, V. A framework to identify students at risk in blended business informatics courses: A case study on Moodle. Int. J. Econ. Bus. Adm. 2022, 10, 239–247. [Google Scholar] [CrossRef] [PubMed]
  18. Vickers, S. Online theatre voice pedagogy: A literature review. Voice Speech Rev. 2020, 14, 254–268. [Google Scholar] [CrossRef]
  19. Borden, I.; A Digital Class for an In-Person Discipline: Creating an Online Class for Introduction to Theatre. University of Nebraska—Lincoln. 2018. Available online: https://digitalcommons.unl.edu/cgi/viewcontent.cgi?params=/context/prtunl/article/1110/&path_info=Ian_Borden_Inquiry_Portfolio_2018.pdf (accessed on 14 August 2025).
  20. Mokoena, M.T. A Pandemic Shakes Our Pedagogy: Attempts to Honour the Integrity of a South African Tertiary Institution’s Applied Drama and Theatre Curriculum in Online Learning Platforms as a Result of COVID-19. Ph.D. Dissertation, University of the Witwatersrand, Johannesburg, South Africa, 2023. [Google Scholar]
  21. Mokoena, M.T.; Van Vuuren, P.J. Lessons learnt from teaching an Applied Drama and Theatre pedagogy online in a digitally divided South Africa. Perspect. Educ. 2023, 41, 103–118. [Google Scholar] [CrossRef]
  22. Moyo, C.; Sibanda, N. Challenges in Teaching and Learning in Practical Theatre Courses during the COVID-19 Lockdown at Lupane State University. ArtsPraxis 2020, 7, 43–55. [Google Scholar]
  23. Anagnostopoulos, T.; Kytagias, C.; Xanthopoulos, T.; Georgakopoulos, I.; Salmon, I.; Psaromiligkos, Y. Intelligent predictive analytics for identifying students at risk of failure in Moodle courses. In Proceedings of the International Conference on Intelligent Tutoring Systems, Athens, Greece, 8–12 June 2020; Springer: Cham, Switzerland, 2020; pp. 152–162. [Google Scholar]
  24. Georgakopoulos, I.; Piromalis, D.; Ntanos, S.; Zakopoulos, V.; Makrygiannis, P. A prediction model for remote lab courses designed upon the principles of education for sustainable development. Sustainability 2023, 15, 5473. [Google Scholar] [CrossRef]
  25. Aleksandrova, Y. Predicting students’ performance in Moodle platforms using machine learning algorithms. In Proceedings of the Conferences of the Department Informatics, Varna, Bulgaria, 18 October 2019; Volume 2, pp. 177–187. [Google Scholar]
  26. Georgakopoulos, I.; Chalikias, M.; Zakopoulos, V.; Kossieri, E. Identifying factors of students’ failure in blended courses by analyzing students’ engagement data. Educ. Sci. 2020, 10, 242. [Google Scholar] [CrossRef]
  27. Georgakopoulos, I.; Tsakirtzis, S. Generating a Model to Predict Secondary School Students at Risk in Mathematics. Int. Electron. J. Math. Educ. 2021, 16, em0630. [Google Scholar] [CrossRef]
  28. Tsakirtzis, S.; Georgakopoulos, I. Developing a Risk Model to identify factors which critically affect Secondary School students’ performance in Mathematics. J. Math. Educ. Teach. Pract. 2020, 1, 63–72. [Google Scholar]
  29. Tempelaar, D.T.; Rienties, B.; Giesbers, B. In search for the most informative data for feedback generation: Learning analytics in a data-rich context. Comput. Hum. Behav. 2015, 47, 157–167. [Google Scholar] [CrossRef]
  30. Szopiński, T. University students` attitude to e-learning in the post-COVID-19 era. Contemp. Econ. 2023, 17, 311–322. [Google Scholar] [CrossRef]
  31. Ćwiertniak, R.; Stach, P.; Kowalska-Jarnot, K.; Worytkiewicz-Raś, K.; Wachułka-Kościuszko, B. Addressing students’ perceived value with the virtual university concept. E-Mentor 2022, 2, 65–76. [Google Scholar] [CrossRef]
  32. Romero, C.; Ventura, S. Educational data mining and learning analytics: An updated survey. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2020, 8, e1355. [Google Scholar] [CrossRef]
  33. Yesufu, L.O. Predictive learning analytics in higher education. In Data Analytics in Marketing, Entrepreneurship, and Innovation; Auerbach Publications: Boca Raton, FL, USA, 2021; pp. 151–173. [Google Scholar]
  34. Shafiq, D.A.; Marjani, M.; Habeeb, R.A.A.; Asirvatham, D. Student Retention Using Educational Data Mining and Predictive Analytics: A Systematic Literature Review. IEEE Access 2022, 10, 72480–72503. [Google Scholar] [CrossRef]
  35. Huang, A.Y.Q.; Lu, O.H.T.; Huang, J.C.H.; Yin, C.J.; Yang, S.J.H. Predicting students’ academic performance by using educational big data and learning analytics: Evaluation of classification methods and learning logs. Interact. Learn. Environ. 2019, 28, 206–230. [Google Scholar] [CrossRef]
  36. Hellas, A.; Ihantola, P.; Petersen, A.; Ajanovski, V.V.; Gutica, M.; Hynninen, T.; Knutas, A.; Leinonen, J.; Messom, C.; Liao, S.N. Predicting academic performance: A systematic literature review. In Proceedings of the Companion of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education, Larnaca, Cyprus, 1–3 July 2018; pp. 175–199. [Google Scholar]
  37. Sterling, S.; Orr, D. Sustainable Education: Re-Visioning Learning and Change; Schumacher Society/Green Books: South Devon, UK, 2001; ISBN 1870098994. [Google Scholar]
  38. Pardo, A.; Ellis, R.A.; Calvo, R.A. Combining academic and behavioral metrics for learner engagement models. Internet High. Educ. 2019, 42, 1–14. [Google Scholar]
  39. Almarabeh, H. Analysis of students’ performance using LMS data in blended learning. Int. J. Inf. Technol. 2017, 9, 1–10. [Google Scholar]
  40. Webb, G.I.; Ge, P. Feature weighting for LDA: Applications in educational analytics. J. Educ. Data Min. 2022, 14, 45–67. [Google Scholar]
  41. Rencher, A.C.; Christensen, W.F. Methods of Multivariate Analysis, 3rd ed.; Wiley: Hoboken, NJ, USA, 2012. [Google Scholar]
  42. Büyüköztürk, Ş.; Çokluk-Bökeoğlu, Ö. Discriminant Function Analysis: Concept and Application. Eurasian J. Educ. Res. 2008, 33, 73–92. [Google Scholar]
  43. Hastie, T.; Tibshirani, R.; Friedman, J. The Elements of Statistical Learning; Springer: Cham, Switzerland, 2009. [Google Scholar]
  44. Lourens, A.; Bleazard, D. Applying predictive analytics in identifying students at risk: A case study. S. Afr. J. High. Educ. 2016, 30, 129–142. [Google Scholar] [CrossRef]
  45. Woollard, J. Learning for Sustainability in Education: Theory, Practice, and Policy; Routledge: Oxfordshire, UK, 2021. [Google Scholar]
  46. Vose, D. Risk Analysis: A Quantitative Guide; Wiley: Hoboken, NJ, USA, 2008. [Google Scholar]
  47. Alam, A.; Mohanty, A. Predicting students’ performance employing educational data mining techniques, machine learning, and learning analytics. In Proceedings of the International Conference on Communication, Networks and Computing, Gwalior, India, 8–10 December 2022; Springer Nature: Cham, Switzerland; pp. 166–177. [Google Scholar]
  48. Fauszt, T.; Erdélyi, K.; Dobák, D.; Bognár, L.; Kovács, E. Design of a machine learning model to predict student attrition. Int. J. Emerg. Technol. Learn. 2023, 18, 184–195. [Google Scholar] [CrossRef]
  49. Kabathova, J.; Drlik, M. Towards predicting student’s dropout in university courses using different machine learning techniques. Appl. Sci. 2021, 11, 3130. [Google Scholar] [CrossRef]
  50. Izenman, A.J. Linear discriminant analysis. In Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning; Springer: Cham, Switzerland, 2016; pp. 237–280. [Google Scholar] [CrossRef]
  51. Manly, B.F.; McDonald, L.L.; Thomas, D.L.; McDonald, T.L.; Erickson, W.P. Resource Selection by Animals: Statistical Design and Analysis for Field Studies; Springer: Dordrecht, The Netherlands, 2002. [Google Scholar]
  52. Xanthopoulos, P.; Pardalos, P.M.; Trafalis, T.B. Linear discriminant analysis. In Robust Data Mining; Springer: Cham, Switzerland, 2012; pp. 27–33. [Google Scholar]
  53. Ye, J.; Janardan, R.; Li, Q. Two-dimensional linear discriminant analysis. Adv. Neural Inf. Process. Syst. 2004, 17, 1569–1576. [Google Scholar]
  54. Johnson, R.A.; Wichern, D.W. Applied Multivariate Statistical Analysis, 6th ed.; Pearson: London, UK, 2007. [Google Scholar]
  55. Tabachnick, B.G.; Fidell, L.S. Using Multivariate Statistics, 6th ed.; Pearson: London, UK, 2013. [Google Scholar]
  56. Razali, N.M.; Wah, Y.B. Power comparisons of Shapiro-Wilk, Kolmogorov-Smirnov, Lilliefors and Anderson-Darling tests. J. Stat. Model. Anal. 2011, 2, 21–33. [Google Scholar]
  57. Scheffel, M.; Tsai, Y.-S.; Gašević, D.; Draschler, H. Learning Analytics Policies. In Handbook of Learning Analytics, 2nd ed.; Society for Learning Analytics Research: Beaumont, AB, Canada, 2022. [Google Scholar]
  58. Zaghloul, H.S. The Theater in the Educational Context: Elements of Strengths, Weaknesses, Opportunities, and Threats. J. Hist. Cult. Art. Res. 2020, 9, 106–112. [Google Scholar] [CrossRef]
Figure 1. The Methodological Approach.
Figure 1. The Methodological Approach.
Information 16 00780 g001
Figure 2. The Discriminant Analysis Scheme.
Figure 2. The Discriminant Analysis Scheme.
Information 16 00780 g002
Figure 3. ROC Curve.
Figure 3. ROC Curve.
Information 16 00780 g003
Table 1. Predictors of students’ performance across studies.
Table 1. Predictors of students’ performance across studies.
MethodStudyKey PredictorsAccuracyInsights
LDA[16]Logins, forum participation85%Strong for linear data; interpretable
Logistic Regression[17]Assignments, login frequency82%Simplicity, but limited in interactions
SVM[23]Video views, peer interactions88%Excels in complex datasets
Decision Trees[39]Time-on-task, quiz scores78%Captures non-linear patterns
Hybrid Models[40]Demographics + behavioural metrics89%High accuracy, low interpretability
Table 2. Students’ Engagement Metrics.
Table 2. Students’ Engagement Metrics.
Moodle LMS Engagement Metrics
1.Total number of theory resources (slides) viewed
2.Total number of Videos watched
3.Total number of self-evaluation exercises completed
4.Collaborative assignment grade
5.Final exam grade
6.Total logins into the system
Table 3. Variables modelled.
Table 3. Variables modelled.
Variable NameDescriptionMeasurement TypeMeasurement Date
TnthslidesTotal number of theory resources(slides) viewed.ScaleTwo weeks before the final exam
TnvwatchTotal number of Videos watchedScaleTwo weeks before the final exam
TnselfevexTotal number of self-evaluation exercises completedScaleTwo weeks before the final exam
CollaborativeAssignmentGradeCollaborative assignment gradeScaleTwo weeks before the final exam
TotalLoginsTotal logins into the systemScaleTwo weeks before the final exam
stachAchievers/NonachieversNominalAfter the final exam
Table 4. Linear Discriminant Functions’ Coefficients.
Table 4. Linear Discriminant Functions’ Coefficients.
Classification Function Coefficients
 stach
01
Total Logins0.750.45
Collaborative Assignment Grade0.850.50
(Constant)32.45634.025
Table 5. Eigenvalues Confirming Model Robustness.
Table 5. Eigenvalues Confirming Model Robustness.
FunctionEigenvalue% of VarianceCumulative %Canonical Correlation
12.60100.0100.00.852
Table 6. Classification Rates.
Table 6. Classification Rates.
Predicted AchieversPredicted Non-AchieversTotal
Actual Achievers48 (TP)2 (FN)50
Actual Non-Achievers10 (FP)40 (TN)50
Total5842100
Table 7. LDA assumptions’ verification results.
Table 7. LDA assumptions’ verification results.
AssumptionVerification MethodResult in Our Model
1. Multivariate Normality (Features are normally distributed per class)- Shapiro–Wilk testVerified: Logins and grades showed approximate normality (p > 0.05 for Shapiro–Wilk).
2. Homoscedasticity (Equal covariance matrices across classes)- Box’s M test
- Covariance matrix comparison
Verified: Box’s M test (p = 0.12) → No significant difference.
3. Low Multicollinearity (Features are not highly correlated)- Pearson’s r/VIF scoresVerified: r = 0.35 (Logins vs. Grades), VIF < 5
Table 8. Covariance Matrices (Achievers vs. Non-Achievers).
Table 8. Covariance Matrices (Achievers vs. Non-Achievers).
ClassTotalLogins VarianceGrade VarianceCovariance
Achievers0.450.300.10
Non-Achievers0.500.350.12
Table 9. Feature Summary Pre/Post-Standardization.
Table 9. Feature Summary Pre/Post-Standardization.
FeatureOriginal RangeStandardized Range
TotalLogins[5, 50][−1.8, 2.1]
Grade (0–100 scale)[60, 95][−1.2, 1.5]
Table 10. Standardization improvements.
Table 10. Standardization improvements.
MetricRaw DataStandardized Data
Accuracy85%88%
AUC0.890.92
Specificity75%80%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ntanos, S.; Georgakopoulos, I.; Zakopoulos, V. Predicting Achievers in an Online Theatre Course Designed upon the Principles of Sustainable Education. Information 2025, 16, 780. https://doi.org/10.3390/info16090780

AMA Style

Ntanos S, Georgakopoulos I, Zakopoulos V. Predicting Achievers in an Online Theatre Course Designed upon the Principles of Sustainable Education. Information. 2025; 16(9):780. https://doi.org/10.3390/info16090780

Chicago/Turabian Style

Ntanos, Stamatios, Ioannis Georgakopoulos, and Vassilis Zakopoulos. 2025. "Predicting Achievers in an Online Theatre Course Designed upon the Principles of Sustainable Education" Information 16, no. 9: 780. https://doi.org/10.3390/info16090780

APA Style

Ntanos, S., Georgakopoulos, I., & Zakopoulos, V. (2025). Predicting Achievers in an Online Theatre Course Designed upon the Principles of Sustainable Education. Information, 16(9), 780. https://doi.org/10.3390/info16090780

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop