Next Article in Journal
Managing Collaborative Risks of Integrated Open-Innovation and Hybrid Stage-Gate Model by Applying Social Network Analysis—A Case Study
Previous Article in Journal
The Reconstruction of Post-War Cities—Proposing Integrated Conservation Plans for Aleppo’s Reconstruction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Prediction Model for Remote Lab Courses Designed upon the Principles of Education for Sustainable Development

by
Ioannis Georgakopoulos
1,2,*,
Dimitrios Piromalis
3,
Stamatios Ntanos
4,*,
Vassilis Zakopoulos
2 and
Panagiotis Makrygiannis
1
1
Department of Industrial Design and Production Engineering, University of West Attica, 122 41 Athens, Greece
2
Department of Accounting and Finance, University of West Attica, 122 41 Athens, Greece
3
Department of Electrical and Electronics Engineering, University of West Attica, 122 41 Athens, Greece
4
Department of Business Administration, University of West Attica, 122 41 Athens, Greece
*
Authors to whom correspondence should be addressed.
Sustainability 2023, 15(6), 5473; https://doi.org/10.3390/su15065473
Submission received: 8 February 2023 / Revised: 7 March 2023 / Accepted: 16 March 2023 / Published: 20 March 2023
(This article belongs to the Special Issue Sustainable Governance in Urban Regeneration)

Abstract

:
The COVID-19 pandemic brought about a cease to the physical-presence operation of many laboratory-based university courses. As a response, higher education courses turned into distance learning. Distance education can foster sustainability through resource savings offered by the benefits of technology use. Therefore, there is a necessity to establish a pathway for sustainability practices concerning the increasing distance education enrollment and technological progress. Under the previous concept, this research paper presents a remote lab for the “Data Acquisition Systems” course, delivered during the pandemic as the digital twin of its respective conventional lab. This remote lab was designed on the Education for Sustainable Development (ESD) principles to help students develop critical thinking, problem-solving, and collaboration competencies. This paper aims to develop a concrete framework for identifying factors that critically affect students’ performance during remote lab courses. The analysis is based on students’ engagement data collected by the NI-ELVIS remote lab measurement system during the spring academic semester of 2020 at the University of West Attica, Greece. Furthermore, the paper develops a competent prediction model for students at risk of failing the lab. The findings indicate that content comprehension and theory-exercise familiarization were the main risk factors in the case of the specific remote lab. In detail, a unit increase in content comprehension led to a 2.7 unit decrease in the probability of the risk occurrence. In parallel, a unit increase in theory familiarization through exercises led to a 3.2 unit decrease in the probability of the risk occurrence. The findings also underlined that risk factors such as critical thinking were associated with ESD competencies. Besides this, the benefits of delivering distance-learning labs according to the proposed methodology include environmental benefits by contributing to resource and energy savings since students who are about to fail can be located early and assisted.

1. Introduction

Although the COVID-19 pandemic has provided a reason for remote lab sessions, research has also made the sessions possible remotely. Consequently, during the pandemic, remote teaching has been the prevalent educational practice in the case of laboratory courses [1,2,3]. Traditional laboratory courses aim at helping students develop practical skills. These skills are mastered through well-designed experiments. In the case of remote labs, remote experiments are used for the same end.
A specific study accentuates the role of remote experiments, underlining that experiments constitute an important part of remote labs. It is also essential to point out that the same study focuses on an important aspect of remote experiments, clarifying that remote experiments remove barriers for students with special needs. In contrast to traditional experiments, which are only performed once, remote experiments can be repeated at any time and from any place [4]. From a pedagogical perspective, experiments in traditional labs contribute to students’ conceptual learning and help students develop analytical and experimental skills. Most experiments in traditional labs call for student collaboration. In this sense, experiments in traditional labs could help students develop collaborative skills [5]. Remote experiments should also achieve the previously mentioned goals. Another study emphasizes experiments, underlining that they are part of an active learner-centered approach, implementing inquiry-based learning with a focus on critical thinking and problem-solving [6]. Remote experiments should also answer the same purpose. Since remote education is rising, remote experiments are integrated into the new educational strategy. Therefore, the design of remote experiments should be well considered in the case of remote labs. In this sense, modern remote labs are designed upon ”digital twins” capabilities. The authors of “Environment and Planning B: Urban Analytics and City Science” clarify that the term “digital twin” refers to an absolute mirroring of a physical process. In that aspect, the digital twin executes the respective operation in the same way the physical process is executed. The use of digital twins has been expanded to serve the purpose of entire systems. A system that mirrors another system’s operation could be deemed to be an abstraction of the critical features of the existing system [7]. From this point of view, the operation of experiments could be mirrored by their respective digital twins. Therefore, digital twins ensure the success of remote labs.
The introductory section is divided into seven sections providing the theory for remote lab features (Section 1.1), describing the equipment (Section 1.2 and Section 1.2), predicting students at risk (Section 1.4), presenting the challenges for Education for Sustainable Development (Section 1.5), designing remote labs according to ESG (Section 1.6), and our research aims (Section 1.7).

1.1. Remote Labs’ Features

Remote labs could be deemed experiments that are carried out and controlled through the Internet [8]. Remote labs aid educational institutions in covering the need for space, instrumentation, and human support [6]. Remote labs could be used by many students [9]. Some issues that should be considered to make labs remotely accessible are centered on the following processes: hardware selection and installation, data digitization and collection, visualization, and network selection and installation [10].
Remote labs are mainly based on a client—server architecture to deal with the issue of complexity. Two main parts stand out in this architecture. The first part is the Remote Lab Client Side, a web-based application that interacts with the server. The other part is the Remote Lab Server Side, which is usually built in the philosophy of the LabView or Matlab simulation [11].
Several benefits related to remote labs are:
  • Various technologies could be employed to implement a remote lab tailored to students’ needs.
  • It is easy for students to perform assessment practices and experiments.
  • Remote labs are suitable for industrial applications owing to their remote monitoring potential [12].
  • Safety issues could be more easily addressed given that specific risks related to dangerous experiments do not occur [13,14,15].
The challenges pertain to the need for error management, re-usability, and the need for extra security due to liable malicious web pages [13,14,16].
Finally, the main drawbacks of remote labs are [15]:
  • Physical presence is needed to foster collaboration between students and teachers and also among students.
  • There is a need for IT administrators to solve possible technical problems and support students during the experimental process.
  • Some practical skills and hands-on dexterities cannot be easily developed through simulations and online remote experiments.

1.2. NI-ELVIS Remote Labs

NI-ELVIS remote labs offer [17]:
  • Multimedia Web-based operation
  • Interactive labs that contribute to theory and emphasize on projects.
  • Real-time Experiments that are performed by using real hardware.
  • Hardware-sharing capability at a course level.
  • Programming potential (Python and C).
In terms of the pedagogical benefits, NI-ELVIS remote labs achieve the following goals:
  • Help students to develop innovative qualities through real experiments.
  • Aid students to assimilate knowledge by appropriate resources
  • Stir students’ engagement by offering a web-driven environment, augmenting students’ desire to get involved in the learning process.
A cardinal function of NI-ELVIS remote labs is the real-time measurement outcome. The NI-ELVIS measurement station is illustrated in Figure 1:
This function is materialized by a potent measurement station, the units of which are depicted in Figure 2.
Another function of NI-ELVIS remote labs is based on an online teaching environment that provides students with theoretical material on the syllabus and tests students’ comprehension through appropriate exercises. This function is like the one implemented by e-learning systems. The theoretical material includes slides, videos, and other multimedia resources, whereas the exercises include multiple-choice questions, true/false questions, matching elements questions, and specific experiments.
It is essential to highlight that these exercises are graded, and students will be notified about their grades. It is also essential to note that educators can modify the context mounted on the respective e-learning system (LMS) to provide students with courses tailored to their needs. It is vital to stress the valuable statistical data that the e-learning system offers regarding students’ engagement. As a potent LMS, the NI-ELVIS e-learning system provides educators with meaningful data related to students’ activities (activity/assignment grades, logins into the system, and activity completion time). Therefore, educators can obtain an overall picture of students’ performance.

1.3. Metrics for Assessing Remote Labs

A study mentioned before highlights that remote labs should be widely available and widely accessible, and they should offer a safer experimentation environment in comparison to traditional labs [12]. Another study has scrutinized over 100 articles on remote labs. It explains that some factors affecting the remote lab’s remote monitoring process are the extent of difficulty, limitations in the number of users, reliability, and security [13].
ELVIS LabVIEW applications in automation should offer flexibility in design and be equipped with slight modification code capabilities and advanced accessibility potential [18].
A critical study puts another issue on the table, pointing out that the success of virtual and remote laboratories is encircled by appropriate software selection [19]. The same study lists a set of criteria that should be met to select the appropriate LabVIEW application software. These criteria are affiliated with LabVIEW application capabilities in every functional territory, including modularity; compatibility of code and hardware; debugging potential; executability; performance; and intuition.
Some studies contribute to getting perspective on the pedagogical issues related to remote labs [18,20,21]. In detail, ELVIS LabVIEW applications aid students in practicing their communication skills and familiarize them with the learning process at home, enabling them to take in the rudimentary knowledge of experiments through a mixture of conventional teaching and self-practice [18].
One study proved that remote and virtual experimentation practices increase students’ comprehension of concepts [20]. Another study indicates that another pedagogical aspect affecting the effectiveness of remote labs is related to the opportunity to learn through a failed learning process, insinuating that students could make mistakes and learn from them [21].
Another critical study indicates that factors that should be considered in a remote lab assessment process are usability, instruction comprehension, total time allotted to exercises and experiments’ completion, and entire procedure reliability [22]. In parallel, the same study stresses students’ overall satisfaction as a cardinal metric reflecting a remote lab’s effectiveness.
Finally, some other studies shed light on another aspect that should be considered in the assessment process. They point out that only one student can use the remote experiment at a given moment, and therefore it is necessary to ensure an efficient reservation system. Students can thus choose the most suitable time for them and work at their own pace or return to the experiment at home if they do not understand something [23,24].

1.4. Predicting Students at Risk in Remote Labs

Many studies indicate e-learning platforms’ vital role in predicting at-risk students [25,26,27,28,29,30,31]. Previous studies have also indicated that a proper analysis of students’ behavioral engagement could lead to developing competent risk models and generating respective prediction models. Statistical and machine learning techniques have been employed to answer the individual purpose of these studies. For instance, some studies refer to risk models developed by employing binary logistic regression analysis [25,26,27,28,31], whereas a specific study refers to a risk model developed by employing other classification techniques. In parallel, one specific study underlines the contribution of the discriminant function analysis to the prediction model generation [26].
In the case of NI-ELVIS remote labs, one study has proved that the student’s interaction with the NI-ELVIS LMS strongly predicts students’ critical performance. The same study has pointed out that the activities’ completion concerning the theoretical material and the theoretical exercises significantly affect students’ outcomes [31]. As the studies mentioned above underline, no specific set of factors affects students’ critical performance in any course. Risk factors are course oriented. In this sense, risk models, along with prediction models, are also course oriented. Therefore, although binary logistic regression analysis and discriminant function analysis are deemed to be competent statistical methods that could be used to develop robust risk models and competent prediction models, it is essential to stress the fact that they cannot be used to develop models which are suitable for any risk occurrence. For instance, risk and prediction models developed in terms of a specific course are only valid for the specific course, given that we cannot rule out the possibility of emerging risk factors in subsequent course runs. In this sense, the prediction model generated using the respective methods should be validated in terms of many courses to develop a warning system for students at risk. Consequently, any risk models or prediction models developed in terms of a specific remote lab course do not necessarily work well when coming into effect in other remote lab courses.

1.5. Education for Sustainable Development

The education sector’s reaction to UNESCO to the severe and pressing problems the planet is facing is called Education for Sustainable Development (ESD). UNESCO’s ESD for 2030 education program aims to promote the societal and individual transformation required to reverse direction due to catastrophic human activities. On the UNESCO website, with the most recent initiative being in 2015 with the establishment of the 2030 agenda and the Sustainable Development Goals (SDGs), it is underlined that it will take a comprehensive approach to address environmental, social, and economic challenges if we are to stop global warming before it reaches catastrophic proportions. Teaching and learning about crucial sustainable development topics, such as climate change, disaster risk reduction, biodiversity, poverty alleviation, and sustainable consumerism, are known as education for sustainable development [32]. However, although the integration of SDGs in universities is becoming apparent, the university’s role still needs to be more crucial in achieving the SDGs. The “5 Ps”, or five pillars of sustainable development, stand for people, planet, prosperity, peace, and partnerships. Sustainable development is development that is based on these five dimensions. One study also states that the 2012 UNESCO report entitled “The Future We Want” further suggests that the following themes could be included to secure renewed political and educational commitment to sustainable development [33]:
  • Poverty eradication;
  • Water and sanitation;
  • Energy;
  • Transportation;
  • Sustainable cities and human settlements;
  • Health and population;
  • Employment;
  • Oceans and seas;
  • Forests/desertification/biodiversity;
  • Sustainable consumption and production [34].
The three pillars of sustainability are social, environmental, and economic; in addition, education could play an essential role in providing solutions to the issues above. Educational sustainability is fundamental because it provides students with an understanding of why the environment is essential and with real-world skills they can use to improve the planet [35]. To act for sustainability now and in the future in one’s own life, community, and global scale, one must first instill the values and incentives necessary to do so in students, schools, and communities. This is what sustainable education aims to do. The 21st Conference of the Parties to the United Nations Framework Convention on Climate Change (COP21) deliberations in Paris in December 2015 stressed the importance of informed sustainability globally [36,37,38]. One of the critical objectives and purposes of the United Nations Sustainable Development Goals for 2030 is education for sustainable development and sustainable lifestyles. Higher education institutions play a significant part in this effort [39]. The same study highlights that, globally and locally, higher education may support sustainability in various social, technological, and environmental ways. Due to their large population of students, interdisciplinary teaching and learning methods, use of technology, and flexibility in the learning process, distance-learning universities are essential components of education for sustainable development.
The study mentioned before refers to interviews of Vice Chancellors or Rectors of four selected distance learning universities (OU, UAb, UNED, and FernU) about the sustainability of the DL model and the likely impact of current disruptions in the future, revealing that higher education distance learning is not viewed as a significant leadership organization in fostering a sustainable world. However, the requirement to meet the ambitions of the student body in a continuously changing world and the fight to master technological hurdles will undoubtedly cause questions of global sustainability to rise in the remote learning agenda. In general, sustainability is still primarily limited to the narrow sustainability of each institution’s life cycle of educational delivery. On the other hand, laboratory exercises are a crucial addition to many subjects of the curriculum. Virtual and remote laboratories (VRL) have become highly popular in assisting the learning process, both at formal and informal training, thanks to advancements in Information and Communication Technologies (ICT). Other studies report that there have been numerous attempts to create a remote and virtual laboratory, and most of them have been effective in doing so. However, most of these need to catch up in maintaining the laboratory after its first development period and making it viable. They conclude that sustainability and expendability may place more demands on user abilities and requirements [40,41,42,43]. Crucial components in extending the system are the use of ICT by teachers and students and the content offered by a VRL. VRLs will be popular in the future and will predominate in laboratory experiments in formal education from secondary schools to postgraduate degree programs since the new generations are accustomed to using ICT from very early on.
A couple of studies attempt a systematic review of the literature on recent studies on teacher educators’ perspectives and attitudes concerning ESD and their use, discovering that one could not assume that teacher educators thoroughly understand ESD. Possibly, those who do not concentrate on sustainability studies only have a broad grasp. To promote ESD, implementation in courses is crucial but needs to be improved. To encourage the adoption of ESD, further training is required, preferably subject-specific training, as well as institutional support (such as a goal statement for teaching) [43,44].
The findings of specific research suggest that higher education confronts a need to establish sustainability concerning distance education. University leadership regarding infrastructure and faculty support mechanisms were identified as crucial success factors. For continued success, professors should constantly question accepted teaching theories and modify their pedagogies to incorporate new technologies [44]. Another research study tried to analyze how digital pedagogical models of three kinds (collaborative, social, and independent) influence the learning experience. The results showed that social and collaborative models foster effective learning and are crucial for sustainable education. They improve relationships between students, a sense of being a part of a group with similar interests, and feelings of togetherness.
In contrast to a more interactive model, an autonomous model may hinder students’ perceptions of their present knowledge and the development of collaboration skills [44]. Another study investigated ways to advance in developing competencies aimed at Sustainable Development Goals (SDGs) in business administration education. Their study is based on the proposal presented by UNESCO (2017), “Education Goals for Sustainable Development: Learning Goals” and structured eight competencies and the specific learning goals for each SDG (see Table 1) [32]. According to the same study, the document created by UNESCO (2017) does not advance in operationalizing, developing, or evaluating competencies or goals, despite being based on prior acknowledged studies [45,46]. Despite the abundance of studies on the topic, there still needs to be proof of how these skills are acquired in students, how they might help the 2030 Agenda, or how they relate to the SDGs [40,45]. With rare exceptions, most of the suggestions are generic and aim to define the levels of competencies for undergraduate and graduate programs [45]. It has already been challenged and highlighted as a barrier to further progress in the sector that identifying and describing competencies for sustainable development are so straightforward [45].
According to UNESCO (2017) [32], the skills mentioned earlier are necessary to help people solve current socio-environmental issues and provide them with a deeper understanding of the 2030 Agenda for Sustainable Development. Although the timeframe for attaining the 2030 Agenda targets is less than ten years away, it is underlined that there are obstacles to the insertion, development, and evaluation of these capabilities [40,45]. It is also underlined that there need to be more assessment tools on how initiatives affect people’s education. Another study focuses on the resistance of many educational institutions to developing effective strategies for sustainable development [45]. Some studies clarify that the way various pedagogical approaches can be used to support the articulation of competencies for sustainable development needs to be addressed [45].

1.6. Designing Remote Labs upon Sustainability Principles

Some critical studies refer to remote labs’ design based on sustainability principles [47,48,49,50]. One study stresses the need to design remote labs which foster collaboration [47]. Another study underlines that a sustainable model for remote labs could achieve scalability, interoperability, and federation of remote experiments [48]. Another critical study focuses on the sustainability of system architecture and teaching outcomes [49]. Finally, a specific study points out that a sustainable approach for remote engineering labs could be based on well-designed experiments which favor a “learning from failure process” and could promote creativity [50]. In parallel, the studies mentioned before emphasizing the need for efficient management of resources.

1.7. Our Research Objective

Some studies refer to remote labs design based on sustainability principles [47,48,49,50]. However, remote lab effectiveness has yet to be associated with student performance. Besides, none of the studies we located in the relevant literature address students at risk in remote labs. Hence, there is space for scientific output in this field. This paper focuses on an NI-ELVIS remote lab designed upon sustainability principles to help students develop ESD competencies, such as critical thinking, problem-solving, and collaborative skills. The paper also demonstrates a prediction model based on students’ engagement that could be used to identify non-achievers (students who are about to fall through) in remote labs to ensure their sustainability.
Given that students’ engagement appears to affect students’ performance, our first research question is [19,20,21,22,23,24,25]:
Which students’ engagement data critically affected their performance in our remote lab?
In parallel, given that our remote lab was designed upon ESD principles, the second research question is:
Were the risk factors associated with ESD competencies?
Therefore, our research is directed toward identifying factors that critically affected students’ performance in our remote lab and examining whether these factors are associated with ESD competencies. In parallel, our research attempts to contribute to the control of the risk of students’ failure in remote labs by presenting a way to generate a prediction model for students at risk.

2. Materials and Methods

2.1. Procedure

Our research objective was to identify non-achievers (students about to fail the course). Therefore, we developed a framework based on the stages of a generic risk management framework [51]. Our method included the following stages:
  • Collect all students’ engagement data elicited from the NI-ELVIS LMS.
  • Analyzing the respective data in a binary logistic regression to develop a risk model for students at risk (non-achievers).
  • Using a discriminant function analysis based on the binary logistic regression outcome to predict students at risk (non-achievers).

2.2. Our Remote Lab

Our remote lab was designed as the digital twin of the Data Acquisition conventional lab. The Data Acquisition conventional lab was designed for the course “Data Acquisition Systems”, delivered at the Industrial Informatics department at the University of West Attica. The objectives of the conventional lab were to familiarize students with data-acquisition systems, to help students perform experiments and obtain measurements, and to upskill students to address real data-acquisition problems. The conventional lab was designed in line with the theoretical framework of the course. The remote lab was initially designed to compensate for the “shutting down” of the conventional lab’s operation during the pandemic. However, finally, it was established as an essential part of the ”Data Acquisition Systems” course. The remote lab course was delivered in parallel with the respective theoretical course. Students who were enrolled in the “Data Acquisition Systems” course also participated in the remote lab course. In our case, 300 students participated in the remote lab course.
Considering ESD principles, our NI-ELVIS remote lab included theoretical material, theoretical exercises, and a large-scale collaborative LabView exercise to help students develop necessary ESD competencies such as critical thinking, problem-solving, and collaboration. It is essential to clarify that our remote lab was divided into specific modules. Each module included theoretical material and theoretical exercises. The theoretical material contributed to students’ comprehension of fundamental problems and critical thinking. The theoretical material included digital resources such as videos, PowerPoints, and pdf files. The theoretical material revolved around critical data acquisition-system concepts such as sampling, sensors, and signals to cover each module’s needs. The theoretical material incorporated examples drawn from real data-acquisition problems, aiming at enabling students to complete the theoretical exercises. The theoretical exercises were designed to improve students’ comprehension of fundamental problems and to help students develop critical thinking. Multiple-choice and true/false questions were used to answer this purpose. The questions could be answered only when students absorbed rudimentary knowledge regarding the theoretical material. The theoretical exercises required high cognitive skills, and the material aspired to help students develop high cognitive skills. In parallel, students had the opportunity to master their high cognitive skills by completing the respective exercises. Real-time experiments aspired to familiarize students with confirmed cases, obtaining measurements in real time. A collaborative large-scale practical exercise that could be solved using LabView aspired to help students develop problem-solving competencies and collaborative skills.
The individual parts constituted well-designed activities that were mounted on a specific LMS included in the NI-ELVIS architecture. Experiments were performed through a specific NI-ELVIS platform with the use of which students had the opportunity to obtain measurements in real time.
It is essential to underline the potential of our NI-ELVIS LMS. Our LMS provides an analytic report showing the activities completed by students, the grades on their activities, and the time allotted to each activity completion. Therefore, there are many meaningful data that could be elicited from the respective LMS report in terms of students’ engagement.

2.3. Applying Our Method

A meaningful data set was elicited from NI-ELVIS LMS in terms of students’ engagement. This data set is listed in Table 2. The data set was collected after completing the Remote Lab Course (first course run).
It is essential to clarify that the data were collected from the engagement analytics of our NI-ELVIS LMS. The LMS provided statistics regarding activities and resources. Therefore, the only data that could be elicited from the respective LMS were those presented in the previous table (Table 2). In this sense, the data reflected the students’ interaction with the learning activities designed for the specific course. It is also essential to point out that a similar data set was used in some studies to reflect students’ behavioral engagement [25,26,27,28,29,30,31].
Appropriate variables were defined, reflecting the data set. These variables and a binary variable termed rlstrisk (0: for students not at risk and 1: for students at risk) were employed in a binary logistic regression analysis to develop the risk model and the individual risk factors [25,26,27,28,29,30,31].
The binary logistic regression, as it is indicated in some studies, is used to serve three cardinal purposes [25,26,27,28,29,30,31]:
  • To develop a risk model in the context of which risk factors could be identified.
  • To provide a classification table for at-risk students and those not at risk.
  • To constitute the basis on which a prediction model could be generated.
The variables reflecting the students’ engagement data set were the independent variables. In contrast, the binary variable reflecting students at risk (variable rlstrisk) was the dependent variable in our binary logistic regression scheme. The independent variables reflecting the data presented in Table 2 constituted candidate risk factors. Therefore, a risk model was needed to decide which of the candidate risk factors could be viewed as objective risk factors. In the context of a binary logistic regression approach, the independent variables that were proved to be statistically significant (according to the risk model) constitute the real factors leading to the risk occurrence. In addition, the risk model estimates the contribution of each factor to the risk occurrence. In this sense, the binary logistic regression model could be used to prioritize the risk factors with respect to their contribution to the risk occurrence.
The classification table is another vital part of a typical binary logistic regression outcome, as it is denoted in some studies [25,26,27,28,29,30,31]. The classification table could act as a prediction model in some cases, given that students are classified into at-risk students and not at risk. However, according to the process followed in one relevant study, the risk factors derived from the risk model could be used in terms of a discriminant function analysis to generate a prediction model for students at risk [27]. The same study has proved that the binary logistic regression classification outcome is very close to the outcome of the discriminant function analysis (prediction model) outcome. This study indicates that a typical discriminant function analysis provides two linear functions, one for students at risk and one for students not at risk. These linear functions reflect the classification potential of the discriminant function analysis. At any time, the score of each function could be calculated, and the function with the maximum score defines the group into which students will be classified (students at risk or not at risk). For instance, if the score of the function for students at risk is greater than the respective score of the function for students not at risk, the prediction model will classify students into the students at risk group. The prediction model will classify students into the students not at-risk group in any other case.
In our case, the binary logistic regression outcome was a potent risk model in the context of which risk factors derived from the students’ engagement data set were deemed statistically significant. In parallel, a classification table was created indicating the classification potential of the risk model.
A discriminant function analysis was carried out based on risk factors to generate a prediction model for at-risk students. The discriminant function analysis outcome was the generation of two linear functions (one for students at risk and one for students not at risk). The score of each function was calculated two weeks before the completion of the course, and the function with the maximum score determined the students’ classification group. In parallel, a classification table was created to indicate the classification potential of the prediction model. The statistical packages of SPSS v26 and JASP were used to perform the requisite analysis. The classification tables and the figures (graphs) included in this paper constitute instances of the JASP software’s outcome.

3. Results

3.1. Risk Model Characteristics

Table 3 presents some essential risk model characteristics.
It is essential to underline that our model accounts for 90.5% of the liable risk factors (Nagelkerke R2), implying that only approximately 9.5% of the liable risk factors are not identified (see Table 3). It is essential to stress that the range for Nagelkerke R2 is between 0 and 1. The value “1” denotes a perfect model fit [52,53]. Our model is reasonable since the Nagelkerke R2 value is close to 1. It is essential to clarify that Nagelkerke R2 is the correction of the Cox and Snell R2. Table 4 indicates the classification potential of our model.
According to Table 4, our model correctly classifies 99% of the cases. Therefore, only 1% of the cases need to be correctly classified. Thereby, a small number of students at risk are incorrectly classified as not at risk, and the same holds for a small number of at-risk students. Thus, our model classifies a great majority of risk cases correctly.

3.2. Risk Factors

The data set presented in previous Table 2 constitutes candidate risk factors. In contrast, Table 5 shows which of the candidate risk factors are real risk factors. The risk factors result from the significance value (p). A statistically significant factor (p < 0.05) could be deemed a real risk factor contributing to the risk occurrence.
According to Table 5, a real risk factor is content comprehension, which reflects the total number of theoretical material resources viewed. Moreover, another risk factor is the theory-exercise familiarization, reflecting the total number of submitted exercises.
Column B in Table 5 denotes that a unit increase in content comprehension leads to a 2.759 unit decrease in the logarithm of the probability of the risk occurrence. In a parallel manner, a unit increase in the theory familiarization through exercises leads to a 3.224 unit decrease in the probability of the risk occurrence logarithm. The most significant risk factor is the need for theory familiarization through exercises.
The estimates for the risk factors are illustrated in the following Figure 3 and Figure 4.
Figure 3 depicts how the probability of the risk occurrence changes with content comprehensibility. The graph shows that the probability of the risk occurrence is reducing while the extent of the content comprehension is increasing.
In parallel, Figure 4 depicts how the probability of the risk occurrence changes with exercise-theory familiarization. The graph shows that the probability of the risk occurrence is reducing while the extent of the familiarization through theoretical exercises is increasing.

3.3. Prediction Model

The prediction model accounts for 73.5% of the factors’ variability (Canonical Correlation), as indicated in Table 6. The Canonical Correlation value is close to 1, denoting a solid relationship between the dependent and the independent variable, indicating the competency of our prediction model [54].
The following Table 7 presents the classification function coefficients.

Fisher’s linear discriminant functions

According to Table 7, our prediction model is based on two classification functions:
Classification function for risk occurrence (below the threshold)
y1 = 1.788 × Content_Comprehension + 3.654 × Exercise_Theory Familiarization − 17.198
Classification function for no risk occurrence (over or equal to the threshold)
y2 = 3.325 × Content_Comprehension + 4.592 × Exercise_Theory Familiarization − 33.560
Figure 5 and Figure 6 illustrate the operation of the respective classification functions:
Figure 5 shows how students are classified into the not-at-risk group according to the discriminant scores (represented by the respective mean values in a standardized format). The y-axis indicates the number of students that are classified into the underlying group in each case of the first discriminant function score calculation (x-axis). Summing up the information reflected by columns in each case of the discriminant score calculation, we obtain the total number of students who are classified as students not at risk.
Figure 6 shows how students are classified into the at-risk group according to the discriminant scores (represented by the respective mean values in a standardized format). The y-axis indicates the number of students that are classified into the underlying group in each case of the second discriminant function score calculation (x-axis). Summing up the information reflected by columns in each case of the discriminant score calculation, we obtain the total number of students who are classified as students at risk [55,56].
The same information is provided in Table 8, including the correct classification percentage of our prediction model:
Table 8 points out that our model classifies 98.7% of cases correctly, and therefore our prediction model is deemed to be robust. In this sense, only a small number of students are incorrectly classified.

4. Discussion and Conclusions

Table 3 has proved that our risk model could be deemed to be a good model given that it accounts for a significant percentage of the liable risk factors. Table 4 shows that our risk model achieves a significant correct classification percentage. In terms of our prediction model, Table 6 indicates that there is a strong relationship between the dependent variable (rlstrisk) and the independent variables (coefficients—real risk factors), given that the Canonical Correlation value is 0.735, close to 1, indicating the competency of our prediction model. Table 8 indicates that our prediction model works well in 98.7% of cases. This is another metric of our prediction model robustness.
Regarding the first research question, Table 5 has indicated that the total number of theoretical material resources viewed (content comprehension) and the total number of theoretical exercises completed (theory familiarization through exercises) were the students’ engagement data that constituted the risk factors that critically affected students’ performance. Column B in Table 5 indicates that the extent of theory familiarization through respective exercises was the cardinal risk factor, given that it led to a more significant reduction in the logarithm of the probability of the risk occurrence. Therefore, in terms of this remote lab, the students’ familiarization with theory was the main factor that affected students’ performance in the remote lab. It is also essential to underline that the students’ familiarization with theory through respective exercises and students’ comprehension of concepts are included in the pedagogical benefits of the remote labs [18,19,20,21,22].
However, it is vital to clarify that the risk factors could vary among courses [25,26,27,28,29,30,31]. We cannot insinuate that the individual risk factors affect the success of any remote lab course, but we can underline that these factors hold for the underlying course. More remote lab courses would be needed to conclude that the individual risk factors hold for any remote lab course. The risk factors also indicate areas of the course design that call for an amendment. An amelioration in the content to be more comprehensible could vouch for a successful outcome. In parallel, an amelioration in the theoretical exercises could level up the remote lab course.
The risk factors led to a prediction model that helps predict students’ performance in the remote lab’s course delivery process. Though our model heavily depends on risk factors, it works well regarding the specific remote lab course. This aligns with some studies that clarify that prediction models are course oriented. Thus, if a new risk factor emerges, the risk model will be altered, and the underlying prediction model will also be altered [25,26,27,28,29,30,31].
A couple of studies indicate that although the risk factors in the two courses run were slightly different, the prediction model potential in each course run was insignificantly altered. The two prediction models achieved a similar correction classification percentage (approximately 90%) [26,28]. Hence, the prediction model’s classification potential seemed unaffected by the different risk factors in each course run. Nevertheless, more studies should be needed to compare the classification potential of a prediction model across risk factors’ variability. However, the previously referred studies denote that there could be a prediction model that could work well in terms of many courses run.
Additionally, although time spent on activities is typically a metric reflecting student effort, this factor did not appear to affect students’ critical performance in our remote lab, a finding which is in line with the previous studies [26,27,28,29,30,31].
Regarding the second research question, in our study, theoretical material and exercises (aiming at helping students develop critical thinking) appeared to be strong predictors of students’ outcomes. Given that critical thinking is included in the ESD competencies, risk factors appeared to be associated with one of the ESD competencies. However, the experiments and the large-scale LabView exercise did not affect students’ critical performance in our remote lab. Thereby, problem-solving, another cardinal ESD competence, did not appear to be associated with risk factors, given that the large-scale LabView exercise designed to help students develop problem-solving skills did not play a significant role in students’ critical achievement. In parallel, the students’ collaboration extent in the LabView exercise was not examined. Therefore, risk factors did not appear to relate to all ESD competencies. That is reasonable, given that risk factors are course oriented. Nevertheless, we cannot rule out the possibility of emerging risk factors in a subsequent course run. Therefore, we cannot rule out the possibility that risk factors that could be associated with other ESD competencies could emerge in a subsequent course run.
The paper indicates that an NI-ELVIS remote lab could be designed in the light of ESD principles to help students develop ESD competencies, such as critical thinking, problem-solving, and collaboration. The paper also presented a concrete framework to generate a competent prediction model for students at risk. In our study, the theoretical material and the theoretical exercises assumed a cardinal role in helping students develop critical thinking (a necessary ESD competence). However, this finding cannot be easily generalized for any remote lab course designed on the pillar of ESD principles, given that risk factors are course oriented. Additionally, although problem-solving and collaboration (necessary ESD competencies) did not appear to play a significant role in the case of students at risk, we cannot rule out the possibility that factors related to the respective competencies positively affected the performance of the achievers. In this sense, our research could be expanded to identify factors that affected the performance of achievers to fully determine the success of the remote lab in light of ESD competencies.
Our team is currently working on determining the remedial course of action for non-achievers (students who are predicted to fail the course) to ensure the remote lab’s sustainability. In addition, the students’ collaboration extent in the LabView exercise could be examined to investigate why the large-scale exercise did not appear to strongly predict students’ critical performance. Furthermore, the process of early identification of students who are about to fall through serves a twofold purpose, contributing to the amelioration of the learning process and leading to resource saving. Finally, further qualitative research could be carried out to examine the role that the theoretical material and the theoretical exercises assume in students’ critical thinking development.

Author Contributions

D.P., I.G. and S.N. gathered the data, carried out the implementation, and performed the calculations and computer programming. I.G., V.Z. and P.M. gathered and implemented all the theoretical background of the paper; with input from the experimental development, they reviewed and discussed the study’s results. S.N. and D.P. performed an overall grammar check based on the narrative and its smoothing throughout the text. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available because they are stored into LMS repositories and cannot be openly accessed without the systems’ administrator consent.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Williamson, B.; Eynon, R.; Potter, J. Políticas, pedagogias e práticas pandêmicas: Tecnologias digitais e educação a distância durante a emergência do coronavírus. Learn. Media Technol. 2020, 45, 107–114. [Google Scholar] [CrossRef]
  2. Anderson, J. The Coronavirus Pandemic is Reshaping Education. Quartz Daily Brief. 2020. Available online: https://qz.com/1826369/how-coronavirus-is-changing-education/ (accessed on 1 January 2022).
  3. Zimmerman, J. Coronavirus and the great online-learning experiment. Chron. High. Educ. 2020, 10. [Google Scholar]
  4. Gerhátová, Ž. Experiments on the Internet-removing barriers facing students with special needs. Procedia-Soc. Behav. Sci. 2014, 114, 360–364. [Google Scholar] [CrossRef] [Green Version]
  5. Schauer, F.; Ožvoldová, M.; Lustig, F. Integrated e-Learning—New Strategy of Cognition of Real Word in Teaching Physics. In INNOVATIONS 2009: World Innovations in Engineering Education and Research; iNEER: Arlington, VA, USA, 2009; ISBN 978-0-9741252-9-9. [Google Scholar]
  6. Gerhátová, Ž; Perichta, P.; Drienovský, M.; Palcut, M. Temperature Measurement—Inquiry-Based Learning Activities for Third Graders. Educ. Sci. 2021, 11, 506. [Google Scholar] [CrossRef]
  7. Grieves, M. Digital twin: Manufacturing excellence through virtual factory replication. White Pap. 2014, 1, 1–7. [Google Scholar]
  8. Chen, X.; Song, G.; Zhang, Y. Virtual and remote laboratory development: A review. In Earth and Space 2010: Engineering, Science, Construction, and Operations in Challenging Environments; ASCE: Reston, VA, USA, 2010; pp. 3843–3852. [Google Scholar]
  9. Song, G.; Olmi, C.; Bannerot, R. Enhancing vibration and controls teaching with remote laboratory experiments. In Proceedings of the 2007 Annual Conference & Exposition, Honolulu, HI, USA, 24–27 June 2007; pp. 12–677. [Google Scholar]
  10. Chang, T.; Jaroonsiriphan, P.; Sun, X. Integrating Nanotechnology into Undergraduate Experience: A Web-based Approach. Int. J. Eng. Educ. 2002, 18, 557–565. [Google Scholar]
  11. Chen, X.; Jiang, L.; Shahryar, D.; Kehinde, L.; Olowokere, D. Technologies for the Development Of Virtual And Remote Laboratories: A Case Study. In Proceedings of the 2009 Annual Conference & Exposition, Austin, TX, USA, 14–17 June 2009; pp. 14–1168. [Google Scholar]
  12. Heradio, R.; De La Torre, L.; Galan, D.; Cabrerizo, F.J.; Herrera-Viedma, E.; Dormido, S. Virtual and remote labs in education: A bibliometric analysis. Comput. Educ. 2016, 98, 14–38. [Google Scholar] [CrossRef]
  13. Mokhtar, A.; Mikhail, G.; Joo, C. A survey on Remote Laboratories for E-Learning and Remote Experimentation. Contemp. Eng. Sci. 2014, 7, 1617–1624. [Google Scholar] [CrossRef] [Green Version]
  14. Riman, C. A Remote Robotic Laboratory Experiment Platform with Error Management. Int. J. Comput. Sci. Issues 2011, 8, 242. [Google Scholar]
  15. Faulconer, E.K.; Gruss, A.B. A review to weigh the pros and cons of online, remote, and distance science laboratory experiences. Int. Rev. Res. Open Distrib. Learn. 2018, 19, 3386. [Google Scholar] [CrossRef]
  16. Yoo, S.; Kim, S.; Choudhary, A.; Roy, O.P.; Tuithung, T. Two-phase malicious web page detection scheme using misuse and anomaly detection. Int. J. Reliab. Inf. Assur. 2014, 2, 1–9. [Google Scholar] [CrossRef]
  17. NI-Elvis Product Flyer. Available online: https://www.ni.com/pdf/product-flyers/ni-elvis.pdf (accessed on 10 December 2022).
  18. Singh, S.; Singh, S.; Kumat, N. A Comparative Study on Different Applications of National Instrument ELVIS LabVIEW in automation. In Proceedings of the ISTE Trends & Innovations in Management Engineering, Education & Sciences (TIMEES), Ropar, India, 4–5 November 2016. [Google Scholar]
  19. Ertugrul, N. Towards virtual laboratories: A survey of LabVIEW-based teaching/learning tools and future trends. Int. J. Eng. Educ. 2000, 16, 171–180. [Google Scholar]
  20. Zacharia, Z.C. Comparing and combining real and virtual experimentation: An effort to enhance students’ conceptual understanding of electric circuits. J. Comput. Assist. Learn. 2007, 23, 120–132. [Google Scholar] [CrossRef]
  21. Feisel, L.D.; Rosa, A.J. The role of the laboratory in undergraduate engineering education. J. Eng. Educ. 2005, 94, 121–130. [Google Scholar] [CrossRef]
  22. Nickerson, J.V.; Corter, J.E.; Esche, S.K.; Chassapis, C. A model for evaluating the effectiveness of remote engineering laboratories and simulations in education. Comput. Educ. 2007, 49, 708–725. [Google Scholar] [CrossRef]
  23. Lowe, D.; Yeung, H.; Tawfik, M.; Sancristobal, E.; Castro, M.; Orduña, P.; Richter, T. Interoperating remote laboratory management systems (RLMSs) for more efficient sharing of laboratory resources. Comput. Stand. Interfaces 2016, 43, 21–29. [Google Scholar] [CrossRef]
  24. Toderick, L.; Mohammed, T.; Tabrizi, M.H. A reservation and equipment management system for secure hands-on remote labs for information technology students. In Proceedings of the Frontiers in Education 35th Annual Conference, Indianopolis, IN, USA, 19–22 October 2005; p. S3F. [Google Scholar]
  25. Macfadyen, L.P.; Dawson, S. Mining LMS data to develop an “early warning system” for educators: A proof of concept. Comput. Educ. 2010, 54, 588–599. [Google Scholar] [CrossRef]
  26. Georgakopoulos, I.; Kytagias, C.; Psaromiligkos, Y.; Voudouri, A. Identifying risks factors of students’ failure in e-learning systems: Towards a warning system. Int. J. Decis. Support Syst. 2018, 3, 190–206. [Google Scholar] [CrossRef]
  27. Georgakopoulos, I.; Chalikias, M.; Zakopoulos, V.; Kossieri, E. Identifying factors of students’ failure in blended courses by analyzing students’ engagement data. Educ. Sci. 2020, 10, 242. [Google Scholar] [CrossRef]
  28. Tsakirtzis, S.; Georgakopoulos, I. Developing a Risk Model to identify factors that critically affect Secondary School students’ performance in Mathematics. J. Math. Educ. Teach. Pract. 2020, 1, 63–72. [Google Scholar]
  29. Lytras, D.M.; Serban, C.A.; Torres-Ruiz, M.J.; Ntanos, S.; Sarirete, A. Translating knowledge into innovation capability: An exploratory study investigating the perceptions on distance learning in higher education during the COVID-19 pandemic—The case of Mexico. J. Innov. Knowl. 2022, 7, 100258. [Google Scholar] [CrossRef]
  30. Anagnostopoulos, T.; Kytagias, C.; Xanthopoulos, T.; Georgakopoulos, I.; Salmon, I.; Psaromiligkos, Y. Intelligent predictive analytics for identifying students at risk of failure in Moodle courses. In Intelligent Tutoring Systems, Proceedings of the 16th International Conference, ITS 2020, Proceedings 16, Athens, Greece, 8–12 June 2020; Springer: Cham, Switzerland, 2020; pp. 152–162. [Google Scholar]
  31. Georgakopoulos, I.; Piromalis, D.; Makrygiannis, P.S.; Zakopoulos, V.; Drosos, C. A Robust Risk Model to Identify Factors that Affect Students’ Critical Achievement in Remote Lab Courses. Int. J. Econ. Bus. Adm. 2022, 10, 3–22. [Google Scholar] [CrossRef]
  32. UNESCO. Education for Sustainable Development Goals and Learning Objectives. 2017. Available online: https://millenniumedu.org/wp-content/uploads/2017/08/en-unescolearningobjectives (accessed on 28 January 2023).
  33. McClanahan, L.G. Essential Elements of Sustainability Education. J. Sustain. Educ. 2014, 6, 3582. [Google Scholar]
  34. UNESCO. Global Action Programme; UNESCO: Paris, France, 2013. [Google Scholar]
  35. Tetiana, H.; Malolitneva, V. Conceptual and legal framework for promotion of education for sustainable development: Case study for Ukraine. Eur. J. Sustain. Dev. 2020, 9, 42–54. [Google Scholar] [CrossRef]
  36. Ourbak, T.; Magnan, A.K. The Paris Agreement and climate change negotiations: Small Islands, big players. Reg. Environ. Chang. 2018, 18, 2201–2207. [Google Scholar] [CrossRef] [Green Version]
  37. Burns, W. Loss and damage and the 21st Conference of the Parties to the United Nations Framework Convention on Climate Change. ILSA J. Int’l Comp. L. 2015, 22, 415. [Google Scholar] [CrossRef] [Green Version]
  38. Zangerolame Taroco, L.S.; Sabbá Colares, A.C. The UN Framework Convention on Climate Change and the Paris Agreement: Challenges of the Conference of the Parties. Prolegómenos 2019, 22, 125–135. [Google Scholar] [CrossRef] [Green Version]
  39. Bell, S.; Douce, C.; Caeiro, S.; Teixeira, A.; Martín-Aranda, R.; Otto, D. Sustainability and distance learning: A diverse European experience? Open Learn. J. Open Distance E-Learn. 2017, 32, 95–102. [Google Scholar] [CrossRef] [Green Version]
  40. Arora, N.K.; Mishra, I. United Nations Sustainable Development Goals 2030 and environmental sustainability: Race against time. Environ. Sustain. 2019, 2, 339–342. [Google Scholar] [CrossRef] [Green Version]
  41. Kara, A.; Ozbeka, M.E.; Cagiltaya, N.E.; Aydin, E. Maintenance, sustainability, and extendibility in virtual and remote Laboratories. Procedia Soc. Behav. Sci. 2011, 28, 722–728. [Google Scholar] [CrossRef] [Green Version]
  42. Gkika, E.C.; Anagnostopoulos, T.; Ntanos, S.; Kyriakopoulos, G.L. User Preferences on Cloud Computing and Open Innovation: A Case Study for University Employees in Greece. J. Open Innov. Technol. Mark. Complex. 2020, 6, 41. [Google Scholar] [CrossRef]
  43. Goller, A.; Rieckmann, M. What do We Know About Teacher Educators’ Perceptions of Education for Sustainable Development? A Systematic Literature Review. J. Teach. Educ. Sustain. 2022, 24, 19–34. [Google Scholar] [CrossRef]
  44. Kyriakopoulos, G.; Ntanos, S.; Asonitou, S. Investigating the environmental behavior of business and accounting university students. Int. J. Sustain. High. Educ. 2020, 21, 819–839. [Google Scholar] [CrossRef]
  45. Santoveña-Casal, S.; Pérez, M.D.F. Sustainable Distance Education: Comparison of Digital Pedagogical Models. Sustainability 2020, 12, 9067. [Google Scholar] [CrossRef]
  46. Dias, B.G.; Onevetch, R.T.d.S.; dos Santos, J.A.R.; Lopes, G.d.C. Competences for Sustainable Development Goals: The Challenge in Business Administration Education. J. Teach. Educ. Sustain. 2022, 24, 73–86. [Google Scholar] [CrossRef]
  47. Gustavsson, I.; Nilsson, K.; Zackrisson, J.; Garcia-Zubia, J.; Hernandez-Jayo, U.; Nafalski, A.; Nedic, Z.; Gol, O.; Machotka, J.; Pettersson, M.I.; et al. On objectives of instructional laboratories, individual assessment, and use of collaborative remote laboratories. IEEE Trans. Learn. Technol. 2009, 2, 263–274. [Google Scholar] [CrossRef]
  48. Orduña, P.; Garcia-Zubia, J.; Rodriguez-Gil, L.; Angulo, I.; Hernandez-Jayo, U.; Dziabenko, O.; López-de-Ipiña, D. The weblab-deusto remote laboratory management system architecture: Achieving scalability, interoperability, and federation of remote experimentation. In Cyber-Physical Laboratories in Engineering and Science Education; Springer: Cham, Switzerland, 2018; pp. 17–42. [Google Scholar]
  49. Tabunshchyk, G.; Kapliienko, T.; Arras, P. Sustainability of the remote laboratories based on systems with limited resources. In Smart Industry & Smart Education, Proceedings of the 15th International Conference on Remote Engineering and Virtual Instrumentation 15, Banglore, India, 3–6 February 2019; Springer: Cham, Switzerland, 2019; pp. 197–206. [Google Scholar]
  50. Felgueiras, C.; Costa, R.; Alves, G.R.; Viegas, C.; Fidalgo, A.; Marques, M.A.; Lima, N.; Castro, M.; García-Zubía, J.; Pester, A.; et al. A sustainable approach to laboratory experimentation. In Proceedings of the Seventh International Conference on Technological Ecosystems for Enhancing Multiculturality, León, Spain, 16–18 October 2019; pp. 500–506. [Google Scholar]
  51. Vose, D. Risk Analysis: A Quantitative Guide; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  52. Allison, P.D. Measures of fit for logistic regression. In Proceedings of the SAS Global Forum 2014 Conference, Washington, DC, USA, 23–26 March 2014; SAS Institute Inc.: Cary, NC, USA, 2014; pp. 1–13. [Google Scholar]
  53. Smith, T.J.; McKenna, C.M. A comparison of logistic regression pseudo R2 indices. Mult. Linear Regres. Viewp. 2013, 39, 17–26. [Google Scholar]
  54. Bach, F.R.; Jordan, M.I. A Probabilistic Interpretation of Canonical Correlation Analysis; University of California: Berkeley, CA, USA, 2005. [Google Scholar]
  55. Büyüköztürk, Ş.; Çokluk-Bökeoğlu, Ö. Discriminant function analysis: Concept and application. Eurasian J. Educ. Res. 2008, 33, 73–92. [Google Scholar]
  56. Brown, M.T.; Wicker, L.R. Discriminant analysis. In Handbook of Applied Multivariate Statistics and Mathematical Modeling; Academic Press: Cambridge, MA, USA, 2000; pp. 209–235. [Google Scholar]
Figure 1. NI-ELVIS measurement station (board view) [17].
Figure 1. NI-ELVIS measurement station (board view) [17].
Sustainability 15 05473 g001
Figure 2. NI-ELVIS remote lab measurement station (operation view) [17].
Figure 2. NI-ELVIS remote lab measurement station (operation view) [17].
Sustainability 15 05473 g002
Figure 3. P(risk = 1). Content_Comprehension.
Figure 3. P(risk = 1). Content_Comprehension.
Sustainability 15 05473 g003
Figure 4. Familiarization.
Figure 4. Familiarization.
Sustainability 15 05473 g004
Figure 5. Classification function for no risk occurrence (x-axis represents z-values, y-axis represents number of students).
Figure 5. Classification function for no risk occurrence (x-axis represents z-values, y-axis represents number of students).
Sustainability 15 05473 g005
Figure 6. Classification Function for Risk Occurrence (x-axis represents z-values, y-axis represents number of students).
Figure 6. Classification Function for Risk Occurrence (x-axis represents z-values, y-axis represents number of students).
Sustainability 15 05473 g006
Table 1. Skills for the SDGs.
Table 1. Skills for the SDGs.
CompetenceConstitutive Definition
Systemic thinkingcompetenceAbility to recognize and understand relationships; analyze complex systems; consider how systems are incorporated within different domains and scales; and deal with uncertainty.
Anticipatory competenceAbility to understand and evaluate various futures possible, probable, and desirable; create your visions for the future; apply the precautionary principle; assess the consequences of actions; and deal with risks and changes.
Normative competenceAbility to understand and reflect on the norms and values that underlie people’s actions; and negotiate sustainability values, principles, objectives, and goals in a context of conflicts of interest and concessions, uncertain knowledge, and contradictions.
Strategic competenceAbility to collectively develop and implement innovative actions that promote sustainability at the local level and in a broader context.
Collaboration competenceAbility to learn from others; understand and respect other people’s needs, perspectives, and actions (empathy); understand, relate, and be sensitive to others (empathic leadership); deal with conflicts in a group; and facilitate collaboration and participation in problem-solving.
Critical thinking competenceAbility to question norms, practices, and opinions; reflect on one’s values, perceptions, and actions; and take a stand in sustainability discourse.
Self-knowledge competenceAbility to reflect on one’s role in the local community and (global) society; continuously evaluate and further motivate one’s actions; and deal with one’s feelings and desires.
Integrated problem-solving competenceAbility to apply different problem-solving frameworks to complex sustainability problems and develop viable, inclusive, and equitable solution options that promote sustainable development by integrating the competencies mentioned above.
Source: UNESCO (2017) [32].
Table 2. Students’ engagement data.
Table 2. Students’ engagement data.
Total number of theoretical material resources viewed
Total number of theoretical exercises completed
Total number of experiments completed
Grades on theoretical exercises
Grades on experiments
Grades on the large-scale exercise (LabView exercise)Time spent on experiments’ completionTime spent on theoretical exercises’ completionTime spent on the large-scale exercise completion
Table 3. Risk model characteristics.
Table 3. Risk model characteristics.
Nagelkerke R20.905
Cox and Snell R20.600
Table 4. Classification table.
Table 4. Classification table.
Classification Table
ObservedPredicted
RiskPercentage correct
01
Step 1Risk02300100.0
136795.7
Overall percentage 99.0
Table 5. Risk factors.
Table 5. Risk factors.
CoefficientΒp-Value
Total number of theoretical material resources viewed−2.7590.017
Total number of theoretical exercises completed−3.2240.037
Grades on theoretical exercises0.8130.867
Grades on experiments4.1490.828
Grades on the large-scale exercise (LabView exercise)−0.9380.745
Time spent on experiments’ completion23.0940.990
Time spent on theoretical exercises’ completion−26.6790.989
Time spent on the large-scale exercise’s completion25.3250.998
Table 6. Canonical Correlation.
Table 6. Canonical Correlation.
Eigenvalues
FunctionEigenvalue% of VarianceCumulative %Canonical Correlation
11.175 a100.0100.00.735
a First 1 canonical discriminant function was used in the analysis.
Table 7. Classification Function Coefficients.
Table 7. Classification Function Coefficients.
Classification Function Coefficients
Risk
01
Content_Comprehension3.3251.788
Exercise_Theory Familiarization4.5923.654
(Constant)−33.560−17.198
Table 8. Classification results a.
Table 8. Classification results a.
Classification Results
RiskPredicted Group MembershipTotal
01
OriginalCount02300230
146670
%0100.00.0100.0
15.794.3100.0
a. 98.7% of original grouped cases are correctly classified.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Georgakopoulos, I.; Piromalis, D.; Ntanos, S.; Zakopoulos, V.; Makrygiannis, P. A Prediction Model for Remote Lab Courses Designed upon the Principles of Education for Sustainable Development. Sustainability 2023, 15, 5473. https://doi.org/10.3390/su15065473

AMA Style

Georgakopoulos I, Piromalis D, Ntanos S, Zakopoulos V, Makrygiannis P. A Prediction Model for Remote Lab Courses Designed upon the Principles of Education for Sustainable Development. Sustainability. 2023; 15(6):5473. https://doi.org/10.3390/su15065473

Chicago/Turabian Style

Georgakopoulos, Ioannis, Dimitrios Piromalis, Stamatios Ntanos, Vassilis Zakopoulos, and Panagiotis Makrygiannis. 2023. "A Prediction Model for Remote Lab Courses Designed upon the Principles of Education for Sustainable Development" Sustainability 15, no. 6: 5473. https://doi.org/10.3390/su15065473

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop