Next Article in Journal
Navigating the Saudi Gig Economy: The Role of Human Resource Practices in Enhancing Job Satisfaction and Career Sustainability
Next Article in Special Issue
Teachers’ AI-TPACK: Exploring the Relationship between Knowledge Elements
Previous Article in Journal
Do Regional Smart Specialization Strategies Affect Innovation in Enterprises?
Previous Article in Special Issue
Determinants of College Students’ Online Fragmented Learning Effect: An Analysis of Teaching Courses on Scientific Research Software on the Bilibili Platform
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Usability of a Virtual Learning Environment in Down Syndrome Adult Learning

by
María Consuelo Sáiz-Manzanares
1,*,
Cristina Arranz Barcenilla
2,
Sara Gutiérrez-González
3 and
Lourdes Alameda Cuenca-Romero
3
1
Research Group DATAHES, Departamento de Ciencias de la Salud, Facultad de Ciencias de la Salud, Universidad of Burgos, P◦ Comendadores s/n, 09001 Burgos, Spain
2
Doctoral Programme in Industrial Technologies and Civil Engineering, Campus Milanera: C/Villadiego s/n, 09001 Burgos, Spain
3
Departamento de Construcciones Arquitectónicas e Ingeniería de la Construcción y del Terreno, Campus Milanera: C/Villadiego s/n, 09001 Burgos, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(23), 16404; https://doi.org/10.3390/su152316404
Submission received: 25 October 2023 / Revised: 21 November 2023 / Accepted: 27 November 2023 / Published: 29 November 2023

Abstract

:
The use of virtual learning environments (VLEs) is becoming increasingly common in teaching. Nevertheless, analysis of how effective these prove to be for the learning of persons with disabilities remains scarce. In this study, we work with a sample of 34 people aged between 16 and 44 (14 women and 20 men) who have Down Syndrome. The aims of the study were to (1) explore whether there were any significant differences before and after teaching when using a VLE; (2) determine whether the frequency of use and time spent on the VLE impacted learning outcomes; (3) examine clusters vis à vis learning behaviour in the VLE; and (4) gauge perceived user satisfaction with the use of the VLE. Significant differences in learning outcomes before and after teaching using a VLE were found. The frequency and time spent using the VLE were seen to have no impact on learning outcomes. Three clusters were identified in terms of VLE behaviour, and perceived user satisfaction with the VLE was high. There is a need to increase the number of studies addressing the impact of VLEs on learning in persons with different disabilities.

1. Introduction

In recent years, the use of learning management systems (LMS)—or virtual learning environments—has become a part of learning in all stages of both official as well as unofficial education [1]. This practice has intensified even further as a result of the worldwide health crisis triggered by COVID-19 [2,3]. Specifically, the use of virtual learning environments (VLEs) helps to monitor students during the learning process [3,4]. Amongst other things, this follow-up helps to personalise learning [5], which in turn boosts student motivation [6] and so aids self-regulated learning (SRL) [7,8], in addition to fostering the use of cognitive and metacognitive strategies [9,10]. Nevertheless, if VLEs are to prove effective in the learning process, they need to be well designed from the pedagogical and technological standpoint [11]. The most effective VLEs include feedback for the user, which provides them with information concerning their successes and failures as well as their progress [12,13,14], and interactive material in a multichannel format (including information through visual and auditory stimuli) [15,16]. These resources normally include the figure of an avatar or assistant who accompanies the student during the learning process [17], specifically in gamification-related tasks [18]. Furthermore, the use of a VLE enables the student’s interactions with the activities, with the teacher and with other students to be recorded [19]. In addition, analysis of these records through educational data mining (EDM) techniques allows each student’s learning patterns to be pinpointed [20,21,22,23,24]. Analysing these patterns also provides teachers with information concerning the learning styles of each of their students. This offers teachers the relevant information required to adapt teaching material so as to personalise teaching and achieve the most efficient learning possible for their students [25,26,27]. Moreover, the use of VLEs helps to pinpoint student procrastination from the very start of the learning process, and so aids in correcting said behaviour, thereby preventing academic failure [28,29]. Research into these topics has developed over the last decade and has focused particularly on university students [30,31] and on the education of adults in unofficial education [32], with one example being research into massive open online courses (MOOCs) [33]. Yet, research in the field of education in the case of adults who have special educational needs remains scant [34].

Applying VLEs in Environments for Persons with Special Educational Needs

In a recent study, Hurwitz et al. [35] reported that designing materials in virtual learning environments is key to achieving personalised learning. These authors also pointed to the importance which the use of these learning environments and digital resources had for users—in this instance, those with autism spectrum disorder—and for their families [36]. The authors also highlighted two aspects: adapting material to the characteristics of each particular problem, and the need to train both teachers as well as users and their families in the use of virtual tools and environments. In another study, Dos Santos Dourado et al. [37] worked with individuals who had Down Syndrome (28 persons aged between six and fifty-one), applying Sandplay augmented reality (AR) technology. The results of their study showed that the use of digital resources in interactive learning environments boosted sensorial integration and behavioural regulation amongst participants and led to reduced costs in education. In another study, Ma et al. [38] reported that virtual reality technology with gamification activities displayed enormous potential in the rehabilitation/recovery of individuals with disabilities since, among other aspects, it aided the development of perceptive (visual and auditory) skills. In another study, Bonilla-Del-Río and Sánchez Calero [39] worked with 39 adults aged between 21 and 72—who had different kinds of disabilities and who were enrolled at an occupational centre—and with families and educators. The conclusions that emerged from this study indicated that the use of technology enhanced the social, educational, and workplace inclusion of users at the occupational centre. The authors also highlighted that the use of virtual learning environments (VLE) showed great potential for the education of persons with disabilities, and with their families, although these technological resources did need to be adapted to each user’s particular characteristics. The authors also pointed out that much remains to be done in order to achieve the proper implementation of technological resources in the education of people with disabilities. In another study, Dharmarathne et al. [40] designed STEP-UP. This is a teaching system that consists of four modules based on gamification aimed at people with autism spectrum disorder, Down Syndrome, or low intellectual capacity. The goal was to foster motivation and learning amongst users. In another work, Barron et al. [41] found that the use of a VLE in face-to-face spaces could offer support to teachers, particularly when working with small groups of students. This is because VLEs enable more efficient and effective teaching for groups with disabilities. Finally, Reyes et al. [42] conducted a systematic review study of teaching through VLEs in individuals with disabilities. Their study concluded that the use of technology-based learning environments aided the inclusion of students with disabilities when it came to their education and, therefore, their social inclusion. Nevertheless, these authors emphasised the need for further research to explore teaching practices and the development of learning in virtual environments in real-life situational use in greater depth.
In this context, it is also important to bear in mind that the use of virtual learning platforms involves the recording of a lot of interaction data [43], and that the information recorded needs to be processed using machine learning techniques [44]. The use of these techniques facilitates the understanding of students’ learning patterns [43]. Finally, it is important to underline how educational work through the use of virtual learning platforms is also proving to be highly effective in working with people with Down Syndrome [45].
The aim of this study was to analyse the usability of a virtual learning environment that dealt with environmental sustainability in the learning of adults with Down Syndrome through EDM, with the ultimate aim being to gauge its impact on learning amongst users. Based on the previously mentioned studies, the research questions posed in this work were as follows:
RQ1.
Will there be significant differences in the learning outcomes in the concepts of sustainability before and after the use of a VLE?
RQ2.
Will there be significant differences in the learning outcomes in the concepts of sustainability depending on students’ frequency of use of a VLE?
RQ3.
Will there be significant differences in the learning outcomes in the concepts of sustainability depending on the time students spend using a VLE?
RQ4.
Will there be different clusters amongst students vis à vis their learning behaviour in the VLE?
RQ5.
What will be the perceived student satisfaction with the thematic content and the VLE?
This work is structured into the following sections: a description of the methodology applied, the results found in relation to the research questions tested, the discussion, and conclusions.

2. Materials and Methods

2.1. Participants

Work was carried out with 34 Down Syndrome students who had a serious intellectual disability [41], a CPM percentile between 1 and 35, following the criteria of the American Psychiatric Association (APA, 2014). Of the 34 students, 14 were women (Mage = 29 years old; SDage = 7.75; Interval 16–42) and 20 were men (Mage = 29.7 years old; SDage = 8.18; Interval 19–44). Participants were students at the Estela Special Education Centre and were users of the Centre for Promoting Personal Independence (the Burgos Down Syndrome Association, Spain). One person dropped out of the study during the course. Convenience sampling was used in order to construct the sample.

2.2. Instruments

(a)
Open access SusKids virtual learning environment https://suskids.bjaland.co/en/courses/ (26 November 2023). This platform was developed as part of the SusKids project co-funded by the European Commission. It involves working with concepts related to sustainable behaviour and is made up of four courses: Course 1. Environment, which in turn contains five thematic blocks (environment; air; the Earth and mountains; water and the oceans; and animals and plants). Course 2. Waste, which in turn contains three blocks (a description of waste; waste is a problem and what to do with waste). Course 3. What to do with waste? This in turn contains seven thematic blocks (where does waste go?; incinerators; dumps; reducing; recycling; review activities). Course 4. Construction and environment, which includes eight blocks (buildings; rocks; bricks, tiles and ceramics; cement; concrete; mortar; construction and environment; evaluation). Each course contains a progress bar.
A visual representation of the virtual learning environment (VLE) of the SusKids project can be seen in Figure 1.
The VLE contains a progress bar which offers the student feedback on their progress during each year. It also offers a function that displays the correct and incorrect answers given when carrying out the proposed learning activities. An example of each of the functionalities can be seen in Figure 2.
(b)
Initial test of their knowledge of sustainability. One test per year was applied.
(c)
Raven’s progressive matrices test [46]. Evaluation test of the G factor of intelligence. Specifically, the Coloured Progressive Matrices Scale (CPM) was applied. This is designed to evaluate children aged four to nine or people with an intellectual deficit. The test has a test–retest reliability index of r = 0.82 and of α = 0.86
(d)
Adaptive Behaviour Assessment System II (ABAS-II) [47]. This scale evaluates the everyday functional skills required to operate independently in daily life. It analyses the areas of communication, use of community resources, functional academic skills, life in the home or life at school, health and safety, leisure, self-care, self-guidance, social, motor, and employment. It also provides the Global Index of Adaptive Behaviour (CAG). In its Spanish version, the test obtained an α = 0.91. In this study, the overall reliability index was α = 0.93.
(e)
Survey on perceived user satisfaction with the use of the VLE. An ad hoc questionnaire was designed, consisting of 12 closed-response items measured on a scale of 1 to 3 (ranging from 1 = not at all satisfied to 3 = very satisfied), and open-response questions addressing the strengths and weaknesses of the VLE (see Appendix A). The questionnaire obtained an α = 0.77 and Ω = 0.74 for the whole questionnaire, and an interval of α = 0.72- α = 0.78 and Ω = 0.67- Ω = 0.76 for the remaining elements.

2.3. Procedure

Prior to commencing the study, a positive report was obtained from the University of Burgos Bioethical Committee (No. IO 10/2022). We also sought the informed written consent of the participants or their legal tutors. Before starting instructional intervention on the VLE, teaching staff were given four weeks of training. All of the teaching staff had at least ten years of experience working with Down Syndrome students. Students’ intellectual capacity (Raven test) [46] and adaptive behaviour was then tested (test ABAS-II [47]). A test of knowledge was previously conducted, after which learning the concepts of environment sustainability within the SusKids VLE commenced. This programme consisted of four courses (see tools section). Once the whole course had been completed, the test of knowledge was repeated in order to evaluate the knowledge acquired. Instructional teaching was given over one academic year, with approximately two hours being devoted thereto each week. Figure 3 shows the procedure followed in the study.

2.4. Data Analysis

Prior to commencing the study, we checked to see whether the students displayed similar characteristics in terms of their results in the Raven test [46]. We also tested to see whether the sample fulfilled the assumptions of normality with regard to the allocated “age” variable. For this purpose, we used the Shapiro–Wilk test (the Shapiro–Wilk test is used to test the normality of a dataset), given that there were fewer than 50 participants in the sample. In order to test RQ1, we applied the t test for dependent samples, and to measure the value of the effect, we used Cohen’s d test [48] and Hedges’ g (1981) [49], where values between 0.2 and 0.3 indicate a low effect value, values between 0.5 and 0.8 indicate a medium effect value, and values above 0.8 indicate a high effect value. To examine RQ2 and RQ3, we applied a one-factor fixed effect ANOVA (analysis of variance is a statistical method used to ascertain whether the results of a test are significant, i.e., to determine whether it is necessary to reject the null hypothesis or to accept the alternative hypothesis) and the eta squared test (η2) to measure the value of the effect. We also applied the Duncan–Tukey post hoc (DMS) test (used in ANOVA to create confidence intervals for all pairwise differences between factor level means while controlling for the error rate per family at a specified level) to assess the difference between groups. To test RQ4, we applied the non-supervised k-means machine learning test (k-means clustering is a method of vector quantization that aims to partition n observations into k clusters in which each observation belongs to the cluster with the nearest mean (cluster centres or cluster centroid, serving as a prototype of the cluster) as well as cluster visualization techniques. Analyses were carried out using the SPSS v. 28 statistical package [50] and with Orange v. 3.32 software [51]. To examine RQ5, we applied a descriptive analysis (means and standard deviations, cross-tabulation and contingency coefficient). Text mining analysis (text mining involves techniques that can be subsumed under data mining. Data mining can be defined as the mathematical analysis of data to deduce patterns and trends) was applied to study the responses to the open questions. Atlas.ti v. 22 [52] qualitative analysis software was used for this purpose.

2.5. Design

Following Campbell and Stanley [53], we applied a quasi-experimental before–after design without a control group. This design was used because the population sample was very specific, and it was not possible to have a contrast sample (control group).

3. Results

3.1. Prior Analysis

We carried out a homogeneity analysis of participants with regard to the results obtained in the g factor of intelligence measured with the CPM test [46] and of the ABAS II CAG [47]. No significant differences were found in any of the scores (Raven [41] CPM (F = 1.81, p = 0.20, η2 = 0.13); ABAS CAG [42] (F = 0.79, p = 0.39, η2 = 0.06)). As regards testing the sample distribution, the Shapiro–Wilk normality test provided a statistic = 0.97, p = 0.39, such that distribution was assumed to be normal. We therefore used parametric statistics for the hypotheses.

3.2. Analysis of Hypothesis Testing

3.2.1. Differences in Learning Outcomes Before-after Instruction in VLE (RQ1)

Significant differences were found between before and after the intervention in the learning programme on sustainability concepts carried out in the SusKids VLE in favour of the post-intervention stage (see Table 1). A medium effect size is observed for course 1 and a high effect size is observed for courses 2, 3 and 4.

3.2.2. Influence of Frequency Use of the Automatic Reader of the VLE and Learning Outcomes (RQ2)

Three groups were established in terms of frequency of use of the automatic reader of the SusKids VLE platform: Group 1—low access interval (1–22 times); Group 2—medium access interval (136–728 times); Group 3—high access interval (1660–9368 times). Significant differences were found in terms of the frequency of use in the learning outcomes in Course 4 (see Table 2). In particular, these differences were found between Group 1 and Group 2 (difference of means = 12.91, p = 0.03) in favour of the former, and Group 3 and Group 1 in favour of the former.

3.2.3. Influence of Student Time Spent on the VLE and Learning Outcomes (RQ3)

Three groups were set up based on the time that the student spent on the SusKids VLE platform: Group 1—low time interval (4–11 h); Group 2—medium time interval (12–20 h); Group 3—high time interval (21–32 h). Significant differences were seen to exist in the learning outcomes obtained in Course 4. Specifically, these differences were found between Groups 1 and 2 (difference of means = 16.86, p = 0.02, in favour of the former) (see Table 3).

3.2.4. Cluster Analysis (RQ4)

Because of the results in RQ2 and RQ3, in which significant differences were found in learning outcomes in Course 4 in favour of students who had accessed the SusKids VLE less often and who had spent less time on it, a cluster analysis was carried out to determine the groupings/clusters amongst students with no prior classification variable with regard to the learning outcomes. The learning outcomes were analysed before and after teaching on sustainability content in the SusKids VLE. Specifically, a k = 3 was applied, since three groups had been found in RQ2 and RQ3. Significant differences were seen between the three clusters before and after teaching in the VLE (see Table 4 and Table 5).
Figure 4 and Figure 5 show the position of each participant in each cluster before and after receiving instruction in the VLE. The visual comparison provides information concerning student progress after having been taught in the VLE.
We then compiled a cross-tabulation to analyse agreement between participants’ position in the clusters before-after intervention. Table 6 shows the results of this agreement. We also obtained a coefficient of contingency of C = 0.71, p = 0.001 *, indicating a difference between participants’ position before and after the intervention.

3.2.5. Perceived Student Satisfaction with the Usability of the VLE

Table 7 shows the descriptive study of student participant responses with regard to their perceived satisfaction with the use of the VLE in the open questions. A high degree of perceived satisfaction with the content and usability of the VLE is evident (interval of 2.45–2.88 out of 3). Students also pointed out that the appearance and aspect of the VLE did not need changing (1.74 out of 3). There was also low dispersion, which indicates a high level of agreement amongst participants.
Appendix B shows the students’ answers to the open questions “What did you like most about the platform?” and “What did you like least about the platform?”. Text mining analysis can be seen in Figure 6.

4. Discussion

The use of the SusKids VLE—which applies SRL and feedback techniques on learning outcomes—has aided the learning of students with Down Syndrome [7,9,10,12,13,15]. The use of VLEs, together with the application of EDM techniques, has enabled the personalised follow-up of students [20,21,22,23,24]. All of this has benefitted a personalised understanding of student learning behaviour [4,19,32]. Likewise, monitoring through non-supervised EDM cluster analysis techniques and visualisation EDM techniques has enabled those students who displayed greater procrastination behaviour to be pinpointed [28,29]. Moreover, the best learning outcomes were not always seen to be linked to a greater use of or more time spent on the VLE. Given that the students involved in this study exhibited similar cognitive development, future studies will analyse what other variables might be influencing enhanced performance. In addition, student participants expressed a high level of perceived satisfaction regarding the functionality and usability of the SusKids VLE, as well as a high degree of motivation for the work it involved. These aspects support the findings of Dharmarathne et al. [40] and Ma et al. [38] concerning the functionality of use and motivation expressed in students who have certain disabilities when learning on a VLE with supervised follow-up and monitoring.
In sum, the use of the SusKids VLE enables students to be monitored during the teaching–learning process [3,32]. Said follow-up bolsters the personalisation of the learning [5] and boosts participant motivation [6]. Likewise, the design of the SusKids VLE is referential, since the assistance provided through progress bars and feedback on the answers then serves to support self-regulated learning [7,8,12,13,14] by applying a gamified learning space [18]. Nevertheless, intervention was not equally effective in terms of improved learning results. Significant differences were only found in year four, both with regard to the frequency of access as well as the length of time spent on the VLE. This may be due to the fact that participants gradually become skilled over time in the use of the VLE and their understanding of how the VLE works. In this regard, a comparative analysis will be carried out in future research. Furthermore, the use of automatic learning techniques enabled us to pinpoint patterns of learning in each participant [20,21,22,23,24]. In this regard, the use of non-supervised clustering learning techniques provided a great deal of information concerning the grouping of participants in different clusters before vs. after instructional intervention in the VLE. It is important to highlight that some participants were in different clusters, whereas others were not. This opens the door to future inquiry exploring the effectiveness of using VLEs in instructional methodology, since this data analysis technique allows for the individualised follow-up of each participant throughout the instructional process. All of this will help educators to gain an understanding of each student’s learning process, which will then allow them to design personalised teaching programmes.

5. Conclusions

This study offers an approach to exploring how effective the use of a VLE in the learning process of persons with an intellectual disability—specifically with Down Syndrome—can prove to be. The study also assesses perceived user satisfaction. Although satisfactory, the results should be approached with caution due to the features of the sample and the non-inclusion of a control group. As a result, future studies will seek to offset these hurdles in an effort to enhance the generalisability of the findings. It should, however, be borne in mind that this work is being carried out with very specific groups in which experimental control of all the variables involved proves highly complex. This work has also embraced a study into student learning behaviour in VLE and its link to learning outcomes.
Furthermore, the sample selection characteristics and the lack of a control group mean that the results have to be approached with caution with regard to the generalizability thereof. However, the main relevance of this work focuses on the use of a virtual platform developed ad hoc in which a specific instruction programme aimed at people with Down Syndrome has been implemented.

Future Lines of Work and Educational Implications

Future research will explore what impact that the use of these learning techniques has on students’ social and workplace inclusion, in line with Reyes et al. [40]. Future studies will also examine teacher perception of the use of VLE as a teaching support resource in face-to-face classrooms, following Barron et al. [41]. An analysis will also be carried out of how families perceive the functionality in the use of VLE in their children’s learning, in line with the proposal put forward by Bonilla-Del-Río and Sánchez-Calero [39].
In addition, analysis of students’ task behaviour in the platform with regard to each activity will enable the teacher to know each student’s learning specificity. This knowledge will make it easier for the teacher to personalise training, in line with the suggestions made by Dharmarathne et al. [40] and Ma et al. [38].
This study has highlighted the relevance of developing ad hoc SusKids VLE platforms, in this case for those with Down Syndrome. The results point to a high degree of group involvement and enhanced learning outcomes. Future studies might therefore seek to adapt the tool to people with other disorders (e.g., autism spectrum disorder, sensory disorders (visual, auditory), etc.). In addition, an attempt will be made to increase the sample size and to apply a longitudinal design. It is also vital to consider the importance of using machine learning techniques to analyse the learning behaviour of people using the VLE. This is because processing the data reveals the learning patterns of each user, based on which it is then possible to personalise learning proposals in the VLE.
Summing up, it can be concluded that further studies are required to address the functionality of learning in VLEs, specifically amongst groups with different disabilities and/or cognitive deterioration. The findings will help to improve both the VLE and the virtual resources used therein, and will thereby enhance the quality of education and the effective learning of those who belong to these groups, as well as boosting their social inclusion.

Author Contributions

Conceptualization, M.C.S.-M. and S.G.-G.; methodology, M.C.S.-M. and S.G.-G.; software, S.G.-G., C.A.B. and L.A.C.-R.; validation, M.C.S.-M. and S.G.-G.; formal analysis, M.C.S.-M.; investigation, C.A.B., S.G.-G. and L.A.C.-R.; resources, M.C.S.-M. and S.G.-G.; data curation, C.A.B.; writing—original draft preparation, M.C.S.-M.; writing—review and editing, M.C.S.-M., S.G.-G., C.A.B. and L.A.C.-R.; visualization, M.C.S.-M., S.G.-G., C.A.B. and L.A.C.-R.; supervision, M.C.S.-M., S.G.-G., C.A.B. and L.A.C.-R.; project administration, S.G.-G.; funding acquisition, S.G.-G. and L.A.C.-R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Commission, grant number 2018-1-ES01-KA201-050639.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and was approved by the Institutional Review Board (or Ethics Committee) of the Universidad de Burgos (protocol code No. IO 10/2022) for studies involving humans.

Informed Consent Statement

Written informed consent was obtained from all the subjects who participated in this study.

Data Availability Statement

The database will be available to interested authors with a signed agreement with the Universidad de Burgos regarding the scientific and responsible use of the data.

Acknowledgments

The authors would like to acknowledge the collaboration of all the participants in the study as well as the management team of the Estela Special Education Centre (Burgos, Spain).

Conflicts of Interest

The authors declare no conflict of interest because all materials and VLE are open access.

Appendix A

ItemsScores
Closed questions
1. Did you understand all of the content worked with in the VLE?123
2. Were the activities clear?123
3. Were the activities easy?123
4. Did you enjoy the activities?123
5. Was the work on the VLE easy?123
6. Would you have liked to learn other things in the VLE courses?123
7. Was it easy to stop inside an activity?123
8. Was it easy to move from one activity to another?123
9. Was it easy to return to an activity/do an activity again?123
10. Did you find the VLE attractive?123
11. Did you find the VLE clear?123
12. Would you like to change the appearance of the VLE?123
Open questions
13. What did you like most about the platform?
14. What did you like least about the platform?

Appendix B

Appendix B.1. Answers to Question 13: What Did You Like Most about the Platform?

“The videos
The ease of navigation and how intuitive it is.
I liked learning more about compost, environment, and construction.
I liked the last topic the most, because I like it, I’m interested and I’m going to work with these things.
I liked everything.
Everything.
Some exercises about rubbish, because I don’t like where things go and I get blocked.
Nothing.
Nothing.
Nothing.
Very good, great fun.
I liked it very much.
Everything.
Everything
I liked the explanatory videos very much.
The exercises.
“I really liked the bricks, tiles, and clay”.
The environment.
Everything.
All the contents: rubbish, pollution, air, water, construction.
I liked everything in this new experience.”

Appendix B.2. Answers to Question 14: What Did You Like Least about the Platform?

“Nothing. Everything is good.
I liked everything, but I especially liked the last one a lot.
It was difficult and I didn’t understand the exercises.
Everything was fine.
Nothing. I understood everything.
The exercise at the bottom right was more complicated.
Nothing.
Nothing.
Nothing.
Nothing.
Nothing.
The errors.
Nothing.
I don’t know. I think it was all interesting, but the rubbish and the first two topics could have had more information.
The rubbish
I didn’t like to read.
Nothing.
The errors and that the internet didn’t work and that I want to change those things.”

References

  1. Malmberg, J.; Järvelä, S.; Järvenoja, H.; Panadero, E. Promoting Socially Shared Regulation of Learning in CSCL: Progress of Socially Shared Regulation Among High- and Low-Performing Groups. Comput. Hum. Behav. 2015, 52, 562–572. [Google Scholar] [CrossRef]
  2. García-Peñalvo, F.J. Digital Transformation in the Universities: Implications of the COVID-19 Pandemic. Educ. Knowl. Soc. 2021, 22, e25464. [Google Scholar] [CrossRef]
  3. Sáiz-Manzanares, M.C.; Marticorena-Sánchez, R.; Rodríguez-Díez, J.J.; Rodríguez-Arribas, S.; Díez-Pastor, J.F.; Ji, Y.P. Improve Teaching with Modalities and Collaborative Groups in an LMS: An Analysis of Monitoring Using Visualisation Techniques. J. Comput. High. Educ. 2021, 33, 747–778. [Google Scholar] [CrossRef] [PubMed]
  4. Sáiz-Manzanares, M.C.; Rodríguez-Díez, J.J.; Díez-Pastor, J.F.; Rodríguez-Arribas, S.; Marticorena-Sánchez, R.; Ji, Y.P. Monitoring of Student Learning in Learning Management Systems: An Application of Educational Data Mining Techniques. Appl. Sci. 2021, 11, 2677. [Google Scholar] [CrossRef]
  5. Dias, S.B.; Hadjileontiadou, S.J.; Diniz, J.A.; Hadjileontiadis, L.J. Computer-based Concept Mapping Combined with Learning Management System Use: An Explorative Study under the Self- and Collaborative-Mode. Comput. Educ. 2017, 107, 127–146. [Google Scholar] [CrossRef]
  6. Zimmerman, B.J.; Moylan, A.R. Self-regulation: Where Metacognition and Motivation Intersect. In Handbook of Metacognition in Education; Hacker, D.J., Dunlosky, J., Graesser, A.C., Eds.; Routledge/Taylor & Francis Group: New York, NY, USA, 2009; pp. 299–315. [Google Scholar]
  7. Azevedo, R.; Taub, M.; Mudrick, N. Technologies Supporting Self-regulated Learning. In Encyclopedia of Educational Technology; Spector, J.M., Ed.; SAGE Publications: Thousand Oaks, CA, USA, 2015; pp. 731–734. [Google Scholar]
  8. Gutiérrez, A.P.; Schraw, G. Effects of Strategy Training and Incentives on Students’ Performance, Confidence, and Calibration. J. Exp. Educ. 2015, 83, 386–404. [Google Scholar] [CrossRef]
  9. Azevedo, R.; Moos, D.C.; Johnson, A.M.; Chauncey, Á.D. Measuring Cognitive and Metacognitive Regulatory Processes during Hypermedia Learning: Issues and Challenges. Educ. Psychol. 2010, 45, 210–223. [Google Scholar] [CrossRef]
  10. Tugba Bulu, S.; Pedersen, S. Supporting Problem-solving Performance in a Hypermedia Learning Environment: The Role of Students’ Prior Knowledge and Metacognitive Skills. Comput. Hum. Behav. 2012, 28, 1162–1169. [Google Scholar] [CrossRef]
  11. Park, Y.; Jo, I.-H. Using Log Variables in a Learning Management System to Evaluate Learning Activity Using the Lens of Activity Theory. Assess. Eval. High. Educ. 2017, 42, 531–547. [Google Scholar] [CrossRef]
  12. Harks, B.; Rakoczy, K.; Hattie, J.; Besser, M.; Klieme, E. The Effects of Feedback on Achievement, Interest and Self-evaluation: The Role of Feedback’s Perceived Usefulness. Educ. Psychol. 2014, 34, 269–290. [Google Scholar] [CrossRef]
  13. Hattie, J.; Clarke, S. Visible Learning: Feedback; Routledge: London, UK, 2018. [Google Scholar] [CrossRef]
  14. Wisniewski, B.; Zierer, K.; Hattie, J. The Power of Feedback Revisited: A Meta-Analysis of Educational Feedback Research. Front. Psychol. 2020, 10, 3087. [Google Scholar] [CrossRef]
  15. Azevedo, R.; Gašević, D. Analyzing Multimodal Multichannel Data about Self-Regulated Learning with Advanced Learning Technologies: Issues and Challenges. Comput. Hum. Behav. 2019, 96, 207–210. [Google Scholar] [CrossRef]
  16. Malmberg, J.; Järvelä, S.; Holappa, J.; Haataja, E.; Huang, X.; Siipo, A. Going Beyond What Is Visible: What Multichannel Data Can Reveal about Interaction in the Context of Collaborative Learning? Comput. Hum. Behav. 2019, 96, 235–245. [Google Scholar] [CrossRef]
  17. Cloude, E.B.; Taub, M.; Lester, J.; Azevedo, R. The Role of Achievement Goal Orientation on Metacognitive Process Use in Game-Based Learning. In Artificial Intelligence in Education; Isotani, S., Millán, E., Ogan, A., Hastings, P., McLaren, B., Luckin, R., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 36–40. [Google Scholar] [CrossRef]
  18. Taub, M.; Mudrick, N.V.; Azevedo, R.; Millar, G.C.; Rowe, J.; Lester, J. Using Multi-Channel Data with Multi-Level Modeling to Assess In-Game Performance during Gameplay with CRYSTAL ISLAND. Comput. Hum. Behav. 2017, 76, 641–655. [Google Scholar] [CrossRef]
  19. Wiedbusch, M.D.; Kite, V.; Yang, X.; Park, S.; Chi, M.; Taub, M.; Azevedo, R. A Theoretical and Evidence-Based Conceptual Design of MetaDash: An Intelligent Teacher Dashboard to Support Teachers’ Decision Making and Students’ Self-Regulated Learning. Front. Educ. 2021, 6, 570229. [Google Scholar] [CrossRef]
  20. Cerezo, R.; Sánchez-Santillán, M.; Paule-Ruiz, M.P.; Núñez, J.C. Students’ LMS Interaction Patterns and their Relationship with Achievement: A Case Study in Higher Education. Comput. Educ. 2016, 96, 42–54. [Google Scholar] [CrossRef]
  21. Bannert, M.; Reimann, P.; Sonnenberg, C. Process Mining Techniques for Analysing Patterns and Strategies in Students’ Self-regulated Learning. Metacognition Learn. 2014, 9, 161–185. [Google Scholar] [CrossRef]
  22. Li, L.-Y.; Tsai, C.-C. Accessing Online Learning Material: Quantitative Behavior Patterns and their Effects on Motivation and Learning Performance. Comput. Educ. 2017, 114, 286–297. [Google Scholar] [CrossRef]
  23. Mudrick, N.V.; Azevedo, R.; Taub, M. Integrating Metacognitive Judgments and Eye Movements Using Sequential Pattern Mining to Understand Processes Underlying Multimedia Learning. Comput. Hum. Behav. 2019, 96, 223–234. [Google Scholar] [CrossRef]
  24. Sáiz-Manzanares, M.C.; Marticorena Sánchez, R.; García Osorio, C.I.; Díez-Pastor, J.F. How Do B-learning and Learning Patterns Influence Learning Outcomes? Front. Psychol. 2017, 8, 745. [Google Scholar] [CrossRef]
  25. Carbonero, M.A.; Román, J.M.; Ferrer, M. Programa para “aprender estratégicamente” con estudiantes universitarios: Diseño y validación experimental. An. Psicol. 2013, 29, 876–885. [Google Scholar] [CrossRef]
  26. Reoyo, N.; Carbonero, M.A.; Martín, L.J. Características de eficacia docente desde las perspectivas del profesorado y futuro profesorado de secundaria. Rev. Educ. 2017, 376, 62–86. [Google Scholar] [CrossRef]
  27. Taub, M.; Azevedo, R.; Rajendran, R.; Cloude, E.B.; Biswas, G.; Price, M.J. How Are Students’ Emotions Related to the Accuracy of Cognitive and Metacognitive Processes during Learning with an Intelligent Tutoring System? Learn. Instr. 2021, 72, 101200. [Google Scholar] [CrossRef]
  28. Martín-Antón, L.J.; Aramayo-Ruiz, K.P.; Rodríguez-Sáez, J.L.; Sáiz-Manzanares, M.C. Procrastination in Pre-service Teachers: The Role of Learning Strategies and Academic Achievement. Educ. XX1 2022, 25, 65–88. [Google Scholar] [CrossRef]
  29. Cerezo, R.; Esteban, M.; Sánchez-Santillán, M.; Núñez, J.C. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle. Front. Psychol. 2017, 8, 1403. [Google Scholar] [CrossRef]
  30. Greig, A.; Priddle, J. Mapping Students’ Development in Response to Sustainability Education: A Conceptual Model. Sustainability 2019, 11, 4324. [Google Scholar] [CrossRef]
  31. Al-Ansi, A.M.; Jaboob, M.; Garad, A.; Al-Ansi, A. Analyzing augmented reality (AR) and virtual reality (VR) recent development in education. Soc. Sci. Humanit. Open 2023, 8, 100532. [Google Scholar] [CrossRef]
  32. Sáiz-Manzanares, M.C.; Ramos Pérez, I.; Arnaiz-Rodríguez, Á.; Rodríguez-Arribas, S.; Almeida, L.; Martín, C.F. Analysis of the learning process through eye tracking technology and feature selection techniques. Appl. Sci. 2021, 6157, 6157. [Google Scholar] [CrossRef]
  33. Qaffas, A.A.; Kaabi, K.; Shadiev, R.; Essalmi, F. Towards an Optimal Personalization Strategy in MOOCs. Smart Learn. Environ. 2020, 7, 14. [Google Scholar] [CrossRef]
  34. Sáiz-Manzanares, M.C.; Gutiérrez-González, S.; Rodríguez, Á.; Alameda Cuenca-Romero, L.; Calderón, V.; Queiruga-Dios, M.Á. Systematic Review on Inclusive Education, Sustainability in Engineering: An Analysis with Mixed Methods and Data Mining Techniques. Sustainability 2020, 12, 6861. [Google Scholar] [CrossRef]
  35. Hurwitz, S.; Garman-McClaine, B.; Carlock, K. Special Education for Students with Autism During the COVID-19 Pandemic: Each Day Brings New Challenges. Autism 2022, 26, 889–899. [Google Scholar] [CrossRef]
  36. Sokolikj, Z.; Ke, F.; Chakraborty, S.; Moon, J. Using Deep Learning to Track Representational Flexibility Development of Children with Autism in a Virtual World. In Proceedings of the 11th International Conference on Information and Education Technology (ICIET), Fujisawa, Japan, 18–20 March 2023; pp. 51–55. [Google Scholar] [CrossRef]
  37. Dos Santos Dourado, G.; Silva, J.A.O.S.; Di Menezes, A.R.C.; Silva, K.C.R.; De Souza Hannum, J.S.; De Andrade Barbosa, T.M.G. An AR Sand play System for People with Down Syndrome. In Proceedings of the 2019 IEEE MIT Undergraduate Research Technology Conference (URTC 2019), Cambridge, MA, USA, 11–13 October 2019; pp. 1–4. [Google Scholar] [CrossRef]
  38. Ma, L.; Yu, Q.; Huang, J.; Sun, Y. Based on the Exploration of VR in the Teaching and Training of Special Children. In Proceedings of the 3rd International Conference on Education Development and Studies, Hilo, HI, USA, 9–11 March 2022; pp. 37–41. [Google Scholar] [CrossRef]
  39. Bonilla-Del-Río, M.; Sánchez-Calero, M.L. Educational Inclusion in Times of COVID-19: Use of Social Media for People with Intellectual Disabilities. RIED-Rev. Iberoam. Educ. Distancia 2022, 25, 141–161. [Google Scholar] [CrossRef]
  40. Dharmarathne, R.S.C.K.; Medagedara, K.A.; Madhubashinee, N.B.W.N.; Maitipe, P.T.; Sriyaratna, D.; Abeywardena, K. STEP UP: Systematically Motivating the Children with Low Psychological Maturity Level and Disabled Children Using Gamification and Human Computer Interaction. In Proceedings of the 7th International Conference for Convergence in Technology (I2CT), Pune, India, 7–9 April 2022; pp. 1–6. [Google Scholar] [CrossRef]
  41. Barron, T.; Friend, M.; Dieker, L.; Kohnke, S. Co-Teaching in Uncertain Times: Using Technology to Improve Student Learning and Manage Today’s Complex Educational Landscape. J. Spec. Educ. Technol. 2021, 37, 439–446. [Google Scholar] [CrossRef]
  42. Reyes, J.I.; Meneses, J.; Melián, E. A Systematic Review of Academic Interventions for Students with Disabilities in Online Higher Education. Eur. J. Spec. Needs Educ. 2021, 37, 569–586. [Google Scholar] [CrossRef]
  43. Aguagallo, L.; Salazar-Fierro, F.; García-Santillán, J.; Posso-Yépez, M.; Landeta-López, P.; García-Santillán, I. Analysis of Student Performance Applying Data Mining Techniques in a Virtual Learning Environment. Int. J. Emerg. Technol. Learn. iJET 2023, 18, 175–195. [Google Scholar] [CrossRef]
  44. Colpo, M.P.; Primo, T.T.; de Aguiar, M.S. Lessons learned from the student dropout patterns on COVID-19 pandemic: An analysis supported by machine learning. Br. J. Educ. Technol. 2023, 1–26. [Google Scholar] [CrossRef]
  45. de Miranda, A.; da Silva, M.L.L.; da Silva, J.R.A.; Correia, A.A.; de Oliveira Rodrigues, C.M.; Lins, F.A.A.; de Oliveira Nobrega, O.; Falcao, T.P. E-Down: Uma Metodologia para Apoio à Escolha e Configuração de AVAs na formação de estudantes com Síndrome de Down. In Proceedings of the 2013 18th Iberian Conference on Information Systems and Technologies, Aveiro, Portugal, 20–23 June 2023. [Google Scholar] [CrossRef]
  46. Raven, J.C. Manual de RAVEN. Matrices Progresivas. CPM-SPM-APM (B*); [RAVEN Manual. Progressive Matrices]; Pearson-Clinical: London, UK, 2011. [Google Scholar]
  47. Harrison, P.L.; Oakland, T. Sistema de Evaluación de la Conducta Adaptativa ABAS-II. [Adaptive Behavior Assessment System. 2005], 2nd ed.; Montero Centeno, D.; Fernández-Pinto, I., Translators; TEA: Madrid, Spain, 2013. [Google Scholar]
  48. Cohen, J. Statistical Power Analysis. Curr. Dir. Psychol. Sci. 1992, 1, 98–101. [Google Scholar] [CrossRef]
  49. Hedges, L.V. Distribution Theory for Glass’s Estimator of Effect size and Related Estimators. J. Educ. Stat. 1981, 6, 107–128. [Google Scholar] [CrossRef]
  50. IBM Corp. SPSS Statistical Package for the Social Sciences (SPSS), Version 28; IBM Corp.: Armonk, NY, USA, 2022. [Google Scholar]
  51. Demsar, J.; Curk, T.; Erjavec, A.; Gorup, C.; Hocevar, T.; Milutinovic, M.; Mozina, M.; Polajnar, M.; Toplak, M.; Staric, A.; et al. Orange: Data Mining Toolbox in Python. J. Mach. Learn. Res. 2013, 14, 2349–2353. [Google Scholar]
  52. ATLAS.ti Corp. Software Package Qualitative Data Analysis; Version 22; ATLAS.ti Corp: Berlin, Germany, 2022. [Google Scholar]
  53. Campbell, D.F.; Stanley, J. Diseños Experimentales y Cuasiexperimentales en la Investigación Social, 9th ed.; Amorrortu: Buenos Aires, Argentina, 2005. [Google Scholar]
Figure 1. Thematic content in the SusKids VLE.
Figure 1. Thematic content in the SusKids VLE.
Sustainability 15 16404 g001
Figure 2. Feedback and progress in the SusKids VLE.
Figure 2. Feedback and progress in the SusKids VLE.
Sustainability 15 16404 g002
Figure 3. Study procedure.
Figure 3. Study procedure.
Sustainability 15 16404 g003
Figure 4. Student distribution in clusters before the intervention in the VLE.
Figure 4. Student distribution in clusters before the intervention in the VLE.
Sustainability 15 16404 g004
Figure 5. Student distribution in clusters after the intervention in the VLE.
Figure 5. Student distribution in clusters after the intervention in the VLE.
Sustainability 15 16404 g005
Figure 6. Analysis of text mining on student perception of the strengths and weaknesses of the VLE.
Figure 6. Analysis of text mining on student perception of the strengths and weaknesses of the VLE.
Sustainability 15 16404 g006
Table 1. T test for dependent samples and value effect.
Table 1. T test for dependent samples and value effect.
Thematic CoursesBefore
n = 33
After
n = 33
M(SD)M(SD)dftpdg
Course 1 68.09(21.02)82.58(18.75)32−6.760.001 *0.730.72
Course 245.90(20.95)72.18(19.06)32−9.630.001 *1.311.30
Course 363.13(14.00)81.71(19.29)32−8.290.001 *1.101.09
Course 450.84(12.00)64.51(15.97)32−7.140.001 *0.970.96
* p < 0.05. M = mean; SD = standard deviation; df = degrees of freedom; d = value of Cohen’s d; g = value of Hedges’ g effect—both in standardised format.
Table 2. Single-factor fixed-effect ANOVA (frequency of access to VLE) and value of the effect.
Table 2. Single-factor fixed-effect ANOVA (frequency of access to VLE) and value of the effect.
Thematic CoursesGroup 1
n = 18
Group 2
n = 11
Group 3
n = 4
M(SD)M(SD)M(SD)dfFpη2
Course 1 88.46(16.03)74.37(19.77)76.21(23.32)(2,32)2.310.120.13
Course 276.65(17.55)70.80(20.54)57.13(18.92)(2,32)1.830.180.11
Course 387.66(18.37)74.13(20.68)71.71(15.64)(2,32)2.320.120.13
Course 470.86(13.65)57.95(16.14)52.21(14.79)(2,32)4.200.03 *0.22
* p < 0.05. M = mean; SD = standard deviation; df = degrees of freedom; η2 = size of the eta squared effect.
Table 3. Fixed-effect factor ANOVA (time spent on the VLE) and value of the effect.
Table 3. Fixed-effect factor ANOVA (time spent on the VLE) and value of the effect.
Thematic CoursesGroup 1
n = 19
Group 2
n = 10
Group 3
n = 3
M(SD)M(SD)M(SD)dfFpη2
Course 1 82.58(19.23)79.54(18.23)88.28(28.13)(2,32)0.240.790.02
Course 270.00(19.59)76.60(20.80)80.88(15.38)(2,32)0.650.530.04
Course 381.00(20.35)77.35(20.01)89.36(15.74)(2,32)0.420.660.03
Course 469.62(15.16)52.76(13.85)65.27(13.58)(2,32)4.350.02 *0.24
* p < 0.05. M = mean; SD = standard deviation; df = degrees of freedom; η2 = size of the eta squared effect. Note. One participant dropped out during the course.
Table 4. Centre of the clusters and ANOVA in the clusters of learning outcomes before the intervention.
Table 4. Centre of the clusters and ANOVA in the clusters of learning outcomes before the intervention.
Thematic CoursesCluster 1Cluster 2Cluster 3
n = 14n = 10n = 9dfFp
Course 1 66.9145.2894.85(2,30)66.750.001 *
Course 242.5525.8473.34(2,30)53.480.001 *
Course 362.2319.0979.48(2,30)39.860.001 *
Course 450.1640.1861.16(2,30)19.270.001 *
* p < 0.01; df = degrees of freedom.
Table 5. Centre of the clusters and ANOVA in the clusters of learning outcomes after the intervention.
Table 5. Centre of the clusters and ANOVA in the clusters of learning outcomes after the intervention.
Thematic CoursesCluster 1Cluster 2Cluster 3
n = 9n = 10n = 14dfFp
Course 1 62.0877.3198.81(2,30)31.520.001 *
Course 249.4075.7386.67(2,30)27.970.001 *
Course 358.4073.72101.24(2,30)105.110.001 *
Course 445.6562.9977.21(2,30)29.880.001 *
* p < 0.01; df = degrees of freedom.
Table 6. Cross table of correspondences between students’ position in the clusters before and after the intervention.
Table 6. Cross table of correspondences between students’ position in the clusters before and after the intervention.
Cluster before
Cluster after123Total
118514
282010
30099
Total9101433
Table 7. Descriptive statistics of perceived satisfaction with the use of the VLE.
Table 7. Descriptive statistics of perceived satisfaction with the use of the VLE.
ItemsMeanStandard Deviation
1. Did you understand the content worked with in the VLE?2.520.73
2. Were the activities clear?2.660.55
3. Were the activities easy?2.450.75
4. Did you like the activities?2.830.47
5. Was the work on the VLE easy?2.710.65
6. Would you have liked to learn other things in the VLE courses?2.710.60
7. Was it easy to stop in an activity?2.640.64
8. Was it easy to move from one activity to another?2.640.67
9. Was it easy to return to an activity/do an activity again?2.740.58
10. Did you find the VLE attractive?2.880.42
11. Did you find the VLE clear?2.830.43
12. Would you like to change the appearance of the VLE?1.740.92
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sáiz-Manzanares, M.C.; Arranz Barcenilla, C.; Gutiérrez-González, S.; Alameda Cuenca-Romero, L. Usability of a Virtual Learning Environment in Down Syndrome Adult Learning. Sustainability 2023, 15, 16404. https://doi.org/10.3390/su152316404

AMA Style

Sáiz-Manzanares MC, Arranz Barcenilla C, Gutiérrez-González S, Alameda Cuenca-Romero L. Usability of a Virtual Learning Environment in Down Syndrome Adult Learning. Sustainability. 2023; 15(23):16404. https://doi.org/10.3390/su152316404

Chicago/Turabian Style

Sáiz-Manzanares, María Consuelo, Cristina Arranz Barcenilla, Sara Gutiérrez-González, and Lourdes Alameda Cuenca-Romero. 2023. "Usability of a Virtual Learning Environment in Down Syndrome Adult Learning" Sustainability 15, no. 23: 16404. https://doi.org/10.3390/su152316404

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop