A Learning Analytics Theoretical Framework for STEM Education Virtual Reality Applications
Abstract
:1. Introduction
2. Background
2.1. Virtual Reality in STEM Education
2.2. Instructional Design in Virtual Reality
- Student-centered learning: Aligned to the principles of (social) constructivism and constructionism, the visually rich environment and the experimental nature of VR enable students to develop strong mental representations of the information sources through hands-on and collaborative activities [21,22].
2.3. Learning Analytics
- Learners: Alter the learning habits by identifying patterns and paths that can support the attainment of the learning objectives and ensure the achievement of the predefined goals.
- Educators: Improve the quality of teaching based on real-time and summative data that mirror learners’ performance, involvement, and engagement throughout the time.
- Instructional designers: Increase the quality of instruction based on the analysis of the elements that have been utilised the most, the feedback from the students on the provided interventions, and the comments of the teachers.
- Policymakers: Develop clear and accurate awareness of current and future tendencies to inform the subsequent decisions and policymaking.
3. The Theoretical Framework Design
3.1. Rationale and Purpose
- Development of a theoretical design framework which takes under consideration the research gaps that have been identified from the examination of the relevant literature.
- Analysis of an instructional approach that can determine students, educators, and practitioners from different STEM fields while uncovering the most relevant variables related to this classification.
- Identification of the most efficient ML models for the analysis of the error-related behaviors and the determination of the patterns that will improve the provided instruction.
- Use of statistical analysis models to classify students after collecting data from several VR-supported training sessions. The initial dataset includes information related to the course design, the learners’ profile, and the interactions that the students had during the VR training task. For the construction of the final model it is expected that several statistical models are considered so as to increase the prediction accuracy and the reliability of the results.
- Use of different feature importance analysis (FIA) methods to identify the most effective classifiers per task, the relevant variables, and their impact on determining students’ success or failure for the task under consideration.
- Use of an exploratory data analysis (EDA) tool to identify the relationships between the recorded errors. To this end, the clustered information is exported visually to develop different hypotheses related to the underlying reasons that drive these relations. For the visual representation, the LA guidelines that Baker and Yacef [35] have proposed can be applied.
3.2. Theoretical Framework Analysis
- LA models are applied primarily to data that originate from LMS without considering alternative or supplementary tools.
- The main sources for data collection consider the information that derives either from the technological or the pedagogical perspective of the tool/intervention but disregard partially or even completely the psychological one.
- Relevant studies examine the correlations that may exist between a finite set of dependent variables (e.g., demographics, credits, grades) against non-classified parameters that are relevant to specific contexts and fields. This endangers the essence and the further evolution of LA as it prevents the collection and the sharing of large and homogenous data sets.
- By cross-examining the latest (systematic) literature reviews, it became apparent that there is still a lack of a universally accepted comprehensive framework and/or system capable of providing the involved stakeholders with suggestions on the typology of the data that should be collected or recommendations on how to interpret such data to evaluate specific elements and improve their practices.
- (a)
- the technical affordances of the utilised tools;
- (b)
- the instructional design choices that practitioners and educators make,
- (c)
- the psychological elements that influence learning.
3.3. Design Decisions
- the software toolkits utilised for the development of the VR application (e.g., Unity, Maya, Net, Photoshop)
- the specifications of the hardware equipment utilised for the conduct of the interventions (e.g., smartphone, tablet, laptop, desktop PC, head-mounted display)
- the type of the VR approach (e.g., HMD-based, CAVE, 360° video) and the companion equipment (e.g., VR-enabled laboratory handbooks or discipline-related specialised equipment)
- the supplementary resources that may be required for the conduct of the intervention (e.g., multimedia resources, web-based educational platforms, 3D models)
- the learning theories based on which the design of the intervention relies on (e.g., constructionism, cognitivism, (social) constructivism, embodied cognition),
- the instructional strategies (learning models) that gravitate the didactic essence of the respective theories (e.g., activity-based, experiential, collaborative, situated, problem-based, game-based, agent-based learning) and instructional techniques utilised for the conduct of the intervention (e.g., lecture, demonstration, seminar, tutorial, case study), and
- the evaluation focus points related to the effectiveness and efficiency of the application, the intervention, and the instructional approach (e.g., learners’ performance, learning outcomes, learning gains).
- the behavioral elements (e.g., the impact/effect of reinforcement, user experience, visual attractiveness/intuitiveness),
- the cognitive elements (e.g., attention and memory span, problem-solving skills),
- the affective elements (e.g., interest, attachment, satisfaction, degree of arousal, social communication, nature of the activities),
- the motivational elements (e.g., self-belief, self-regulation, self-efficacy, self-goals, self-concept, self-esteem, situational interest)
- the information that can be collected from the different stakeholders (e.g., administrators, educators, students, assessment tools),
- the data collection approach which includes information related to the research method (e.g., experimental, quasi-experimental, non-experimental) and the research methods utilised (e.g., qualitative, quantitative, mixed),
- the data analysis approach which includes the use and combination of different methods (e.g., item response theory, cognitive diagnosis, evolutionary algorithms) and educational data mining models (e.g., decision tree, naïve Bayes, k-nearest neighbor), and
- the data visualisation models for the dissemination of the processed data (e.g., graphs/charts, scatterplots, sociograms, tag clouds, signal lights).
3.4. Overview of the Learning Analytics System
- How to assess the skill cultivation between novice/expert students in VR STEM training scenarios?
- How to select the most appropriate instructional design elements to increase the effectiveness of the VR intervention, according to the difficulty of the topic and the learners’ abilities?
- How to perform error diagnosis for VR-supported instructional settings in conjunction with LA?
- How to provide timely support to low-performing or additional opportunities for development to high-performing students?
4. Contribution and Implications
4.1. Conceptual Implications
- Orchestration of instruction by teachers and reflection on the utilised strategies from the originals available to them.
- Evaluation methods to assess not only the students’ performance but, also, that of teachers about the mode of operation and practices followed in both formal and informal contexts.
- Provision of personalised suggestions and appropriate structures to support the implementation of similar scenarios in the future.
- Development of deep understanding of the core elements that influence the educational process and adaptation of the educational resources based on needs and interests of the students.
- Assessment of the course curriculum with particular focus on the parameters that affect the success and the effectiveness of the interventions in STEM training tasks.
- Support from the administration for reshaping of the educational units and allocation of financial resources for the development of VR applications in formal teaching conditions.
4.2. Theoretical Implications
- The decisions related to the data collection should be driven by the principles of the applied instructional design method. Hence, the involved stakeholders are encouraged to provide detailed information about the utilised instructional approaches, the educational subjects that were under investigation, and the analysis methods that have been followed for the examination of the correlations. In doing so, the repetition of the intervention to similar contexts facilitate and supports future research efforts to validate (collectively) the gathered information to develop well-grounded theoretical perspectives.
- The potential of interactions should be examined holistically and not just unilaterally (i.e., both between the users and the VR system and among the users themselves). Under this consideration, we recommend cross-examination and correlation clustering of different pedagogical and psychological elements using ML models to aid the development of prototype profiles and allow the systematic mapping of the factors that influence students’ outcomes and performance.
- The classification of the gathered information should be done in accordance to the areas of interest of the different beneficiaries (e.g., administrators, instructional designers, teachers, students) and the outcomes should be disseminated following the data analytics maturity scale that Howson et al. [46] proposed (e.g., descriptive, diagnostic, predictive, and prescriptive analytics). In doing so the involved stakeholders are able to determine the suitability and the effectiveness of the intervention and thus, perform any adjustments that may be required before designing or implementing new interventions.
4.3. Practical Implications
- VR technology produces huge amounts of data but not all of them are meaningful to the context of educational studies. For exemplification purposes we summarise the data sources that are pertinent to the aim of the proposed LA system followed by some indicative examples:
- visual (e.g., eye motion tracking)
- auditory (e.g., pitch/intensity of the environmental noise levels)
- haptic (e.g., movement, rotation, force)
- network (e.g., packet loss, time delay)
- The essence of the educational VR applications relies on the provision of immediate feedback which offers answer-revision opportunities and leads to errorless learning. In the same vein, comprehensive implementation of a visual LA dashboard is expected to influence the learning dynamics (e.g., motivation, competitiveness, goal orientation) and impact positively learners’ outcomes, achievements, and performance.
5. Discussion and Conclusions
6. Limitations of the Study
Author Contributions
Funding
Conflicts of Interest
References
- STEM Education Data. Available online: https://nsf.gov/nsb/sei/edTool/explore.html (accessed on 6 October 2020).
- Pellas, N.; Kazanidis, I.; Konstantinou, N.; Georgiou, G. Exploring the educational potential of three-dimensional multi-user virtual worlds for STEM education: A mixed-method systematic literature review. Educ. Inf. Technol. 2016, 22, 2235–2279. [Google Scholar] [CrossRef]
- Pellas, N.; Kazanidis, I.; Palaigeorgiou, G. A systematic literature review of mixed reality environments in K-12 education. Educ. Inf. Technol. 2019, 25, 2481–2520. [Google Scholar] [CrossRef]
- Potkonjak, V.; Gardner, M.; Callaghan, V.; Mattila, P.; Guetl, C.; Petrović, V.M.; Jovanović, K. Virtual laboratories for education in science, technology, and engineering: A review. Comput. Educ. 2016, 95, 309–327. [Google Scholar] [CrossRef] [Green Version]
- Makransky, G.; Borre-Gude, S.; Mayer, R.E. Motivational and cognitive benefits of training in immersive virtual reality based on multiple assessments. J. Comput. Assist. Learn. 2019, 35, 691–707. [Google Scholar] [CrossRef]
- Christopoulos, A.; Conrad, M.; Shukla, M. Increasing student engagement through virtual interactions: How? Virtual Real. 2018, 22, 353–369. [Google Scholar] [CrossRef] [Green Version]
- García, A.A.; Bobadilla, I.G.; Arroyo-Figueroa, G.; Pérez, M.; Román, J.M. Virtual reality training system for maintenance and operation of high-voltage overhead power lines. Virtual Real. 2016, 20, 27–40. [Google Scholar] [CrossRef]
- Villagrá-Arnedo, C.J.; Gallego-Durán, F.J.; Llorens-Largo, F.; Compañ-Rosique, P.; Satorre-Cuerda, R.; Molina-Carmona, R. Improving the expressiveness of black-box models for predicting student performance. Comput. Hum. Behav. 2017, 72, 621–631. [Google Scholar] [CrossRef] [Green Version]
- Christopoulos, A.; Conrad, M.; Shukla, M. Between virtual and real: Exploring hybrid interaction and communication in virtual worlds. Int. J. Soc. Media Interact. Learn. Environ. 2016, 4, 23. [Google Scholar] [CrossRef]
- Christopoulos, A.; Kajasilta, H.; Salakoski, T.; Laakso, M.-J. Limits and Virtues of Educational Technology in Elementary School Mathematics. J. Educ. Technol. Syst. 2020, 49, 59–81. [Google Scholar] [CrossRef]
- Pellas, N.; Fotaris, P.; Kazanidis, I.; Wells, D. Augmenting the learning experience in primary and secondary school education: A systematic review of recent trends in augmented reality game-based learning. Virtual Real. 2018, 23, 329–346. [Google Scholar] [CrossRef]
- Heim, M. Virtual Realism; Oxford University Press: Oxford, UK, 1998. [Google Scholar]
- Wolfartsberger, J. Analyzing the potential of Virtual Reality for engineering design review. Autom. Constr. 2019, 104, 27–37. [Google Scholar] [CrossRef]
- Gigante, M. Virtual reality: Definitions, history and applications. In Virtual Reality Systems; Earnshaw, R.A., Gigante, M.A., Jones, H., Eds.; Academic Press: San Diego, CA, USA, 1993. [Google Scholar] [CrossRef]
- Virtual Reality (VR)—Statistics & Facts. Available online: https://www.statista.com/topics/2532/virtual-reality-vr (accessed on 6 October 2020).
- Akdeniz, C. Instructional process and concepts. In Theory and Practice: Improving the Teaching Process; Springer: Singapore, 2016. [Google Scholar] [CrossRef]
- Yeh, S.-C.; Hwang, W.-Y.; Wang, J.-L.; Zhan, S.-Y. Study of co-located and distant collaboration with symbolic support via a haptics-enhanced virtual reality task. Interact. Learn. Environ. 2013, 21, 184–198. [Google Scholar] [CrossRef]
- Christopoulos, A.; Conrad, M.; Shukla, M. Objects, Worlds, and Students: Virtual Interaction in Education. Educ. Res. Int. 2014, 2014, 1–20. [Google Scholar] [CrossRef] [Green Version]
- Christopoulos, A.; Conrad, M.; Shukla, M. Interaction with Educational Games in Hybrid Virtual Worlds. J. Educ. Technol. Syst. 2018, 46, 385–413. [Google Scholar] [CrossRef]
- Moore, K. Classroom Teaching Skills; McGraw-Hill: Boston, MA, USA, 2007. [Google Scholar]
- Concannon, B.J.; Esmail, S.; Roberts, M.R. Head-Mounted Display Virtual Reality in Post-secondary Education and Skill Training. Front. Educ. 2019, 4. [Google Scholar] [CrossRef] [Green Version]
- Segura, R.J.; Pino, F.J.; Ogáyar, C.J.; Rueda, A.J. VR-OCKS: A virtual reality game for learning the basic concepts of programming. Comput. Appl. Eng. Educ. 2019, 28, 31–41. [Google Scholar] [CrossRef]
- Johari, A.; Chen, C.J.; Toh, S.C. A feasible constructivist instructional development model for virtual reality (VR)-based learning environments: Its efficacy in the novice car driver instruction of Malaysia. Educ. Technol. Res. Dev. 2005, 53, 111–123. [Google Scholar] [CrossRef]
- Pellas, N.; Kazanidis, I. Online and hybrid university-level courses with the utilization of Second Life: Investigating the factors that predict student choice in Second Life supported online and hybrid university-level courses. Comput. Hum. Behav. 2014, 40, 31–43. [Google Scholar] [CrossRef]
- Chen, Y.-L.; Hsu, C.-C. Self-regulated mobile game-based English learning in a virtual reality environment. Comput. Educ. 2020, 154, 103910. [Google Scholar] [CrossRef]
- Pellas, N. The influence of computer self-efficacy, metacognitive self-regulation and self-esteem on student engagement in online learning programs: Evidence from the virtual world of Second Life. Comput. Hum. Behav. 2014, 35, 157–170. [Google Scholar] [CrossRef]
- Junco, R.; Clem, C. Predicting course outcomes with digital textbook usage data. Internet High. Educ. 2015, 27, 54–63. [Google Scholar] [CrossRef]
- Siemens, G. Learning analytics: The emergence of a discipline. Am. Behav. Sci. 2013, 57, 1380–1400. [Google Scholar] [CrossRef] [Green Version]
- Xing, W.; Guo, R.; Petakovic, E.; Goggins, S. Participation-based student final performance prediction model through interpretable Genetic Programming: Integrating learning analytics, educational data mining and theory. Comput. Hum. Behav. 2015, 47, 168–181. [Google Scholar] [CrossRef]
- Drachsler, H.; Kalz, M. The MOOC and learning analytics innovation cycle (MOLAC): A reflective summary of ongoing research and its challenges. J. Comput. Assist. Learn. 2016, 32, 281–290. [Google Scholar] [CrossRef] [Green Version]
- Slade, S.; Prinsloo, P. Learning analytics: Ethical issues and dilemmas. Am. Behav. Sci. 2013, 57, 1510–1529. [Google Scholar] [CrossRef] [Green Version]
- Long, P.; Siemens, G. Penetrating the Fog: Analytics in Learning and Education. Educ. Rev. 2011, 46, 31–40. [Google Scholar]
- Peña-Ayala, A. Learning analytics: A glance of evolution, status, and trends according to a proposed taxonomy. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2018, 8, e1243. [Google Scholar] [CrossRef]
- Sciarrone, F. Machine Learning and Learning Analytics: Integrating Data with Learning. In Proceedings of the 17th International Conference on Information Technology Based Higher Education and Training, Olhao, Portugal, 26–28 April 2018; pp. 1–5. [Google Scholar] [CrossRef]
- Baker, R.S.; Yacef, K. The State of Educational Data Mining in 2009: A Review and Future Visions. J. Educ. Data Min. 2009, 1, 3–17. [Google Scholar] [CrossRef]
- Hevner, A.R.; March, S.T.; Park, J.; Ram, S. Design Science in Information Systems Research. MIS Q. 2004, 28, 75. [Google Scholar] [CrossRef] [Green Version]
- Gawlik-Kobylinska, M. The Four-Dimensional Instructional Design Approach in the Perspective of Human-Computer Interactions. In Proceedings of the 1st Frontiers in Artificial Intelligence and Applications, Las Palmas de Gran Canaria, Spain, 8–12 January 2018; pp. 146–156. [Google Scholar] [CrossRef]
- Parong, J.; Mayer, R.E. Learning science in immersive virtual reality. J. Educ. Psychol. 2018, 110, 785–797. [Google Scholar] [CrossRef]
- Chung, J.Y.; Lee, S. Dropout early warning systems for high school students using machine learning. Child. Youth Serv. Rev. 2019, 96, 346–353. [Google Scholar] [CrossRef]
- Gray, C.C.; Perkins, D. Utilizing early engagement and machine learning to predict student outcomes. Comput. Educ. 2019, 131, 22–32. [Google Scholar] [CrossRef]
- Pérez, R.C.; Muñoz-Merino, P.J.; Alexandron, G.; Pritchard, D.E. Using Machine Learning to Detect ‘Multiple-Account’ Cheating and Analyze the Influence of Student and Problem Features. IEEE Trans. Learn. Technol. 2017, 12, 112–122. [Google Scholar] [CrossRef]
- Asif, R.; Merceron, A.; Ali, S.A.; Haider, N.G. Analyzing undergraduate students’ performance using educational data mining. Comput. Educ. 2017, 113, 177–194. [Google Scholar] [CrossRef]
- Pliakos, K.; Joo, S.-H.; Park, J.Y.; Cornillie, F.; Vens, C.; Noortgate, W.V.D. Integrating machine learning into item response theory for addressing the cold start problem in adaptive learning systems. Comput. Educ. 2019, 137, 91–103. [Google Scholar] [CrossRef]
- Alexandron, G.; Yoo, L.Y.; Ruipérez-Valiente, J.A.; Lee, S.; Pritchard, D.E. Are MOOC Learning Analytics Results Trustworthy? With Fake Learners, They Might Not Be! Int. J. Artif. Intell. Educ. 2019, 29, 484–506. [Google Scholar] [CrossRef] [Green Version]
- Zhang, H.; Huang, T.; Lv, Z.; Liu, S.; Yang, H. MOOCRC: A Highly Accurate Resource Recommendation Model for Use in MOOC Environments. Mob. Netw. Appl. 2018, 24, 34–46. [Google Scholar] [CrossRef]
- Yang, F.; Li, F.W. Study on student performance estimation, student progress analysis, and student potential prediction based on data mining. Comput. Educ. 2018, 123, 97–108. [Google Scholar] [CrossRef] [Green Version]
- Yin, M.S.; Haddawy, P.; Suebnukarn, S.; Rhienmora, P. Automated outcome scoring in a virtual reality simulator for endodontic surgery. Comput. Methods Programs Biomed. 2018, 153, 53–59. [Google Scholar] [CrossRef]
- Howson, C.; Sallam, R.L.; Richardson, J.L.; Kronz, A. Magic Quadrant for Analytics and Business Intelligence Platforms. Available online: http://www.sift-ag.com/wp-content/uploads/Qlik-Gartner-2019.pdf (accessed on 6 October 2020).
- Viberg, O.; Hatakka, M.; Bälter, O.; Mavroudi, A. The current landscape of learning analytics in higher education. Comput. Hum. Behav. 2018, 89, 98–110. [Google Scholar] [CrossRef]
- Ferguson, R.; Clow, D. Where is the evidence? A call to action for learning analytics. In Proceedings of the 7th International Learning Analytics Knowledge Conference, New York, NY, USA, 13–17 March 2017; pp. 56–65. [Google Scholar] [CrossRef]
- Leitner, P.; Khalil, M.; Ebner, M. Learning analytics in higher education—A literature review. In Learning Analytics: Fundaments, Applications, and Trends (Studies in Systems, Decision and Control); Peña-Ayala, A., Ed.; Springer International Publishing AG: New York, NY, USA, 2017; Volume 94, pp. 1–23. [Google Scholar] [CrossRef]
- Aldowah, H.; Al-Samarraie, H.; Fauzy, W.M. Educational data mining and learning analytics for 21st century higher education: A review and synthesis. Telemat. Inform. 2019, 37, 13–49. [Google Scholar] [CrossRef]
- Dutt, A.; Ismail, M.A.; Herawan, T. A Systematic Review on Educational Data Mining. IEEE Access 2017, 5, 15991–16005. [Google Scholar] [CrossRef]
- Sclater, N.; Bailey, P. Code of Practice for Learning Analytics. JISC. Available online: https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics (accessed on 6 October 2020).
Parameters | Method | Stakeholder |
---|---|---|
Gender, Age | Survey, Registry | Admins |
Grades, Credits, Achievements, Enrolments, Dropouts, Attendance | Registry | Admins, Educators |
Produced artifacts (Documents, Code, 3D models) | LMS, VR | Students |
Log-in time/frequency, Time-on-task, Resources use, e-Assessment/Feedback | LMS, VR | Students, Designers |
Activity setting (Blended, Distance, F2F, Individual/Collaborative) | LMS, VR | Educators, Designers |
Usability (VR), User experience (VR) | Survey | Students |
Attitude, Motivation | Survey | Students |
Gaze, Gesture, Speech | Sensors | Students |
Categories | Metrics |
---|---|
Academic Progression | Domain knowledge proficiency, Skills mastery, Knowledge retention, Learning strategies, Learning preferences, Learning styles, Performance, Achievements, Misconceptions, Cognition, Aptitude |
Cognitive Performance | Efficiency, Evaluation, Achievement, Competence, Resource consuming, Elapsed time, Correctness, Deficiencies |
Behaviour | Gambling, Guessing, Inquiring, Requesting Help, Willingness to collaborate, Time series of access and response, Persistence, Emotions |
Aim | Machine Learning models | References |
---|---|---|
Feedback to educators’ and instructional designers’ scenarios. | Decision Trees, Random Forest | [39,40] |
Investigation of learners’ behavior during and after the VR-supported intervention. | Naïve Bayes | [41] |
Course adaptation and learning recommendations based on learners’ behavior. | Decision Trees, Random Forest | [42,43] |
Assessment of the VR-supported learning material and content. | Decision Trees, Random Forest, Naïve Bayes | [44,45] |
Prediction of student’s learning performance. | Decision Trees, Logistic Regression, Support Vector Machines | [46,47] |
Analytics | Description | Outcome |
---|---|---|
Descriptive | What happened? | Insights into historical patterns of behavior/performance. |
Diagnostic | Why did it happen? | Evaluation of the examined data. |
Predictive | What could happen in the future? | Identify trends / predict future behavior. |
Prescriptive | How should we respond in the future? | Generate recommendations and make decisions based on algorithmic models |
Evaluation | Visualisation Method |
---|---|
Collaboration | Mathematical graph, Statistical, Timeline, Interaction Matrix, Heatmap |
Instructional Design | Mathematical graph, Statistical, Timeline, Word cloud, Interaction matrix, Circular graph, Bubble plot, Concept map, Glyph, Geomap |
Learning Progress | Statistical, Circular graph, Heatmap, Radar |
Retention | Statistical, Timeline, Word cloud, Glyph |
Motivation | Statistical |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Christopoulos, A.; Pellas, N.; Laakso, M.-J. A Learning Analytics Theoretical Framework for STEM Education Virtual Reality Applications. Educ. Sci. 2020, 10, 317. https://doi.org/10.3390/educsci10110317
Christopoulos A, Pellas N, Laakso M-J. A Learning Analytics Theoretical Framework for STEM Education Virtual Reality Applications. Education Sciences. 2020; 10(11):317. https://doi.org/10.3390/educsci10110317
Chicago/Turabian StyleChristopoulos, Athanasios, Nikolaos Pellas, and Mikko-Jussi Laakso. 2020. "A Learning Analytics Theoretical Framework for STEM Education Virtual Reality Applications" Education Sciences 10, no. 11: 317. https://doi.org/10.3390/educsci10110317
APA StyleChristopoulos, A., Pellas, N., & Laakso, M. -J. (2020). A Learning Analytics Theoretical Framework for STEM Education Virtual Reality Applications. Education Sciences, 10(11), 317. https://doi.org/10.3390/educsci10110317