Next Article in Journal
Resilience Enhancement for Power System State Estimation Against FDIAs with Moving Target Defense
Previous Article in Journal
NRXR-ID: Two-Factor Authentication (2FA) in VR Using Near-Range Extended Reality and Smartphones
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Modeling Learning Outcomes in Virtual Reality Through Cognitive Factors: A Case Study on Underwater Engineering

by
Andrei-Bogdan Stănescu
1,*,
Sébastien Travadel
1,*,
Răzvan-Victor Rughiniș
2 and
Rocsana Bucea-Manea-Țoniș
3
1
Collège des Sciences Navales, MINES Paris—PSL, 75272 Paris, France
2
Faculty of Automatic Control and Computers, National University of Science and Technology “Politehnica” Bucharest, 060042 Bucharest, Romania
3
The Faculty of Physical Education and Sport, National University of Physical Education and Sport, 060057 Bucharest, Romania
*
Authors to whom correspondence should be addressed.
Electronics 2025, 14(17), 3369; https://doi.org/10.3390/electronics14173369
Submission received: 6 August 2025 / Revised: 19 August 2025 / Accepted: 23 August 2025 / Published: 25 August 2025
(This article belongs to the Special Issue Virtual Reality Technology, Systems and Applications)

Abstract

Virtual reality offers unique opportunities to personalize learning by adapting instructions to individual learning styles. This study explores the relationships between learning styles, cognitive load, and learning outcomes in a virtual reality environment designed for engineering education. Drawing on Kolb’s experiential learning theory, the research investigates how immersion and flow, in relation to learning styles, influence learning outcomes within the Submarine Simulator, an educational tool for underwater engineering. To enhance instructional design in virtual reality, this study proposes to aggregate existing and validated models, such as Kolb’s framework, to develop new models tailored specifically for learning environments in virtual reality. This research aims to highlight the interplay of these variables in a learning process focused on acquiring knowledge in the Science, Technology, Engineering, and Mathematics fields, specifically hydrodynamics, through designing and operating a simulated submarine model in virtual reality. A cohort of 26 students from MINES Paris—PSL participated in a three-phase testing process to evaluate the effectiveness of original virtual reality software designed to support learning in underwater engineering. The findings enhance our understanding of how learning styles influence learner engagement and performance and how virtual reality environments can be optimized through adaptive instructional design guided by these novel models tailored specifically for such immersive settings.

1. Introduction

In engineering higher education, virtual reality (VR) has emerged as an effective instructional platform, improving understanding of educational tasks [1], supporting knowledge retention via gamification and observational approaches [2], and elevating student involvement, with learners expressing strong approval of its user-friendliness [3].
VR presents an innovative potential to elevate educational experiences and results via customized strategies that promote students’ favorable views towards immersive learning tools and settings [4]. As educators adapt their instructional methods to accommodate diverse cognitive preferences, understanding the complex interplay between learning styles, cognitive load, and learning outcomes becomes essential.
Learning styles encompass the visible approaches and preferences that students adopt within educational settings [5]. In-depth reviews have examined a range of learning style frameworks and their effects on educational processes. These analyses conclude that awareness and grasp of varied learning styles allow students to personalize their study approaches [6] and enable instructors to adapt their teaching strategies, thus enhancing involvement and deep comprehension via research-supported tools [7].
Investigating the effects of VR-based learning environments on learners with different learning styles, specialists found that learning styles may affect the sense of presence and cognitive load in VR environments [8] and emphasize how these environments modulate the impact of learning styles on educational results [9]. Certain studies indicate that VR holds potential for adaptation to individual variations in learning styles [10], and students gained equivalent benefits from it, without notable differences in their educational experiences or outcomes [10,11].
Immersive VR systems demonstrate considerable potential by supporting individual variability and contributing to improved learning outcomes [12]. This demonstrates that VR-based learning environments hold significant resources for accommodating diverse learning styles and individual differences.
Within engineering education, VR platforms are engineered to accommodate personal learning inclinations, thereby elevating student involvement and achievement [13]. Research indicates that while learning styles do not significantly affect learning performance in VR, they do influence attention levels, with visual learners showing greater attention than verbal learners [14].
Researchers highlight that internal learner factors, such as the ability to absorb and process information, significantly influence knowledge acquisition in VR environments [15]. In relation to learning factors, studies indicate that VR has strong potential to induce flow, which is positively associated with continued use of VR [16]. In VR, the state of flow is defined by profound immersion, a distorted sense of time, and elevated performance [17]. It also serves as a mediator between immersion and educational results, thereby amplifying motivation, curiosity, and cognitive gains [18].
Immersion is considered a prerequisite for achieving flow, significantly impacting user satisfaction and the intention to continue engaging with VR environments [16,19]. Incorporating principles related to flow theory in designing VR software applications may optimize the learning state and learning performance through cognitive absorption [12]. These results underscore the significance of accounting for personal cognitive variations in designing VR applications, given their role as interfaces towards learning achievement levels. Overall, VR appears to offer potential benefits for learners across different styles [20]. Future research will aid in comprehensively grasping VR’s effects on varied learning preferences.
While the educational research community continues to debate the prescriptive value of learning style models and learning factors influencing the efficiency of VR, this study approaches learning preferences not as rigid labels but as flexible cognitive tendencies. We argue that the highly sensorimotor and immersive nature of VR creates a unique context in which these underlying preferences in perceiving and processing information become more pronounced, influencing engagement and flow. Hence, our study examines these interconnections to guide the creation of more adaptable VR educational designs rather than to endorse the learning style frameworks themselves.
Adopting an interdisciplinary viewpoint, this research investigates the interconnections among learning styles, flow state, and educational achievements in teaching core hydrodynamics principles using VR. This study focuses exclusively on fundamental hydrodynamics concepts, Kolb’s learning style inventory, established flow metrics, and assessments of post-training achievements. Using VR applications in other engineering fields and for non-academic populations are out of scope.
To examine the interactions among learning styles (grounded in Kolb’s experiential learning theory), flow state, and educational results in an immersive virtual reality setting, the Submarine Simulator software application, a custom tool crafted for underwater engineering, has been created (Video S1). This custom-built VR application aims to support and enhance the process of learning underwater engineering by enabling users to create three-dimensional (3D) models of small-scale submarines and rigorously test them in a realistic simulated underwater setting.
The software provides a dedicated pedagogical platform that integrates three key elements: a realistic underwater physics engine for authentic hydrodynamic feedback, a VR 3D modeling toolkit that allows direct hands-on design and iteration, and a gamified competitive framework featuring head-to-head racing challenges.
This interdisciplinary approach aims to make important contributions to understanding the underlying factors for learning efficiency, but also to developing new educational VR tools, by answering the following research questions:
  • Leveraging insights from an interdisciplinary viewpoint, how much do immersive engagement and flow state mediate the connection between learning styles (e.g., Kolb’s experiential learning theory) and learning outcome results in a VR-based educational environment?
  • Incorporating insights from educational sciences, psychology, and immersive technologies, in what ways do differences in learning styles (e.g., Kolb’s experiential learning theory) influence student performance in VR educational environments?
To address these research questions, we review the relevant literature in Section 2, which provides a theoretical foundation on the application of Kolb’s experiential learning theory and learning styles in relation to VR design and implementations. Additionally, we have developed an interdisciplinary experimental methodology grounded in this theoretical framework. This approach is expanded in detail throughout Section 3.
Section 4 outlines the experimental outcomes and delivers precise responses to our research questions. Section 5 discusses the research findings, while Section 6 sums up the main conclusions and outlines the limitations of this study.

Interdisciplinary Approaches in Virtual Reality Applications

Our study is positioned at the crossroads of multiple key scientific disciplines, blending both fundamental and applied domains. In the field of engineering, this study mainly relies on hydrodynamics to set the technical content and simulation settings. The computer science side includes VR development, simulation technologies, and human–computer interaction, all of which help build engaging and useful learning environments.
In the realm of educational sciences, this study incorporates principles from educational technology and learning sciences, placing special emphasis on cognitive load, learning styles, and educational psychology within Science, Technology, Engineering, and Mathematics (STEM) contexts. The measurement and evaluation sciences aspect incorporates psychometric techniques for evaluating the flow state, along with practical statistical methods for analyzing learning outcomes. Finally, within the interdisciplinary field, principles from cognitive engineering and human factors direct the enhancement of user experience, ensuring that the VR system promotes both technical proficiency and cognitive effectiveness.
This article is relevant to researchers, educators, and practitioners involved in the integration of immersive technologies into STEM higher education, particularly those seeking to enhance learning efficiency and engagement through simulation-based training. It also targets curriculum designers and policymakers interested in integrating VR into technical training. The specialized scientific audience includes experts in naval engineering pedagogy, hydrodynamics, educational technology research, human–computer interaction, and engineering.

2. Kolb’s Experiential Learning Theory in Designing Virtual Reality Environments for Engineering Education

We propose the application of Kolb’s experiential learning theory to the design of VR activities, with the aim of aligning educational experiences to individual learner profiles while examining the learning processes in relation to targeted outcomes in hydrodynamics. We employ Kolb’s experiential learning theory to investigate the interactions between varied learning styles and immersive settings, exploring their effects on knowledge gain and skill building while also evaluating the pedagogical efficacy of VR software applications in underwater engineering.
Prior research has combined Kolb’s experiential learning framework with VR to examine elements influencing students’ willingness to adopt VR for educational purposes [10], to develop customized VR workspaces [21,22], and to boost examination performance [23].
Kolb’s experiential learning theory outlines a four-stage cycle: Concrete Experience (CE), Reflective Observation (RO), Abstract Conceptualization (AC), and Active Experimentation (AE). While Kolb’s original experiential learning model defined four primary learning styles—Diverging, Assimilating, Converging, and Accommodating—these stem from earlier iterations of his theory. In more recent updates, particularly with the Kolb Learning Style Inventory (KLSI) 4.0, Kolb has refined and expanded this framework into nine styles to provide greater granularity across the learning quadrants: Experiencing, Imagining, Reflecting, Analyzing, Thinking, Deciding, Acting, Initiating, and Balancing. Each learning style represents an individual’s preference for a distinct phase of Kolb’s learning cycle, shaped by how they engage with, reflect on, conceptualize, and experiment with new information (Figure 1).
In Kolb’s experiential learning theory, a student’s primary learning style reflects their preferred approach to the learning cycle. This primary style is a preference, not a rigid limitation, indicating where a learner naturally feels most comfortable. Alongside this, students exhibit flex styles where they also perform effectively, showcasing their adaptability. Non-flex styles, or developing styles, are areas where learners are less proficient but can improve through practice and self-awareness, leading to a more complete and balanced learning profile.
Drawing from the theory, the tasks in our software application are structured as sequential phases, meticulously aligned with Kolb’s experiential learning cycle (CE, RO, AC, AE). This design fosters a comprehensive approach that accommodates diverse learner preferences and enhances engagement throughout the learning process (Figure 2).
Our VR simulation provides a tangible and immersive environment for studying and comprehending complex hydrodynamics concepts that are otherwise difficult to conceptualize. However, this novel VR environment does present a unique challenge for participants, most of whom have little to no prior experience with VR technology or advanced underwater simulations. The software application demands a seamless blend of creative problem-solving and precise interaction, pushing the boundaries of realistic 3D modeling and navigation tasks.
By mirroring Kolb’s framework, our research protocol not only facilitates hands-on interaction but also encourages reflection, theoretical integration, and practical application, ultimately aiming to quantify how these elements influence knowledge acquisition, skill retention, and flow in a simulated submarine modeling and underwater testing scenario. The learning outcomes are assessed through repeated VR sessions, enabling the evaluation of improvements in engineering competencies, such as problem-solving and technical precision. The protocol unfolds in four primary phases, as detailed in Table 1.
This dynamic process, rooted in Kolb’s learning cycle, fosters adaptive learning in the novel VR context, where participants can test hypotheses and refine skills in a realistic, immersive setting, ultimately strengthening their ability to tackle complex engineering challenges. Starting with these premises and by applying Kolb’s experiential learning principles, we seek to explore how customized VR experiences can enhance personalization, increase engagement, and improve educational outcomes, ultimately fostering more adaptive and effective instructional strategies in engineering higher education.

3. Research Methodology

To address the research questions and to test the proposed hypotheses, we designed an interdisciplinary observational (non-manipulative) experiment. The use of diverse assessment tools within naturalistic settings allowed for richer, multi-faceted data collection, providing a more complete picture of the phenomena under investigation.
The independent variables in this study were the original software and individual student characteristics. The dependent variables were the outcomes measured across the three distinct phases, which corresponded to the software’s testing stages (as described in Section 3.4).

3.1. Research Aim

Building upon the ongoing evolution of the learning styles framework designed by Kolb, our investigation seeks to explore the intricate relationships among learning styles, flow states, and student satisfaction as they relate to educational outcomes.
Particularly, our study focuses on engineering students immersed in VR settings, with the goal of determining the interplay among these elements to enhance engagement and overall academic performance within technology-driven educational contexts. This approach not only refines our understanding of personalized education but also highlights VR’s potential as a transformative tool in fostering deeper engagement and measurable success in STEM fields.

3.2. Hypothesis

Throughout our experiments, we tested the following hypotheses:
H1: 
The perceived quality of the VR learning environment, in conjunction with students’ learning styles, significantly predicts the level of immersive engagement and the overall satisfaction of the learning experience.
H2: 
Learning styles significantly influence student performance in the VR-based Submarine Simulator.
H3: 
Immersive engagement and the flow state in VR mediates the relationship between learning styles and learning outcomes.
H4: 
Learning styles significantly influence students’ perceived immersion and the flow state in the VR-based Submarine Simulator.
H5: 
Student performance in VR conditions is influenced by the perceived quality of VR software.

3.3. Target Group

The research protocol was used on a group of 26 students, in their fourth year at MINES Paris—PSL, during the underwater engineering course. The course was conducted on a full-time basis over a 10-week on-campus period, during which students were not enrolled on any other courses. Each student completed all three phases described in Section 3.4.5. Following each phase, the facilitator conducted a brief 10 min interview to ensure the experience was well-received, confirming that students enjoyed the session and felt comfortable without experiencing nausea or discomfort.
All students signed an agreement to participate in the research protocol (Document S1) and were informed that during the research and its dissemination the anonymity of the participants would be respected (Document S2). For reference purposes, during the data processing stage, only randomly assigned numerical codes were used. The characteristics of the group, from the point of view of learning styles, flow, immersion, satisfaction, and performance, are presented in Appendix B and Dataset S1.

3.4. Methods

To highlight the research variables, the following instruments were used throughout our experiment:

3.4.1. The Kolb Experiential Learning Profile (KELP)

The Kolb Experiential Learning Profile (KELP) is a practical self-assessment instrument that can help us assess our unique learning styles and has the advantage of only taking 15–25 min to complete [24]. Based on the results of the test, students received a score on the following four quadrants, describing how they process and transform experiences into knowledge:
(a)
Concrete Experience (CE)—Learners immerse themselves fully, openly, and without preconceived biases in novel experiences, emphasizing direct involvement and sensory engagement.
(b)
Reflective Observation (RO)—They contemplate and examine these experiences from diverse viewpoints, fostering introspection and nuanced understanding.
(c)
Abstract Conceptualization (AC)—They synthesize observations into coherent concepts, forming logically robust theories that explain patterns and relationships.
(d)
Active Experimentation (AE)—They apply these theories practically to inform decision-making and address real-world problems, testing ideas through action.
Drawing from these quadrants, Kolb’s refined framework identifies nine distinct learning styles, each offering greater granularity and reflecting unique preferences in how individuals navigate the learning cycle: Experiencing, Imagining, Reflecting, Analyzing, Thinking, Deciding, Acting, Initiating, and Balancing. The analysis of the questionnaire involved characterizing the group in terms of learning dimensions, learning styles, and flex learning styles.
Learning styles are determined by an individual’s preferences along two key bipolar dimensions, which are calculated as difference scores from questionnaire responses. These dimensions capture how people balance opposing approaches to perceiving (grasping information) and processing (transforming information). The two dimensions are named Abstract Conceptualization minus Concrete Experience (ACCE) and Active Experimentation minus Reflective Observation (AERO), both described in more detail in Appendix A.

3.4.2. The Flow State Scale (FSS)

The Flow State Scale (FSS), developed by Jackson and Marsh (1996), is a 36-item psychometric tool designed to measure the nine dimensions of flow as outlined by Csikszentmihalyi, capturing the optimal psychological state of complete immersion and engagement in an activity [25]. These dimensions include the following:
  • Challenge–Skill Balance: Perceiving that personal skills match the task’s demands;
  • Action–Awareness Merging: Experiencing seamless integration of actions and awareness;
  • Clear Goals: Having a clear understanding of objectives;
  • Unambiguous Feedback: Receiving immediate, clear feedback on performance;
  • Concentration on the Task: Maintaining deep focus without distractions;
  • Sense of Control: Feeling in command of the activity;
  • Loss of Self-Consciousness: Becoming less aware of self and external judgments;
  • Transformation of Time: Perceiving time as altered, either speeding up or slowing down;
  • Autotelic Experience: Finding the activity intrinsically rewarding.
Each dimension is assessed through four items, rated on a 5-point Likert scale (1 = strongly disagree, 5 = strongly agree), enabling precise measurement of flow intensity.
For each dimension, the four item scores are summed and divided by four to calculate an average score. This method quantifies flow experiences, facilitating analysis of their relationship with learning styles and performance outcomes in immersive VR settings like the Submarine Simulator, where engagement is key to learning.

3.4.3. Immersive Tendencies Questionnaire (ITQ)

The Immersive Tendencies Questionnaire (ITQ) was developed by Bob G. Witmer and Michael J. Singer, and it was designed to assess an individual’s inherent propensity or tendency to become immersed in everyday activities, media, and environmental situations, particularly as a predictor of how readily they might experience presence in virtual environments [26]. Presence, in this context, refers to the psychological state of feeling “there” in a mediated or simulated environment, and the ITQ aims to capture individual differences that could influence immersion levels. It is typically administered prior to VR exposure to stratify participants, predict performance, or identify factors contributing to immersion, such as involvement in activities like reading, watching movies, or playing games. The questionnaire is supported by correlations with related measures like the Tellegen Absorption Scale (r ≈ 0.40–0.60). Factor analysis in follow-up research confirms loadings on immersion-related constructs. Higher ITQ scores have been linked to better VE performance and higher presence ratings in some studies. Each item is rated on a 7-point Likert scale, with higher scores indicating stronger immersive tendencies.

3.4.4. Basic Needs in Games (BANG)

The Basic Needs in Games (BANG) scale is an open-access, free-to-use questionnaire developed to evaluate the satisfaction and frustration of basic psychological needs—autonomy, competence, and relatedness—experienced by players during video game play and is rooted in Self-Determination Theory (SDT) [27]. The BANG scale includes six subscales—three assessing satisfaction (autonomy, competence, relatedness) and three measuring frustration—allowing researchers to compute mean scores for each need separately, offering detailed and interpretable insights into how well a game supports these psychological needs.
The “satisfaction” dimension of the BANG results measures the fulfillment of three core needs—autonomy, competence, and relatedness—which significantly enhance player enjoyment, engagement, and psychological well-being. Autonomy satisfaction arises from players’ control over meaningful choices, competence satisfaction from mastering challenges and achieving goals, and relatedness satisfaction from forming social connections with others, including players and non-player characters. Fulfilling these needs fosters motivation, immersion, and sustained engagement, contributing to a rewarding gaming experience.
The scale has been statistically validated through rigorous psychometric testing, with initial studies demonstrating good internal consistency (Cronbach’s alpha values typically exceeding 0.80 for the satisfaction subscales) and construct validity, confirmed through factor analysis that aligns with SDT constructs. Furthermore, its reliability and applicability have been supported across diverse gaming contexts and populations, with ongoing research continuing to refine its sensitivity and predictive power.
This validation ensures that the BANG scale provides a robust tool for researchers and designers to assess and enhance the psychological impact of games, making the “satisfaction” dimension a critical metric for optimizing player-centered design in virtual environments.

3.4.5. Scoring Performance

To examine the interplay between learning styles, immersion and flow states, and learning outcomes in the Submarine Simulator VR environment, a tailored scoring system was established. This system draws on the distinct tasks across the three phases of software interaction, assigning scores that capture objective performance in each phase (Figure 3):
  • Phase 1 (Basic Submarine Construction and Testing, scored on a 0–20 Scale): This structured phase emphasizes foundational building and initial testing.
  • Phase 2 (Designing Tight and Loose Spiral Models, scored on a 0–2 Scale): Focused on strategic planning and iterative refinement, the binary scoring reflects a pass/fail mechanism for the two required models: 2 points for successfully completing both, 1 point for one, and 0 points for none.
  • Phase 3 (Competitive Racing on Underwater Tracks, scored on a 0–20 Scale): This dynamic phase demands real-time adaptation and quick decision-making. The scoring system was designed to be simple yet engaging. Points were awarded according to achievement on each track, with 1 point for Track 1, 2 points for Track 2, and 3 points for the more complex Track 3. Additional points would be awarded after winning the race against an opponent.

3.5. Statistical Methods

Using SmartPLS 3.0 [28], a Partial Least Squares Structural Equation Model (PLS-SEM) was used to investigate if a student’s VR experience—comprised of flow, immersion, satisfaction, and mindset—acts as a mediator between their learning styles and their learning performance across three distinct instructional phases (described as results using Points1, Points2, and Points3). As a valuable method for understanding causal processes and testing mediation effects, PLS-SEM analysis combines multiple aspects of factor analysis: learning style components; cognitive, affective, and mindset aspects; and path analysis to test and assess the relationships among variables.
This model allows researchers to simultaneously estimate two key components of a study [28]:
  • The measurement model: which examines how well the observed variables (indicators) relate to their underlying theoretical concepts (latent variables);
  • The structural model: which tests the hypothesized relationships between the latent variables themselves.
The main statistical indicators analyzed within PLS-SEM were the following:
  • Path Analysis: depicts the standardized path coefficients between latent variables (learning styles → VR experience → performance). These coefficients represent the strength and direction of the relationships, scaled between −1 and +1. Path coefficients were obtained by SmartPLS through an iterative algorithm that maximizes explained variance (R2) in the dependent variables (performance or outcomes learning). The percentages shown in the diagrams correspond to R2 values, representing the proportion of variance in each dependent latent variable explained by its predictors (e.g., R2 = 0.494 means 49.4% of performance variance is explained).
  • Covariance: refers to the degree to which two latent variables vary together. SmartPLS calculates these values from the estimated model using the covariance of their composite scores.
  • Average Variance Extracted (AVE): measures convergent validity—in other words, how well a construct explains the variance of its indicators. AVE > 0.50 indicates adequate convergence [29].
  • Discriminant Validity (Fornell–Larcker Criterion): ensures that a construct is more strongly related to its own indicators than to other constructs. The square root of AVE for each construct should be greater than its correlations with other constructs.
  • f2 (Cohen’s effect size): indicates the strength of a predictor’s contribution. Values of 0.02, 0.15, and 0.35 are considered small, medium, and large effects [29,30]. No subjective weighting was applied. All indicators were treated equally.
  • Cronbach’s α and Composite Reliability: assess internal consistency (values > 0.70 indicate good reliability).
  • Variance Inflation Factor (VIF): checks whether indicators are excessively correlated, which can bias estimates. VIF < 3.3 indicates no critical collinearity.
The model fit was assessed using the Standardized Root Mean Square Residual (SRMR), the squared Euclidean distance (d_ULS), and the geodesic discrepancy (d_G). An SRMR value below the recommended threshold of 0.08 indicates an acceptable model fit. Similarly, d_ULS and d_G values within the 95% bootstrapped confidence intervals suggest that the discrepancy between the empirical and model-implied covariance matrices is not statistically significant.
To assess the robustness of parameter estimates, we complemented the 5000-sample bootstrapping procedure with a jackknife resampling procedure, systematically omitting one case at a time. Path coefficients and their significance remained stable across all jackknife subsamples, with variations in β values < ±0.04 and no change in significance levels. This suggests the results are not unduly influenced by any single participant, supporting the stability of the findings.

3.6. Technical Setup and Architecture

Each participant was given identical hardware setups during all phases of the study, consisting of Meta Quest 3 VR headsets (Meta Platforms, Menlo Park, CA, USA), matching controllers, and high-powered rendering laptops (AMD Ryzen 9 6900HX, Advanced Micro Devices, Inc., Santa Clara, CA, USA; NVIDIA GeForce RTX 3070Ti, NVIDIA Corp., Santa Clara, CA, USA). The headsets and laptops were linked using USB-C cables (Meta Quest Link) to achieve low latency, avoid any lag, and exclude any internet dependencies.
As seen in Figure 4, in order to ensure optimal experiential quality, an .apk file was not installed directly on the headset, as this approach would necessitate downscaling graphical fidelity by reducing polygon counts, resulting in suboptimal visuals prone to inducing motion sickness. Instead, the .exe application (built via Unity for Windows) was executed on the laptops, with the output streamed to the VR headsets. This configuration enabled the facilitator to both monitor and record participants’ activities in real time via the laptop display.
From a software standpoint, the Unity-based project encompasses two primary scenes, each encapsulating distinct functionalities essential to the system’s operation:
The 3D Construction Scene: integrates all logic required for the incorporation of 3D objects (cubes, spheres, and other basic shapes) and their assembly into submarine models. Key functionalities include the following:
  • Inserting, manipulating, and duplicating 3D objects using VR controllers;
  • Scaling 3D objects proportionally, mirroring and free form, enabling users to optimize dimensions for all submarine components;
  • Snapping mechanisms (direct on an object face or at an angle) to facilitate the process of combining different shapes into an ensemble;
  • Activating vertical and horizontal guides to enhance construction accuracy;
  • Symmetrical construction modes, contexts in which modifications on any plane (OX/OY/OZ) are automatically mirrored on the opposing side.
Furthermore, users can select and modify components by altering their mass, density, or material properties or by removing elements, as needed. A save and retrieval mechanism is also implemented. All completed models are serialized as JSON files that encapsulate comprehensive construction data.
The Underwater Simulation Scene: contains all logic associated with the evaluation of submarine models within the simulated underwater environment. Core features encompass the following:
  • Activation and deactivation of thrusters to accelerate the model directionally;
  • Application of the custom-built physics simulations to visualize model behavior under submerged conditions;
  • Rendering of a green trajectory trace to visualize the model’s path;
  • Toggling between third-person and first-person perspectives;
  • Restarting the underwater simulation by re-immersing the submarine model;
  • Transitioning back to the construction scene for iterative refinements.

3.7. Research Design

This study adopted a multi-phase experimental design, carefully structured to align with Kolb’s experiential learning cycle, ensuring an incremental progression that fosters both practical engagement and theoretical understanding. Each phase was tailored to build upon the previous one, creating a cohesive framework that supports iterative learning, accommodates diverse learning styles, and captures nuanced data on participant performance and psychological states. This methodical design enhances the study’s ability to generate reliable, actionable insights into the efficacy of VR-based instruction in STEM education.
All figures presented in this section are screenshots captured directly from the original Submarine Simulator VR software application.
Phase 1: Introduction and foundational training
In the initial phase of the protocol, participants were gradually introduced to the VR platform, with the main goal of helping them feel comfortable and confident in using it. This introductory session focused on familiarizing them with the system’s interactive features, including user-friendly navigation controls and tools for building models.
Through guided, hands-on tutorials led by a facilitator, learners actively engaged in practical exercises, learning to create submarine prototypes (Figure 5) and conducting initial tests of navigation dynamics in the virtual underwater environment (Figure 6).
The primary focus was on cultivating a core understanding of hydrodynamic principles, buoyancy, and propulsion mechanisms relevant to submarine design. This study does not evaluate performance metrics during this phase; instead, it functions as a preparatory stage to ensure all participants achieve a consistent baseline proficiency. Phase 1 allocated a strict 45 min duration for each participant to complete the task, ensuring uniform timing across all sessions.
The facilitator reads each of the instructions mentioned in Table 2. The participant is then expected to follow the instructions. The participants may receive additional help from the facilitator until the task is completed correctly (assessed on a binary done/not done basis).
Students learn how to use the application through guided experiential learning, systematically progressing through Kolb’s cycle under the facilitator’s direction, with an emphasis on hands-on practice and learning by doing.
Phase 2: Sketching and designing models for spiral trajectories
Building upon the foundational skills developed in Phase 1, Phase 2 prioritizes independent design and iterative refinement within the Submarine Simulator VR environment. Participants start the phase with a low-tech activity, sketching preliminary submarine designs on paper to promote conceptual planning and creative visualization unconstrained by digital tools. This approach cultivates deliberate ideation, enabling learners to freely explore concepts prior to immersing themselves in the VR platform. The underlying intention of this protocol was to examine whether VR tends to suppress creativity.
In the VR environment, the primary task during the phase involves designing submarines capable of navigating two distinct spiral trajectories in the simulated underwater setting:
  • Tight spirals: Requiring high precision and a small radius, challenging participants to optimize control and maneuverability (Figure 7a).
  • Loose spirals: Emphasizing stability over a larger radius, testing the model’s structural integrity under varying conditions (Figure 7b).
Real-time feedback during testing provides data on trajectory accuracy, speed, and stability, enabling participants to refine their designs iteratively. This phase aimed to enhance problem-solving, adaptive design thinking, and practical application of hydrodynamic principles. Each participant had 35 min in VR during this phase.
Following Phase 2, a brief 10 min interview was conducted to assess the students’ planning approaches to the submarine design task. Based on these interviews, participants were divided into three categories: (1) those who started with an initial plan but adapted it iteratively based on intuition and observations (75%); (2) those who proceeded without a predefined plan, adapting dynamically along the way (20%); and (3) those who entered VR with a well-defined plan and followed it consistently throughout the phase (5%).
Phase 3: Paired competition on underwater racetracks
In the final phase of the study, a competitive twist was added by pairing participants into 13 teams, encouraging teamwork within each pair and friendly rivalry between groups. Each team collaborated to brainstorm ideas, then worked individually to design submarine models optimized for three increasingly challenging virtual underwater racetracks. These tracks were designed to reflect real-world engineering challenges, testing navigation, performance, and design skills in a dynamic, engaging, and gamified way.
Track 1—Straight-Line Navigation: Teams engineered submarine prototypes optimized for seamless straight-line propulsion, emphasizing robust propulsion mechanisms and consistent directional stability to prevent any unintended deviations from the intended path (Figure 8).
Track 2—Horizontal Maneuverability: This racetrack challenged teams to design submarine models capable of executing precise left and right maneuvers, integrating effective buoyancy adjustments, depth regulation, and thrust management to navigate the course successfully (Figure 9).
Track 3—Multi-Directional Agility: The most challenging track demanded complete maneuverability, requiring submarine models to execute fluid up, down, left, and right movements while maintaining precise control within a highly complex virtual underwater environment (Figure 10).
The evaluation focused on key performance indicators, such as the model’s ability to fully complete the designated course. Scoring was simple yet motivating for each track. Points were allocated based on achievement—Track 1, which involved straightforward linear paths, offered 1 point; Track 2 granted 2 points; and the more intricate Track 3 provided 3 points.
To heighten the excitement and competitive spirit, an additional layer was added—individuals could gain 1 bonus point by outperforming a rival in a direct race, with the opponent’s performance visualized as a “shadow” submarine (Figure 11).
For clarity, this shadow was not a live competitor but a digital replay drawn from a prior participant’s successful run on the same track, enabling the current user to race alongside this ghostly echo in real time within the immersive VR world, fostering a sense of head-to-head rivalry while building on collective progress.
If the current student finished the race ahead of this shadow opponent based on time, they would secure the additional point, adding a layer of competitive incentive to the exercise.
This phase emphasized interpersonal dynamics, such as competition and contentiousness, within a paired setting. Phase 3 spanned about 45 min, including team brainstorming, design collaboration, individual building, and racing. It is important to note that students could attempt the same track until successful or could change track.
To conclude, the research design included three evaluation phases, each with its own set of instructions and a specific system for assessing the level of learning. In Figure 9, we present the research design, illustrating when and how the tests were administered and indicating which software testing phase was used in each case (Figure 12).

4. Results

This section outlines the findings from the empirical analyses, exploring the connections between learning styles, the immersive VR experience, and learning outcomes.

4.1. Path Analyses

The model demonstrated good explanatory power, with R2 = 0.247 for the VR latent variable and R2 = 0.494 for performance, indicating moderate to substantial effect sizes (Figure 13).
The path coefficient from learning styles to VR was statistically meaningful (β = 0.497), suggesting that individual cognitive preferences (especially AE and CE, with loadings of 0.552 and 0.691, respectively) strongly influence the degree of VR engagement. Additionally, the path from VR to performance was strong (β = 0.703), indicating that perceived experiential quality within the VR environment is a critical determinant of learner success (Figure 13). H1 is confirmed.
The direct effect of learning styles on performance was comparatively weak (β = 0.290) but still statistically relevant, indicating that learning preferences have an independent, albeit smaller, impact on outcomes. However, the indirect effect—estimated as the product of the learning styles → VR and VR → performance paths (0.497 × 0.703 ≈ 0.349)—was stronger than the direct path, supporting a partial mediation model (Figure 10). H2 is partially confirmed, meaning that there are also additional factors that influence student performance.
These findings align with recent research emphasizing the role of immersive engagement (flow and presence) in VR-based learning [16,17]. The observed mediation effect supports the hypothesis that adaptive instructional design in VR should not merely align with learner preferences but should also optimize experiential features that facilitate cognitive absorption and sustained motivation.

4.2. Covariance Analysis of Latent Constructs

The covariance matrix for the latent variables provides additional insight into the structural relationships within the model. As shown in Table 3, the strongest covariance emerged between the VR experience construct and performance (Cov = 0.703), indicating a substantial shared variance between learners’ immersive engagement (flow, immersion, and mindset) and their performance outcomes across the VR learning phases, confirming H3.
A moderate covariance was observed between learning styles and VR (Cov = 0.497), suggesting that individual cognitive preferences are meaningfully aligned with how learners perceive and engage with the VR environment. In contrast, the learning styles–performance covariance was weaker (Cov = 0.290), reinforcing the interpretation that learning styles alone are not sufficient predictors of learning performance in immersive environments. The H4 hypothesis is partially confirmed, meaning that students have been able to adapt to the VR environment despite any learning style preference.
These results align with prior work highlighting the mediating role of engagement and flow in technology-enhanced learning [16] and suggest that instructional designers should prioritize experiential elements that activate learner motivation and presence over static alignment with predefined learning style categories.
The model’s explanatory power was assessed using R2 and f2 values for the endogenous constructs. The results indicated that the VR latent construct (comprising flow, immersion, and mindset) was explained by learning styles, with an R2 value of 0.247, suggesting that approximately 24.7% of the variance in VR experience could be attributed to individual learning preferences and cognitive process traits. As a result, H4 was partially confirmed and the relationship between learning styles, flow, and immersion, but also satisfaction, needs further study.
In turn, performance (aggregated across three instructional phases) showed a higher R2 of 0.494, meaning nearly half the variance in student outcomes was accounted for by the model’s predictors—primarily the VR construct (Table 4).
Effect size estimates using Cohen’s f2 revealed that learning styles had a moderate effect on VR (f2 = 0.329), while VR had a very large effect on performance (f2 = 0.977), far exceeding the threshold for a large effect [28]. These findings reinforce the mediating role of the VR experience, suggesting that its quality is a far stronger predictor of learning outcomes than learning styles alone. H5 is confirmed. The high f2 for VR on performance underscores the central importance of immersive and engaging instructional design in achieving measurable academic gains in VR-based learning environments (Table 4).

4.3. Construct Reliability and Validity

To assess the internal consistency and convergent validity of the latent variables, multiple reliability indices were examined, including Cronbach’s alpha, rho_A, composite reliability, and AVE. The learning styles and VR constructs exhibited excellent internal consistency, each achieving perfect reliability across all indices (Cronbach’s α = 1.000, rho_A = 1.000, and composite reliability = 1.000). This reflects high inter-item correlation among the indicators used for these latent variables (Table 5).
The performance construct showed acceptable reliability, with Cronbach’s alpha of 0.741 and composite reliability of 0.738, both exceeding the standard minimum threshold of 0.70 [30]. Furthermore, its AVE value of 0.513 surpassed the 0.50 benchmark, confirming adequate convergent validity and indicating that more than half the variance in observed indicators was captured by the construct. Collectively, these metrics support the robustness and psychometric adequacy of the measurement model (Table 5).

4.4. Discriminant Validity

Discriminant validity among the latent constructs was assessed using the Fornell–Larcker criterion, which compares the square root of the AVE for each construct with its correlations with other constructs. As shown in Table 6, the square root of the AVE for performance was 0.716, which exceeded its correlations with learning styles (r = 0.290) and VR (r = 0.703). Similarly, the square root of the AVE for VR (0.703) was higher than its correlation with learning styles (r = 0.497). These results confirm that each latent variable shares more variance with its indicators than with any other construct, thereby satisfying the criterion for adequate discriminant validity [31]. This supports the distinctiveness of the latent dimensions measured in the model, ensuring that learning styles, VR experience, and performance are empirically separable constructs (Table 6).
Multicollinearity diagnostics were assessed using the VIF. All indicator VIF values ranged between 1.017 and 3.036, which are well below the conservative threshold of 3.3 [31]. This confirms that multicollinearity is not a concern in the measurement model.
The highest VIF was observed for AE learners (3.036), yet it remained within acceptable limits. These results affirm the stability of the model estimation process and the robustness of path coefficient calculations (Table 7).

4.5. Model Fit

The overall model fit was evaluated using several common indices within the PLS-SEM framework. The SRMR was 0.096 for the saturated model and 0.097 for the estimated model—both values falling below the conservative threshold of 0.10, indicating an acceptable model fit [31].
The discrepancy measures d_ULS (0.724–0.730) and d_G (0.449–0.454) were low, further supporting the structural integrity of the model. The Chi-square values (44.431 and 44.725) provide additional support for a moderate-to-adequate fit, although Chi-square statistics in PLS-SEM are typically interpreted with caution due to distributional assumptions [32] (Table 8).

4.6. Bootstrapping Path Coefficients

The significance of the structural model paths was assessed using a bootstrapping procedure with 5000 subsamples. The path from learning styles to VR experience was statistically significant, with a standardized coefficient of β = 0.497, t(df) = 2.723, p = 0.007 [31]. This indicates that learners’ cognitive preferences (e.g., CE, AE) positively and significantly influence their perceived engagement and flow within the VR environment. The strength of this relationship underscores the importance of tailoring immersive environments to accommodate cognitive variability (Table 9).
In turn, the path from VR experience to performance was also strong, with a standardized coefficient of β = 0.703, suggesting a substantial impact of experiential quality on learning outcomes. While no p-value is listed in the table for this path, the high coefficient aligns with previous model outputs (e.g., R2 = 0.494 for performance), reinforcing the mediating role of VR in translating cognitive predispositions into successful performance. These results lend empirical support to the conceptual model in which VR acts as a critical conduit through which learning styles exert their effect on performance (Table 9).

5. Discussion

A central finding of this study is the strong mediating role of the VR experience, encompassing flow, immersion, and satisfaction, between learning styles and performance. This shows that while learning preferences influence how students engage with the virtual environment, the quality of the immersive experience itself is the more potent predictor of success. The actual experience acts as a strong mediator because it leverages multisensory and interactive elements that can adapt to various learning styles, creating a unified engagement pathway that transcends individual preferences and directly enhances cognitive processing and retention.
By inducing states of flow and immersion, the learning environment minimizes distractions and boosts intrinsic motivation, allowing all learning styles to yield learning performance through emotional satisfaction and sustained focus. Ultimately, while learning styles shape initial engagement, the immersive VR experience’s ability to deliver universal psychological benefits, such as heightened focus and motivation, establishes it as the primary driver of learning outcomes.
A well-designed VR environment can act as a cognitive “leveling field,” providing multiple, simultaneous pathways for understanding (kinesthetic, visual, and analytical) that cater to a wide range of learners. In such an environment, the user’s ability to achieve a state of flow appears to override their initial cognitive predispositions, becoming the primary driver of learning outcomes.
To summarize, the results support the proposed mediation effect—flow and immersive presence partially bridge the relationship between learning styles and performance outcomes. This aligns with prior work on VR personalization and supports the notion that cognitive adaptation—both by learners and systems—enhances digital learning outcomes [12].
Our results indicate that VR can effectively support experiential learning, confirming the literature data [22,23]. Nevertheless, the effectiveness of VR hinges greatly on its ability to facilitate the full cyclical learning process. Experiential learning does not involve confining the learner to their main learning style throughout the entire experience; instead, it entails granting entry into the learning cycle via their preferred comfort zone and guiding them through the complete cycle, thereby fostering growth in both their flexible styles and emerging styles.
Remarkably, learners with a strong action orientation (high AERO scores) displayed enhanced engagement and performance in dynamic, competitive tasks like the Phase 3 racing simulation, indicating that kinesthetic interactivity boosts flow for those who thrive on Active Experimentation. Conversely, learners with high abstraction preferences (high ACCE scores) reported lower flow when VR tasks lacked sufficient theoretical context, especially during early, hands-on phases. These findings confirm that task-type congruence with learning preferences is critical in maximizing immersive potential [16].
In engineering education research, the Assimilating and Converging styles, referred to as Analyzing and Deciding in Kolb’s latest framework used in this paper, demonstrate greater task effectiveness in model-building simulations. In contrast, the Accommodating style, termed Initiating in the same framework, shows a negative correlation with non-personalized descriptive tasks, indicating underperformance in non-hands-on phases unless adapted [21]. Our study’s findings suggest that while learning styles play a role, other factors more strongly influence performance, as evidenced by a weaker correlation with learning styles alone.
A seminal study from 2005—one of the earliest investigations into the interplay between individual learning styles and VR—demonstrates that learners gain the most significant advantages from guided exploration modes in VR environments, irrespective of their personal learning preferences, underscoring VR’s capacity to adapt effectively to a wide range of educational needs [10]. Although it represents one of the earliest investigations into the connection between learning styles and learning efficiency in VR, this finding aligns with our research experience, as it offers partial validation for Hypothesis 2 by illustrating that students, regardless of their learning preferences, could adapt and achieve varying degrees of success across all three phases of our study.
Interestingly, learners identified as Balanced (with near-zero AERO and ACCE scores) maintained consistent flow across all task types, reinforcing the value of flexibility in learning approaches, which is crucial for student development, as it enables them to adapt to circumstances by flexing into all learning styles. This group’s performance suggests that versatile cognitive strategies can help navigate the diverse demands of VR environments.

6. Conclusions

Learning environments built in virtual reality, like the Submarine Simulator, are powerful tools for nurturing these developing styles. By engaging students in immersive tasks, such as designing and testing submarine models, VR encourages experimentation, reflection, and refinement, helping learners strengthen unfamiliar learning styles. However, achieving full-cycle learning, where students adeptly navigate all stages, relies on a supportive learning environment. A well-crafted VR platform, combined with guided instruction and reflective opportunities, creates an ideal setting to enhance engagement and foster growth across all learning styles, enabling students to become versatile, well-rounded learners.
Adaptive instructional design, grounded in theoretical frameworks like Kolb’s experiential learning cycle, is essential for creating inclusive and effective VR-based learning environments by tailoring content and interactions to accommodate diverse learner preferences, such as Reflective Observation, Abstract Conceptualization, Concrete Experience, and Active Experimentation. This approach not only enhances individual learner engagement and retention but also addresses potential disparities in cognitive load, ensuring that VR tools are accessible and beneficial across a wide range of educational contexts, from technical training to creative skill development.
Moreover, the modality effect highlights the importance of thoughtfully designing how information is presented in VR, as it can variably affect cognitive load. Although VR provides notable benefits for improving cognitive functions, it is critical to address potential issues like cognitive overload and prioritize effective instructional design to optimize learning results.
By creating feedback loops between students, educators, and developers, we can cultivate an ecosystem that continuously evolves, ensuring that VR tools remain relevant and impactful in addressing the complexities of modern education. In addition to the importance of user feedback, it is crucial to explore the role of interdisciplinary collaboration in enhancing VR learning environments.
Our findings translate into actionable guidance for instructional designers. To better support learners with a preference for Reflective Observation (negative AERO scores), VR platforms should include features that allow students to replay and analyze their performance from multiple perspectives. Conversely, for learners with a high preference for Abstract Conceptualization (positive ACCE scores), who reported lower flow in hands-on phases, embedding access to theoretical principles and real-time data visualizations directly within the VR environment could be crucial for bridging the gap between theory and application.
Thus, continued research and innovative practices are essential to fully harness the transformative power of VR in education, enabling the development of tailored learning experiences that cater to diverse student needs and learning styles. This ongoing exploration should focus on refining VR technologies and pedagogical strategies to maximize engagement, enhance skill acquisition, and foster critical thinking, thereby preparing students for a future where technology is an integral and dynamic component of their learning journeys across various disciplines and contexts.

Limitations

This holistic approach, which includes comprehensive teacher training and the incorporation of user feedback, will be vital in maximizing the potential of VR technologies in fostering inclusive and engaging learning environments. The study involved a single group of 26 students enrolled in the underwater engineering course and was relatively homogeneous, which may be considered a limitation, as it could restrict the generalizability of the findings across more diverse populations.
To address the issues with a small and similar group of participants, which limits insights and adds biases, future studies should include larger and more varied groups. This means recruiting people from different schools, with a mix of beginners and advanced users, and from diverse backgrounds, like various ethnicities, genders, ages, and cultures. Doing this will help reduce biases, make the results apply more broadly, and give a clearer picture of how well the VR Submarine Simulator works as a teaching tool in different contexts.
Another notable limitation is that the experiment was heavily dependent on a facilitator to guide participants. While this ensured baseline proficiency, it also suggests that complex, open-ended educational VR tools like the Submarine Simulator may not yet be suitable for fully independent learning. This can be viewed as an interesting finding for practical implementation; the role of the educator evolves from a knowledge dispenser to a “VR facilitator” who is very important for scaffolding the experience, managing cognitive load, and ensuring pedagogical goals are met.
The time interval between phases was strictly capped at a maximum of 7 days, allowing for some degree of individual variation among students, though additional constraints could potentially be applied in future studies to refine this aspect.
Future research should investigate the incorporation of real-time monitoring, including metrics like heart rate variability (HRV), heart rate, and other physiological sensors, to dynamically adapt and personalize VR experiences based on learners’ immediate states. Gaining a deeper understanding of this intricate interplay between physiological responses and virtual learning environments will be crucial, particularly in engineering education, where optimizing performance and sustaining high levels of engagement are paramount to success. Thus, continued research and innovative practices are necessary to fully harness the transformative power of VR in education, preparing students for a future where technology plays an integral role in their learning journeys.

Supplementary Materials

The following supporting information can be downloaded at https://www.mdpi.com/article/10.3390/electronics14173369/s1, Video S1: Introduction to the VR software application; Document S1: Informed consent from students; Document S2: Signed IRB approval letter; Dataset S1: Collected raw data.

Author Contributions

Conceptualization, A.-B.S., S.T. and R.-V.R.; methodology, A.-B.S., S.T. and R.-V.R.; software, A.-B.S.; validation, A.-B.S., S.T., R.-V.R. and R.B.-M.-Ț.; formal analysis, A.-B.S., S.T., R.-V.R. and R.B.-M.-T.; investigation, A.-B.S.; data curation, A.-B.S., S.T. and R.-V.R.; writing—original draft preparation, A.-B.S., S.T., R.-V.R. and R.B.-M.-Ț.; writing—review and editing, A.-B.S., S.T., R.-V.R. and R.B.-M.-Ț.; visualization, A.-B.S., S.T., R.-V.R. and R.B.-M.-Ț.; supervision, S.T. and R.-V.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The raw data has been added as Supplementary Material.

Acknowledgments

The research was performed as part of doctoral studies and supervised by coordinators from MINES Paris—PSL and the National University of Science and Technology “POLITEHNICA” Bucharest. The main author (A.-B.S.) is an Experiential Learning Practitioner and Facilitator, certified by the Institute for Experiential Learning. Certification IDs: 1a500d6d-db52-4f46-ac23-6a8abe4cc321 and 0f037352-d1f4-4cfc-81e9-1110742f0c2c.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

The Kolb Experiential Learning Profile (KELP) is based on the experiential theory and model of learning developed by David Kolb and on the seminal contributions of John Dewey, Kurt Lewin, and Jean Piaget. Based on the results of the test, students received a score based on the following four quadrants, describing how they process and transform experiences into knowledge:
  • Concrete Experience (CE)—Learners immerse themselves fully, openly, and without preconceived biases in novel experiences, emphasizing direct involvement and sensory engagement;
  • Reflective Observation (RO)—They contemplate and examine these experiences from diverse viewpoints, fostering introspection and nuanced understanding;
  • Abstract Conceptualization (AC)—They synthesize observations into coherent concepts, forming logically robust theories that explain patterns and relationships;
  • Active Experimentation (AE)—They apply these theories practically to inform decision-making and address real-world problems, testing ideas through action.
Drawing from these four quadrants, Kolb’s refined framework identifies nine distinct learning styles, each offering greater granularity and reflecting preferences in how individuals navigate the learning cycle:
  • Experiencing: Thrives on hands-on engagement, relying on intuition and emotional connection to immerse fully in new experiences;
  • Imagining: Combines creativity and reflection, brainstorming innovative ideas by exploring diverse perspectives with empathy;
  • Reflecting: Focuses on deep observation, analyzing experiences from multiple angles to uncover patterns and insights;
  • Analyzing: Excels at systematic analysis, organizing observations into structured, logical frameworks;
  • Thinking: Prioritizes logical reasoning, developing precise, theory-driven solutions through objective analysis;
  • Deciding: Focuses on practical problem-solving, using theories to make informed decisions and achieve measurable outcomes;
  • Acting: Thrives in dynamic settings, implementing ideas and adapting quickly to achieve tangible results;
  • Initiating: Proactively embraces new challenges, combining risk-taking with enthusiasm to explore innovative solutions;
  • Balancing: Adapts flexibly across all learning cycle stages, seamlessly integrating experiencing, reflecting, theorizing, and acting.
These dimensions capture how people balance opposing approaches to perceiving (grasping information) and processing (transforming information):
  • ACCE (Abstract Conceptualization minus Concrete Experience): This represents the “perceiving” dimension, measuring an individual’s preference for abstract thinking versus concrete feeling when grasping new information. A positive ACCE score indicates a stronger inclination toward AC—favoring logical analysis, theoretical models, and objective reasoning (e.g., conceptualizing principles like buoyancy in a submarine simulation). A negative score leans toward CE, emphasizing tangible, hands-on experiences and intuitive, relational approaches (e.g., immersing in sensory feedback from the VR environment). This dimension highlights how learners prefer to initially encounter and internalize experiences, with balanced scores suggesting flexibility between the two.
  • AERO (Active Experimentation minus Reflective Observation): This is the “processing” dimension, assessing the preference for active doing versus reflective watching when transforming experiences into knowledge. A positive AERO score points to a bias toward AE—prioritizing practical application, experimentation, and risk-taking to test ideas (e.g., iteratively redesigning a submarine model and running tests). A negative score favors RO, focusing on careful observation, contemplation, and diverse viewpoints before acting (e.g., reviewing simulation outcomes to understand failures). Like ACCE, this dimension underscores processing strategies, and neutral scores indicate adaptability.

Appendix B

Descriptive Statistics of Learning Styles and Dimensions Within Target Group

Below are the key descriptive statistics for the core Kolb dimensions (CE, RO, AC, AE) and their differences (ACCE, AERO), described in Table A1. These provide an overview of central tendencies and variability. The group shows a clear preference for Abstract Conceptualization (higher mean AC vs. CE, positive ACCE), aligning with analytical and logical thinking in Kolb’s model. The near-zero mean AERO indicates balance between reflection and action overall, with moderate variability suggesting individual differences (Table A1).
Table A1. Learning styles of the target group members.
Table A1. Learning styles of the target group members.
DimensionMeanStd DevMin25%Median75%Max
CE (Concrete Experience)20.06.71116.017.022.037
RO (Reflective Observation)26.65.51622.326.529.539
AC (Abstract Conceptualization)35.15.92430.336.039.044
AE (Active Experimentation)27.35.91425.027.029.041
ACCE (AC—CE)15.18.5−911.016.520.027
AERO (AE—RO)0.78.8−15−5.80.08.315
In Table A2, we represent the distribution of the main learning style. Analyzing dominates (38.5%), followed by Thinking, indicating a sample skewed toward assimilative (high AC/RO) and convergent (high AC/AE) styles in Kolb’s framework.
Table A2. Distribution of main learning styles.
Table A2. Distribution of main learning styles.
Main StyleCountPercentage
Analyzing1038.5%
Thinking623.1%
Acting311.5%
Reflecting27.7%
Balancing27.7%
Initiating27.7%
Deciding13.8%
The target group was also characterized through their flex learning style (in the Kolb model). Flex styles were split and counted across all participants (total mentions = 101, average ~3.9 per person). The counts are taken from the analysis: Balancing (20), Reflecting (19), Thinking (13), Analyzing (11), Imagining (11), Deciding (10), Experiencing (7), Acting (7), Initiating (3), totaling 101 mentions across 26 participants (average ~3.9 flexes per person) (Table A3).
Table A3. Distribution of secondary learning style.
Table A3. Distribution of secondary learning style.
FlexCountPercentage of Total Mentions
Balancing2019.8%
Reflecting1918.8%
Thinking1312.9%
Analyzing1110.9%
Imagining1110.9%
Deciding109.9%
Experiencing76.9%
Acting76.9%
Balancing and Reflecting are the most common, appearing in 77% and 73% of profiles, respectively. Notably, 84.6% (22/26) of participants include Balancing in the main style or flexes. The high prevalence of Balancing implies versatility, allowing adaptation across Kolb’s cycle (Experiencing, Reflecting, Thinking, Acting). This could indicate a mature or trained group, as Kolb emphasizes balanced styles for effective learning.
The results obtained by the participants in the other tests are presented below (Table A4).
Table A4. Performance scores and questionnaire results for each participant.
Table A4. Performance scores and questionnaire results for each participant.
User IDMain Learning StyleFlow Result (FSS)Immersive Result (IQT)BANG Result (Satisfaction)Learning Outcome Phase 1Learning Outcome Phase 2Learning Outcome Phase 3
290Thinking3.7784.0711520212
709Acting4.0283.4291719211
227Analyzing4.5284.28614.520210
512Initiating4.2223.857171928
177Balancing4.3893.893161724
443Balancing4.1943.92918.51823
187Initiating3.9724.21415.51722
159Thinking4.3334.179121722
830Analyzing4.1943.57117.51720
670Thinking3.7784.46417.52020
590Analyzing3.5564.07117.51820
595Analyzing3.3614.036161915
682Deciding3.9724.07115.51613
341Reflecting4.3614.53615.51912
411Thinking3.8613.893151812
840Analyzing3.1943.67916.51512
388Analyzing3.9723.857161811
723Analyzing4.1114.14314.51710
985Reflecting2.5283.78618.51710
779Thinking3.3614.179201810
298Acting3.3334.286131810
484Acting4.5283.67916.51610
770Analyzing3.2224.03615.51610
955Thinking3.7224.17915.51510
642Analyzing2.5003.750131410
256Analyzing3.7783.500171504

References

  1. Ghazali, A.K.; Aziz, N.A.A.; Aziz, K.A.; Kian, N.T. The usage of virtual reality in engineering education. Cogent Educ. 2024, 11, 2319441. [Google Scholar] [CrossRef]
  2. Wong, J.Y.; Azam, A.B.; Cao, Q.; Huang, L.; Xie, Y.; Winkler, I.; Cai, Y. Evaluations of Virtual and Augmented Reality Technology-Enhanced Learning for Higher Education. Electronics 2024, 13, 1549. [Google Scholar] [CrossRef]
  3. Conesa, J.; Martínez, A.; Mula, F.; Contero, M. Learning by Doing in VR: A User-Centric Evaluation of Lathe Operation Training. Electronics 2024, 13, 2549. [Google Scholar] [CrossRef]
  4. Bano, F.; Alomar, M.A.; Alotaibi, F.M.; Serbaya, S.H.; Rizwan, A.; Hasan, F. Leveraging Virtual Reality in Engineering Education to Optimize Manufacturing Sustainability in Industry 4.0. Sustainability 2024, 16, 7927. [Google Scholar] [CrossRef]
  5. Manolis, C.; Burns, D.J.; Assudani, R.; Chinta, R. Assessing experiential learning styles: A methodological reconstruction and validation of the Kolb Learning Style Inventory. Learn. Individ. Differ. 2013, 23, 44–52. [Google Scholar] [CrossRef]
  6. Poirier, L.; Ally, M. Considering Learning Styles When Designing for Emerging Learning Technologies. In Emerging Technologies and Pedagogies in the Curriculum; Yu, S., Ally, M., Tsinakos, A., Eds.; Bridging Human and Machine: Future Education with Intelligence; Springer: Singapore, 2020. [Google Scholar] [CrossRef]
  7. Whitman, G.M. Learning Styles: Lack of Research-Based Evidence. Clear. House: A J. Educ. Strateg. Issues Ideas 2023, 96, 111–115. [Google Scholar] [CrossRef]
  8. Huang, C.L.; Luo, Y.F.; Yang, S.C.; Lu, C.M.; Chen, A.S. Influence of Students’ Learning Style, Sense of Presence, and Cognitive Load on Learning Outcomes in an Immersive Virtual Reality Learning Environment. J. Educ. Comput. Res. 2019, 58, 596–615. [Google Scholar] [CrossRef]
  9. Hauptman, H.; Cohen, A. The synergetic effect of learning styles on the interaction between virtual environments and the enhancement of spatial thinking. Comput. Educ. 2011, 57, 2106–2117. [Google Scholar] [CrossRef]
  10. Chen, C.J.; Seong, C.T.; Wan, M.F.; Wan, I. Are Learning Styles Relevant to Virtual Reality? J. Res. Technol. Educ. 2005, 38, 123–141. [Google Scholar] [CrossRef]
  11. Pedram, S.; Howard, S.; Kencevski, K.; Perez, P. Investigating the Relationship Between Students’ Preferred Learning Style on Their Learning Experience in Virtual Reality (VR) Learning Environment. In Human Interaction and Emerging Technologies, Proceedings of the International Conference on Human Interaction and Emerging Technologies (IHIET 2019), Nice, France, 22–24 August 2019; Ahram, T., Taiar, R., Colson, S., Choplin, A., Eds.; Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2020; Volume 1018. [Google Scholar] [CrossRef]
  12. Kampling, H. The Role of Immersive Virtual Reality in Individual Learning. In Proceedings of the Hawaii International Conference on System Sciences, Hilton Waikoloa Village, HI, USA, 2–6 January 2018. [Google Scholar] [CrossRef]
  13. Lin, Y.; Wang, S. The Study and Application of Adaptive Learning Method Based on Virtual Reality for Engineering Education. In Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2019; Volume 1015, pp. 301–308. [Google Scholar] [CrossRef]
  14. Cao, L.; He, M.; Wang, H. Effects of Attention Level and Learning Style Based on Electroencephalo-Graph Analysis for Learning Behavior in Immersive Virtual Reality. IEEE Access 2023, 11, 53429–53438. [Google Scholar] [CrossRef]
  15. Gyed, F.A.S.; Tehseen, M.; Tariq, S.; Muhammad, A.K.; Yazeed, Y.G.; Habib, H. Integrating educational theories with virtual reality: Enhancing engineering education and VR laboratories. Soc. Sci. Humanit. Open 2024, 10, 101207. [Google Scholar] [CrossRef]
  16. Hassan, L.; Jylhä, H.; Sjöblom, M.; Hamari, J. Flow in VR: A Study on the Relationships Between Preconditions, Experience and Continued Use. In Proceedings of the Hawaii International Conference on System Sciences, Maui, HI, USA, 7–10 January 2020. [Google Scholar] [CrossRef]
  17. Rutrecht, H.; Wittmann, M.; Khoshnoud, S.; Igarzábal, F.A. Time Speeds Up During Flow States: A Study in Virtual Reality with the Video Game Thumper. Timing Time Percept. 2021, 9, 353–376. [Google Scholar] [CrossRef]
  18. Guerra-Tamez, C.R. The Impact of Immersion through Virtual Reality in the Learning Experiences of Art and Design Students: The Mediating Effect of the Flow Experience. Educ. Sci. 2023, 13, 185. [Google Scholar] [CrossRef]
  19. Mütterlein, J. The Three Pillars of Virtual Reality? Investigating the Roles of Immersion, Presence, and Interactivity. In Proceedings of the Hawaii International Conference on System Sciences, Hilton Waikoloa Village, HI, USA, 2–6 January 2018. Available online: http://hdl.handle.net/10125/50061 (accessed on 22 July 2025).
  20. Zaharias, P.; Andreou, I.; Vosinakis, S. Educational Virtual Worlds, Learning Styles and Learning Effectiveness: An Empirical Investigation. 2010. Available online: https://api.semanticscholar.org/CorpusID:14092908 (accessed on 22 July 2025).
  21. Horváth, I. An Analysis of Personalized Learning Opportunities in 3D VR. Front. Comput. Sci. 2021, 3, 673826. [Google Scholar] [CrossRef]
  22. Majgaard, G.; Weitze, C. Virtual Experiential Learning, Learning Design and Interaction in Extended Reality Simulations. In Proceedings of the 14th European Conference on Games Based Learning ECGBL 2020, Virtual, 24–25 September 2020. Available online: https://www.researchgate.net/publication/344412172_Virtual_Experiential_Learning_Learning_Design_and_Interaction_in_Extended_Reality_Simulations (accessed on 5 March 2025).
  23. Konak, A.; Clark, T.K.; Nasereddin, M. Using Kolb’s Experiential Learning Cycle to improve student learning in virtual computer laboratories. Comput. Educ. 2014, 72, 11–22. [Google Scholar] [CrossRef]
  24. Kolb, D. The Kolb Experiential Learning Profile a Guide to Experiential Learning Theory, KELP Psychometrics and Research on Validity. The Kolb Experiential Learning Profile (KELP) Tech Guide. 2021. Available online: https://www.academia.edu/87570273/The_Kolb_Experiential_Learning_Profile_A_Guide_to_Experiential_Learning_Theory_KELP_Psychometrics_and_Research_on_Validity (accessed on 22 July 2025).
  25. Jackson, S.A.; Marsh, H. Development and validation of a scale to measure optimal experience: The Flow State Scale. J. Sport Exerc. Psychol. 1996, 18, 17–35. [Google Scholar] [CrossRef]
  26. Witmer, B.G.; Singer, M.J. Measuring Presence in Virtual Environments: A Presence Questionnaire. Presence 1998, 7, 225–240. Available online: https://cs.uky.edu/~sgware/reading/papers/witmer1998measuring.pdf (accessed on 6 March 2024). [CrossRef]
  27. Ballou, N.; Denisova, A.; Ryan, R.; Scott Rigby, C.; Deterding, S. The Basic Needs in Games Scale (BANGS): A new tool for investigating positive and negative video game experiences. Int. J. Hum.-Comput. Stud. 2024, 188, 103289. [Google Scholar] [CrossRef]
  28. Ringle, C.M.; Wende, S.; Becker, J.-M. SmartPLS 3 Boenningstedt: SmartPLS GmbH: Boenningstedt 2015. Available online: http://www.smartpls.com (accessed on 5 March 2025).
  29. Hair, J.F.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 3rd ed.; Sage: Thousand Oaks, CA, USA, 2022. [Google Scholar]
  30. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Routledge: Oxford, UK, 1988. [Google Scholar] [CrossRef]
  31. Sarstedt, M.; Ringle, C.M.; Hair, J.F. Partial Least Squares Structural Equation Modeling. In Handbook of Market Research; Homburg, C., Klarmann, M., Vomberg, A.E., Eds.; Springer: Cham, Switzerland, 2021. [Google Scholar] [CrossRef]
  32. Van Laar, S.; Braeken, J. Caught off Base: A Note on the Interpretation of Incremental Fit Indices. Struct. Equ. Model. A Multidiscip. J. 2022, 29, 935–943. [Google Scholar] [CrossRef]
Figure 1. The nine learning styles defined by Kolb in relation to the cycle.
Figure 1. The nine learning styles defined by Kolb in relation to the cycle.
Electronics 14 03369 g001
Figure 2. How the virtual reality application fits with Kolb’s learning cycle.
Figure 2. How the virtual reality application fits with Kolb’s learning cycle.
Electronics 14 03369 g002
Figure 3. The methodological framework applied.
Figure 3. The methodological framework applied.
Electronics 14 03369 g003
Figure 4. High-level technical architecture.
Figure 4. High-level technical architecture.
Electronics 14 03369 g004
Figure 5. Student in the process of building a submarine model.
Figure 5. Student in the process of building a submarine model.
Electronics 14 03369 g005
Figure 6. Student in the process of controlling the motors placed on a submarine model.
Figure 6. Student in the process of controlling the motors placed on a submarine model.
Electronics 14 03369 g006
Figure 7. (a) Submarine model going in a tight trajectory; (b) submarine model going in a loose trajectory.
Figure 7. (a) Submarine model going in a tight trajectory; (b) submarine model going in a loose trajectory.
Electronics 14 03369 g007
Figure 8. The straight-line racetrack.
Figure 8. The straight-line racetrack.
Electronics 14 03369 g008
Figure 9. The racetrack testing horizontal maneuverability.
Figure 9. The racetrack testing horizontal maneuverability.
Electronics 14 03369 g009
Figure 10. The racetrack testing both vertical and horizontal maneuverability.
Figure 10. The racetrack testing both vertical and horizontal maneuverability.
Electronics 14 03369 g010
Figure 11. Showcasing a race against another student’s model (represented as a shadow).
Figure 11. Showcasing a race against another student’s model (represented as a shadow).
Electronics 14 03369 g011
Figure 12. Research design for VR experiential learning.
Figure 12. Research design for VR experiential learning.
Electronics 14 03369 g012
Figure 13. Path analysis of the relationship between learning styles and learning outcomes.
Figure 13. Path analysis of the relationship between learning styles and learning outcomes.
Electronics 14 03369 g013
Table 1. Methodological framework based on Kolb’s learning cycle and hands-on tasks.
Table 1. Methodological framework based on Kolb’s learning cycle and hands-on tasks.
Kolb’s Learning StageLearning Objectives and Cognitive Tasks
Concrete Experience (CE)Learning Objective:
  • This hands-on interaction in a realistic virtual setting bridges theoretical knowledge with tangible application, sparking curiosity and laying the groundwork for deeper learning in subsequent phases.
Cognitive Tasks:
  • Participants dive into the Submarine Simulator VR environment, actively building and testing their custom-designed 3D submarine models.
  • They observe critical hydrodynamic characteristics, such as movement, stability, and response to simulated underwater forces, fostering an initial sensory connection and practical engagement with engineering concepts.
Reflective Observation (RO)Learning Objective:
  • This introspective process encourages learners to document personal reflections, emotional responses, and emerging patterns, transforming raw experiences into thoughtful observations that inform future iterations.
  • By fostering this deliberate pause for contemplation, the phase aligns with Kolb’s cycle, enhancing awareness and bridging practical encounters with conceptual understanding, particularly valuable in an unfamiliar VR context where initial challenges can spark profound learning breakthroughs.
Cognitive Tasks:
  • Participants carefully review the outcomes of their VR simulations, analyzing successes and failures—such as why a submarine model sank, exhibited instability, or performed inefficiently—to derive meaningful insights.
Abstract Conceptualization (AC)Learning Objective:
  • This phase encourages learners to connect their hands-on VR experiences with foundational engineering theories, fostering a deeper understanding of how design choices impact performance in simulated underwater environments. By integrating practical insights with academic frameworks, participants build a robust conceptual foundation, paving the way for informed experimentation in subsequent phases of Kolb’s learning cycle.
  • This reflective synthesis is particularly impactful in the novel VR context, where complex phenomena become tangible, enhancing learners’ ability to grasp relatively abstract principles.
Cognitive Tasks:
  • Participants synthesize their observations to formulate conclusions about essential hydrodynamic principles, such as buoyancy, drag, and propulsion, while weaving in relevant theoretical concepts.
Active Experimentation (AE)Learning Objective:
  • This hands-on iteration allows learners to apply insights gained from prior learning cycles, actively experimenting to optimize their designs and enhance performance.
Cognitive Tasks:
  • Participants refine their approach by iteratively redesigning or adjusting their 3D submarine models—tweaking elements such as shape, materials, or component placement—and prepare for subsequent testing in the VR environment.
Table 2. The instructions given to all participants.
Table 2. The instructions given to all participants.
Chapter 1: Building Mode
  • Create a new submarine model.
  • Add a cube to the scene and move it to the center of the table.
  • Scale the cube proportionally.
  • Change the material of the cube to ABS.
  • Make sure the weight of the cube is between 7 and 10 kg.
  • Add a cylinder to the scene.
  • Snap the cylinder on top of the cube exactly in the middle using the “Face Snap” function.
  • Attach two motors to the cube, one motor on the left side of the cube and one motor on the right side of the cube.
  • Save your model.
  • Move to simulation mode with the current model.
Chapter 2: Floatability Testing and Underwater Navigation
  • In front of you, on the virtual table, you have two handles—use the left one to rotate around the object and the right one to zoom in and zoom out.
  • Below, under the table, you have two handles. Each one has the same color as the motor it controls.
  • Use the motors individually or together to move the ensemble underwater.
  • Experiment with navigating the model underwater—rotate it/move forward/backwards.
  • Press “A” to change into a first-person view, and use the joysticks to move underwater.
  • Press “A” to come back again to the initial third-person view.
  • Restart the simulation using the green button on the console.
  • Exit the simulation using the red button on the console.
  • Open Q&A.
Chapter 3: Exploratory Learning
  • Create a model made of four shapes and two motors that sink into the water to between 1000 and 2000 m. The model needs to be as controllable as possible (can move forward, backwards, left, and right in a controlled manner).
  • Participants are informed that they have complete creative freedom to design their 3D submarine models.
Table 3. Latent variable covariances.
Table 3. Latent variable covariances.
Learning StylesPerformanceVR
Learning styles1.0000.2900.497
Performance0.2901.0000.703
VR0.4970.7031.000
Table 4. Quality indicators.
Table 4. Quality indicators.
Variablesf2R2

VR
Learning styles
Performance
VRLearning
Styles
PerformanceR2R2 Adjusted
-0.329-0.2470.216
-----
0.977--0.4940.473
Table 5. Construct reliability and validity.
Table 5. Construct reliability and validity.
VariablesCronbach’s Alpharho_AComposite ReliabilityAVE
Learning styles
Performance
VR
-1.000--
0.7410.8310.7380.513
-1.000--
Table 6. Fornell–Larcker criterion.
Table 6. Fornell–Larcker criterion.
VariablesLearning StylesPerformance
Learning styles
Performance
VR
--
0.2896290.716043
0.497480.702998
Table 7. Collinearity statistics (VIF).
Table 7. Collinearity statistics (VIF).
VariablesVIF
Abstraction Conceptualization2.644784
Active Experimentation3.273493
Concrete Experience2.00095
Reflective Observation2.122368
Style1.127902
Flow1.201899
Immersion1.173842
Mindset1.314606
Satisfaction1.07815
Points11.684684
Points21.450584
Points31.416396
Table 8. Model fit.
Table 8. Model fit.
SaturatedEstimated
SRMR0.0960.096
d_ULS0.7240.730
d_G0.4490.454
Chi-Square44.43144.725
Table 9. Bootstrapping path coefficients (β).
Table 9. Bootstrapping path coefficients (β).
Original Sample (O)Sample Mean (M)STDEV *T Statistics (|O/STDEV|)p Values
Learning styles -> VR0.4970.7080.1832.7230.007
VR -> Performance0.703----
* STDEV—standard deviation, p = significance threshold.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Stănescu, A.-B.; Travadel, S.; Rughiniș, R.-V.; Bucea-Manea-Țoniș, R. Modeling Learning Outcomes in Virtual Reality Through Cognitive Factors: A Case Study on Underwater Engineering. Electronics 2025, 14, 3369. https://doi.org/10.3390/electronics14173369

AMA Style

Stănescu A-B, Travadel S, Rughiniș R-V, Bucea-Manea-Țoniș R. Modeling Learning Outcomes in Virtual Reality Through Cognitive Factors: A Case Study on Underwater Engineering. Electronics. 2025; 14(17):3369. https://doi.org/10.3390/electronics14173369

Chicago/Turabian Style

Stănescu, Andrei-Bogdan, Sébastien Travadel, Răzvan-Victor Rughiniș, and Rocsana Bucea-Manea-Țoniș. 2025. "Modeling Learning Outcomes in Virtual Reality Through Cognitive Factors: A Case Study on Underwater Engineering" Electronics 14, no. 17: 3369. https://doi.org/10.3390/electronics14173369

APA Style

Stănescu, A.-B., Travadel, S., Rughiniș, R.-V., & Bucea-Manea-Țoniș, R. (2025). Modeling Learning Outcomes in Virtual Reality Through Cognitive Factors: A Case Study on Underwater Engineering. Electronics, 14(17), 3369. https://doi.org/10.3390/electronics14173369

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop