Previous Article in Journal
MITM- and DoS-Resistant PUF Authentication for Industrial WSNs via Sensor-Initiated Registration
Previous Article in Special Issue
Examining the Flow Dynamics of Artificial Intelligence in Real-Time Classroom Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Virtual Reality for Hydrodynamics: Evaluating an Original Physics-Based Submarine Simulator Through User Engagement

by
Andrei-Bogdan Stănescu
1,*,
Sébastien Travadel
2 and
Răzvan-Victor Rughiniș
1
1
Faculty of Automatic Control and Computers, National University of Science and Technology “Politehnica” Bucharest, 060042 Bucharest, Romania
2
Collège des Sciences Navales, MINES Paris—PSL, 75272 Paris, France
*
Author to whom correspondence should be addressed.
Computers 2025, 14(9), 348; https://doi.org/10.3390/computers14090348
Submission received: 22 July 2025 / Revised: 22 August 2025 / Accepted: 22 August 2025 / Published: 24 August 2025

Abstract

STEM education is constantly seeking innovative methods to enhance student learning. Virtual Reality technology can represent a critical tool for effectively teaching complex engineering subjects. This study evaluates an original Virtual Reality software application, entitled Submarine Simulator, which is developed specifically to support competencies in hydrodynamics within an Underwater Engineering course at MINES Paris—PSL. Our application uniquely integrates a customized physics engine explicitly designed for realistic underwater simulation, significantly improving user comprehension through accurate real-time representation of hydrodynamic forces. The study involved a homogeneous group of 26 fourth-year engineering students, all specializing in engineering and sharing similar academic backgrounds in robotics, electronics, programming, and computer vision. This uniform cohort, primarily aged 22–28, enrolled in the same 3-month course, was intentionally chosen to minimize variations in skills, prior knowledge, and learning pace. Through a combination of quantitative assessments and Confirmatory Factor Analysis, we find that Virtual Reality affordances significantly predict user flow state (path coefficient: 0.811) which then predicts user engagement and satisfaction (path coefficient: 0.765). These findings show the substantial educational potential of tailored Virtual Reality experiences in STEM, particularly in engineering, and highlight directions for further methodological refinement.

1. Introduction

Virtual Reality (VR) software applications for learning Science, Technology, Engineering, and Mathematics (STEM) subjects offer a transformative approach to education by providing immersive, interactive, and engaging environments that enhance the process of understanding complex and abstract concepts. VR’s capability to simulate real-world scenarios and environments that are otherwise inaccessible makes it a valuable tool in STEM education.

1.1. Existing Tools and Frameworks

Various VR platforms and frameworks have been developed for STEM education, including ScienceVR for science laboratories [1], Unity-based modules for K-12 education [2], and smartphone-based VR plotting systems [3]. These tools enable the creation of 3D visualizations and interactive simulations, enhancing students’ understanding of complex concepts. Studies have demonstrated that VR can improve learning efficiency, especially when incorporating environmental traversal capabilities [4].
VR has been effectively integrated across diverse STEM disciplines in higher education, including engineering and chemistry experiments, where it has demonstrated significant gains in spatial abilities, knowledge retention, student motivation, engagement, and overall learning outcomes [5,6].
Mobile VR platforms enhance the affordability and accessibility of VR technology for educational institutions, facilitating its widespread adoption in instructional practices [5,7]. To further support this, VR software must remain user-friendly and compatible with budget-conscious hardware solutions, such as head-mounted displays and haptic gloves, which institutions prioritize for cost-effectiveness [8].
Virtual laboratories have garnered significant attention in research, offering a secure alternative for experiments that might endanger participants. By generating immersive three-dimensional digital environments, VR enables students to engage with and investigate intricate STEM concepts in a risk-free, controlled manner—proving especially advantageous for hazardous or impractical real-world procedures [6,9]. Moreover, the heightened sense of presence and interactive learning via physical movements and gestures in a VR setting can enrich the educational process, rendering abstract ideas more concrete and comprehensible [10].

1.2. Research Gap

Difficult challenges remain in designing effective VR experiences for education, with researchers proposing design principles and guidelines to optimize learning outcomes [10,11,12,13]. Current literature lacks comprehensive studies on the affordances of VR for fostering flow states in hydrodynamics learning, leaving educators without evidence-based guidelines to integrate such immersive simulations into curricula.
While numerous VR tools support general STEM learning—such as those for molecular manipulation, ecosystem exploration, or basic physics experiments—few integrate realistic water simulations with intuitive interfaces that enable rapid skill acquisition, particularly for complex underwater scenarios. For instance, projects like Stanford’s VR simulation for ocean acidification provide educational dives into marine environments, but they often prioritize visualization over dynamic, physics-accurate interactions [14].
Similarly, underwater VR modules for marine science courses, developed through US National Science Foundation (NSF) funded initiatives, focus on immersive labs but may lack the fidelity needed for engineering-specific tasks like manipulating submerged objects under realistic fluid forces [15,16].
Existing educational VR tools frequently rely on generic physics engines (e.g., those in Unity or Unreal Engine), which are optimized for real-time performance in games but fall short in accurately modeling complex underwater forces such as buoyancy, drag, turbulence, and viscous flow [17,18].
A comparative study of physics engines like ODE, Bullet, and MuJoCo reveals limitations in handling industrial or underwater robotics simulations, where inaccuracies in collision detection, force application, and environmental realism can lead to deviations from expected behaviors [19].
These engines often prioritize computational efficiency over precision, resulting in simplified approximations that do not capture the full complexity of underwater dynamics, such as variable water densities or multi-body interactions in submerged environments. For example, in underwater robotic simulators, generic engines struggle with sensor fidelity and sim-to-real transfer, leading to runtime failures or unrealistic outcomes that undermine training effectiveness [19].
This gap significantly limits the educational value of VR in advanced engineering courses, where high-fidelity simulations are essential for developing intuition in non-intuitive settings like underwater operations [20].
Research on high-fidelity modeling for submarines and remotely operated vehicles (ROVs) demonstrates that custom, physics-based simulations, which often use specialized tools like GPU-accelerated renderers or dedicated emulators, better replicate real-world conditions, improving skill transfer and decision-making in virtual training [21,22].
Expanding on this, bridging the gap could involve hybrid approaches such as integrating deep reinforcement learning for adaptive water flow simulations with user-friendly interfaces to foster rapid mastery [23]. Such advancements would not only boost immersion and flow states in learning but also prepare students for real-world challenges in fields like marine engineering, where a precise understanding of hydrodynamic forces is critical for innovation and safety [24,25].
Overall, literature underscores the need for more specialized VR tools to elevate STEM education from generic experiences to targeted, high-impact simulations that mirror professional demands. Addressing these shortcomings, our application introduces a specialized, custom-developed physics engine and software application—Submarine Simulator—explicitly tailored for hydrodynamics and submarine modeling, significantly enhancing realism and practical applicability by incorporating advanced fluid dynamics principles and precise force modeling that are being overlooked by generic engines.
This custom approach allows for more accurate replication of submarine behaviors, such as multi-body interactions in variable water densities, which supports better sim-to-real transfer and decision-making in virtual training environments.
Additionally, the software is engineered for rapid skill acquisition and an efficient, iterative design process. By emphasizing intuitive, high-fidelity interactions, our solution not only addresses the fidelity gap in underwater engineering education but also fosters active learning, significantly enhancing retention and practical applications of complex engineering concepts in real-world underwater environments.
To conclude, despite the rapid advancements in VR technology and education, there remains a significant gap in empirical research examining how VR applications enhance student engagement and understanding in specialized fields like underwater engineering, particularly in simulating hydrodynamics and 3D modeling in VR.
The urgency of addressing this gap is underscored by the escalating demand for innovative STEM education tools amid global challenges such as climate change and ocean exploration, where effective training in underwater engineering can accelerate real-world innovations and sustainability efforts. Underwater engineering plays a critical role in designing sustainable offshore structures, renewable energy systems (e.g., tidal and wave energy), and submersible technologies for environmental monitoring.

1.3. Aim of the Research

In our research, we aim to assess our original VR software designed for acquiring knowledge in hydrodynamics through self-reported engagement from the users. The information provided by the software enables the assimilation of the necessary knowledge for understanding the basic principles of building a small-scale submarine and experimenting with how the shape and properties of the ensemble affect the way it behaves in a simulated underwater environment.
To reach the aim of the research, the software has been tested during a course on underwater engineering at MINES Paris—PSL, offering students an additional opportunity to prototype and iterate on their ideas digitally. Gaining a deep understanding of the unique characteristics of the VR-based learning process is essential, as it empowers students to actively contribute to the iterative refinement of educational software aimed at fostering specialized expertise.
It is important to note that this study does not seek to benchmark the learning efficacy of VR-based techniques against conventional instructional methods. Instead, it assesses the ability of the original VR system designed for underwater engineering education to evoke targeted psychological and cognitive responses.
The main contributions of this paper are the empirical exploration of VR in enhancing underwater engineering education within challenging, non-intuitive environments that are difficult to visualize mentally. Through a custom-developed VR application, the study addresses three key research questions:
RQ1: How do the affordances of custom-developed VR applications in engineering education impact users’ state of flow?
RQ2: In what ways does a user’s perceived state of flow influence the overall perceived value of VR-based learning environments?
RQ3: What is the relationship between VR affordances, user flow states, and the effectiveness of engineering education as evaluated through perceived learning value?

2. Features of VR for STEM Education

To ensure an effective learning environment, the VR software should accomplish a set of criteria, aiming to ensure the standard for an educational tool. The current research highlights key features that VR software should include. It also suggests approaches for designing new tools in this field. These features are crucial for motivating students and improving their learning outcomes in STEM disciplines [5]. Efficient VR software for STEM education is characterized by its ability to create high levels of immersion, interactivity, and engaging learning environments [10].

2.1. Immersive Learning Environments

Immersive features of VR software are pivotal in creating engaging and realistic experiences across various domains, including STEM. Immersive VR can create a strong sense of presence and boost engagement. Engagement in educational contexts can be defined as the sustained cognitive, emotional, and behavioral involvement of users, achieved through immersion that captivates senses, interactivity that empowers actions, and engaging learning environments that provide meaningful context and motivation. The immersive nature of VR is achieved through a combination of technological and design elements that simulate real-world experiences in a virtual environment. These features leverage advanced visualization, interaction, and sensory integration to enhance user engagement and understanding. Additionally, the design of VR environments must ensure sufficient fluidity and immersion to avoid negatively impacting learning outcomes [5,26].
VR’s immersive features allow for personalized learning experiences, helping students understand and explore complex and abstract concepts that are often difficult to replicate in traditional educational settings through interactive and engaging methods [27]. VR is used in training scenarios to replicate real-world tasks, utilizing scripting and game engines to create customizable and adaptive training environments [28].
The effectiveness of immersive features in educational VR applications is mixed, highlighting the need for careful alignment of design features with educational goals [29]. While immersive features in VR software offer significant potential for enhancing user experiences, they also present challenges that need to be addressed. VR applications should provide a high degree of immersion, allowing students to feel present within the virtual environment. This is essential for engaging students and enhancing their learning experience [10,26]. The effectiveness of VR in translating immersive experiences into academic performance requires further exploration [30].

2.2. Interaction: 360-Degree Video and Head-Mounted Displays

Notably, 360-degree environments and head-mounted display technologies are used for VR experiences, allowing viewers to explore environments from all angles. As a result, it can increase emotional engagement and empathy [31].
VR software leverages high-resolution graphics and 3D modeling to craft realistic environments with genuine depth. This contrasts with 360-degree videos, which lack depth and thus offer only a limited immersion experience. Nonetheless, the benefits of true 3D VR are apparent in applications like digital media and interactive art, where deep learning models elevate the immersive experience by boosting modeling accuracy and the sense of presence [32]. Platforms like CAVE and head-mounted displays provide a profound sense of immersion by creating a neuropsychological sense of “being there” [33].
Multi-sensory virtual environments, such as the Multi-Sensory Virtual Decision-Making Center, facilitate collaborative learning by allowing multiple users to engage in real-time decision-making [34]. This approach not only improves the educational experience but also enhances teamwork and communication skills among students.
The design of VR experiences must be carefully aligned with the intended outcomes to maximize their effectiveness. As VR technology continues to evolve, it is expected that these challenges will be mitigated, leading to widespread adoption and innovation across various fields.

2.3. Interactive Learning

VR applications, such as educational games and simulations, have been shown to increase student motivation and engagement by making learning more interactive and enjoyable [6,35]. Applications should enable students to interact with virtual objects and environments, facilitating hands-on learning and exploration. This can be achieved through advanced interfaces like hand gesture recognition [5,36]. VR applications should offer customizable environments that cater to individual learning needs, allowing students to explore topics at their own pace and according to their interests [26].
The promotion of active learning through movement and gestures in a three-dimensional virtual environment can lead to a deeper understanding and retention of knowledge [5]. Effective Human-Computer Interaction (HCI) is central to immersive experiences, allowing users to interact with virtual environments seamlessly [37]. In educational settings, VR applications often include interactive elements that allow users to manipulate and explore abstract concepts, although narrative and social features are less commonly integrated [29].
The software should adapt to the learner’s progress, providing tailored feedback and challenges to optimize learning outcomes [38,39]. It should be scalable to accommodate different educational settings, from individual learners to large classrooms [40], and it should support collaborative learning, enabling students to work together in virtual spaces, share ideas, and solve problems collectively [41]. Incorporating social elements can enhance engagement and motivation, making learning more enjoyable and effective [39]. While VR offers significant advantages for STEM education, challenges such as the need for teacher training and the integration of VR into existing curricula must be addressed.

2.4. Gamification in STEM VR Education

Different game-design elements within VR environments have the role of enhancing learning experiences in science, technology, engineering, and mathematics. This approach aims to increase student engagement, motivation, and knowledge retention. The use of gamification and personalized learning in VR environments can further captivate students’ attention and encourage exploration and discovery [42].
Common gamification mechanisms in VR include rewards, challenges, and avatars. Elements such as content unlocking, point systems, task difficulty levels, and achievement systems are frequently used to create engaging learning environments. These elements are designed to motivate students by providing immediate feedback and a sense of progression [43].
VR platforms like VRCoding provide interactive and immersive experiences that encourage critical thinking and problem-solving. These environments allow students to actively participate in their learning process, moving away from passive listening to engaging with the content in a meaningful way [44].
Studies have shown that gamified VR environments significantly improve student engagement and motivation. For instance, the use of a gamified 3D virtual world for teaching computer architecture resulted in higher engagement levels and improved learning outcomes among engineering students [45]. While gamification can enhance learning, it is essential to balance game elements to avoid negative impacts on intrinsic motivation. For example, the use of high scores and achievements can lead to increased competition, which may detract from the learning experience [46].

3. Original VR Software to Teach Hydrodynamics

Drawing upon the previously mentioned elements, a tailored software application was developed to support the learning of core hydrodynamics principles essential for submarine design.
Submarine Simulator is a custom-made application created in Unity that facilitates engineering education both in VR and Augmented Reality (AR). The application consists of three main modes:
-
Environment Selection: Once immersed in VR, students can choose a virtual environment before beginning the construction process of a small-scale submarine model. The application currently allows students to choose from immersing themselves in an open world, a garage, a hangar, or an open space in nature. At this stage, students have the option to switch to AR instead of VR. This allows them to perceive their surrounding physical environment while utilizing a real table as a blended digital-physical canvas to initiate the construction process (Figure 1).
-
Construction Scene: After picking their favorite virtual environment, the students can use the VR controllers to drag, drop, and combine different 3D shapes to create a submarine model (Figure 2). They have multiple construction instruments and functions that allow for building with precision. Additionally, the students can change the weight, density, and material of each of the shapes added to the environment. Once the students decide that the shape of the submarine is complete, they may also attach motors. If the student has selected to use AR, all activities will be made by manipulating holograms which will blend together with the real space around them.
-
Simulation Scene: After completing the submarine model, students can test it in a simulated underwater scene (Figure 3). Based on preference, they can control the submarine either from a third-person or first-person perspective. The scene includes a hyper-realistic water simulator that supports the player in understanding how the previously created model behaves underwater. Using the motors attached previously, they can control the submarine, navigate underwater, and visually understand how the thrust and the physical characteristics of the submarine model interact with the water environment.
-
Gamification—Gamification in VR has been linked to better knowledge retention and understanding (Figure 4). The integration of game elements helps students grasp complex concepts more effectively, as seen in the positive feedback from students using the VR Coding system for learning computational thinking [44,45,46]. Our setup leverages gamification principles—such as points, leaderboards, real-time competition, and immediate feedback—to transform the learning process into an interactive and motivating experience. These principles deepen comprehension by encouraging problem-solving in a dynamic VR context, where failures and successes provide direct learning opportunities.
In our testing process, participants had the opportunity to test their custom-designed 3D submarine models by competing in virtual underwater racetracks, all simulated within the immersive VR environment. Upon successfully completing a race, students earned points based on their performance and could view their final completion time for the specific track. To enhance engagement, the system supported real-time multiplayer racing, where users could compete against their colleagues. This was facilitated through a visual “shadow” representation of peers’ submarines, allowing participants to see and react to others’ progress dynamically throughout the course. Ultimately, students achieving the highest cumulative scores across races were declared winners, fostering a competitive and collaborative learning atmosphere.
It is worth mentioning that the instructor overseeing these learning sessions could monitor everything live. The virtual simulation viewed via the student’s headset was duplicated on the instructor’s displays, enabling him to provide appropriate guidance and assistance to the participants if required.

4. Methodology

The assessment of the value inherent in VR software requires a comprehensive methodology that encompasses various dimensions, including performance, usability, user engagement, and the context in which the software is being used. The evaluation process is very important for identifying potential challenges and for ensuring that the software effectively fulfills its intended objectives.

4.1. Research Objectives and Hypothesis

The primary objective of this study was to evaluate the perceived effectiveness of a custom-built software application, termed Submarine Simulator, within the context of a course on underwater engineering in the fourth year of the engineering bachelor’s program at MINES Paris—PSL, France.
Other objectives were related to assessing the software’s impact on student engagement, specifically exploring how its tailored affordances influence the educational experience in a specialized engineering curriculum. By integrating advanced simulations, the research sought to determine the extent to which the Submarine Simulator enhances students’ understanding of complex underwater engineering concepts and fosters an immersive learning environment conducive to achieving a state of flow.
More specifically, we tested the following hypotheses:
H1. 
Affordances provided by custom-developed VR applications for engineering education significantly influence users’ reported state of flow.
H2. 
The perceived state of flow among users significantly affects the perceived value of the learning environment, as measured by the CEGE framework.

4.2. Research Methods

4.2.1. The Quality of the Experience

To assess the engagement and quality of the VR experience, a custom-made affordance questionnaire was administered immediately following the first practice session (Phase 1). Affordances, as defined by Gibson (1979), refer to the possibilities for action provided by an environment, which in the context of VR systems like the Submarine Simulator, include how intuitive, accessible, and engaging the interface and interactions are for users [47]. The questionnaire was developed based on general guidelines for affordance evaluation, ensuring it captures the specific affordances relevant to the VR environment.
The affordance questionnaire consists of 20 items across four key dimensions:
  • Actions: This item measures the clarity and intuitiveness of interaction possibilities (e.g., ease of navigating the VR environment).
  • Engagement: Assesses the degree to which the VR system fosters immersive and sustained user involvement.
  • Fluid Mechanics: Evaluates the smoothness and responsiveness of dynamic interactions (e.g., object manipulation, movement within the simulator).
  • Functions: This item measures the effectiveness and accessibility of the VR software’s core building and maneuvering functionalities (e.g., control panels, functions).
Each item is rated on a 5-point Likert scale (1 = Strongly Disagree, 5 = Strongly Agree), allowing for nuanced quantification of user perceptions. The questionnaire, detailed in Appendix A, was administered immediately after their Phase 1 session to capture real-time feedback and minimize recall bias. The greater the total score, the stronger the perceived affordance of the VR system.

4.2.2. The Flow State Scale (FSS) Questionnaire

Developed by Jackson and Marsh (1996), the FSS is a validated psychometric tool designed to measure the psychological state of flow, characterized by deep immersion, focus, and enjoyment in an activity [48].
The FSS was administered immediately following the second practice session (Phase 2) to capture participants’ flow experiences while interacting with the VR software. The questionnaire consists of 36 items, grouped into nine subscales, with the dimensions being measured on a 5-point Likert scale (1 = Strongly Disagree, 5 = Strongly Agree):
  • Challenge-Skill Balance: Perceiving that personal skills match the task’s demands.
  • Action-Awareness Merging: Experiencing seamless integration of actions and awareness.
  • Clear Goals: Having a clear understanding of objectives.
  • Unambiguous Feedback: Receiving immediate, clear feedback on performance.
  • Concentration on the Task: Maintaining deep focus without distractions.
  • Sense of Control: Feeling in command of the activity.
  • Loss of Self-Consciousness: Becoming less aware of self and external judgments.
  • Transformation of Time: Perceiving time as altered, either speeding up or slowing down.
  • Autotelic Experience: Finding the activity intrinsically rewarding.
For each of the nine dimensions of the FSS, the scores from the four corresponding items are summed and averaged to produce a mean score. This approach quantifies participants’ flow experiences, enabling the examination of their connections to learning styles and performance outcomes in immersive VR environments, such as the Submarine Simulator, where sustained engagement is critical for effective learning.

4.2.3. The Core Elements of Gaming Experience (CEGE) Questionnaire

Developed by Calvillo-Gámez et al. (2010), CEGE is a validated tool designed to measure the quality of a gaming experience by assessing key psychological and interactive elements [49].
It was administered at the end of the third practice session (Phase 3) to evaluate the perceived value of the Submarine Simulator’s VR environment in fostering an engaging and rewarding gaming experience. The CEGE questionnaire comprises 38 items grouped into six core dimensions:
  • Enjoyment: Measures the level of fun and satisfaction derived from the VR experience.
  • Control: Assesses the user’s sense of control and ability to interact effectively with the VR system.
  • Immersion: Evaluates the depth of psychological absorption in the VR environment.
  • Challenge: Gauges the perceived difficulty and stimulation provided by the VR tasks.
  • Focus: Measures the state of optimal engagement where users are fully focused and immersed.
  • Game Mechanics: Assesses the quality and intuitiveness of the VR system’s interactive elements (e.g., controls, in-game construction tools).
Each item is rated on a 5-point Likert scale (1 = Strongly Disagree, 5 = Strongly Agree), allowing for a nuanced quantification of the gaming experience. CEGE was chosen for its comprehensive coverage of gaming experience facets, which align with the study’s goal of evaluating how the Submarine Simulator’s affordances—actionable properties like intuitive controls and immersive tasks—contribute to perceived value and engagement.
The higher the overall CEGE score (sum of all items, ranging from 38 to 190), the stronger the perceived affordances and value of the VR system.
From both a contextual and stakeholder perspective, understanding the values, needs, and objectives of diverse stakeholders (such as educators, learners, and developers) is critical in the design and evaluation of VR applications [50]. This approach ensures that the software aligns with its intended educational goals, delivering an effective and engaging user experience.
For educational VR applications, evaluation methods often include the Technology Acceptance Model and flow tests to assess how well the software supports learning objectives. In medical education contexts, such as surgical training or stroke rehabilitation simulations, these can be complemented by specialized tools that evaluate the tolerability of artificial movement types (e.g., teleportation or floating movements) to minimize simulator sickness, ensuring safer and more effective VR adoption. This approach involves isolating movement factors in a controlled virtual environment and using validated measures like the Simulator Sickness Questionnaire (SSQ) for pre- and post-exposure assessments, as demonstrated in recent software developments aimed at reducing adverse effects for vulnerable users [51]. In our case, we are assessing immersivity through affordance, FSS, and CEGE. Additionally, we use a refined way of navigating through the environment using a zoom-in and zoom-out motion using the controllers in order to mitigate simulator and motion sickness.

4.3. Statistical Methods

A Confirmatory Path Analysis (CFA) was conducted to determine the relationship between the software features, as evaluated by students, and other variables connected to their psychological states, by using the software SmartPLS (v3.2.9) [52,53]. It provides a coherent explanatory model of causal mechanisms, highlighting both the direct and indirect effects of independent variables on a dependent variable within a theoretically specified network of relationships.
Partial Least Squares Structural Equation Modeling (PLS-SEM), which is particularly well-suited for exploratory research involving small sample sizes, non-normally distributed data, and formative or composite measurement models, was conducted using the specified software, incorporating bootstrapping with 5000 samples to estimate path coefficients. PLS-SEM offers a nonparametric, variance-based alternative that can yield meaningful results with samples as small as 20–30 inputs, provided model complexity is moderate [53,54]. PLS-SEM does not impose distributional assumptions and utilizes bootstrapping procedures to estimate significance, making it an appropriate choice when data normality or scale properties are not guaranteed. As a result, these features made SmartPLS the most appropriate tool for our exploratory analysis under the conditions of limited sample size and emerging constructs in a novel educational context [54].
Beyond R2, we also reported f2 effect sizes, which quantify the contribution of each independent variable to the dependent variable, thereby complementing R2 with a measure of local effect strength [55]. Additionally, the quality of the model was assessed using model fit indices, specifically SRMR (Standardized Root Mean Square Residual), NFI (Normed Fit Index), d_ULS (Unweighted Least Squares Discrepancy), d_G (Geodesic Discrepancy) and Chi-square values, which serve a similar function to loss/error metrics in predictive modeling, particularly in variance-based SEM contexts. Discriminant validity was evaluated using the Fornell–Larcker Criterion, while model fit was assessed through standardized metrics such as SRMR, NFI, Chi-square, d_ULS, and d_G indices.
While traditional loss functions such as MSE (Mean Squared Error) or cross-entropy are not directly applicable in SmartPLS’s estimation framework—given its nonparametric and component-based nature, the bootstrapping procedure (5000 resamples) and outer model loadings offer robust estimates of parameter stability and indicator reliability.

4.4. Target Group

The study involved a selected group of 26 fourth-year engineering students from MINES Paris—PSL, all specializing in engineering and sharing similar academic backgrounds in robotics, electronics, programming, and computer vision. This homogenous cohort, primarily aged 22–28 and enrolled in the same university program, was intentionally chosen to minimize variations in skills, prior knowledge, and learning pace. This homogeneity enabled a more consistent and reliable evaluation of the VR Submarine Simulator’s effectiveness, as differences in outcomes could be attributed to the software itself rather than varied participant characteristics.
All 26 students voluntarily participated in the evaluation of the VR Submarine Simulator. To ensure participant anonymity, no personal identifiers, such as names, were used at any point during the study. Instead, each student was assigned a unique numerical code for reference. Every participant successfully completed all three phases of the study, as outlined in the research design (see Section 4.5), ensuring a comprehensive assessment of the VR software.

4.5. Research Design

The study was structured as a series of three sessions, logically divided into three phases, each designed to progressively build students’ proficiency in using the VR Submarine Simulator while deepening their understanding of underwater engineering principles.
Conducted over five weeks, these sessions involved distinct tasks tailored to enhance both technical skills and conceptual knowledge. Each of the 26 students completed three distinct phases. Phases 2 and 3 were conducted approximately one week after the preceding phase, ensuring a consistent schedule for all participants.
This study was embedded within a 3-month, on-site course dedicated to the design and prototyping of mid-sized underwater remotely operated vehicles (ROVs) ranging from 50 to 150 cm in length.
The figures in this section are direct screenshots from the original Submarine Simulator VR software. The following phases were designed and executed by each of the participants in our study:

4.5.1. Phase 1: Introduction and Foundational Training

This phase has been split into four chapters:
  • Chapter 0—Video Introduction: All students have been introduced to the capabilities of the VR software using a 10-min video explanation showing all functionalities available and numerous examples.
  • Chapter 1—Building Mode: All students were invited to engage in the immersive VR experience, where they completed a step-by-step tutorial led by a trained facilitator. This hands-on session introduced them to the essential functions needed to construct small-scale submarine models, fostering a clear understanding of the VR Submarine Simulator’s core features (Figure 5).
  • Floatability Testing and Underwater Navigation: After mastering the construction process, students were invited to a virtual underwater environment where they could test their submarine models. This interactive setting allowed them to observe and analyze key hydrodynamic principles, including floatability, buoyancy, acceleration, and other dynamic behaviors, as applied to their designs.
  • Exploratory Learning: Lastly, once showing a basic level of understanding and comfort while using the VR software, all students were given a construction task. The students had complete creative freedom during the construction process.
The VR session was limited to 35 min to prevent strain from participants’ unfamiliarity with the technology. A comprehensive list of commands and guidelines provided to all students is detailed in Appendix B.

4.5.2. Phase 2: Underwater Design for Spiral Trajectories

Following the re-accommodation in VR, participants were given two construction tasks designed to deepen their understanding of hydrodynamic principles through practical application.
Using the VR simulator’s building mode, students were tasked with creating two distinct submarine models: one to navigate in a tight spiral trajectory and another to follow a loose spiral trajectory (Figure 6). The following command has been given explicitly:
“For the next 35 min, your task is to design two unique submarine models using the VR Submarine Simulator. Each model must include at least four shapes and two motors. One submarine should follow a tight spiral path with a radius of approximately 2 m, while the other should move in a loose spiral path with a radius of more than 5 m. You can build them in any order and adjust your designs as many times as needed. Once I confirm that a model’s trajectory is correct, you can move on to the next part of the exercise. Have fun creating and experimenting! Any questions before we start?”
These tasks required students to adjust key parameters, such as buoyancy, weight distribution, and motor positioning, to achieve the desired movement patterns in the simulated underwater environment. The tight spiral task demanded precise control to maintain stability within a confined radius, testing students’ ability to balance hydrodynamic forces under constrained conditions. In contrast, the loose spiral task allowed for broader movement, emphasizing sustained control over longer distances and varying currents.
Real-time feedback from the simulator’s tailored physics engine, featuring green visual markers for model stability and trajectory accuracy, allowed students to iteratively refine their submarine designs with precision and confidence.

4.5.3. Phase 3: Paired Competition in Underwater Racetracks

In the concluding phase of the study, conducted 7 days after Phase 2, a dynamic competitive element was introduced by organizing the 26 participants into 13 pairs, fostering both collaborative teamwork within each duo and spirited rivalry across teams.
Each pair worked together to strategize and generate creative solutions, then individually crafted submarine models tailored for three progressively demanding virtual underwater race tracks:
  • Track 1: Focused on straight-line propulsion with stable, deviation-free submarine models (Figure 7).
  • Track 2: Required precise left/right maneuvers (Figure 8).
  • Track 3: Demanded full maneuverability (up/down/left/right) in a complex underwater environment (Figure 9).
These meticulously designed tracks simulated real-world engineering scenarios, rigorously evaluating participants’ navigation precision, model performance, and design ingenuity in an immersive, gamified environment that heightened engagement and practical learning.

4.6. Control Conditions

To ensure the reliability and validity of the research, several control measures were implemented to minimize confounding variables and standardize the experimental conditions across all 26 participants. These measures were implemented to isolate the VR software’s impact on user engagement, making sure that any observed differences could be attributed solely to the intervention, rather than external variables.
The VR sessions took place in a controlled laboratory at MINES Paris—PSL, with consistent lighting (ambient, 500 lux) and temperature (22–24 °C) to maintain a uniform environment and minimize distractions. No external interruptions were permitted during the sessions, ensuring uninterrupted focus on the tasks.
All participants used standardized hardware configurations, featuring Meta Quest 3 (Meta Platforms, Inc., Menlo Park, CA, USA) VR headsets, paired controllers, and high-performance rendering laptops (HP OMEN 16-inch laptop (HP Inc., Palo Alto, CA, USA) with processor AMD Ryzen 9 6900HX (Advanced Micro Devices, Inc., Santa Clara, CA, USA), and graphics card NVIDIA GeForce RTX 3070Ti (NVIDIA Corporation, Santa Clara, CA, USA)), connected via USB-C cables to ensure minimal latency, eliminate lag, and prevent variability from equipment differences. Audio levels were consistently set on the headsets across all sessions to maintain a uniform sensory experience.
To standardize participant instructions, a trained facilitator delivered all commands verbally using a scripted protocol (detailed in Appendix B), ensuring consistency in guidance across all sessions. This approach prevented variations in instruction delivery that could influence participant’s performance or understanding of the VR software’s functionality.
To maintain consistency, participants adhered to a fixed schedule: those assigned a morning slot (e.g., 9:00–10:00 AM) for one phase continued with approximately the same time for subsequent phases. This controlled timing minimized the impact of fatigue or circadian rhythm variations on performance.
Lastly, participants were screened for VR-related motion sickness susceptibility prior to the study, with none reporting significant discomfort, ensuring consistent engagement with the VR environment.

4.7. Definition of Variables

To investigate the connection between user perceptions and the software features of the technical solution, we collected the following variables and their sub-components as critical and relevant for our analysis, based on data collected via questionnaires. These variables, detailed alongside their sub-components and corresponding loading factor values, are presented in the table below (Table 1).

5. Results

The data structure and distribution alongside the raw data have been expanded in Appendix C. Each variable was standardized to z-scores, centering and scaling values relative to their distribution.
To answer the research questions, a CFA path factor analysis has been applied, which resulted in a model comprising the formative variables: flow state (FLOW), affordance/usability (AFF), and the reflective variable: the Core Elements of the Gaming Experience (CEGE), with the items presented in Table 1.
The statistical analysis provides robust evidence that the immersive and usability affordances of the VR software substantially influence user engagement and the quality of their learning experience. Specifically, the high path coefficients (0.811 and 0.765) indicate strong predictive relationships, demonstrating practically that intuitive interaction and realism directly contribute to achieving an optimal learning state (flow) among students (Figure 10).
However, some statistical validity measures discussed below (such as AVE below recommended thresholds) suggest that particular survey questions might not fully capture all nuances of user experiences or that responses reflected varied interpretations of questionnaire items. Practically, these results imply that minor refinements in measurement items or additional clarifications during data collection could further strengthen empirical validation of the software’s educational benefits.
Overall, we identified that affordances significantly shape the user’s state of flow. H1 is confirmed. The very high value of the path coefficient of the model (0.811) proved the relationship.
Table 2 presents the Coefficient of Determination (R2) that indicates the predictive power of the model, showing the proportion of variance explained by independent variables. Flow (R2 = 0.658) shows that 65.8% of the variance in flow is explained by affordance, indicating a strong model fit for this relationship. The R2 Adjusted (0.644) corrects for model complexity and remains high, showing model stability.
Table 3 presents the effect size for predictors (f2), measuring the impact of an independent variable on a dependent variable. Affordance has a very large effect on flow, highlighting that ease of use and immersive design are critical drivers of user engagement. This is an exceptionally strong relationship, indicating that VR affordances significantly shape the user’s state of flow.
Focus on improving usability, control, and immersion in the VR experience to boost user flow and enjoyment. Table 4 presents reliability and validity metrics for the constructs affordance (AFF) and flow in the VR environment. AFF is a formative variable and no calculus was conducted. Flow is a reflective variable.
Cronbach’s Alpha (CA) for flow: 0.842, indicating good internal consistency. AFF is a formative variable and no calculus was conducted. The Spearman rho_A coefficient for flow: 0.862 shows good reliability. Composite Reliability (CR) for Flow: 0.839 indicates good construct reliability. Average Variance Extracted (AVE) for Flow: 0.407 is below the recommended threshold, meaning it may not explain enough variance in its indicators.
Flow has good internal consistency and reliability (Cronbach’s Alpha, rho_A, and CR are all above 0.7). However, Flow’s AVE is low (0.407), which suggests potential validity concerns. This could mean that the indicators for flow may not sufficiently capture the construct.
Table 5 presents Discriminant Validity using the Fornell–Larcker Criterion, which checks whether a construct is distinct from others in a model. Square Root of AVE (Diagonal Values): AFF is likely 0.811. Flow: 0.638 (which is the square root of AVE).
The correlation between AFF and Flow (0.811) is strong. Flow’s square root of AVE (0.638) is lower than its correlation with AFF (0.811). This suggests poor discriminant validity, meaning flow and affordance might not be clearly distinguishable from one another in the model.
Furthermore, the Fornell–Larcker Criterion suggests insufficient discriminant validity, as the square root of AVE for Flow (0.638) is lower than its correlation with affordance (0.811), which may reflect conceptual overlap between immersive affordances and flow in the interactive VR context. In response, the measurement model should be revised in the future to better delineate these constructs.
In Table 6, the SRMR (Standardized Root Mean Square Residual) value of 0.121 is not very close to the maximum threshold of 0.10, suggesting the model has a low fit with room for improvement.
d_ULS (Unweighted Least Squares Discrepancy) and d_G (Geodesic Discrepancy) lower values indicate a better fit. Chi-Square Value 81.449 for the estimated model is at least equal to the saturated value; thus, we can count on the model fit. NFI = 0.509 < 0.80 indicates a poor fit. Low NFI could be due to sample size issues.
Model fit indices such as SRMR (0.121) and NFI (0.509 and 0.641) fall short of ideal thresholds, suggesting the need for structural refinement and more parsimonious modeling in future analyses.
In the second path analysis (Figure 11), the very high value of the path coefficient of the model (0.765) confirms the supposed relationship between the software characteristics, through flow state, and CEGE. H2 is confirmed as well.
Table 7 presents the Coefficient of Determination (R2) that indicates the predictive power of the model, showing the proportion of variance explained by independent variables. CEGE (R2 = 0.586) shows that 58.6% of the variance in CEGE is explained by flow, indicating a strong model fit for this relationship. The R2 Adjusted (0.569) corrects for model complexity and remains high, showing model stability.
Table 8 presents the effect size for predictors (f2), measuring the impact of an independent variable on a dependent variable. Flow has a very large effect on CEGE, highlighting that ease of use and immersive design are critical drivers of user engagement. This is an exceptionally strong relationship, indicating that VR affordances significantly shape the user’s state of flow.
Educators and developers should emphasize enhancements in usability, control, and immersion within the VR experience to elevate user flow and enjoyment. Table 9 presents reliability and validity metrics for the CEGE and Flow (as reflective variables) in the VR environment. AFF is a formative variable and no calculus was conducted.
Cronbach’s Alpha (CA) for CEGE: 0.793 indicates good internal consistency. Flow is a formative variable and no calculus was conducted. The Spearman rho_A coefficient for flow, 0.869, shows good reliability. Composite Reliability (CR) for CEGE: 0.803 indicates good construct reliability. Average Variance Extracted (AVE) for CEGE: 0.432 is below the recommended threshold, meaning it may not explain enough variance in its indicators.
CEGE has good internal consistency and reliability (Cronbach’s Alpha, rho_A, and CR are all above 0.7). However, CEGE’s AVE is low (0.407), which suggests potential validity concerns. This could mean that the indicators for flow may not sufficiently capture the construct.
The SRMR (Standardized Root Mean Square Residual) value of 0.107 is very close to the maximum threshold of 0.10, suggesting the model is very close to being acceptable with room for improvement. d_ULS (Unweighted Least Squares Discrepancy) and d_G (Geodesic Discrepancy) lower values indicate a better fit. The Chi-Square value of 88.938 for the estimated model is at least equal to the saturated value; thus, we can count on the model fit. NFI = 0.641 < 0.80 indicates a poor fit. Low NFI could be due to sample size issues, 26 subjects (Table 10).

6. Discussions

VR has emerged as a promising tool for STEM education, offering immersive experiences and active learning opportunities. In accordance with the specialized literature, our developed software meets a series of quality criteria such as usability, fidelity, and effectiveness, in accordance with the guidelines provided in the relevant literature [8,10,28].
An important limitation of this study is its reliance on a cross-sectional design with immediate post-intervention assessments. This prevents us from making empirical claims about the long-term educational impact of knowledge transfer.
This design was chosen to efficiently evaluate the immediate efficacy of the Submarine Simulator. However, the absence of longitudinal data limits our ability to assess whether observed improvements in underwater engineering and hydrodynamic principles persist over time or translate to other contexts. Future studies should employ longitudinal designs, with follow-up assessments at multiple intervals, to investigate the durability and transferability of these effects. Despite this limitation, the current findings provide valuable evidence of our VR tool’s immediate benefits, laying the groundwork for more comprehensive evaluations.
The positive affordance questionnaire responses, where most students reported the software application as intuitive and supportive of task completion, suggest that the VR environment successfully bridged the gap between theoretical knowledge and simulated practice. Our VR application for submarine design and testing provides users with an intuitive and engaging platform to explore complex engineering principles. The application attempts to represent hydrodynamic forces as accurately as possible for real-time computation, allowing for realistic experimentation and observation. This immersive environment fosters a deeper understanding of theoretical concepts by enabling direct manipulation and real-time feedback, transforming abstract equations into tangible outcomes, in accordance with the learning objectives [29].
Secondly, the data reveals that most participants experienced flow, with self-reported metrics (via the Flow State Scale questionnaire) indicating high levels of absorption and enjoyment. Notably, the strong correlation between affordance perceptions and flow suggests that the software’s design elements, such as clear navigational cues and responsive interactions, facilitated this state by minimizing frustration and maximizing control.
This correlation extends to CEGE, including aspects like challenge, control, and feedback, which collectively enhanced immersion. Drawing from Csikszentmihalyi’s flow theory, these results imply that VR’s ability to create a “presence” in non-intuitive underwater settings through sensory realism and adaptive difficulty promotes flow more effectively than passive learning tools.
Furthermore, the application’s design actively promotes problem-solving, critical thinking, and iterative design processes. These are essential conditions for an efficient learning process [6,34,41]. Users can prototype rapidly, test, and refine their submarine models, learning from failures and optimizing their designs in a risk-free virtual space.
The interdependencies among affordance, flow, and CEGE offer a compelling theoretical contribution, suggesting an expanded model for VR in education: affordances act as precursors to engagement, mediating flow through gamified structures. This builds on existing models like the Technology Acceptance Model (TAM), incorporating psychological flow as a key variable for non-intuitive domains.
Practically, these insights advocate for VR developers to prioritize iterative user testing focused on affordances to optimize flow and engagement, aligning with broader VR-in-education research that emphasizes user-centered design. This way, we can enhance immersion and learning outcomes in simulated environments. This approach could revolutionize underwater engineering pedagogy by transforming abstract concepts into immersive, hands-on simulations that mirror real-world challenges. Thereby, it fosters deeper conceptual understanding and develops practical skills through repeated virtual practice without physical risks. Additionally, it also bridges the gap between modern and traditional methods—such as lectures and static diagrams. Traditional methods often have difficulties conveying the dynamic complexities of underwater engineering. By incorporating VR into the learning environment, educators can support students in building the skills required to tackle innovative problem-solving in real-world scenarios.
These findings suggest to educators the potential for scalable VR integration, exemplified by hybrid curricula that blend immersive VR sessions with collaborative debriefing to strengthen overall student engagement throughout the learning process. Institutions could leverage these correlations to justify VR integration, emphasizing its role in preparing students for industry demands where simulation training is standard.
Our VR submarine application exemplifies the potential of immersive technologies to improve STEM education. We believe it empowers students to not only grasp complex scientific and engineering principles but also to innovate and apply their knowledge in a practical, meaningful way. By prioritizing active learning strategies, the original software design aligns with Johnson-Glenberg’s (2019) call for a transformation in educational practices—one that centers on student interaction and engagement, which are crucial for effective STEM education [10].
Compared to complex Computer-Aided Design (CAD) software, which typically demands extensive training and significant time to master, our VR application’s intuitive interface and immersive environment show promise for enabling users to quickly learn its building and testing capabilities. However, this observation is preliminary, as it does not include comparisons with other design tools or data on long-term skill retention and real-world application.
Additionally, the accessibility and cost of VR hardware pose significant challenges, particularly for educational institutions with limited budgets. High-quality VR systems often require substantial investment in equipment, maintenance, and technical support, creating disparities in access to these advanced learning tools. For instance, individual VR headsets like the Meta Quest 3 can cost between USD 500 and USD 600, while more advanced models such as the HTC Vive Pro range from USD 900 to USD 1200, not including additional expenses for VR-ready computers, laptops (USD 1200–2000), or classroom setup packs that may exceed reasonable prices for basic configurations. These financial challenges are further intensified by ongoing expenses for creating educational content and maintaining technical infrastructure, such as high-speed internet and powerful graphics cards for smooth VR experiences. These costs can be particularly burdensome for schools with limited budgets, deepening educational discrepancies and making it harder for all students around the globe to access cutting-edge learning tools.
Beyond financial barriers, VR adoption in education poses several practical challenges. Teachers and technical staff require specialized training to integrate VR effectively into lessons, as their role shifts from traditional instructors to facilitators that guide students through immersive, self-directed learning experiences. This transition demands not only technical proficiency with VR systems but also new pedagogical skills to manage interactive and dynamic virtual environments. Furthermore, overseeing an entire class of students using VR headsets can be particularly challenging for a single teacher. Unlike traditional classroom settings, where a teacher can easily monitor and engage with students, VR environments often isolate learners in individual headsets, making it difficult to maintain classroom discipline, provide real-time support, or ensure all students remain on task. This necessitates additional support staff, advanced classroom management tools, or smaller class sizes to ensure effective supervision and guidance, further complicating VR adoption in educational settings.
Compatibility issues, such as outdated computers or limited network capabilities, can also hinder adoption. Additionally, ensuring user comfort is key, which includes using ergonomic designs and session guidelines to reduce issues like motion sickness, making VR more accessible and enjoyable for students. Other barriers involve resistance to change among educators unfamiliar with VR, a shortage of high-quality educational content tailored to specific disciplines like underwater engineering, and potential health risks from prolonged use, such as eye strain, which can also slow the broader adoption of VR in educational environments.
Lastly, a potential barrier to adopting VR in education and training is technophobia, an irrational fear or aversion to new technologies that can cause anxiety, avoidance, or frustration among users. Practically, educators may experience heightened anxiety and low confidence during the initial use of VR, leading to resistance and limited implementation. Mitigation strategies, such as targeted training and gradual exposure, can build confidence and reduce barriers, enabling VR integration and adoption.
VR can enhance inclusion in education and training by creating immersive environments that accommodate diverse needs and overcome physical barriers. As noted in recent research, the current education landscape represents a unique opportunity to build accessible and inclusive virtual worlds from the very beginning, leveraging metaverse and extended reality technologies for lifelong learning. This benefits students with disabilities through personalized simulations, adaptive training, and remote access, as evidenced by VR’s effectiveness in special education [56].
Furthermore, VR promotes equitable learning pathways, such as STEM applications for underrepresented groups, fostering creativity and empathy while aligning with sustainable goals. By integrating supportive frameworks, educators can create inclusive ecosystems that reduce disparities and broaden participation.

7. Conclusions

The originality of our VR application is rooted in several key elements that differentiate it from existing educational tools. A cornerstone of this novelty is the customized physics engine specifically designed for the underwater simulation scene. Recognizing the limitations of generic physics engines in accurately representing complex hydrodynamics, we carefully re-engineered over 60% of the code. This extensive process delivered a better level of fidelity and realism to users when testing submarine models, far from simplistic games and into simulation territory. The realistic simulation of buoyancy, drag, and fluid dynamics allows for a truly authentic experience that attempts to mirror real-world underwater environments, a critical factor for effective engineering education.
Furthermore, the perceived quality and usability of the application have been independently validated through the affordance questionnaire, with results indicating a very high rating. This high rating underscores the intuitive design and ease of interaction within the virtual environment, ensuring that users can focus on learning and experimentation rather than struggling with the interface.
Perhaps the most compelling element of novelty is the absence of any other publicly available and mature application that offers a similar comprehensive platform for virtual submarine design and testing. While isolated simulations or design tools may exist, none integrate the full spectrum of design, testing, and realistic physics within an immersive VR environment to the degree our application does. This unique positioning fills a significant gap in STEM education tools, offering an unparalleled opportunity for students and professionals to engage with complex naval architecture and engineering principles in an accessible and engaging way.
These early findings indicate that the VR application’s user-friendly design could reduce barriers for learners, potentially enabling a wider audience, including those unfamiliar with complex design software, to engage with advanced design and engineering concepts. This streamlined onboarding process appears to enhance engagement and optimize learning time, allowing users to transition quickly from learning the interface to experimenting and innovating. Further research, including comparative studies with tools like CAD and longitudinal assessments of skill durability, is needed to validate these observations and fully evaluate the application’s educational impact.

Limits and Future Work

In the evaluation process of this software, we highlight an initial limitation related to the relatively small number of participants. Repeating the study with a larger sample size should be linked to accessing a group with similar characteristics. Further studies will focus on the relationship between the effectiveness of the training process and the characteristics of the participants involved in it.
Additionally, the small sample size and homogenous nature of the cohort, drawn from a single institution, can pose challenges to the generalizability of the findings. The uniformity in age, educational background, and institutional context may not reflect the diversity of learners in other settings, such as different universities, engineering disciplines, or professional environments.
This homogeneity introduces potential sample bias, as the cohort’s specific characteristics—such as their advanced engineering training or familiarity with technology—may have influenced their performance and engagement with the VR software, potentially inflating its perceived effectiveness. These limitations suggest that the study’s results may not fully apply to broader populations, such as novice learners, students from varied academic backgrounds, or professionals in underwater engineering.
To address the limitations of the small and homogenous sample, further research should incorporate larger and more diverse participant groups, including students from multiple institutions, varying levels of expertise, and different demographic backgrounds. Such studies would enhance the generalizability of findings and reduce potential biases, providing a more robust understanding of the VR Submarine Simulator’s effectiveness across diverse educational and professional contexts.
While the evaluation of VR software for educational purposes is comprehensive, it is important to recognize the inherent challenges and limitations in the process itself. One significant limitation lies in standardizing evaluation metrics across diverse VR applications. The immersive and interactive nature of VR introduces variables not typically found in traditional software, such as motion sickness and the difficulty in isolating the specific pedagogical gains directly attributable to the VR experience versus other learning modalities. Quantifying the subtle, yet powerful, influence of presence and immersion on learning outcomes remains a complex area of research.
There are some limitations concerning the statistical validity of certain measures. Specifically, AVE and discriminant validity indicators fell below standard thresholds, raising concerns about measurement precision and results. These issues may result from a small sample size, ambiguous questionnaire formulations, or language barriers. We acknowledge that the AVE values for the flow construct (0.407) and Core Elements of the Gaming Experience—CEGE (0.432) fall below the recommended threshold of 0.50 [57], indicating potential weaknesses in convergent validity.
Given the exploratory nature of this study and the novelty of the VR learning environment, we recognize the challenge in precisely defining latent constructs at this stage; thus, future iterations will involve item refinement and validation on a larger, homogenous sample.
While the current sample size of 26 is acceptable for exploratory PLS-SEM, we conclude that it constrains generalizability and statistical power; therefore, subsequent studies will involve larger samples to enable more robust validation, subgroup analyses, and improved reliability of parameter estimates.
To address another limitation in future studies, we recommend refining the affordance questionnaire items to more explicitly align with VR affordance and flow constructs. Clearer wording or adding illustrative examples to certain survey questions may improve participant comprehension and response accuracy, enhancing subsequent statistical robustness.
Another challenge involves the long-term retention and transfer of knowledge gained in VR environments. While initial studies often show strong immediate learning, more longitudinal research is needed to determine how well these skills and understandings translate to real-world applications and how durable they are over time.
Potential confounding factors further complicate VR implementation and research evaluation. These include variability in participants’ prior VR and gaming experience, which can influence perceived immersion and learning outcomes; differences in hardware performance across institutions, leading to inconsistent user experiences; and environmental factors like classroom setup, lighting, noise, or distractions that may diminish the sense of presence in VR simulations. In educational research contexts, such as studies on our Submarine Simulator, confounding often arises in media comparison designs where factors like instructional methods, novelty effects, or learner motivation are not adequately controlled, potentially skewing results on VR’s efficacy compared to traditional methods. Addressing these requires standardized protocols for VR deployment, including guidelines for hardware setup, user onboarding processes to mitigate experience discrepancies, and consistent evaluation metrics tailored to VR’s unique features, such as immersion scales and longitudinal tracking of skill transfer.
Future research should therefore focus on optimizing these environments to maximize their educational potential while addressing these challenges. This includes developing more robust and universally applicable evaluation frameworks that account for the unique characteristics of VR. Exploring how VR can be seamlessly integrated with other teaching methods to create blended learning experiences will be key to unlocking its full potential, ensuring that it complements, rather than replaces, traditional educational approaches.

Author Contributions

Conceptualization, A.-B.S., S.T. and R.-V.R.; methodology, A.-B.S., S.T. and R.-V.R.; software, A.-B.S.; validation, A.-B.S.; formal analysis, A.-B.S., S.T. and R.-V.R.; investigation, A.-B.S.; writing—original draft preparation, A.-B.S., S.T. and R.-V.R.; writing—review and editing, A.-B.S., S.T. and R.-V.R.; supervision, S.T. and R.-V.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

No new data was created.

Acknowledgments

The research was run within doctoral studies and supervised by the coordinators from MINES Paris—PSL and the National University of Science and Technology “POLITEHNICA” Bucharest.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Affordance Questionnaire Items

On a scale from 1 to 5, where 1 is Strongly Disagree and 5 is Strongly Agree:
  • I found the VR environment easy to navigate (using the menu, navigating closer/away from the submarine mode, changing modes, changing views, and everything related)
  • The VR controls were easy to learn and use.
  • The overall VR experience did NOT cause any discomfort or motion sickness.
  • The VR environment provided clear visual indications of what I could interact with.
  • The VR environment felt responsive to my actions.
  • I felt confident about how I could interact with objects in the VR environment.
  • At the end of the experience, I felt confident about how to build and test submarine models.
  • I found it easy to personalize the appearance and functionality of my submarine in Build Mode.
  • The Build Mode provided enough freedom for my creativity in designing a unique submarine model.
  • The VR environment provided clear indications of what actions I could perform.
  • The tools in VR felt intuitive to use.
  • I found it easy to understand the function of the different tools available in VR.
  • When I looked at two components, I could easily tell if they were snapped together.
  • I believe that the VR simulation had a good representation of underwater fluid mechanics (in alignment and close to reality).
  • I believe that all the represented forces and calculations during the underwater simulation were enough to understand the efficiency of my submarine model.
  • I felt engaged and fully focused during the experiment.
  • I felt comfortable in the virtual submarine simulation environment (scene 2).
  • The experiment allowed me to iterate fast and efficiently over my submarine models.
  • Overall—I believe that this learning experience in Virtual Reality brings real educational value in the process of prototyping a real, small-scale submarine.
  • I did not feel that one hour had passed; I lost track of time.

Appendix B

Table A1. Commands and guidelines given to all participants.
Table A1. Commands and guidelines given to all participants.
Chapter 0:
Video Introduction
  • The video tutorial contained an in-depth explanation of each function and mode available in the VR software application.
Chapter 1:
Building Mode
  • Create a new submarine model.
  • Add a cube to the scene and move it to the center of the table.
  • Scale the cube proportionally.
  • Change the material of the cube to ABS.
  • Make sure the weight of the cube is between 7 and 10 kg.
  • Add a cylinder to the scene.
  • Snap the cylinder on top of the cube exactly in the middle using the “Face Snap” function.
  • Attach two motors to the cube, one motor on the left side of the cube, and one motor on the right side of the cube.
  • Save your model.
  • Move to simulation mode with the current model.
Chapter 2:
Floatability Testing and
Underwater navigation
  • In front of you—on the virtual table you have two handles—use the left one to rotate around the object and the right one to zoom in and zoom out.
  • Below, under the table—you have two handles. Each one has the same color as the motor it controls.
  • Use the motors individually or together to move the ensemble underwater.
  • Experiment with navigating with the model underwater—rotate it/move forward/backwards.
  • Press “A” to change to a first-person view and use the joysticks to move underwater.
  • Press “A” again to return to the initial third-person view.
  • Restart the simulation using the green button on the console.
  • Exit the simulation using the red button on the console.
  • Open Q&A
Chapter 3:
Exploratory Learning
  • Create a model made of 4 shapes and 2 motors that sinks into the water between 1000 and 2000 m. The model needs to be as controllable as possible (can move forward, backwards, left, and right in a controlled manner).
  • Participants were informed that they had complete creative freedom to design their 3D submarine models.

Appendix C

Data and Data Distribution Review

Table A2. Distribution of data—affordance.
Table A2. Distribution of data—affordance.
StatisticValue
Count26
Mean85.69
Median87.00
Std Deviation6.92
Min70.00
25%81.25
75%91.00
Max98.00
Skewness−0.42
Kurtosis−0.42
Table A3. Distribution of data—flow.
Table A3. Distribution of data—flow.
StatisticValue
Count26
Mean3.80
Median3.92
Std Deviation0.54
Min2.50
25%3.41
75%4.19
Max4.53
Skewness−0.85
Kurtosis0.21
Table A4. Distribution of data—CEGE.
Table A4. Distribution of data—CEGE.
StatisticValue
Count26
Mean5.17
Median5.24
Std Deviation0.64
Min3.39
25%4.84
75%5.66
Max6.24
Skewness−0.68
Kurtosis0.58
All dimensions show left-skewed distributions, suggesting that while most users score moderately high, there are a few lower scores pulling the means down. AFF has the widest relative spread, while FSS and CEGE are more consistent. With only 26 points, these patterns should be interpreted cautiously; larger samples might reveal more about underlying normality or multimodality.
The following raw data has been utilized throughout the statistical analysis described in the section related to results:
Table A5. Raw data—questionnaire results for each participant in the study.
Table A5. Raw data—questionnaire results for each participant in the study.
UserIDAffordance
(AFF)
Flow
(FSS)
Core Elements of Gaming Experience (CEGE)
290733.7784.303
709834.0285.455
227814.5286.152
512934.2225.212
177914.3895.788
443904.1945.788
187763.9724.667
159894.3335.152
830934.1945.667
670883.7784.909
590793.5564.333
595833.3614.818
682823.9724.818
341874.3615.515
411913.8615.273
840913.1945.788
388943.9725.061
723944.1115.273
985842.5284.939
779793.3613.394
298803.3334.303
484984.5286.242
770873.2225.273
955873.7225.697
642702.5004.939
256853.7785.636

References

  1. Qorbani, H.S.; Arya, A.; Nowlan, N.S.; Abdinejad, M. ScienceVR: A Virtual Reality Framework for STEM Education, Simulation and Assessment. In Proceedings of the 2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), Taichung, China, 15–17 November 2021; pp. 267–275. [Google Scholar] [CrossRef]
  2. Ward, T.; Ortega-Moody, J.; Khoury, S.; Wheatley, M.; Jenab, K. Virtual reality platforms for K-12 STEM education. Manag. Sci. Lett. 2025, 15, 193–204. [Google Scholar] [CrossRef]
  3. Smith, J.R.; Snapp, B.; Madar, S.; Brown, J.R.; Fowler, J.; Andersen, M.; Porter, C.D.; Orban, C. A Smartphone-Based Virtual Reality Plotting System for STEM Education. PRIMUS 2023, 33, 1–15. [Google Scholar] [CrossRef]
  4. Nersesian, E.; Vinnikov, M.; Lee, M.J. Travel Kinematics in Virtual Reality Increases Learning Efficiency. In Proceedings of the 2021 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC), St. Louis, MO, USA, 10–13 October 2021; pp. 1–5. [Google Scholar] [CrossRef]
  5. Ha, O. Development of a low-cost immersive virtual reality solution for STEM classroom instruction: A case in Engineering Statics. In Proceedings of the 2020 2nd International Workshop on Artificial Intelligence and Education (WAIE 2020), Montreal, QC, Canada, 6–8 November 2020; Association for Computing Machinery: New York, NY, USA; pp. 64–68. [Google Scholar] [CrossRef]
  6. Janonis, A.; Kiudys, E.; Girdžiūna, M.; Blažauskas, T.; Paulauskas, L.; Andrejevas, A. Escape the Lab: Chemical Experiments in Virtual Reality; Springer: Cham, Switzerland, 2020; pp. 273–282. [Google Scholar] [CrossRef]
  7. Laseinde, O.T.; Dada, D. Enhancing teaching and learning in STEM Labs: The development of an android-based virtual reality platform. Mater. Today Proc. 2023, 105, 240–246. [Google Scholar] [CrossRef]
  8. Roy, S.; Chang, D.Y.; Stocksdale, G.; Gunasekera, M.; Rizwan-uddin, R. Recent Advances in VR Labs for Use in STEM Education. In Proceedings of the Polytechnic University of Valencia Congress, Tenth International Conference on Higher Education Advances, Valencia, Spain, 18–21 June 2024. [Google Scholar] [CrossRef]
  9. Klän, W. Using Virtual Reality to Improve STEM Education. Ph.D. Thesis, Carleton University, Ottawa, Canada, 2023. [Google Scholar] [CrossRef]
  10. Johnson-Glenberg, M.C. The Necessary Nine: Design Principles for Embodied VR and Active Stem Education; Springer: Singapore, 2019; pp. 83–112. [Google Scholar] [CrossRef]
  11. Semerikov, S.; Mintii, M.; Mintii, I. Review of the course “Development of Virtual and Augmented Reality Software” for STEM teachers: Implementation results and improvement potentials. In Proceedings of the International Workshop on Augmented Reality in Education, Kryvyi Rih, Ukraine, 11 May 2021. [Google Scholar] [CrossRef]
  12. Pirker, J.; Dengel, A.R.; Holly, M.S.; Safikhani, S. Virtual Reality in Computer Science Education: A Systematic Review. In Proceedings of the 26th ACM Symposium on Virtual Reality Software and Technology, Ottawa, Canada, 1–4 November 2020. [Google Scholar] [CrossRef]
  13. Mystakidis, S.; Fragkaki, M.; Filippousis, G. Ready Teacher One: Virtual and Augmented Reality Online Professional Development for K-12 School Teachers. Computers 2021, 10, 134. [Google Scholar] [CrossRef]
  14. Standford University. Stanford Researchers Release Virtual Reality Simulation that Transports Users to Ocean of the Future. 2019. Available online: https://news.stanford.edu/stories/2016/10/virtual-reality-simulation-transports-users-ocean-future (accessed on 21 August 2025).
  15. Purdue University Northwest. CIVS Developed Underwater VR Lab Modules for An NSF-Funded STEM Project. 2023. Available online: https://www.pnw.edu/civs/2023/11/07/civs-technology-applied-to-nsf-funded-national-stem-project/ (accessed on 21 August 2025).
  16. Fauville, G.; Voski, A.; Mado, M.; Lantz-Andersson, A. Underwater virtual reality for marine education and ocean literacy: Technological and psychological potentials. Environ. Educ. Res. 2024, 1–25. [Google Scholar] [CrossRef]
  17. Aldhaheri, S.; Hu, Y.; Xie, Y.; Wu, P.; Kanoulas, D.; Liu, Y. Underwater Robotic Simulators Review for Autonomous System Development. arXiv 2025, arXiv:2504.06245v1. [Google Scholar]
  18. Rosette, M.; Hannah Kolano, H.; Holm, C.; Hollinger, G.A.; Marburg, A.; Pickett, M.; Davidson, J.R. WAVE: An Open-Source Under Water Arm-Vehicle Emulator. Available online: https://research.engr.oregonstate.edu/rdml/sites/research.engr.oregonstate.edu.rdml/files/icra24_1333_fi.pdf (accessed on 21 August 2025).
  19. Yoon, J.; Son, B.; Lee, D. Comparative Study of Physics Engines for Robot Simulation with Mechanical Interaction. Appl. Sci. 2023, 13, 680. [Google Scholar] [CrossRef]
  20. Sinnott, C.; Liu, J.; Matera, C.; Halow, S.; Jones, A.; Moroz, M.; Mulligan, J.; Crognale, M.; Folmer, E.; MacNeilage, P. Underwater Virtual Reality System for Neutral Buoyancy Training: Development and Evaluation. In Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology (VRST ‘19), Parramatta, Australia, 12–15 November 2019; Association for Computing Machinery: New York, NY, USA; pp. 1–9. [Google Scholar] [CrossRef]
  21. Haklidir, M.; Guven, A.F.; Eroglu, O.; Aldogan, D.; Tasdelen, I. High Fidelity Modelling and Simulation of Submarine in a Commercial Computer Generated Forces Toolkit. In Proceedings of the Society for Modeling & Simulation International [SCS] 2009 Summer Computer Simulation Conference (SCSC’09), Istanbul, Turkey, 13–16 July 2009; Available online: https://www.researchgate.net/publication/234715910_High_Fidelity_Modeling_and_Simulation_of_Submarine_in_a_Commercial_Computer_Generated_Forces_Toolkit (accessed on 21 August 2025).
  22. Chin, C.S.; Kamsani, N.B.; Zhong, X.; Cui, R.; Yang, C. Unity3D Serious Game Engine for High Fidelity Virtual Reality Training of Remotely-Operated Vehicle Pilot. In Proceedings of the 10th International Conference on Modelling, Identification and Control (ICMIC), Guiyang, China, 2–4 July 2018; pp. 1–6. [Google Scholar] [CrossRef]
  23. Tohoku University. “Let it Flow: Recreating Water Flow for Virtual Reality.” ScienceDaily. ScienceDaily, 21 September 2023. Available online: www.sciencedaily.com/releases/2023/09/230920111215.htm (accessed on 21 August 2025).
  24. Makransky, G.; Terkildsen, T.S.; Mayer, R.E. Adding immersive virtual reality to a science lab simulation causes more presence but less learning. Learn. Instr. 2019, 60, 225–236. [Google Scholar] [CrossRef]
  25. Huang, F.; Chen, X.; Xu, Y.; Yang, X.; Chen, Z. Immersive virtual simulation system design for the guidance, navigation and control of unmanned surface vehicles. Ocean. Eng. 2023, 281, 114884. [Google Scholar] [CrossRef]
  26. Deck, C. Virtual reality and science, technology, engineering, and mathematics education. Elsevier Ebooks 2023, 189–197. [Google Scholar] [CrossRef]
  27. Paskova, A. Features of application of immersive technologies of virtual and augmented reality in higher education. Vestn. Majkopskogo Gos. Tehnol. Univ. 2022, 14. [Google Scholar] [CrossRef]
  28. Zikas, P.; Papagiannakis, G.; Lydatakis, N.; Kateros, S.; Ntoa, S.; Adami, I.; Stephanidis, C. Immersive visual scripting based on VR software design patterns for experiential training. Vis. Comput. 2020, 36, 1965–1977. [Google Scholar] [CrossRef]
  29. Matovu, H. Immersive virtual reality for science learning: Design, implementation, and evaluation. Stud. Sci. Educ. 2022, 59, 205–244. [Google Scholar] [CrossRef]
  30. Danmali, S.S.; Onansanya, S.A.; Atanda, F.A.; Abdullahi, A. Application of Virtual Reality in STEM Education for Enhancing Immersive Learning and Performance of At-Risk Secondary School Students. Int. J. Res. Innov. Soc. Sci. 2024, 8, 3971–3984. [Google Scholar] [CrossRef]
  31. Li, S. An Analysis of Immersive Communication in VR Documentary: A Case Study of Planet Earth II. Commun. Humanit. Res. 2023, 16, 110–115. [Google Scholar] [CrossRef]
  32. Zhu, J. Immersive Experience Design of Digital Media Interactive Art Based on Virtual Reality. Comput. Aided Des. Appl. 2024, 21, 164–177. [Google Scholar] [CrossRef]
  33. Harris, T.M. Immersive virtual reality and spatial analysis, Chapters. In Handbook of Spatial Analysis in the Social Sciences; Rey, S.J., Franklin, R.S., Eds.; Edward Elgar Publishing: Cheltenham, UK, 2022; Chapter 20; pp. 336–351. Available online: https://ideas.repec.org/h/elg/eechap/19110_20.html (accessed on 21 August 2025).
  34. Azofeifa, J.D.; Rueda-Castro, V.; Gonzalez-Gomez, L.J.; Noguez, J.; Caratozzolo, P. Revolutionizing Training in Continuing Engineering Education: An Experimental Proposal Using the Multi-Sensory Virtual Decision-Making Center. IEEE Access 2024, 12, 192440–192448. [Google Scholar] [CrossRef]
  35. Truchly, P.; Medvecky, M.; Podhradsky, P.; Vanco, M.P. Virtual Reality Applications in STEM Education. In Proceedings of the 2018 16th International Conference on Emerging eLearning Technologies and Applications (ICETA), Stary Smokovec, Slovakia, 15–16 November 2018; pp. 597–602. [Google Scholar] [CrossRef]
  36. Ma, T.; Xiao, X.; Wee, W.G.; Han, C.Y.; Zhou, X. A 3D Virtual Learning System for STEM Education. In Virtual, Augmented and Mixed Reality; Applications of Virtual and Augmented Reality; VAMR 2014; Lecture Notes in Computer Science; Shumaker, R., Lackey, S., Eds.; Springer: Cham, Switzerland, 2014; Volume 8526. [Google Scholar] [CrossRef]
  37. Drossis, G.; Birliraki, C.; Stephanidis, C. Interaction with Immersive Cultural Heritage Environments Using Virtual Reality Technologies. In HCI International 2018–Posters’ Extended Abstracts; HCI 2018; Communications in Computer and Information Science; Stephanidis, C., Ed.; Springer: Cham, Switzerland, 2018; Volume 852, pp. 177–183. [Google Scholar] [CrossRef]
  38. Gómez, P.M.M.; Gómez, R.J.M.; Antonio, D.; Borré, F. Virtual Immersion in STEM Education: A MICMAC Study on How Virtual Reality Impacts the Understanding and Application of Scientific Concepts. Evol. Stud. Imaginative Cult. 2024, 8, 162–173. [Google Scholar] [CrossRef]
  39. Johnson, D.; Mamani, B.; Salas, C. CollabVR: VR Testing for Increasing Social Interaction between College Students. Computers 2024, 13, 40. [Google Scholar] [CrossRef]
  40. Lytras, M.D.; Papadopoulou, P.; Misseyanni, A.; Marouli, C.; Alhalabi, W.; Daniela, L. Moving virtual and augmented reality in the learning cloud: Design principles for an agora of active visual learning services in stem education. In Proceedings of the 9th International Conference on Education and New Learning Technologies, Barcelona, Spain, 3–5 July 2017; pp. 7673–7678. [Google Scholar] [CrossRef]
  41. Mo, J. Improving the Middle School STEM Education in Rural China Through Virtual Reality. In Proceedings of the 4th International Conference on Educational Reform, Management Science and Sociology (ERMSS 2023), Detroit, MI, USA, 7–8 January 2023; Volume 9. [Google Scholar] [CrossRef]
  42. Lynch, T.; Ghergulescu, I. Review of virtual labs as the emerging technologies for teaching stem subjects. In Proceedings of the 11th International Technology, Education and Development Conference, Valencia, Spain, 6–8 March 2017; pp. 6082–6091. [Google Scholar] [CrossRef]
  43. Sun, W.; Chen, Q. The Design, Implementation and Evaluation of Gamified Immersive Virtual Reality (IVR) for Learning: A Review of Empirical Studies. In Proceedings of the 17th European Conference on Games Based Learning, Enschede, The Netherlands, 5–6 October 2023. [Google Scholar] [CrossRef]
  44. Gerini, L.; Delzanno, G.; Guerrini, G.; Solari, F.; Chessa, M. Gamified Virtual Reality for Computational Thinking. In Proceedings of the Gamify 2023: Proceedings of the 2nd International Workshop on Gamification in Software Development, Verification, and Validation, San Francisco, CA, USA, 4 December 2023; pp. 13–21. [Google Scholar] [CrossRef]
  45. Ruscanu, A.-M.; Ciupe, A.; Meza, S. arPcTECHture–a gamified educational 3D virtual world for introductory concepts in computer architecture. In Proceedings of the IEEE Global Engineering Education Conference, Tunis, Tunisia, 28–31 March 2022; pp. 1437–1442. [Google Scholar] [CrossRef]
  46. Tiefenbacher, F. Evaluation of Gamification Elements in a VR Application for Higher Education. In Systems, Software and Services Process Improvement; EuroSPI 2020; Communic Ations in Computer and Information Science; Yilmaz, M., Niemann, J., Clarke, P., Messnarz, R., Eds.; Springer: Cham, Switzerland, 2020; Volume 1251. [Google Scholar] [CrossRef]
  47. Gibson, J.J. The Theory of Affordance. In Ecological Approach to Visual Perception; Houghton Mifflin Company: Boston, MA, USA, 1979; Available online: https://monoskop.org/images/c/c6/Gibson_James_J_1977_1979_The_Theory_of_Affordances.pdf (accessed on 21 August 2025).
  48. Jackson, S.A.; Marsh, H. Development and validation of a scale to measure optimal experience: The Flow State Scale. J. Sport Exerc. Psychol. 1996, 18, 17–35. [Google Scholar] [CrossRef]
  49. Calvillo-Gámez, E.H.; Cairns, P.; Cox, A.L. Assessing the Core Elements of the Gaming Experience. In Evaluating User Experience in Games; Human-Computer Interaction Series; Bernhaupt, R., Ed.; Springer: London, UK, 2010. [Google Scholar] [CrossRef]
  50. AlGerafi, M.A.M.; Zhou, Y.; Oubibi, M.; Wijaya, T.T. Unlocking the Potential: A Comprehensive Evaluation of Augmented Reality and Virtual Reality in Education. Electronics 2023, 12, 3953. [Google Scholar] [CrossRef]
  51. Bernhard, K.; Henning, S. Development of a software-tool to evaluate the tolerability of different VR-movement types. Health Technol. 2024, 14, 781–790. [Google Scholar] [CrossRef]
  52. Ringle, C.M.; Wende, S.; Becker, J.-M. “SmartPLS3” Boenningstedt: SmartPLS GmbH. 2015. Available online: http://www.smartpls.com (accessed on 21 August 2025).
  53. Hair, J.F.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 2nd ed.; Sage Publications: Thousand Oaks, CA, USA, 2017. [Google Scholar]
  54. Henseler, J.; Ringle, C.M.; Sinkovics, R.R. The use of partial least squares path modeling in international marketing. Adv. Int. Mark. 2009, 20, 277–319. [Google Scholar] [CrossRef]
  55. Cohen, J. Statistical Power Analysis for Behavioral Sciences, 2nd ed.; Lawrence Erlbaum Associates: Hillsdale, NJ, USA, 1988; Available online: https://utstat.utoronto.ca/brunner/oldclass/378f16/readings/CohenPower.pdf (accessed on 21 August 2025).
  56. Anastasovitis, E.; Roumeliotis, M. Enhanced inclusion and accessibility in education and training through virtual worlds. Metaverse 2024, 5, 2836. [Google Scholar] [CrossRef]
  57. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to use and how to report the results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
Figure 1. Using the main menu during construction. (a) Selecting 3D shapes. (b) Setting the properties of a 3D shape.
Figure 1. Using the main menu during construction. (a) Selecting 3D shapes. (b) Setting the properties of a 3D shape.
Computers 14 00348 g001
Figure 2. Three different submarine models showing the flexibility of the construction process. (a) Model 1. (b) Model 1. (c) Model 1.
Figure 2. Three different submarine models showing the flexibility of the construction process. (a) Model 1. (b) Model 1. (c) Model 1.
Computers 14 00348 g002
Figure 3. The Underwater Simulation Mode. (a) Command dashboard in simulation mode. (b) Spiraling model in simulation mode.
Figure 3. The Underwater Simulation Mode. (a) Command dashboard in simulation mode. (b) Spiraling model in simulation mode.
Computers 14 00348 g003
Figure 4. Use of gamification in the Submarine Simulator. (a) Submarine model competing against other students in underwater track. (b) Submarine model winning a race.
Figure 4. Use of gamification in the Submarine Simulator. (a) Submarine model competing against other students in underwater track. (b) Submarine model winning a race.
Computers 14 00348 g004
Figure 5. Submarine designing process in VR environment using different functionalities. (a) Rescaling objects. (b) Combining objects. (c) Centering objects.
Figure 5. Submarine designing process in VR environment using different functionalities. (a) Rescaling objects. (b) Combining objects. (c) Centering objects.
Computers 14 00348 g005
Figure 6. Examples of models designed by the students of the target group. (a) Model in a tight spiral motion. (b) Model in a loose spiral motion.
Figure 6. Examples of models designed by the students of the target group. (a) Model in a tight spiral motion. (b) Model in a loose spiral motion.
Computers 14 00348 g006
Figure 7. The linear navigation course.
Figure 7. The linear navigation course.
Computers 14 00348 g007
Figure 8. The lateral navigation course.
Figure 8. The lateral navigation course.
Computers 14 00348 g008
Figure 9. Race track testing model’s vertical and horizontal maneuverability.
Figure 9. Race track testing model’s vertical and horizontal maneuverability.
Computers 14 00348 g009
Figure 10. The path coefficients for affordance and flow (Smart PLS output version 3.2.9).
Figure 10. The path coefficients for affordance and flow (Smart PLS output version 3.2.9).
Computers 14 00348 g010
Figure 11. The path coefficients for flow and CEGE (Smart PLS output version 3.2.9).
Figure 11. The path coefficients for flow and CEGE (Smart PLS output version 3.2.9).
Computers 14 00348 g011
Table 1. Variable description.
Table 1. Variable description.
VariablesDefinitionSub-
Components
Sub-Component
Definition
Loading
Factors Value
FLOWCharacteristics for the Flow StateAction
Awareness
Merging of action and awareness.0.653
Autotelic
Experience
The activity is its own reward.0.813
Challenge SkillsBalance between challenge and skill level.0.558
Clear GoalsKnowing what needs to be conducted in the given circumstance0.466
Concentration TasksFocused attention on the task.0.645
Loss of
Self-Control
Feeling of effortless action.0.664
Paradox
Control
Sense of control without deliberate effort.0.803
UnambiguityImmediate feedback on performance.0.373
AFFORDANCETraits Related to
Affordance
EngagementThe degree to which a user interacts with or is drawn into the perceived possibilities of an object.0.483
FluidMechHow close to reality the fluids and forces behave in the simulation.0.648
ActionsThe visible or perceptible cues an object provides about its possible actions or uses.0.595
FunctionsThe available functions to manipulate shapes and models.0.600
CEGESoftware
Evaluation
ControlA sense of agency or influence over the
situation/task.
0.333
EnjoymentExperiencing pleasure or satisfaction from the activity.0.971
EnvironmentThe setting or context in which the activity
occurs.
0.395
FacilitatorSomething in the simulation that helps make an action or process easier.0.723
GamificationHow the game-design elements have been applied in the simulation.0.603
OwnershipFeeling of personal connection or responsibility towards the task/activity.0.705
Table 2. R2—Coefficient of Determination for affordance and flow.
Table 2. R2—Coefficient of Determination for affordance and flow.
Dependent VariableR2R2 AdjustedInterpretation (Effect Size)
Flow0.6580.644Strong
(Substantial effect, R2 > 0.50)
Table 3. Effect Size for Predictors related to affordance and flow.
Table 3. Effect Size for Predictors related to affordance and flow.
Predictor → Dependent Variablef2
(Effect Size)
Interpretation
(Cohen’s Guideline)
Affordance → Flow1.924Very Large Effect (f2 > 0.35)
Table 4. Construct Reliability and Validity for affordance and flow.
Table 4. Construct Reliability and Validity for affordance and flow.
Cronbach’s
Alpha (CA)
rho_AComposite
Reliability (CR)
Average
Variance
Extracted (AVE)
Threshold≥0.7≥0.7≥0.7≥0.5
AFF-1.000--
Flow0.8420.8620.8390.407
Table 5. Fornell–Larcker Criterion.
Table 5. Fornell–Larcker Criterion.
AFFFlow
AFF--
Flow0.8110.638
Table 6. Model Fit for Saturated and Estimated Models on affordance and flow.
Table 6. Model Fit for Saturated and Estimated Models on affordance and flow.
Saturated ModelEstimated Model
SRMR0.1210.121
d_ULS1.1381.138
d_G0.8200.820
Chi-Square81.44981.449
NFI0.5090.509
Table 7. R2—Coefficient of Determination for flow and CEGE.
Table 7. R2—Coefficient of Determination for flow and CEGE.
Dependent VariableR2R2 AdjustedInterpretation (Effect Size)
CEGE0.5860.569Strong
(Substantial effect, R2 > 0.50)
Table 8. Effect Size for Predictors related to flow and CEGE.
Table 8. Effect Size for Predictors related to flow and CEGE.
Predictor → Dependent
Variable
f2
(Effect Size)
Interpretation
(Cohen’s Guideline)
Flow → CEGE1.414Very Large Effect (f2 > 0.35)
Table 9. Construct Reliability and Validity for flow and CEGE.
Table 9. Construct Reliability and Validity for flow and CEGE.
Cronbach’s
Alpha (CA)
rho_AComposite
Reliability (CR)
Average Variance
Extracted (AVE)
Threshold≥0.7≥0.7≥0.7≥0.5
CEGE0.7930.8690.8030.432
Flow-1.000--
Table 10. Model Fit for Saturated and Estimated Models on flow and CEGE.
Table 10. Model Fit for Saturated and Estimated Models on flow and CEGE.
Saturated ModelEstimated Model
SRMR0.1070.107
d_ULS1.3851.385
d_G0.8740.874
Chi-Square88.93888.938
NFI0.6410.641
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Stănescu, A.-B.; Travadel, S.; Rughiniș, R.-V. Virtual Reality for Hydrodynamics: Evaluating an Original Physics-Based Submarine Simulator Through User Engagement. Computers 2025, 14, 348. https://doi.org/10.3390/computers14090348

AMA Style

Stănescu A-B, Travadel S, Rughiniș R-V. Virtual Reality for Hydrodynamics: Evaluating an Original Physics-Based Submarine Simulator Through User Engagement. Computers. 2025; 14(9):348. https://doi.org/10.3390/computers14090348

Chicago/Turabian Style

Stănescu, Andrei-Bogdan, Sébastien Travadel, and Răzvan-Victor Rughiniș. 2025. "Virtual Reality for Hydrodynamics: Evaluating an Original Physics-Based Submarine Simulator Through User Engagement" Computers 14, no. 9: 348. https://doi.org/10.3390/computers14090348

APA Style

Stănescu, A.-B., Travadel, S., & Rughiniș, R.-V. (2025). Virtual Reality for Hydrodynamics: Evaluating an Original Physics-Based Submarine Simulator Through User Engagement. Computers, 14(9), 348. https://doi.org/10.3390/computers14090348

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop